Tick History REST API: Error in handle_url(handle, url, ...) :can't find object 'location'
Hi. I'm following this tutorial (https://developers.refinitiv.com/en/article-catalog/article/using-tick-history-in-r-language-part-3) to use Refinitiv REST API in R. But I think the question should be general and apply to other languages (Python/C etc).
So I'm not able to get results within 30s time-out period so I'm polling the 'get' method. When running:
url <- "https://selectapi.datascope.refinitiv.com/RestApi/v1/Extractions/ExtractRaw"
r <- httr::GET(location,add_headers(prefer = "respond-async",Authorization = token))
I get the error:
Error in handle_url(handle, url, ...) : ccan't find object 'location'
But nowhere in the tutorial mentions how to define 'location'. So can I get some help on this issue? Thank you.
Best Answer
-
Refer to the source code in GitHub, the Location is in the HTTP's headers when the HTTP status code is 202.
RTHExtractRaw <- function(b,path,overwrite = FALSE) {
url <- "https://selectapi.datascope.refinitiv.com/RestApi/v1/Extractions/ExtractRaw"
token <- get("token",envir = cacheEnv)
r <- httr::POST(url,add_headers(prefer = "respond-async",Authorization = token),content_type_json(),body = b,encode = "json")
if (status_code(r) == 202) {
message("The request has been accepted but has not yet completed executing asynchronously.\r\nReturn monitor URL\r\n",r$headers$location)
return(invisible(r$headers$location))
} else if(status_code(r) == 200) {
a<-content(r, "parsed", "application/json", encoding="UTF-8")
message(a$Notes)
return(RTHRawExtractionResults(a$JobId,path,overwrite))
} else {
warn_for_status(r)
a<-content(r, "parsed", "application/json", encoding="UTF-8")
return(a)
}
}Then, the URL in the Location is used with the HTTP get method to check the request status.
RTHCheckRequestStatus <- function(location,path,overwrite = FALSE) {
token <- get("token",envir = cacheEnv)
r <- GET(location,add_headers(prefer = "respond-async",Authorization = token))
if (status_code(r) == 202) {
message("The request has not yet completed executing asynchronously.\r\nPlease wait a bit and check the request status again.\r\n")
return(invisible(r$headers$location))
} else if(status_code(r) == 200) {
a<-content(r, "parsed", "application/json", encoding="UTF-8")
message(a$Notes)
return(RTHRawExtractionResults(a$JobId,path,overwrite))
} else {
warn_for_status(r)
a<-content(r, "parsed", "application/json", encoding="UTF-8")
return(a)
}
}0
Answers
-
Hi. Thank you. I'm now able to run the functions. However, should I define path as a csv file or something else? When I set path as a csv file and run RTHCheckRequestStatus(), I get some garbled codes as in the pic. What should I do?
0 -
Hi. Just repeating my question in the comment for a better readability... I'm now able to run the functions thanks a lot. However, should I define path as a csv file or something else? When I set path as a csv file and run RTHCheckRequestStatus(), I get some garbled codes as in the pic. What should I do?
0 -
Please share the request's body (b) used with the RTHExtractRaw function.
0 -
Hi, my body is:
b='{
"ExtractionRequest": {
"@odata.type": "#DataScope.Select.Api.Extractions.ExtractionRequests.TickHistoryIntradaySummariesExtractionRequest",
"ContentFieldNames": [
"Close Ask",
"Close Bid"
],
"IdentifierList": {
"@odata.type": "#DataScope.Select.Api.Extractions.ExtractionRequests.InstrumentIdentifierList",
"InstrumentIdentifiers": [
{ "Identifier": "AAPL.O", "IdentifierType": "Ric" }
],
"ValidationOptions": null,
"UseUserPreferencesForValidationOptions": false
},
"Condition": {
"MessageTimeStampIn": "LocalExchangeTime",
"ReportDateRangeType": "Range",
"QueryStartDate": "2020-08-17T11:00:00.000Z",
"QueryEndDate": "2020-08-19T11:30:00.000Z",
"SummaryInterval": "OneMinute",
"TimebarPersistence": true,
"DisplaySourceRIC": true
}
}
}'
0 -
From my test, the output is the gzip file. Therefore, we need to save it as a gzip file.
result = RTHCheckRequestStatus(location,"c:\\d_drive\\test.csv.gz")
Then, uncompress it to get the csv file.
0 -
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 684 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 250 ETA
- 554 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 643 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛