The TRTH servers deliver compressed (or not) data depending on several things, as described under heading "Compression" in this page. For an On Demand Time and Sales extraction I believe it should always deliver compressed data, whatever you set in the GET header.
What does the response header contain ? I would expect it to contain this:
Content-Encoding: gzip Content-Type: text/plainIf that is the case, the content is compressed.
But some HTTP clients automatically decompress data when they receive compressed data (Postman does that) I guess you have run into that as well. You must disable httr’s content decoding, using "config(http_content_decoding=0)" in the GET call.
Here is a code snippet that does it:
TRTHRawExtractionResults <- function(token,jobid,Path,Overwrite = TRUE) {
url <- paste0("https://hosted.datascopeapi.reuters.com/RestApi/v1/Extractions/RawExtractionResults('",jobid,"')/$value")
r <- httr::GET(url,add_headers(prefer = "respond-async",Authorization = token),config(http_content_decoding=0),write_disk(Path,Overwrite),progress())
stop_for_status(r)
return(r)
}This is an extract from this article that describes an example of R code that does a time and sales extraction.
If that does not help, please post your code so we can have a look at it.
Automatic Decompression in HTTP request for Tick History Time and Sales report

Hi there!
I'm trying to download the Tick History Time and Sales report via R library httr.
Apparently, the GET method decompress it automatcally, what could be a real problem for large reports.
Is there a way to work around this?
I read in this page that if the request doesn't have the "Accept-Encoding" header, the response wouldn't have the "Content-Encoding" header and the client shouldn't decompress it, but that is not true, the "Content-Encoding" is still present.
So I don't know if the problem is with the R or with the API.
Best Answer
-
The TRTH servers deliver compressed (or not) data depending on several things, as described under heading "Compression" in this page. For an On Demand Time and Sales extraction I believe it should always deliver compressed data, whatever you set in the GET header.
What does the response header contain ? I would expect it to contain this:
Content-Encoding: gzip Content-Type: text/plain
If that is the case, the content is compressed.
But some HTTP clients automatically decompress data when they receive compressed data (Postman does that) I guess you have run into that as well. You must disable httr’s content decoding, using "config(http_content_decoding=0)" in the GET call.
Here is a code snippet that does it:
TRTHRawExtractionResults <- function(token,jobid,Path,Overwrite = TRUE) {
url <- paste0("https://hosted.datascopeapi.reuters.com/RestApi/v1/Extractions/RawExtractionResults('",jobid,"')/$value")
r <- httr::GET(url,add_headers(prefer = "respond-async",Authorization = token),config(http_content_decoding=0),write_disk(Path,Overwrite),progress())
stop_for_status(r)
return(r)
}This is an extract from this article that describes an example of R code that does a time and sales extraction.
If that does not help, please post your code so we can have a look at it.
0
Answers
-
Thank you very much! It its working perfectly now.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 687 Datastream
- 1.4K DSS
- 621 Eikon COM
- 5.2K Eikon Data APIs
- 11 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 4 Trading API
- 2.9K Elektron
- 1.4K EMA
- 254 ETA
- 557 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 276 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 669 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 229 TRKD
- 917 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛