Inconsistent result from DSS TRTH v2

Hi there,@manigandan.r
When i request the extraction result from a Time and Sales report schedule - i get different results using exactly the same request.
I see the file size is exactly the same but my call to response.getEntity().getContent() appears to return inconsistent results. E.g. the last 2 calls i made to the API, T&S returned 69937 vs 101031 lines.
This is how i set up my client:
private CloseableHttpClient httpclient = HttpClientBuilder.create().build();
.....
String urlGet = this.url + "/Extractions/ReportExtractions('"+reportExtractionId+"')/Files";
HttpGet request = new HttpGet(urlGet);
request.addHeader("Authorization", "Token "+this.sessionToken);
HttpResponse response = this.httpclient.execute(request);
BufferedReader rd = new BufferedReader(new InputStreamReader(response.getEntity().getContent()));
StringBuffer result = new StringBuffer();
String line = "";
while ((line = rd.readLine()) != null) {
result.append(line);
}
JSONObject jsonGetResponse = new JSONObject(result.toString());
JSONArray valueJArray = jsonGetResponse.getJSONArray("value");
String ExtractedFileId = "";
String ExtractedFileName = "";
String FileType = "";
boolean success = false;
for (int i = 0; i < valueJArray.length(); i++)
{
ExtractedFileId = valueJArray.getJSONObject(i).getString("ExtractedFileId");
......
//FileType="Full"
urlGet = this.url + "/Extractions/ExtractedFiles('"+ExtractedFileId+"')/$value";
request = new HttpGet(urlGet);
request.addHeader("Authorization", "Token "+this.sessionToken);
response = this.httpclient.execute(request);
HttpEntity entity = response.getEntity();
rd = new BufferedReader(new InputStreamReader(entity.getContent()));
......
while ((line = rd.readLine()) != null)
{...}
......
}
The same code works fine for Depth extraction which also has a gzipped file output.
Am i missing something here please?
Thanks
Best Answer
-
Sorry this took some time, but it was a
complex nut to crack. The support team found a solution, kudos go to them.The key problem is the merged
GZip files cause issues for the default decompression methods. For Java, the Apache Commons
Compress https://commons.apache.org/proper/commons-compress/index.html
library seems like a safe option. However, it needs to be
implemented correctly.GZIPStream from the
java.util.zip package will work if it is done on the local machine, but may
fail if used while reading directly from the HTTP Stream.Key factors for Apache Commons
Compress implementation:Disable Content
Compression on the HTTP clientprivate CloseableHttpClienthttpclient = HttpClients.create().disableContentCompression().build();
Set
decompressConcatenated = true when initializing the streamGzipCompressorInputStream gis =new GzipCompressorInputStream(
myURLConnection.getInputStream(), true);If these are set, you can use In-Stream
Decompression and process the stream while downloading the file.In a nutshell, here are the code details:
Add the Apache Commons Compress library:
import
org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream;Use this setting when declaring the httpclient:
private CloseableHttpClient httpclient = HttpClientBuilder.create().disableContentCompression().build();
Full code of a method to extract and display on the fly is attached.solutioncode.txt.
I will update all our Java samples, in the next version they will use this method.
0
Answers
-
@chinelo.okoli, the code snippets you posted look ok.
Could you share your entire code so I could test it ? Could you also share your report template and instrument list ? If you do not want to post these on the forum you can also send them to me directly.
Added notes:
Our sample code DSS2ImmediateScheduleTicksTRTHClient2 (available under the downloads tab of TRTH REST) was tested for extractions up to 14MB gziped (99MB unzipped) without issue. Could you try that one with your template and instrument list to see if it exhibits the same issue ?
If you are using a pre-defined instrument list and report template on the server, I can send you a variant of DSS2ImmediateScheduleTicksTRTHClient2 that would use them which you could use for testing.
When you get 2 files of different size, what does the last line of data in the file look like ? Is it truncated ?
0 -
All the TRTH REST API Java samples have been checked, and updated where required to implement the correction. The new package, which also includes new samples and enhanced functionality, is available for download.
0 -
Thanks a lot Christiaan
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 684 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 249 ETA
- 554 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 643 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 192 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛