Extracting TRTH Time and Sales Data to a File

Hello, When I try to extract the Time and Sales Data to a file I only get partial data. I've attached a snippet of my code for the extraction request which writes to a file on my local machine. Am I missing a step like aggregating files?
StreamWriter clear = new StreamWriter(dataOutputFile, false);
clear.Close();
Console.WriteLine("Cleared data output file " + dataOutputFile + "\n");
TickHistoryTimeAndSalesExtractionRequest extractionRequest = new TickHistoryTimeAndSalesExtractionRequest {
IdentifierList = InstrumentIdentifierList.Create(instrumentIdentifiers),
Condition = reportTemplate.Condition,
ContentFieldNames = { "Trade - Price", "Trade - Volume", "Quote - Bid Price", "Quote - Bid Size", "Quote - Ask Price", "Quote - Ask Size" }
};
extractionsContext.Options.AutomaticDecompression = true; //Decompress gzip to plain text
RawExtractionResult extractionResult = extractionsContext.ExtractRaw(extractionRequest);
Console.Write("{0:T} Extraction complete ... ", DateTime.Now);
DebugPrintAndWaitForEnter("");
sw = new StreamWriter(dataOutputFile, true);
Console.WriteLine("==================================== DATA =====================================");
DssStreamResponse streamResponse = extractionsContext.GetReadStream(extractionResult);
using (StreamReader reader = new StreamReader(streamResponse.Stream)) {
string line = reader.ReadLine();
if (string.IsNullOrEmpty(line)) {
Console.WriteLine("WARNING: no data returned. Check your request dates.");
sw.WriteLine("WARNING: no data returned. Check your request dates.");
} else {
//The first line is the list of field names:
sw.WriteLine(line);
Console.WriteLine(line);
//The remaining lines are the data:
//Variant 1: write all lines individually to file and console (and set a limit on number of lines)
for (int lineCount = 0; lineCount < maxDataLines && !reader.EndOfStream; lineCount++) {
line = reader.ReadLine();
sw.WriteLine(line);
Console.WriteLine(line);
}
//Variant 2: write all lines to either file or console (not both !):
//sw.WriteLine(line = reader.ReadToEnd());
//Console.WriteLine(line = reader.ReadToEnd());
}
}
sw.Close();
DebugPrintAndWaitForEnter("===============================================================================");
}
Best Answer
-
-What is maxDataLines you set in your test application?
if it's the same value as our tutorial which is 1000000, it may not enought to print unpacked result. Some report can generate more than 1000000 lines of data in csv format. You have to increase the value.
-Can you provide more details about reportTemplate.condition you are testing? I would like to test the instrument with the same date range.
-Please you change ExtractionsContext.Options.AutomaticDecompression to false and then change the codes that your write the result to file to
//Download the result
using (var response = ExtractionsContext.RawExtractionResultOperations.GetReadStream(result))
using (var fileStream = File.Create("savedExtraction.gzip"))
response.Stream.CopyTo(fileStream);Then you can unpack .gzip file and open savedExtraction.csv to compare number of lines from the unpacked data.
0
Answers
-
This code is from our .Net Tutorial 5. It was just meant as a demo, not as productized code. We discovered lately that native libraries are not designed to handle merged GZip files, and are not very robust, especially when decompressing data on the fly. Thus they might fail with large data sets. I am reviewing the code for .Net Tutorials 4 and 5, and as soon as I have something more robust I will make it available.
0 -
Yes, I was using the demos to write a program to automate a daily extraction. I want to get rid of the debugging, etc later. Do you have a dictionary for the TRTH API?
0 -
Ok, thanks. I will increase the size. I try to extract the data from midnight to 1am, and I only get 13s of history in the file. I'll try downloading to gzip. My report condition is below.
reportTemplate.Condition = new TickHistoryTimeAndSalesCondition {
MessageTimeStampIn = TickHistoryTimeOptions.GmtUtc,
QueryStartDate = new DateTimeOffset(2017, 07, 20, 0, 0, 0, TimeSpan.FromHours(0)),
QueryEndDate = new DateTimeOffset(2017, 07, 20, 1, 0, 0, TimeSpan.FromHours(0)),
ReportDateRangeType = ReportDateRangeType.Range,
ExtractBy = TickHistoryExtractByMode.Ric,
SortBy = TickHistorySort.SingleByRic,
Preview = PreviewMode.None,
DisplaySourceRIC = false
};0 -
@helen.ristov, there is a data dictionary under the documentation tab. Is that what you are looking for ?
On the code sample topic I made good progress. If all goes well tomorrow I will publish new samples and update .Net tutorials 4 & 5 with more robust code to display or save the data. But first I want to test them with large data sets, and that takes time ...
0 -
No, I have been using the demos to write code because I need to automate a daily data pull. The demos give examples of the Intraday summary reports but I need to download tick history. I don't know if you guys have any other API help documentation? Let me know when you publish new samples. Thanks!
0 -
I just tried running the code you mentioned to save the file and I got a a small file again . Attached here.
using (var response = extractionsContext.RawExtractionResultOperations.GetReadStream(extractionResult))
using (var fileStream = File.Create("E:\\HomeShare\\apitest.txt"))
response.Stream.CopyTo(fileStream);0 -
I got it to work by extracting to gzip and setting the decompression option to false
0 -
@helen.ristov, I have just uploaded a new set of C# code samples. Changes:
- Samples 4 and 5 are enhanced with better code to treat data on the fly.
- Sample 5 now has code to save the compressed data file.
- Note: there is a requirement for an additional library (the DLL is in the new code sample package).
.Net Tutorials 4 & 5 were updated accordingly.
Hope this helps !
0 -
Following this query, a new set of C# code samples was created and uploaded. Changes:
- Samples 4 and 5 enhanced with better code to treat data on the fly.
- Sample 5 now has code to save the compressed data file.
- Note: there is a requirement for an additional library (the DLL is in the new code sample package).
.Net Tutorials 4 & 5 were updated accordingly.
0 -
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 687 Datastream
- 1.4K DSS
- 621 Eikon COM
- 5.2K Eikon Data APIs
- 11 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 4 Trading API
- 2.9K Elektron
- 1.4K EMA
- 254 ETA
- 557 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 276 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 669 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 229 TRKD
- 917 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛