For a deeper look into our DataScope Select REST API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials

question

Upvotes
Accepted
33 1 1 3

Time and Sales stream response capped

Hi,

Retrieving Time and Sales data in a .Net application for up to 15 RICs over max 15 minute query window ...


Extraction Services Version 16.1.44123 (ed92f5e3e332), Built Aug 10 2022 01:55:41
User ID: -------
Extraction ID: 2000000447520516
Correlation ID: CiD/-------/0x0000000000000000/REST API/EXT.2000000447520516
Schedule: 0x082acb91eea331e9 (ID = 0x0000000000000000)
Input List (15 items):  (ID = 0x082acb91eea331e9) Created: 14/09/2022 12:19:01 Last Modified: 14/09/2022 12:19:01
Report Template (2 fields): _OnD_0x082acb91eea331e9 (ID = 0x082acb91eeb331e9) Created: 14/09/2022 12:17:57 Last Modified: 14/09/2022 12:17:57
Schedule dispatched via message queue (0x082acb91eea331e9), Data source identifier (F5CE03EFE93B42DEAF241C749ED80AE7)
Schedule Time: 14/09/2022 12:17:58
Processing started at 14/09/2022 12:17:58
Processing completed successfully at 14/09/2022 12:19:01
Extraction finished at 14/09/2022 12:19:01 UTC, with servers: tm07n01, TRTH (54.806 secs)
Historical Instrument <RIC,FGBMH0> expanded to 1 RIC: FGBMH0.
Historical Instrument <RIC,FGBSH0> expanded to 1 RIC: FGBSH0.
Historical Instrument <RIC,STXEH0> expanded to 1 RIC: STXEH0.
Historical Instrument <RIC,FFIH0> expanded to 1 RIC: FFIH0.
Historical Instrument <RIC,UNA.AS> expanded to 1 RIC: UNA.AS.
Instrument <RIC,IMCD.AS> expanded to 1 RIC: IMCD.AS.
Instrument <RIC,OREP.PA> expanded to 1 RIC: OREP.PA.
Instrument <RIC,PHG.AS> expanded to 1 RIC: PHG.AS.
Instrument <RIC,BAR.BR> expanded to 1 RIC: BAR.BR.
Instrument <RIC,ABI.BR> expanded to 1 RIC: ABI.BR.
Instrument <RIC,ATCOa.ST> expanded to 1 RIC: ATCOa.ST.
Instrument <RIC,AFPD.S> expanded to 1 RIC: AFPD.S.
Instrument <RIC,LVMH.PA> expanded to 1 RIC: LVMH.PA.
Instrument <RIC,AMPF.MI> expanded to 1 RIC: AMPF.MI.
Instrument <RIC,CHRH.CO> expanded to 1 RIC: CHRH.CO.
Total instruments after instrument expansion = 15

Range Query from 2019-12-20T09:02:34.000 to 2019-12-20T09:11:27.000 (UTC)
Quota Message: INFO: Tick History Compliance Quota Count Before Extraction: 2802; Instruments Approved for Extraction: 14; Tick History Compliance Quota Count After Extraction: 2802, 28.02% of Limit; Tick History Compliance Quota Limit: 10000
Manifest: #RIC,Domain,Start,End,Status,Count
Manifest: ABI.BR,Market Price,2019-12-20T09:02:39.336925949Z,2019-12-20T09:11:26.421164980Z,Active,930
Manifest: AFPD.S,Market Price,,,Inactive,0
Manifest: AMPF.MI,Market Price,2019-12-20T09:02:34.241595786Z,2019-12-20T09:11:26.674041158Z,Active,618
Manifest: ATCOa.ST,Market Price,2019-12-20T09:02:34.202579216Z,2019-12-20T09:11:24.300485901Z,Active,628
Manifest: BAR.BR,Market Price,2019-12-20T09:02:38.593195816Z,2019-12-20T09:09:59.962822200Z,Active,141
Manifest: CHRH.CO,Market Price,2019-12-20T09:02:41.101816484Z,2019-12-20T09:11:26.986073709Z,Active,208
Manifest: FFIH0,Market Price,2019-12-20T09:02:35.106607821Z,2019-12-20T09:11:25.914743782Z,Active,2071
Manifest: FGBMH0,Market Price,2019-12-20T09:02:34.110009742Z,2019-12-20T09:11:26.167625056Z,Active,4086
Manifest: FGBSH0,Market Price,2019-12-20T09:02:35.612086751Z,2019-12-20T09:11:26.669088468Z,Active,899
Manifest: IMCD.AS,Market Price,2019-12-20T09:02:37.045480176Z,2019-12-20T09:11:24.190847735Z,Active,396
Manifest: LVMH.PA,Market Price,2019-12-20T09:02:39.337839279Z,2019-12-20T09:11:24.194082022Z,Active,3583
Manifest: OREP.PA,Market Price,2019-12-20T09:02:34.404629591Z,2019-12-20T09:11:24.597156067Z,Active,1878
Manifest: PHG.AS,Market Price,2019-12-20T09:02:37.046180441Z,2019-12-20T09:11:24.934922999Z,Active,1636
Manifest: STXEH0,Market Price,2019-12-20T09:02:34.006535645Z,2019-12-20T09:11:26.355424258Z,Active,18739
Manifest: UNA.AS,Market Price,2019-12-20T09:02:37.808890854Z,2019-12-20T09:11:26.117026806Z,Active,6738


Using IExtractionsContext.ExtractRawAsync() and subsequently IExtractionsContext.GetReadStreamAsync(RawExtractionResult) for obtaining a response stream which I then process with a StreamReader.


For the extraction note above however, the response stream stops after 13466 lines (almost at the end of LVMH.PA data). Looking at the manifest Count values it is in the block of LVMH.PA lines that the limit of 10000 is reached. Coincidence ? Because this quota limit is on RICs not on retrieved lines, correct ?


I notice this behaviour consistently in other requests with +10000 lines of reply data. Still coincidence or am I overlooking something ?


Any help/info much appreciated !



dsstick-history-rest-api
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Hello @Sven Thomas ,

Please share the request- will run it on our side to verify the suspected issue.

You are correct- the extraction limits are per instruments returned, not per lines.

hi @Sven Thomas ,

Thank you for your participation in the forum. Is the reply below satisfactory in resolving your query?
If so please can you click the 'Accept' text on the left side of the appropriate reply? This will guide all community members who have a similar question.

Thanks,
AHS

Upvote
Accepted
32.2k 40 11 20

Hello @Sven Thomas ,

TickHistory result should be gzipped, and it is not adaptable, as Tick History does not allow to customize compression format, please see DSS Key Mechanisms -> Streaming-> Compression for more information.

Please see this previous discussion thread and the answer from Veerapath at the bottom, hoep this will be of help.

If not, please include the details, on how to reproduce multipart gzip that causes the issue for you, so we can try to reproduce the same on our side.

One of the factors that may be relevant is that oversized tick history requirements that are implemented with custom request approach and not VBD, often have to be partitioned into sub-requests, so we would collect all of the zipped sub-results that are to be accumulated for the use case, first, and consequently decompress locally, all of the pieces, outside of DSS NET SDK. NET decompression is usually used as on-the-fly automatic decompression.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
33 1 1 3

Hi @zoya faberov ...with 'share the request' you mean the source code ?


var extractionRequest = new TickHistoryTimeAndSalesExtractionRequest()
{
    IdentifierList = new InstrumentIdentifierList
    {
        InstrumentIdentifiers = new DataScope.Select.Api.Content.InstrumentIdentifier[]
        {
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "ABI.BR", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "AFPD.S", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "AMPF.MI", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "ATCOa.ST", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "BAR.BR", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "CHRH.CO", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "FFIH0", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "FGBMH0", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "FGBSH0", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "IMCD.AS", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "LVMH.PA", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "OREP.PA", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "PHG.AS", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "STXEH0", IdentifierType = IdentifierType.Ric },
            new DataScope.Select.Api.Content.InstrumentIdentifier { Identifier = "UNA.AS", IdentifierType = IdentifierType.Ric }
        },
        ValidationOptions = new InstrumentValidationOptions { AllowHistoricalInstruments = true },
        UseUserPreferencesForValidationOptions = false
    },
    Condition = new TickHistoryTimeAndSalesCondition()
    {
        ReportDateRangeType = ReportDateRangeType.Range,
        ExtractBy = TickHistoryExtractByMode.Ric,
        MessageTimeStampIn = TickHistoryTimeOptions.GmtUtc,
        SortBy = TickHistorySort.SingleByRic,
        TimeRangeMode = TickHistoryTimeRangeMode.Inclusive,
        QueryStartDate = new DateTimeOffset(new DateTime(2019,12,20,9,2,34, DateTimeKind.Utc)),
                    QueryEndDate = new DateTimeOffset(new DateTime(2019,12,20,9,11,27, DateTimeKind.Utc)),
        DaysAgo = null
    },
    ContentFieldNames = new[]
    {
        RefinitivContentFieldNames.TimeAndSales.BidPrice,
        RefinitivContentFieldNames.TimeAndSales.AskPrice
    }
};


extractionsContext.Options.AutomaticDecompression = true;
            
RawExtractionResult extractionResult = await extractionsContext.ExtractRawAsync(extractionsContext, extractionRequest, cancellation);


extractionsContext is of type IExtractionsContext on which I subsequently call GetReadStremAsync and of course created with our RTH account credentials, well you know what I mean.

Is that enough info for you to run the verification ?

Many thanks already !

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvote
33 1 1 3

Hi @zoya faberov , digged a bit further into this ...

Turned off AutomaticDecompression and persisted the returned response stream to a file (gzip).

Magically , the unzipped file contains all expected lines ... the sum of all manifest Count values.

So now my best guess it has something to do with AutomaticDecompression. Came across the following thread on the forum ... Missing completed extraction returned via API - Forum | Refinitiv Developer Community

The thread mentions an advisory but the link seems outdated.

Once again ... any help/info ... mucho appreciado !!!


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
32.2k 40 11 20

Hello @Sven Thomas ,

Thanks for sharing your request and for letting us know.

You are absolutely correct, disabling decompression is often helpful for longer extraction times and larger results.

The other feature that is often of help is Asyc, as described in detail in Key Mechanisms -> Async.

So having run the request on our side and having part missing, I did both, i.e.

  //  ExtractionsContext.Preferences.WaitSeconds = 5;
    //        ExtractionsContext.Options.AutomaticDecompression = true;

And

  //Start the job
            var job = extractionsContext.ExtractRawStart(extractionRequest);
            Status.Notify(extractionsContext, null, "MonitorJob", MethodType.Operation, Publish.Primary);
            while (job != null && job.Status != JobStatus.Completed && job.Status != JobStatus.Error
                   && job.Status != JobStatus.Cancelled && job.Status != JobStatus.Purged)
            {
                //Retrieve an updated status of the job
                job = extractionsContext.MonitorJob(job);
            }

            //Output
            if (job != null && job.Result != null && job.Status == JobStatus.Completed)
            {
...

on my side, have verified the complete result for your request as well.

But thanks to your information, we also know that just not decompressing for this request sizing is sufficient, and start+monitor job is not strictly required.

The other note that you may find useful, is once we have the jobId, it may be helpful to quick-check job status with Postman to be completed=200 and quick-get-result with Postman on jobId as well. Compare the result= it was much larger for your request. This way, I knew that the job submitted with .NET SDK was completing successfully, no issue with request or permissions, and that the result as retrieved via with .NET SDK on the spot was not the complete result.

Hope that this information helps


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
33 1 1 3

Hi @zoya faberov

Sorry but I'm not quite following your suggestion to use Async mechanism ... I'm actually using ExtractRawAsync already. When this method returns an ExtractionResult I would expect it to be complete for both AutomaticDecompression true or false. Seems like a flaw in the DSS api to me and apparently reported before here I see now.

Anyway, I will work on the solution without autodecompressing and un-g-zipping myself.

Many thanks !

CC @aleksandra.kluczniak


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvote
33 1 1 3

Good afternoon @zoya faberov

With AutomaticDecompression set to false and decompressing on our end I now run into another issue. The TRTH api seems to return multipart gzip data which is not supported by standard .Net System.IO.Compression.GZipStream. As far as I know only 3rd party library SharpZipLib does support this. Do you perhaps have more experience with this or can you recommend other decompression possibility for multipart gzip ?

Are there any plans on Refinitiv side for other types of data compression ?


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.