For a deeper look into our DataScope Select REST API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials

question

Upvote
Accepted
69 1 0 5

How to list all extractions created with ExtractRaw?

Hi!

I'm following 'Tick History Rest Api Guide' to download history market data. I made post request ('https://selectapi.datascope.refinitiv.com/RestApi/v1/Extractions/ExtractRaw') with body:

{"ExtractionRequest":
    {
"@odata.type": "#DataScope.Select.Api.Extractions.ExtractionRequests.TickHistoryIntradaySummariesExtractionRequest",
"ContentFieldNames": ["High", "Last", "Low", "Open", "Volume"],
"IdentifierList": {
"@odata.type": "#DataScope.Select.Api.Extractions.ExtractionRequests.InstrumentIdentifierList",
"InstrumentIdentifiers": [{"Identifier": "CLZ2", "IdentifierType": "Ric"}],
"UseUserPreferencesForValidationOptions": "False"
},
"Condition": {
"MessageTimeStampIn": "GmtUtc",
"ReportDateRangeType": "Range",
"QueryStartDate": "2022-08-22T00:00:00Z",
"QueryEndDate": "2022-08-23T23:59:00Z",
"DisplaySourceRIC": "True",
"SummaryInterval": "OneMinute"
}
}
}

I got ExtractionId in the respond ('0x0830ea3024a33b1c'). So, I have two questions:

  1. How to get all extractions made from me? I tried get request https://selectapi.datascope.refinitiv.com/RestApi/v1/Extractions/ReportExtractions but result is empty:
{
    "@odata.context": "https://selectapi.datascope.refinitiv.com/RestApi/v1/$metadata#ReportExtractions",
    "value": []
}

2. How to delete extraction ('0x0830ea3024a33b1c') and all related files after I don't need it anymore?

tick-history-rest-api
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvote
Accepted
52.9k 136 44 63

@d.alishev

Sorry for the issue you are facing, let me see if I can help you in resolving this.

You can use the /Jobs/Jobs endpoint with the GET method to get all extracted jobs.

The output looks like this:

 {
            "JobId": "0x0830eadfb9933af4",
            "UserId": xxx,
            "Status": "Completed",
            "StatusMessage": " ",
            "Description": "TickHistoryTimeAndSalesReportTemplate Extraction",
            "ProgressPercentage": 0,
            "CreateDate": "2022-10-04T01:32:22.826Z",
            "StartedDate": "2022-10-04T01:32:22.826Z",
            "CompletionDate": "2022-10-04T01:35:56.156Z",
            "MonitorUrl": "https://selectapi.datascope.refinitiv.com/restapi/v1/Extractions/ExtractRawResult(ExtractionId='0x0830eadfb9933af4')"
        },

You can cancel an In Progress extraction by using the DELETE method with the Extractions/ExtractRawResult(ExtractionId='<job id>') endpoint.

1664852496786.png

The status of canceled jobs will be PendingCancellation.

{
    "@odata.context": "https://selectapi.datascope.refinitiv.com/RestApi/v1/$metadata#Jobs",
    "value": [
        {
            "JobId": "0x08314caddb733ba3",
            "UserId": xxx,
            "Status": "PendingCancellation",
            "StatusMessage": "PendingCancellation",
            "Description": "TickHistoryTimeAndSalesReportTemplate Extraction",
            "ProgressPercentage": 0,
            "CreateDate": "2022-10-04T02:59:10.086Z",
            "StartedDate": "2022-10-04T02:59:10.086Z",
            "MonitorUrl": "https://selectapi.datascope.refinitiv.com/restapi/v1/Extractions/ExtractRawResult(ExtractionId='0x08314caddb733ba3')"
        },
...

You can get a list of extracted files from the /Extractions/ExtractedFiles endpoint with the GET method. The output looks like this:

    "@odata.context": "https://selectapi.datascope.refinitiv.com/RestApi/v1/$metadata#ExtractedFiles",
    "value": [
        {
            "ExtractedFileId": "VjF8MHgwODMwZjU5ZTU4MzMzYjNkfA",
            "ReportExtractionId": "2000000455842161",
            "ScheduleId": "0x08314caddb733ba3",
            "FileType": "Note",
            "ExtractedFileName": "_OnD_0x08314caddb733ba3.csv.gz.notes.txt",
            "LastWriteTimeUtc": "2022-10-04T03:02:17.230Z",
            "ContentsExists": true,
            "Size": 953,
            "ReceivedDateUtc": "2022-10-04T03:02:17.230Z"
        },
        {
            "ExtractedFileId": "VjF8MHgwODMwZTk2NDYxNzMzYWYwfA",
            "ReportExtractionId": "2000000455817736",
            "ScheduleId": "0x0830eadf7fb33af6",
            "FileType": "Note",
            "ExtractedFileName": "_OnD_0x0830eadf7fb33af6.csv.gz.notes.txt",
            "LastWriteTimeUtc": "2022-10-04T01:45:36.740Z",
            "ContentsExists": true,
            "Size": 953,
            "ReceivedDateUtc": "2022-10-04T01:45:36.740Z"
        },

Then, you can delete a file by using the DELETE method with the /Extractions/ExtractedFiles('<ExtractedFileId>') endpoint.

1664852873469.png

For more information, please refer to the API Reference Tree.

I hope this will help.


1664852496786.png (34.2 KiB)
1664852873469.png (33.6 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Thank you! It is definitely what I was looking for! One more question. As I understand from @Gurpreet answer there is no need in clean up procedure from our side. So, can I skip deleting files, since it will be deleted automatically?

Sorry, I found the answer in manual:

On-demand extractions expire after 3 days 

@d.alishev

Correct. I checked the Best Practices & Fair Usage Policy for DataScope Select and Tick History guide and found the following statement.

1664874067681.png

The files form on-demand extractions will be remvoed after 3 days.

1664874067681.png (54.2 KiB)
Upvote
16.7k 42 12 19

Hi @d.alishev,

There are two types of data extractions - scheduled and on-demand. On-Demand extraction like the one you are executing are valid only until they are in progress. Once the extraction is complete and you have downloaded the resulting data file, there is no cleanup required from the application. The extraction ID is invalidated automatically.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.