Discover Refinitiv
MyRefinitiv Refinitiv Perspectives Careers
Created with Sketch.
All APIs Questions & Answers  Register |  Login
Ask a question
  • Questions
  • Tags
  • Badges
  • Unanswered
Search:
  • Home /
  • TRTH /
avatar image
REFINITIV
Question by gteage01 · Jun 30, 2017 at 07:09 AM · pythontrthv2trth v2

TRTH Python API - Queuing Time

Hi,

I received a question "I see that to generate the intraday summary report with 5sec interval for 1 RIC for 1 month takes long time (more than 30min excluding queuing time). Is it normal? Is there any way to quicken the process?"

I could not attach .py File though the code from it is copied below, I have removed the Username and Password from the Python script.

Best regards,

Gareth

-----------------------------------------------------------------------------------------------------------------------------------

# coding: utf-8 # In[4]: #Step 1: token request import requests import json import time requestUrl = "https://hosted.datascopeapi.reuters.com/RestApi/v1/Authentication/RequestToken" requestHeaders={ "Prefer":"respond-async", "Content-Type":"application/json" } requestBody={ "Credentials": { "Username": , "Password": "" } } proxies = {'http': 'http://webproxy.ssmb.com:8080', 'https': 'http://webproxy.ssmb.com:8080'} r1 = requests.post(requestUrl, json=requestBody, headers=requestHeaders, proxies=proxies) if r1.status_code == 200 : jsonResponse = json.loads(r1.text.encode('ascii', 'ignore')) token = jsonResponse["value"] print ('Authentication token (valid 24 hours):') print (token) else: print ('Please replace myUserName and myPassword with valid credentials, then repeat the request') # In[5]: #Step 2: send an on demand extraction request using the received token requestUrl='https://hosted.datascopeapi.reuters.com/RestApi/v1/Extractions/ExtractRaw' requestHeaders={ "Prefer":"respond-async", "Content-Type":"application/json", "Authorization": "token " + token } requestBody={ "ExtractionRequest": { "@odata.type": "#ThomsonReuters.Dss.Api.Extractions.ExtractionRequests.TickHistoryIntradaySummariesExtractionRequest", "ContentFieldNames": [ # "Close Ask", # "Close Bid", # "High", # "High Ask", # "High Bid", "Last", # "Low", # "Low Ask", # "Low Bid", # "No. Asks", # "No. Bids", "No. Trades", "Open", # "Open Ask", # "Open Bid", "Volume" ], # "@odata.type": "#ThomsonReuters.Dss.Api.Extractions.ExtractionRequests.TickHistoryTimeAndSalesExtractionRequest", # "ContentFieldNames": [ # "Trade - Price", # "Trade - Volume" # ], "IdentifierList": { "@odata.type": "#ThomsonReuters.Dss.Api.Extractions.ExtractionRequests.InstrumentIdentifierList", "InstrumentIdentifiers": [{ "Identifier": "ESU7", "IdentifierType": "Ric" }, ], "UseUserPreferencesForValidationOptions":"false" }, "Condition": { "MessageTimeStampIn": "GmtUtc", "ReportDateRangeType": "Range", "QueryStartDate": "2017-06-28T00:00:00.000Z", "QueryEndDate": "2017-06-29T00:00:00.000Z", "SummaryInterval": "FiveSeconds", "TimebarPersistence":"false", "DisplaySourceRIC":"true" } } } r2 = requests.post(requestUrl, json=requestBody, headers=requestHeaders, proxies=proxies) #displaying the response status, and the location url to use to get the status of the extraction request #initial response status (after approximately 30 seconds wait) will be 202 print (r2.status_code) print (r2.headers["location"]) # In[6]: #Step 3: poll the status of the request using the received location URL, and getting the jobId and extraction notes requestUrl = r2.headers["location"] requestHeaders={ "Prefer":"respond-async", "Content-Type":"application/json", "Authorization":"token " + token } while True: r3 = requests.get(requestUrl, headers=requestHeaders, proxies=proxies) if r3.status_code == 200: break else: print('Failed...Re-request in 30 secs...') time.sleep(30) #when the status of the request is 200 the extraction is complete, we display the jobId and the extraction notes print ('response status = ' + str(r3.status_code)) if r3.status_code == 200 : r3Json = json.loads(r3.text.encode('ascii', 'ignore')) jobId = r3Json["JobId"] print ('jobId: ' + jobId + '\n') notes = r3Json["Notes"] print ('Extraction notes:\n' + notes[0]) else: print ('execute the cell again, until it returns a response status of 200') # In[7]: #Step 4: get the extraction results, using the receive jobId requestUrl = "https://hosted.datascopeapi.reuters.com/RestApi/v1/Extractions/RawExtractionResults" + "('" + jobId + "')" + "/$value" requestHeaders={ "Prefer":"respond-async", "Content-Type":"text/plain", "Accept-Encoding":"gzip", "Authorization": "token " + token } r4 = requests.get(requestUrl, headers=requestHeaders, proxies=proxies) # print (r4.text) # In[8]: #Step 5 (cosmetic): formating the response using a panda dataframe from io import StringIO import pandas as pd timeSeries = pd.read_csv(StringIO(r4.text)) timeSeries # In[ ]:

People who like this

0 Show 1
Comment
10 |1500 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

avatar image
REFINITIV
Beera.Rajesh · Jul 05, 2017 at 11:55 AM 0
Share

Hi Team, could someone look into this and provide an update? it's been quite a few days this query was posted

2 Replies

  • Sort: 
avatar image
REFINITIV
Best Answer
Answer by Nipat Kunvutipongsak · Aug 16, 2017 at 04:42 AM

Here this is useful investigation information from case 05649911:

This is an expected behavior of TRTH because it may take longer time to process a request if an input RIC is extremely liquid.



Tick history has to parse lot of ticks to create Intraday summaries on the fly. The intraday extractions are expectedly takes longer while you do the same extraction on time and sales report you will receive a faster response but with huge number of messages.



For another interesting information about the AWS Direct Download feature, it enhances only the download speed of extracted data but not the speed (time) of Processing the data. The time to process the data still remains same like before.
Comment

People who like this

0 Show 0 · Share
10 |1500 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

avatar image
REFINITIV
Answer by Robert Gross · Jul 06, 2017 at 02:59 PM

This issue needs to be investigated. This doesn't appear to be a "How to" to question, so we need more information in order to investigate. From an email I received, there is a case number associated with this issue, 05649911. Please provide the notes, or at least the user id via the case 05649911.

Comment

People who like this

0 Show 0 · Share
10 |1500 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Watch this question

Add to watch list
Add to your watch list to receive emailed updates for this question. Too many emails? Change your settings >
12 People are following this question.

Related Questions

How can i get symbol changes from trth v2? sample data given below..

in TimeAndSales request is there a view which can give me the raw data like in MarketDepth?

Did the TRTH V2 have schedular task limit

How can i get previous day close for each dy and symbol when making a trth ElektronTimeseries request

Missing TRTH HistoricalChainResolution Constituents for 0#.AORD

  • Feedback
  • Copyright
  • Cookie Policy
  • Privacy Statement
  • Terms of Use
  • Careers
  • Anonymous
  • Sign in
  • Create
  • Ask a question
  • Spaces
  • Alpha
  • App Studio
  • Block Chain
  • Bot Platform
  • Calais
  • Connected Risk APIs
  • DSS
  • Data Fusion
  • Data Model Discovery
  • Datastream
  • Eikon COM
  • Eikon Data APIs
  • Electronic Trading
    • Generic FIX
    • Local Bank Node API
    • Trading API
  • Elektron
    • EMA
    • ETA
    • WebSocket API
  • Legal One
  • Messenger Bot
  • Messenger Side by Side
  • ONESOURCE
    • Indirect Tax
  • Open PermID
    • Entity Search
  • Org ID
  • PAM
    • PAM - Logging
  • ProView
  • ProView Internal
  • Product Insight
  • Project Tracking
  • Refinitiv Data Platform
    • Refinitiv Data Platform Libraries
  • Rose's Space
  • Screening
    • Qual-ID API
    • Screening Deployed
    • Screening Online
    • World-Check One
    • World-Check One Zero Footprint
  • Side by Side Integration API
  • TR Knowledge Graph
  • TREP APIs
    • CAT
    • DACS Station
    • Open DACS
    • RFA
    • UPA
  • TREP Infrastructure
  • TRIT
  • TRKD
  • TRTH
  • Thomson One Smart
  • Transactions
    • REDI API
  • Velocity Analytics
  • Wealth Management Web Services
  • World-Check Data File
  • Explore
  • Tags
  • Questions
  • Badges