Long timeseries history requests

Hi— I'm updating a timeseries data downloader that worked with the eikon C# .NET API that was deprecated some years ago to work with the new LSEG API.. I'm hitting a problem with data limits that seemed not to be a problem with the earlier C# API.
When the number of data points in the LSEG API hits a limit— this results in just the last portion of the data being returned, without the earlier time slots. If using the lseg.data.get_history API or the lseg.data.content.historical_pricing API, this happens completely silently without errors being returned or any other message indicating the data is truncated (at least that I can see). The only way to tell anything is wrong is to see that the first data point is not the same as the requested start (and even this check is questionable, since perhaps the stock did not trade the whole day, or perhaps the data is unavailable that day, etc.)
Questions:
a) Am I correct here? Or is there some way to detect this other than the first df datapoint not being the start date?
b) If we receive a date for any stock in a universe— can we guarantee that all the data for that date was received for all stocks requested at the same time? Or is it possible that different stocks have been filled in different amounts, or truncated differently?
c) If we are requesting intraday data (or interday for that matter)— how do/can we know whether a time point simply does not exist, or whether we are missing data for it because it was truncated, for example if we are requesting intraday event data?
Answers
-
Thank you for reaching out to us.
Does the application request historical data of the TR.XXX fields or real-time fields (no TR. prefixed)?
The library uses different endpoints to retreve historical data for the TR.XXX fields or real-time fields.
0 -
Hi Jirapongse— in this case I'm still trying out the new API so I'm not even specifying fields yet. But some sample calls would be:
history_1d_bars = hp.summaries.Definition(universe=russell_2000, interval=hp.Intervals.MINUTE, start='2025-01-23 01:00', end='2025-01-23 23:00', adjustments=[hp.Adjustments.EXCHANGE_CORRECTION], sessions=[hp.MarketSession.NORMAL, hp.MarketSession.PRE, hp.MarketSession.POST])
history_5d = hp.summaries.Definition(universe=univ.get_uni_mappings('US_RT').ric_composite.tolist(), interval=hp.Intervals.DAILY, start='2024-12-20', end='2024-12-27', adjustments=hp.Adjustments.EXCHANGE_CORRECTION, sessions=hp.MarketSession.NORMAL)
history_full_manual_def = hp.summaries.Definition(universe=universe, interval=hp.Intervals.DAILY, start='2000-01-01', adjustments=hp.Adjustments.EXCHANGE_CORRECTION, sessions=hp.MarketSession.NORMAL)
In particular at least to the first and last calls above I have had cases where I either hit a limit or a 429 error, and nonetheless received non-null data for the same name from the same request in the .data part of the response.0 -
Thank you for the information. It is a historical pricing API.
Regarding the 429 limit, please refer to the answer on this discussion. For the intraday summary data, the limit is 25 requests per second.
0 -
Thanks Jirapongse but this doesn't quite really answer myquestion. When we get limited (not a 429 limit, but a datapoint hit limit), how do we tell that that happened, and how to restart our request?
0 -
I am not sure if it is the data limit or the data is unvailable.
Please scope down the issue by providing the cut-down and runnable code with instruments, fields, and parameters.
Then, we can verify what the problem is.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 685 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 252 ETA
- 556 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 652 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 228 TRKD
- 917 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛