Fragility of the API? in get_data raise RDError(-1, except_msg)

Working with Refinitiv Data Python library I have some bond titles for which I pull the Ultimate Parent Id in order to further query some data:
>>> import refinitiv.data as
>>> import pandas as pd
...
>>> ParentIDs
Instrument Ultimate Parent Id
0 US013822AC54 5051045063
1 US013822AE11 5051045063
2 US013822AG68 5051045063
3 US013822AH42 5051045063
4 XS2342910689 5057937159
... ... ...
3238 US98955DAA81 5053129374
3239 XS2431015655 5053129374
3240 US98980BAA17 5043459833
3241 US98981BAA08 5041798671
3242 US98980HAA86 4295893641
[3243 rows x 2 columns]
>>> ParentIDs['Ultimate Parent Id'].unique().dropna()
<StringArray>
['5051045063', '5057937159', '4295903254', '4295901827', '5083796424',
'4298089043', '5038907093', '4295906590', '5000065666', '4295900267',
...
'5069399010', '4295908523', '5042953645', '5067514871', '4296579444',
'5082038238', '5053129374', '5043459833', '5041798671', '4295893641']
Length: 1431, dtype: string
However, if I pull the Primary Quote, I get the following error:
>>> RICs = rd.get_data(
... universe = ParentIDs['Ultimate Parent Id'].unique().dropna(),
... fields = ["TR.PrimaryQuote"]
... )
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\...\AppData\Roaming\Python\Python311\site-packages\refinitiv\data\_access_layer\get_data_func.py", line 126, in get_data
raise RDError(-1, except_msg)
refinitiv.data._errors.RDError: Error code -1 | Backend error. 400 Bad Request Requested universes: ['5051045063', '5057937159', '4295903254',
'4295901827', '5083796424', '4298089043', '5038907093', '4295906590', '5000065666', '4295900267', '5082539928', '5063764430', '4295863289', ...
At a second attempt it worked.
- What does this error message mean?
- Is the API "fragile"? It also takes sometimes a very long time just to get some data (e.g. "Ultimate Parent Id")
Best Answer
-
@fabian.echterling Thanks for your question - so during busier periods of high load - occasionally some requests are dropped - the only real solution is to code defensively in a try-except structure and re-present the call - perhaps after a small sleep. I will contact the service team and see if there are any specific issues and report back. Could you also provide a full universe list to see if I can replicate or if there is some issue with the identifiers.
The API Limits for get_data are 10,000 datapoints per API call - so you are well within that. I hope this can help.
0
Answers
-
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 684 Datastream
- 1.4K DSS
- 614 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 248 ETA
- 552 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 641 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 191 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 89 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛