Explain the Error "Error code -1 | UDF Core request failed. Gateway Time-out"

Hi Team, can you please explain the Error that the client is facing. They are pulling data for a list of ISINs and they are getting that error. Previously, for their bulk number of ISINs, when they remove the data item ['TR.NIIssuePricePctPrint','TR.NIIssuerUltParentTRBCEcoSec','TR.NIOfferPricePrintUniform'], the error is gone and the script works. However upon further testing, we can use the the 3 fields above for less number of ISINs.
Is this a limitation error since we did not find any documentation for that?
import refinitiv.data as rd
rd.open_session()
df = rd.get_data(
universe = ['XS2624942764', 'US09710H5182', 'XS2591444422'],
fields = [
#'TR.FiTicker',
'TR.LegalEntityIdentifier',
'TR.FiIssueDate',
'TR.FiCurrency',
#'TICKER',
#'TR.ExchangeTicker',
'TR.FiOriginalAmountIssued',
'TR.FiOrgID',
#'TR.OrgidCode',
'TR.FiParentOrgID',
#'TR.NACEClassification',
'TR.FiESGBondType',
#'TR.GreenBondFlag'#,
'TR.FiMaturityStandardYield',
'TR.NIIssuePricePctPrint',
'TR.NIIssuerUltParentTRBCEcoSec',
'TR.NIOfferPricePrintUniform']
)
display(df)
Answers
-
Hi @gjastia so that message usually means that the request was not completed by the server. Reasons for this could be size of request (requests to the service should not be more that 10,000 datapoints per API request - please see our limits guidance here). I can see that their request is already over that limit with 1756 rows x 12 columns = 21072 - almost twice the recommended amount of datapoints. When server load is light these queries might pass but during busier periods they may well fail - hence our guidance around the 10K datapoint per call. The request also seems to be mixing reference fields with timeseries fields. There is also no row fidelity for the timeseries fields as they are using the get_data function - they should use the get_history function to ensure row fidelity across different timeseries. They can open the Codebook App and look in the Examples folder there and check the get_history function as well as get_data function examples etc. I hope this can help.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 685 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 252 ETA
- 556 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 650 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 228 TRKD
- 917 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛