retrieve and concatenate results return from historical pricing parallel requests with events

Based on this github example, i learn how to create parallel requests with events , where i need to retrieve snapshot data. However, I have a question,
- how do i concatenate the results to form a larger dataframe ? Code below produce errors, as df_snap is referenced before assignment. Can someone show me the propoer way to concatenate results with index equal to the original dataframe, so i can reconcile / join the results with the original dataframe.
df_snap = pd.DataFrame() def concat_response(response, definition, session):
if response.is_success:
df_snap = pd.concat([df_snap , response.data.df], axis=0)
else:
print(response.http_status) for idx, row in data.iterrows():
asyncio.get_event_loop().create_task(
historical_pricing.events.Definition(universe=row['ricCode'], fields=["TRDPRC_1","MID_PRICE"],start = row['GMT Time'], count = 1).get_data_async(on_response = concat_response, closure = str(idx))
)
Answers
-
Thank you for reaching out to us.
You can use the global keyword in the concat_response method.
df_snap = pd.DataFrame() def concat_response(response, definition, session): global df_snap print(response.data.df) if response.is_success: df_snap = pd.concat([df_snap , response.data.df], axis=0) else: print(response.http_status)
1 -
can i ask is there a maximum limit to the number of parallel requests that the API can process at one time? For example, if i have a million rows in the dataframe, how many rows can be processed in parallel ?
I found out that when i send out 100 parallel requests, i would receive this
ReadTimeout('')
andTask exception was never retrieved
… long list of similar error messages…0 -
I found the following limit on the historial pricing API. You may contac the support team directly via MyAccount to confirm this.
If you would like to request a lot of data (million requests), please contact your LSEG account team or Sales team directly to discuss this usage.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 685 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 252 ETA
- 556 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 652 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 228 TRKD
- 917 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛