retrieve and concatenate results return from historical pricing parallel requests with events

thomas.ng
thomas.ng Newcomer
edited December 2024 in Refinitiv Data Platform

Based on this github example, i learn how to create parallel requests with events , where i need to retrieve snapshot data. However, I have a question,

  1. how do i concatenate the results to form a larger dataframe ? Code below produce errors, as df_snap is referenced before assignment. Can someone show me the propoer way to concatenate results with index equal to the original dataframe, so i can reconcile / join the results with the original dataframe.

df_snap = pd.DataFrame()
def concat_response(response, definition, session):


if response.is_success:
df_snap = pd.concat([df_snap , response.data.df], axis=0)
else:
print(response.http_status) for idx, row in data.iterrows():
asyncio.get_event_loop().create_task(
historical_pricing.events.Definition(universe=row['ricCode'], fields=["TRDPRC_1","MID_PRICE"],start = row['GMT Time'], count = 1).get_data_async(on_response = concat_response, closure = str(idx))
)

Answers

  • Jirapongse
    Jirapongse ✭✭✭✭✭

    @thomas.ng

    Thank you for reaching out to us.

    You can use the global keyword in the concat_response method.

    df_snap = pd.DataFrame()
    def concat_response(response, definition, session):
          global df_snap
          print(response.data.df)
    
          if response.is_success:
    
              df_snap = pd.concat([df_snap , response.data.df], axis=0)
    
          else:
    
              print(response.http_status) 
    
  • can i ask is there a maximum limit to the number of parallel requests that the API can process at one time? For example, if i have a million rows in the dataframe, how many rows can be processed in parallel ?

    I found out that when i send out 100 parallel requests, i would receive this ReadTimeout('') and Task exception was never retrieved … long list of similar error messages…

  • Jirapongse
    Jirapongse ✭✭✭✭✭

    @thomas.ng

    I found the following limit on the historial pricing API. You may contac the support team directly via MyAccount to confirm this.

    image.png

    If you would like to request a lot of data (million requests), please contact your LSEG account team or Sales team directly to discuss this usage.