Upgrade from Eikon -> Workspace. Learn about programming differences.

For a deeper look into our Eikon Data API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials |  Articles

question

Upvotes
Accepted
1 1 1 5

429 - Too Many Requests!

Hi everybody,

i have two questions:

1. Is there any possibility to find out in detail why I am getting seemingly randomly a 429 response (get.data)? I have read very carefully the documentation (Eikon Data API Usage and Limits Guideline) and I can confirm that:

  • I am (as far as I can observe) not sending more than four requests per second (However, is there any way to verify this?).
  • I am way below 50 MB per minute.
  • I am way below 10,000 data points per request.
  • I am way below 10,000 requests per day


2. My timeout duration is not 24hours but rather around 4-6hours - is there any way to understand why I am getting which time-out duration?


Kind regards





eikon-data-api#productpython api
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
Accepted
10.3k 18 6 9

@s2782245 OK so the first point here is that the eikonapir library is a community library - im not sure how upto date that library is - it uses our older generation of Eikon Data APIs - we have a more modern set called Refinitiv Data Libraries (python, .net, typescript) - that one of our customers has ported to R using python-R interop using reticulate and very kindly open-sourced. You can find that here : https://github.com/GreenGrassBlueOcean/RefinitivR .

Regarding the throttling limits so they are applied at the server end per User ID - so they just go off what has been consumed or requested. From the documentation - we can see that :

For Call-based and Daily limits, when the limit is reached, the Eikon platform returns an HTTP response with status code 429 and the message "Too many requests, please try again later.". Then, the Python library raises an EikonError exception with the following message:

1693316495273.png


So you are either falling foul of call-based limits or Daily Limits. There is also a daily API call limit of 10,000 API calls - which you could be hitting if you are iterating over large numbers of ISINs or iterating over isin/year combinations. To deal with these you need to optimise your calls to account for this additional throttle which could trigger the 429s. Add a count variable before every API call and keep a count of it. Also I am unsure how the code would run with other App_Ids (sharing of IDs is not permitted see:

Eikon is only licensed for individual use. Eikon users may not share their login credentials, run any instances of Eikon on a server or use, distribute or redistribute data in any way that is inconsistent with their organization’s agreement with us.


- but it doesn't matter as I mentioned before the throttles are applied against the underlying user ID. When you hit 10K calls you will be locked out for 24hrs - however - at the moment there is a bug which means the timezone currently resets at GMT rather than your local timezone (I think this might explain the 4-6 hours reset you are experiencing) - this will be corrected in the next release. Let me know about the number of API calls. I hope this can help.



1693316495273.png (4.4 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
10.3k 18 6 9

Hi @s2782245 Thanks for your question and sorry to hear about your issue. Can you please post the API call you are using so we can try to replicate? Are you making asynchronous API calls or sequential? Which library and version are you using? Also what is the size of your instrument list. Thanks.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
1 1 1 5

Dear jason,

thank you very much for your answer. I am using an R wrapper for Refinitiv eikon API R, as described in one of your posts in this forum (https://github.com/ahmedmohamedali/eikonapir). Hence, I am using the Eikon Data API (EDAPI) library. I am making multiple randomized sequential API calls. See the below code as an example:

    #### Setting up the connection ####
    eikonapir::set_proxy_port(9000L)
    eikonapir::set_app_id('XXXXXX')
    #### Creating Year Parameter ####
    year_df <- as.data.frame(0:-25) 
    year_df <- year_df %>% rename("yearindikator"="0:-25")
    year_df <- year_df %>% 
      mutate("yearindikator"=paste(year_df$yearindikator,"CY",sep ="" ))
    #### Reuters Functions to be pulled ####
    investors_reuters_function <- list(
      "TR.InstrumentType",
      "TR.CompanyName",
      "TR.FreeFloatPct",
      "TR.InvestorFullName.investorpermid",
      "TR.InvestorFullName",  
      "TR.HoldingsDate",
      "TR.EarliestHoldingsDate",
      "TR.SharesHeld",
      "TR.PctOfSharesOutHeld",
      "TR.SharesHeldValue",
      "TR.InvestorType",
      "TR.InvParentType",
      "TR.InvInvmtOrientation",
      "TR.FilingType",
      "TR.ConsHoldFilingDate",
      "TR.NbrOfInstrHeldByInv",
      "TR.InvAddrCountry",
      "TR.NbrOfInstrBoughtByInv",
      "TR.NbrOfInstrSoldByInv")
     
    #### Example-ISIN for truncated response #### 
    list_of_ISINS <-as.list("CNE100004116",...)
     
    #### Main loop to get the needed data ####
    for (i in 1:nrow(list_of_ISINS)){

  current_stock <- list_of_ISINS[i,1]
 

    for (q in 1:nrow(year_df)){
     
      # This loop now takes one "Parameter Item" from the year_df created earlier
      # and attaches it to the "get_data" function
      
      
      ### To avoid overload
      tmsleep<- sample(1:6,1)
      Sys.sleep(tmsleep)
      
      
     
      print(q)
      
      
      next_df <- 
        get_data(current_stock,
                 investors_reuters_function,
                 parameters = list("SDate" = year_df[q,1]))
      
      
      
      ### As sometimes, empty responses are received for no obvious reason,
      ### a while loop is implemented to try again
      
      counter <- 0
      while(dim(next_df)[1]==0 & counter <5){
        counter <- sum(counter,1)
        
        tmsleep<- sample(1:5,1)
        Sys.sleep(tmsleep)
        
      
        print(counter)
        next_df <- 
          get_data(current_stock,
                   investors_reuters_function,
                   parameters = list("SDate" = year_df[q,1]))
        
      }
      
      ### Saving the answers
      
      savingsname <- paste(list_of_ISINS[i,1],".csv",sep="")
      savingspfad <- paste("P:/17 ...",savingsname,sep="")
      write_delim(next_df,
                  savingspfad,
                  delim = ";",
                  append = TRUE,
                  col_names = !file.exists(savingspfad)
                  
      )
      
    }


The number of stocks that the TR.Functions are being sought after may vary depending on the user's input. Also the very same code might also come into play in a different APP with a different app_id. This is why I labelled the approach as multiple randomized sequential API calls. However, a single response rarely exceeds 2 MB, in a single second there can not be more than 2 simultaneous API calls and it certainly does neither exceed the 50MB/minute rule nor the daily 5 GB limit....


Kind regards




icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.