Data limits for Eikon API Proxy

Hello,

I know that in Bloomberg there are daily limits (max 500,000 hits) and monthly limits (max 7000 unique securities). Something similar should apply also here:

"Every application using Eikon API Proxy must have an Application ID. An Application ID allows Thomson Reuters to monitor data usage by applications, enforce data limits and throttle any offending applications."

developers.thomsonreuters.com/tr-eikon-scripting-apis-eap-limited-access/eikon-web-and-scripting-apis-beta/quick-start

A) What are exact limits per day/month?


B) If two applications are developed on the same same terminal with the same Eikon user ID/password, is there a limit per machine?

C) How is it possible to check data usage and see whether the limit is approaching?

Best Answer

  • Alex Putkov.1
    Alex Putkov.1 ✭✭✭✭✭
    Answer ✓

    Hi @tommy

    There are no daily or monthly data retrieval quotas in Eikon. To protect the backend we throttle the requests you submit. In addition, if we determine that the amount of data you consume does not constitute "personal use" as stated in your contract with Thomson Reuters, we may reach out to you and discuss your data consumption needs and suggest products that best fit your data usage. But unlike in Bloomberg there's no limit after reaching which you automatically cannot submit any further data requests.

Answers

  • tommy
    tommy Explorer

    @Alex Putkov.1v Great, keep up with the good work.

  • tr105
    tr105 Explorer

    This answer is no longer up to date, right? Considering the info on this page: https://developers.refinitiv.com/eikon-apis/eikon-data-api/docs?content=49692&type=documentation_item

  • HeikoRR
    HeikoRR Explorer

    It now seems too restrictive that I am considering my license. It makes quant works very tedious to the point I have to spend more time in writing data requests than to do the actual analysis.

    Refinitiv: Please reconsider reasonable data limits that are in line with Bloomberg and other competitors.

    Thank you in advance.

  • Alex Putkov.1
    Alex Putkov.1 ✭✭✭✭✭

    @HeikoRR
    Thank you for your comment. Would you mind being specific about your use case: what data you retrieve, which API calls you make and what limits you're exhausting? For the detailed description of data retrieval limits through Eikon Data APIs, please see
    https://developers.refinitiv.com/eikon-apis/eikon-data-api/docs?content=49692&type=documentation_item

    It's not trivial to compare data retrieval limits through Eikon Data APIs with those in Bloomberg Terminal. We believe that the limits in Eikon are more generous. If you have evidence to the contrary, we'd appreciate you sharing it.

  • HeikoRR
    HeikoRR Explorer

    Hi Alex,

    Thank you for your quick reply.

    Below I detail what I mean and I will compare to Bloomberg (I have been using the Bloomberg API for 10+ years)


    The EIKON data documentation states:

    "get_timeseries: The current limit value (10-Oct-2019) is 3,000 data points (rows) for interday intervals and 50,000 data points for intraday intervals. This limit applies to the whole request, whatever the number of requested instrument."

    Assume I want to do a quick analysis on the S&P500 looking at historic prices:

    Open Low High Close and Volume (5 timeseries), daily, starting in 2007 (13 years ago)

    That will be a request of 500 *5 * 250 * 13 = 8,125,000. Since the data limit is 3000 per request, I have to do 8,125,000/3000 = 2709 requests in theory.

    Practically it is much worse since I have to make each request by ticker. Therefore there is a minimum of 500 x 5 = 2500 requests.

    However, to retrieve eg just the daily Close of one constituents, the number of requests are 250 * 13 = 3250 which is greater than 3000, so I have to make 2 requests and merge it.

    So for every one of the 500 x 5 timeseries, I have to make 2 requests which brings the total amount of work to:

    5000 requests and 2500 merging of time series.

    Further, the number of requests I can make per second is 3. So the minimum time needed is 5000/3= 1667 seconds or 27.8 minutes.

    EIKON: 5000 requests and 2500 merging of time series, 27.8 minutes

    BLOOMBERG: 1 request, no merging, 5 seconds

    Please let me know if I miss anything