Upgrade from Eikon -> Workspace. Learn about programming differences.

For a deeper look into our Eikon Data API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials |  Articles

question

Upvotes
Accepted
3 0 0 1

Eikon API Speed

I am requesting 35 fields for 6600 stocks and the request is taking an hour to complete, is this expected or do I have an issue somewhere? I have broken the request into chunks of 50 stocks but this doesn't seem to help.

eikon-data-api#technology
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
Accepted
5.5k 18 2 7

Hi @whardwick ,


I am not sure if you have shared with me your full code, but with the code below which I updated a bit from yours I get the results for 5000 rics in about 6 minutes:

chunks = [ticker_list[i:i + 2000] for i in range(0, len(ticker_list), 2000)]
max_retries = 3
merged_df = pd.DataFrame()
for chunk in chunks:
    retries = 0
    while retries < max_retries:
        try:
            df, err = ek.get_data(chunk, fields)
            df.replace('', np.nan, inplace=True)
            merged_df = pd.concat([merged_df, df], ignore_index=True)
        except:
            retries +=1
            print(retries)
            continue
        break

Please try with this and let me know how it goes. Please note that you may increase the chunk size even more and it will run quicker, however it may through bad requests. In any case the 2000 seemed stable for me.


Best regard,

Haykaz

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Thanks, Haykaz. Perhaps it's the fields that are the issue. They exceed the character limit so have uploaded. fields.txt

fields.txt (1.8 KiB)
I have got the fields, just your while loop with try/except doesn't seem complete. Please try with my example and let me know if that improves
yes that's much faster thanks a lot
Upvote
5.5k 18 2 7

Hi @whardwick ,


I would say that is expected considering the number of stocks you are making the request for. Would you mind sharing your code here, so I have a look to see if there is a way to optimize it?


Best regards,

Haykaz

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Hi Haykaz, thanks for the response:

# breaks tickers into chunks
 chunks = [ticker_list[i:i + 200] for i in range(0, len(ticker_list), 200)]
 max_retries = 3
 for chunk in chunks:
       retries = 0
       while retries < max_retries:
       try:
           df, err = ek.get_data(chunk, fields)
           df.replace('', np.nan, inplace=True)
           merged_df = pd.concat([merged_df, df], ignore_index=True)

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.