Continuously getting backend error and internal server error.

Hi, I have around 6000 RICS and i want to fetch the earnings data for those RICS on daily basis. I'm using eikon. But continuously I'm getting backend error and internal server error. and even if I ran it on the for all 6000 RICS I'm getting the data for only in between 3 to 3.5K. I'm tried giving the time sleep of 2 sec or 1 sec but that also not works. How can I solve this issue. Is Eikon is capable of fetching the data for all the RICS or is there any another solution for this. Because I need to fetch the data for all on daily basis (each business day)
Best Answer
Answers
-
Hi @aramyan.h
Thank you for the response , I'm attempting to get the market cap and I'm using the code below to do so. I get an HTTP timeout exception if I increase the chunk size , and backend error if I provide one RIC at a time.
I frequently receive the backend error even when I train on small list of RIC's.
Our goal is to setup the regular tasks to collect the data like market cap, earnings, merge and acquisitions, estimates and numerous others factors.
So how can I get the data without the errors ?
chunk_size = 50
merged_df = pd.DataFrame()
try:
for i in range(0, len(Final_df), chunk_size):
rics_chunk = Final_df['ric'].iloc[i:i + chunk_size].tolist()
df, err = ek.get_data(rics_chunk, fields=['TR.MarketCapLocalCurn', 'TR.MarketCapLocalCurn.date'],
parameters={'SDate': '2010-01-01', 'EDate': '2023-11-07', 'Frq': 'D'})
merged_chunk = pd.merge(df, Final_df.iloc[i:i + chunk_size], left_on='Instrument', right_on='ric', how='left')
merged_df = pd.concat([merged_df, merged_chunk])
break
except Exception as e:
print(f"Error occurred: {e}")
0 -
Hi @vishnu01 ,
have you looked at the thread I have shared? there is an example how to do that. I have tried to midify your code, you may try it and see if works:
chunk_size = 50
merged_df = pd.DataFrame()
for i in range(0, len(Final_df), chunk_size):
retries = 0
rics_chunk = Final_df['ric'].iloc[i:i + chunk_size].tolist()
while retries < 3:
try:
df, err = ek.get_data(rics_chunk, fields=['TR.MarketCapLocalCurn', 'TR.MarketCapLocalCurn.date'],
parameters={'SDate': '2010-01-01', 'EDate': '2023-11-07', 'Frq': 'D'})
merged_chunk = pd.merge(df, Final_df.iloc[i:i + chunk_size], left_on='Instrument', right_on='ric', how='left')
merged_df = pd.concat([merged_df, merged_chunk])
except Exception as e:
print(f"Error occurred: {e}")
retries +=1
continue
breakBest regards,
Haykaz
0 -
Hi @aramyan.h,
When I attempted to use the code you provided above, Eikon gave me two errors. I tried reducing the chunk size all the way to 10, but the problem persisted.
Error code 408 | HTTP TimeoutException and "code":504,"message":"A remote server did not reply in a timely fashion.
0 -
Hi @vishnu01 ,
I think your issue is coming from the line merged_chunk = pd.merge(df, Final_df.iloc[i:i + chunk_size], left_on='Instrument', right_on='ric', how='left'), not the API.
I have commented out that line and have tested with chunk 100 and it works fine:
Also I have indented the break that should have been inside while (updated in the code above)
Best regards,
Haykaz
0 -
Hi, @aramyan.h I tried your suggestions and even when I reduce the size to 50 and apply the same logic, I still receive the same error. Are these errors connected to server or my eikon desktop? I'm running eikon version 4.0.64 and Python version 3.11.3. When I use the chunk_size=100 it gives me the HTTP timeout exception.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 684 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 249 ETA
- 554 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 643 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 192 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛