Timeout Error retrieving data

Hi everyone,
I'm reaching out for some assistance regarding an eikon API query I've been working on.
I want to retrieve some data on the institutional ownership of 300 companies from 2002 until 2023 and I've encountered a timeout error (Error code 408 | Request timeout occurred) while executing the following code:
import eikon as ek
import pandas as pd
ek.set_timeout(100)
instruments = ['EQNR.OL' ,'DNB.OL' ,'NHY.OL' ,'TEL.OL' ,'YAR.OL' ,'AKRBP.OL' ,'MOWI.OL' ,'ORK.OL' ,'STB.OL' ,'TOM.OL' ,'GJFG.OL' ,'KOG.OL' ,'ADEA.OL' ,'SALM.OL' ,'NOD.OL' ,'SCHB.OL' ,'SCHA.OL' ,'SRBNK.OL' ,'MING.OL' ,'BRGB.OL' ,'VAR.OL' ,'AKER.OL' ,'TGS.OL' ,'AUTO.OL' ,'PROT.OL' ,'EPR.OL' ,'ELK.OL' ,'LSG.OL' ,'ATEA.OL' ,'WAWI.OL' ,'AKSOA.OL' ,'HAUTO.OL' ,'ENTRA.OL' ,'SCATC.OL' ,'NEL.OL' ,'VEI.OL' ,'DNO.OL' ,'CRAYN.OL' ,'AUSS.OL' ,'MPCC.OL' ,'WWI.OL' ,'SPOLS.OL' ,'GSFG.OL' ,'ACCA.OL' ,'HEX.OL' ,'BONHR.OL' ,'EQNR.N' ,'AKH.OL' ,'BNOR.OL' ,'NONG.OL' ,'PGS.OL' ,'NAS.OL' ,'RECSI.OL' ,'NYKD.OL' ,'NSKOG.OL' ,'GCC.OL' ,'BELCO.OL' ,'AFK.OL' ,'ASA.OL' ,'BEWI.OL' ,'ELMRA.OL' ,'ELO.OL' ,'HPUR.OL' ,'LINK.OL' ,'PEXIP.OL' ,'SBOS.OL' ,'VOLUE.OL' ,'XXL.OL' ,'DOFG.OL' ,'KIT.OL' ,'RANA.OL' ,'KID.OL' ,'HAVI.OL' ,'SOFF.OL' ,'AZT.OL' ,'BOUV.OL' ,'MEDI.OL' ,'MULTI.OL' ,'SPOG.OL' ,'SVEG.OL' ,'SPOT.OL' ,'OPRA.OQ' ,'PHO.OL' ,'IDEX.OL' ,'BGBIO_r.OL' ,'AFGA.OL' ,'OLT.OL' ,'ULTI.OL' ,'ODF.OL' ,'MORGS.OL' ,'SOONS.OL' ,'WWIB.OL' ,'PENR.OL' ,'ABGA.OL' ,'KCCK.OL' ,'PARB.OL' ,'SALME.OL' ,'B2I.OL' ,'SMCRT.OL' ,'SADG.OL' ,'HELG.OL' ,'CLOUD.OL' ,'AKAST.OL' ,'ZAP.OL' ,'KOA.OL' ,'AMSCM.OL' ,'SATSS.OL' ,'ELABS.OL' ,'MGN.OL' ,'NORBT.OL' ,'AGLX.OL' ,'STRO.OL' ,'SOR.OL' ,'POL.OL' ,'ACR.OL' ,'TRER.OL' ,'ABL.OL' ,'AKBM.OL' ,'AURG.OL' ,'ZAL.OL' ,'OKEA.OL' ,'TOTG.OL' ,'VOW.OL' ,'MOBAM.OL' ,'OTEC.OL' ,'CARAC.OL' ,'AKVA.OL' ,'PLT.OL' ,'BGBIO.OL' ,'HBC.OL' ,'KMCP.OL' ,'EWIND.OL' ,'NRC.OL' ,'SAGAS.OL' ,'KOMPLK.OL' ,'TRMED.OL' ,'HUNT.OL' ,'SOAG.OL' ,'RING.OL' ,'SNI.OL' ,'SUBC.OL' ,'FRO.OL' ,'GIG.OL' ,'SIOFF.OL' ,'BWO.OL' ,'NOM.OL' ,'FLNG.OL' ,'NORAM.OL' ,'BAKKA.OL' ,'ODLO.OL' ,'BWLPG.OL' ,'AGAS.OL' ,'GOGL.OL' ,'BORR.OL' ,'20202.OL' ,'OET.OL' ,'HAFNI.OL' ,'BWE.OL' ,'CADLR.OL' ,'DVD.OL' ,'NORSE.OL' ,'ECIT.OL' ,'HSHP.OL' ,'CLCO.OL' ,'OTL.OL' ,'SDRL.OL' ,'DDRIL.OL' ,'SEAPT.OL' ,'NORCO.OL' ,'AASB.OL' ,'ABSA.OL' ,'ABTEC.OL' ,'ADSA.OL' ,'AEGA.OL' ,'AFISH.OL' ,'AIRX.OL' ,'ALNG.OL' ,'ANDF.OL' ,'AQUIL.OL' ,'ARGEO.OL' ,'ARRA.OL' ,'AURAA.OL' ,'AYFIE.OL' ,'BALT.OL' ,'BBERG.OL' ,'BCS.OL' ,'BFISH.OL' ,'BIEN.OL' ,'BMA.OL' ,'BOR.OL' ,'BSPC.OL' ,'CAMBI.OL' ,'CAPSL.OL' ,'CIRCA.OL' ,'CODEC.OL' ,'CRNA.OL' ,'CYVIZ.OL' ,'DSRT.OL' ,'EAM.OL' ,'EIOF.OL' ,'ELIMP.OL' ,'EMGS.OL' ,'ENDUR.OL' ,'ENERG.OL' ,'ENSU.OL' ,'EQVA.OL' ,'EXTX.OL' ,'GEM.OL' ,'GENT.OL' ,'GEOS.OL' ,'GIGA.OL' ,'GOD.OL' ,'GRONG.OL' ,'GYL.OL' ,'HAVH.OL' ,'HDLY.OL' ,'HKY.OL' ,'HRGI.OL' ,'HSPG.OL' ,'HUDL.OL' ,'HYN.OL' ,'HYON.OL' ,'HYPRO.OL' ,'IFISH.OL' ,'INDCT.OL' ,'ININ.OL' ,'INSTA.OL' ,'IOX.OL' ,'ISLAX.OL' ,'ITERA.OL' ,'IWS.OL' ,'JAREN.OL' ,'KRAB.OL' ,'KYOTO.OL' ,'LEAL.OL' ,'LIFEA.OL' ,'LOKO.OL' ,'LUMI.OL' ,'LYTIX.OL' ,'MASM.OL' ,'MELG.OL' ,'MVE.OL' ,'MVWM.OL' ,'NAVA.OL' ,'NBX.OL' ,'NCOD.OL' ,'NEXT.OL' ,'NISB.OL' ,'NKR.OL' ,'NOAP.OL' ,'NOHAL.OL' ,'NORDH.OL' ,'NORTH.OL' ,'NSOL.OL' ,'NTG.OL' ,'NTI.OL' ,'NUMND.OL' ,'OBSRV.OL' ,'OBXEDNBN.OL' ,'OCEANO.OL' ,'OMDA.OL' ,'OSUN.OL' ,'OTOVO.OL' ,'OTS.OL' ,'PCIB.OL' ,'PHLY.OL' ,'PNOR.OL' ,'PPGP_p.OL' ,'PROXI.OL' ,'PRSO.OL' ,'RCR.OL' ,'REACH.OL' ,'ROMER.OL' ,'ROMSB.OL' ,'SB68.OL' ,'SCANA.OL' ,'SKAND.OL' ,'SKUE.OL' ,'SMOP.OL' ,'SNOR.OL' ,'SOFTX.OL' ,'SOGNS.OL' ,'SPIR.OL' ,'STST.OL' ,'STSU.OL' ,'SUNSB.OL' ,'TECH.OL' ,'TECO.OL' ,'TEKNA.OL' ,'TYSB.OL' ,'VGM.OL' ,'VISTN.OL' ,'VVL.OL' ,'WEST.OL' ,'WSTEP.OL' ,'XPLRA.OL' ,'ZWIPEZ.OL']
def split_instruments(instruments, chunk_size=5):
"""Yield successive chunk_size chunks from instruments."""
for i in range(0, len(instruments), chunk_size):
yield instruments[i:i + chunk_size]
instrument_chunks = list(split_instruments(instruments))
dfs = []
for chunk in instrument_chunks:
# Fetch data for the current chunk of instruments
df_chunk = ek.get_data(
instruments=chunk,
fields=[
'TR.PctOfSharesOutHeld.date',
'TR.PctOfSharesOutHeld',
'TR.InvestorFullName'
],
parameters={'SDate': '2002-01-01', 'EDate': '2024-01-01', 'Frq': 'FY'}
)[0]
filtered_df = df_chunk[df_chunk['Investor Full Name'].str.contains("Vanguard|BlackRock|State Street", case=False, na=False)].copy()
filtered_df['Date'] = pd.to_datetime(filtered_df['Date'])
filtered_df['Date'] = filtered_df['Date'].dt.year
filtered_df.rename(columns={'Date': 'Year'}, inplace=True)
dfs.append(filtered_df)
final_df = pd.concat(dfs, ignore_index=True)
print(final_df)
Until about a month ago, this code was working fine for fetching data from up to 50 companies. But now, even with the same companies as before, it's giving me a timeout error (Error code 408 | Request timeout occurred) so, I'm seeking guidance on how to resolve this timeout issue. Any gentle suggestions or optimizations would be greatly appreciated.
Thank you all in advance for your help!
Best Answer
-
Hi @jon.alonso ,
I can see you have already increased the timeout and are using chunks. Another approach I might suggest is to introduce a while loop to handle Timeout extensions as suggested in this thread -Recommended code pattern for handling Eikon get_data timeout - Forum | Refinitiv Developer Community - Forum | Refinitiv Developer Community. In the meantime I will check with the product team to check if there are issues on the API side.
Best regards,
Haykaz
0
Answers
-
Hi @aramyan.h I solved the issue forcing the code to retry several times, I provide the solution in case anyone can find it valuable:
import time
import eikon as ek
import pandas as pd
ek.set_timeout(200)
instruments = ['EQNR.OL' ,'DNB.OL' , ...]
dataframes = {}
# Loop through each instrument and fetch data
for instrument in instruments:
success = False
retries = 100
for attempt in range(retries):
try:
df, err = ek.get_data(
[instrument],
['TR.CompanyName','TR.PctOfSharesOutHeld.date', 'TR.PctOfSharesOutHeld', 'TR.InvestorFullName'],
parameters={'SDate': '2002-01-01', 'EDate': '2024-01-02', 'Frq': 'FY'}
)
if err is None:
dataframes[instrument] = df
success = True
print(f"Data successfully retrieved for {instrument}")
break # Exit the retry loop on success
except ek.EikonError as e:
print(f"Attempt {attempt + 1} failed for {instrument}: {e}")
time.sleep(5) # Wait 5 seconds before retrying
if not success and attempt == retries - 1:
print(f"Failed to retrieve data for {instrument} after {retries} attempts.")0 -
0
Categories
- All Categories
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 684 Datastream
- 1.4K DSS
- 613 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 248 ETA
- 552 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 629 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 191 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 86 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛