Error code 408 | Request timeout occured - JSON request

In Terminal I try to run my python file that should download a large quantity of data as a CSV file.
I get the error code 408. Something about a JSON request. How do I fix this?
(base) U:\Castellain\refinitiv>python dailyAPIcode.py
Request timeout occured
Traceback (most recent call last):
File "dailyAPIcode.py", line 25, in <module>
'TR.RNSARPctOSPostTrans'])
File "C:\Users\william\AppData\Local\Continuum\anaconda3\lib\site-packages\eikon\data_grid.py", line 186, in get_data
result = eikon.json_requests.send_json_request(_endpoint, payload, debug=debug)
File "C:\Users\william\AppData\Local\Continuum\anaconda3\lib\site-packages\eikon\json_requests.py", line 118, in send_json_request
_check_server_error(result)
File "C:\Users\william\AppData\Local\Continuum\anaconda3\lib\site-packages\eikon\json_requests.py", line 194, in _check_server_error
raise EikonError(int(server_response['ErrorCode']), error_message)
eikon.eikonError.EikonError: Error code 408 | Request timeout occured
FROM THE .PY FILE:
#import packages
import eikon as ek # the Eikon Python wrapper package
import pandas as pd
import numpy as np
import datetime
from datetime import timedelta, date, datetime
from pandas.tseries.offsets import BDay
#connects to Bill's Eikon terminal
ek.set_app_key('xxxxxxxxxxxxxxxxxxxxx')
#Buyback fields with dynamic date constraints for a list of RICs AND export as CSV to U:Castellain/refinitiv
df,e = ek.get_data("lists('Inv Trust List')","TR.RIC")
ric_list = df['Instrument'].tolist()
df, e = ek.get_data(ric_list,
['TR.RNSFilerName',
'TR.RNSAnnouncedDate',
'TR.RNSTransactionType',
'TR.RNSARNumShrsTransacted',
'TR.RNSARPctOSTransacted',
'TR.RNSARTransactionPrice',
'TR.RNSARMktValTransaction',
'TR.RNSARTotShrsPostTrans',
'TR.RNSARPctOSPostTrans'])
end_date = date.today()
start_date = end_date - timedelta(days=365)
end_date_str = datetime.strftime(end_date, "%Y-%m-%d")
start_date_str = datetime.strftime(start_date, "%Y-%m-%d")
df['RNS Announced Date'] = pd.to_datetime(df['RNS Announced Date'])
mask = (df['RNS Announced Date'] > start_date_str) & (df['RNS Announced Date'] <= end_date_str)
df = df.loc[mask]
df.rename(columns={'RNS AR Price (at Transaction) - £': 'RNS AR Price (at Transaction) GBP',
'RNS AR Market Value of Transaction - £': 'RNS AR Market Value of Transaction - GBP'},
inplace=True)
#create file name and export
todays_date = date.today()
todays_date_str = datetime.strftime(todays_date, "%Y%m%d")
df.to_csv('Daily API Download_' + todays_date_str + '.csv')
Best Answer
-
Hi @bill39
Please read the API limitation guide at https://developers.refinitiv.com/eikon-apis/eikon-data-api/docs?content=49692&type=documentation_item
You are mostly exceeding the limitation (10000 records per API call)
I do not know what is inside your lists('Inv Trust List').
Assuming that it is a large list as you mentioned and it may be a culprit.
So you can split them into a smaller list per API call.
Here is a modified code to break the list down into multiple smaller groups.
Please note that I added the time package (import time)
#import packages
import eikon as ek # the Eikon Python wrapper package
import pandas as pd
import numpy as np
import datetime
import time
from datetime import timedelta, date, datetime
from pandas.tseries.offsets import BDay
ek.set_app_key('xxxxxxxxxxxxxxxxxxxxx')
#assuming that these are RIC code in ric_list, in your program, it reads from lists
ric_list = ['AAL.L','ABF.L','ADML.L','AHT.L','ANTO.L','AV.L','AZN.L','BAES.L']
#split them into a group of 'ric_per_call' into array_ric_list
ric_per_call = 3 #change to any number of RIC per API call
array_ric_list = []
i = 0
while i < len(ric_lists)/ric_per_call:
array_ric_list.append(ric_lists[i*ric_per_call:(i+1)*ric_per_call])
i += 1
#loop API call between groups of RIC
end_date = date.today()
start_date = end_date - timedelta(days=365)
end_date_str = datetime.strftime(end_date, "%Y-%m-%d")
start_date_str = datetime.strftime(start_date, "%Y-%m-%d")
todays_date = date.today()
todays_date_str = datetime.strftime(todays_date, "%Y%m%d")
addHeader = True #this variable to addHeader to csv just one time
for riccodes in array_ric_list:
print("API calls for:",riccodes)
df, e = ek.get_data(riccodes,
['TR.RNSFilerName','TR.RNSAnnouncedDate','TR.RNSTransactionType',
'TR.RNSARNumShrsTransacted','TR.RNSARPctOSTransacted',
'TR.RNSARTransactionPrice','TR.RNSARMktValTransaction',
'TR.RNSARTotShrsPostTrans','TR.RNSARPctOSPostTrans'])
df['RNS Announced Date'] = pd.to_datetime(df['RNS Announced Date'])
mask = (df['RNS Announced Date'] > start_date_str) & (df['RNS Announced Date'] <= end_date_str)
df = df.loc[mask]
df.rename(columns={'RNS AR Price (at Transaction) - £': 'RNS AR Price (at Transaction) GBP',
'RNS AR Market Value of Transaction - £': 'RNS AR Market Value of Transaction - GBP'},
inplace=True)
if addHeader:
df.to_csv('Daily API Download_' + todays_date_str + '.csv', header=True)
addHeader=False #Only add header one time
else:
df.to_csv('Daily API Download_' + todays_date_str + '.csv', mode='a', header=False)
time.sleep(5) #delay between each API call
print('Done')Here is the result:
API calls for: ['AAL.L', 'ABF.L', 'ADML.L']
API calls for: ['AHT.L', 'ANTO.L', 'AV.L']
API calls for: ['AZN.L', 'BAES.L'] Done0
Answers
-
Hello @bill39
Please refer to this question as it is similar.
0
Categories
- All Categories
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 683 Datastream
- 1.4K DSS
- 613 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 248 ETA
- 552 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 626 Refinitiv Data Platform Libraries
- 5 LSEG Due Diligence
- 1 LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 191 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 84 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛