Hello,
I'm using the Refinitiv data library for python. In particular, I'm using the following function and have been getting some errors for too many requests.
Using rd.get_history(), we were pulling data for a total of 6,458 RICs, separating the requests out into loops of 100 RICs and 3 fields. We've only submitted around 100 requests before we got the following:
RDError: Error code 429 | Too many requests, please try again later.
The code is follows:
fields_TS = ['B_YLD_1','OAS_BID','AST_SWPSPD']
df1 = []
counter = 0
RICS_per_loop_TS = 100
for k in range(0, len(RICs), RICS_per_loop_TS):
batch = RICs[k:k+RICS_per_loop_TS]
counter = counter+1
# loop for the same set of RICs but for all of the fields in fields_TS
for i in range(3):
df0 = pd.DataFrame()
df0 = rd.get_history(
instr = batch,
start = "2016-07-23",
end = "2024-07-23",
fields = fields_TS[i],
interval = '1W'
)
time.sleep(0.25)
df1.append(df0)
I have removed all the irrelevant bits of code, but essentially in every loop we pull data for 100 RICs, and we have 3 separate requests for each field.
The API was also only returning a response around every 30 seconds, and this was working consistently at a regular speed for around 100 requests for about 45 minutes, so I would assume this is a daily limit that we've ran into.
But I also don't think I've reached the 10,000 daily request limit (given I've not submitted more than 200 requests today). The file only added up to around 65MB after 45 minutes so I don't think we've reached the daily volume limit either.
I would really appreciate it if you could help us out!
Thank you!