I am currently trying to implement a python program to download time series data through Eikon Scripting API. For example, I want to download the minute-by-minute high price for certain US stocks (e.g. IBM, AAPL.O). The code is pretty straightforward, such as
tmp = ek.get_timeseries(chunk, fields=field, start_date=today, end_date=dt.datetime.utcnow(), interval='minute', corax='adjusted', raw_output=True)
where `chunk` is `['IBM', 'AAPL.O']`. In practice, I want to download as many stocks as possible, hence the length of `chunk` can be >1000.
However, I am running into the following error:
... File "C:\ProgramData\Anaconda2\lib\site-packages\eikon\time_series.py", line 155, in get_timeseries result = eikon.json_requests.send_json_request(TimeSeries_UDF_endpoint, payload, debug=debug) File "C:\ProgramData\Anaconda2\lib\site-packages\eikon\json_requests.py", line 82, in send_json_request check_server_error(result) File "C:\ProgramData\Anaconda2\lib\site-packages\eikon\json_requests.py", line 130, in check_server_error raise requests.HTTPError(error_message, response=server_response) HTTPError: Failed to deserialize backend response: invalid character 'E' looking for beginning of value
The most weird part is that, with the same code, I only run into this error sporadically, hence I am guessing I am hitting some kind of limit?
The error comes from the Web service delivering timeseries data, and it's not related to any limits. I can easily reproduce the error requesting a single row for a single RIC:
ek.get_timeseries('AAPL.O', count=1, interval='minute')
The issue seems to be limited to certain RICs and certain time periods, e.g. I have no problem retrieving
ek.get_timeseries('IBM', count=1, interval='minute')
ek.get_timeseries('AAPL.O',count=1, end_date='2017-09-29T10:44:25.500903-04:00' , interval='minute').
ek.get_timeseries('AAPL.O',count=1, end_date='2017-09-29T14:44:25.500903-04:00' , interval='minute')
ek.get_timeseries('CSCO.O',count=1, end_date='2017-09-28T14:44:25.500903-04:00' , interval='minute')
return the same error.
The issue also seems to be intermittent. A few minutes after I wrote the above I no longer reproduce the error with the exact calls I used to reproduce it before.
I'm escalating the issue.
Just a followup:
I still rely on 'luck' to get the minute-by-minute data, for >2000 stocks that I am tracking. In order to increase my odd, I simply use 'try... except...' clause to make multiple attempts. I will set the timeout to be 10 seconds, in order to avoid the 'request timeout' error (which I've seen couple times).
So far, I am able to download the minute-by-minute data, for 1000 stocks, in one get_timeseries() function call (sometimes after multiple attempts).
However, I do see a lot of 'nan's in the downloaded data, I will open a new thread to describe the details.