1. get_timeseries cuts off historic data 2. get_data not working for all tickers

The two problems below only occur in when using the API with R.
Everything works well when using Eikon Workstation of the Eikon Excel add-inn.
#PROBLEM 1
#get_timeseries() cuts off timeseries - irregularly
#Example tickers: IBM (17 Mar 1980), MSFT (13 Mar 1986), .IBLEM0002 (31 Oct 2010)
#Full historic data is available on Eikon and on the Excel-Eikon for all tickers
#Prepare
library(devtools)
library(eikonapir)
library(tidyr)
eikonapir::set_proxy_port(9000L)
eikonapir::set_app_id('Put your API Key here')
#Create tickers
tickers = c("MSFT.O","IBM",".IBLEM0002")
tickers = c("MSFT.O",".IBLEM0002")
tickers = c("IBM",".IBLEM0002")
tickers = c(".IBLEM0002")
#Create tickers input list
ticker.list = vector("list", length(tickers))
ticker.list[1:length(tickers)] = tickers
#Timeframe
startDate = "2003-01-01T00:00:00"
endDate = paste("2020-03-27","T00:00:00",sep="")
#Get prices
p.out = get_timeseries(
rics = ticker.list,
fields = list("TIMESTAMP","CLOSE"),
start_date = startDate,
end_date = endDate,
interval = "daily")
names(p.out) = c("Date","Close","Tickers")
#Convert from stacked to wide format
p.out = spread(data = p.out, key = Tickers ,value = Close)
names(p.out) = c("Date",tickers)
head(p.out,3)
tail(p.out,3)
#RESULTS - different timeseries start dates depending on ticker combination
#If only .IBLEM0002 is included then timeseries gives the full series
#If only MSFT.O or IBM are included then timeseries is cut off 2009-04-29
#If tickers = c("MSFT.O","IBM",".IBLEM0002"), ".IBLEM0002" starts 2016-03-11 and MSFT, IBM later
#If tickers = c("IBM",".IBLEM0002"), ".IBLEM0002"starts 2016-03-11 and MSFT later
#If tickers = c("MSFT.O",".IBLEM0002"), ".IBLEM0002"starts 2014-03-05 and IBM later
#PROBLEM 2
#get_data get full timeseries for et IBM, MSFT.O BUT nothing for .IBLEM0002
#Full historic data is available on Eikon and on the Excel-Eikon also for .IBLEM0002 (start in 2010)
startDate = "2003-01-01"
endDate = "2020-03-27"
p.out = get_data(
instruments = ticker.list,
fields = list("TR.PriceClose.Date","TR.PriceClose"),
parameters = list("Frq"="M","SDate"=startDate,"EDate"=endDate))
names(p.out) = c("Tickers","Date","Close")
p.out = spread(data = p.out, key = Tickers ,value = Close)
head(p.out,3)
tail(p.out,3)
I need to download the full historic timeseries for a range of different ticker types. Please let me know if there is an error in my formulas or if there is a different more consistent approach.
Thank you very much in advance
Best Answer
-
You can cut down the number of requests by retrieving multiple fields in a single request. Using the following code I retrieve 13 years of daily price history for the constituents of S&P 500 in about 8 minutes. In this code spx is the list of RICs for the constituents of S&P 500.
ts = pd.DataFrame()
chunk_size = 50
for i in range(0, len(spx), chunk_size):
rics = spx[i : i + chunk_size]
df, err = ek.get_data(rics,
['TR.CLOSEPRICE.date','TR.CLOSEPRICE',
'TR.HIGHPRICE','TR.LOWPRICE',
'TR.OPENPRICE','TR.ACCUMULATEDVOLUME'],
{'SDate':'-13Y', 'EDate':'0D'})
ts = ts.append(df, sort=False)
tsThe resulting dataframe has 1.65 million rows. I must admit I have no first hand experience with Bloomberg. I can easily believe that using Bloomberg API one can retrieve all these timeseries in a single request. Then again adding the above code to break the list of instruments into chunks and to retrieve chunks in a loop is not a huge task, and you only need to do this once. As for the data retrieval time I find it very hard to believe that all this data can be retrieved in 5 seconds from any system. If it is not an exaggeration, that is mightily impressive. 8 minutes is not the best time you can possibly get out of Eikon, although further improvement would not be as straightforward as in the above code sample. And even with the max optimization I don't think we can get the retrieval time to significantly under a minute.
0
Answers
-
For the first problem, it could be limitations in Eikon Data APIs mentioned in the EIKON DATA API USAGE AND LIMITS GUIDELINE.
get_timeseries: The current limit value (10-Oct-2019) is 3,000 data points (rows) for interday intervals and 50,000 data points for intraday intervals. This limit applies to the whole request, whatever the number of requested instrument.
For the second problem, the TR.PriceClose field is not available for .IBLEM0002. You can use the Data Item Browser (DIB) tool to verify it.
0 -
Thank you @irapongse.phuriphanvichai - a bit disappointing. This is much much less than what Bloomberg allows - making the API less useful for even a simple analysis such as the S&P500 10 years data: 500 x 10 x 250 = 1,250,000 data points or 417 requests - essentially looping over each constituent. Even just analyzing a single stock over the past 20 years needs 2 requests and a merge.
Question: How is the community dealing with this problem? Thank you very much in advance.
0 -
@HeikoRR
To retrieve timeseries of daily price history for exchange traded instruments including stocks and indices you can use get_data method with TR.CLOSEPRICE field.0 -
Hi Alex, thank you for your reply.
get_timeseries():
I have answered you with respect to get_timeseries()
There, the result for a OHLC, Volume, daily, 13 years for S&P 500 constituents were:
EIKON: 5000 requests and 2500 merging of time series, 27.8 minutes
BLOOMBERG: 1 request, no merging, 5 seconds
get_data():
I tried to do the same using get_data(). The situation is a bit better but still way worse than using the Bloomberg API: Assume again I want to do a quick analysis on the S&P500 with
Open Low High Close and Volume (5 timeseries), daily, starting in 2007 (13 years ago)
For the time period (2007-01-03 to 2020-03-25, 3332 days) I can only request 63 to 64 tickers in one request (I tried many combinations). That amounts to 500 * 5 / 63 = 40 requests. If I request more, get_data() returns NULL without an error message.
EIKON: 40 requests, no merging of time series
BLOOMBERG: 1 request, no merging, 5 seconds
Note: Some tickers such as .IBLEM0002 do not have the TR.PriceClose field and have to be retrieved using get_timeseries()
Please let me know if I miss anything?
0 -
Hi Alex,
Yes, thank you again. Sure it is easy to loop over chucks of rics.
But why having these restrictions in the first place? It would be helpful if you could internally see if these restrictions could be adapted to allow to retrieve reasonable amounts of data per day in a single request - especially the limits on get_timeseries are way to low.
Thank you in advance, Heiko
0 -
Thanks again for the feedback. The reason for the limits on how much data can be retrieved in a single request is to protect the platform from abuse and to protect the user from accidentally exhausting daily data retrieval limits, which is what we hear some users complain about happening with competitor products. This said I agree that 3K rows limit for single get_timeseries call for interday intervals is too restrictive. It is an unfortunate limitation of a legacy backend system that is currently behind get_timeseries method. As we develop and roll out the new Refinitiv Data Platform, it will replace the legacy system for timeseries with its limitations.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 685 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 252 ETA
- 556 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 652 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 193 TREP Infrastructure
- 228 TRKD
- 917 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛