Downloading Datastream data referring to varying dates in large dataset

Hi all,
I have a question regarding the download of stock price data referring to specific dates, varying throughout the dataset. I use Datastream to download stock price data matching to M&A announcement retrieved from the SDC Mergers&Acquisitions database. Specifically, I need to download stock price data referring to specific announcement dates, varying from company to company. (e.g. Deal A was announced on 10jun2016, so I want to retrieve the target stock price 30 trading days prior to the transaction. For Deal B, the announcement date may be 26jun2016. And so on...)
As the sample is quite large (>90.000 transactions over a 20-year period), I am looking for a way to only download the stock price data for the relevant dates for each transaction (i.e. I only need a few specific dates per transaction, but the dates are different for each transaction). I want to avoid downloading stock price data of the last 20 years for all companies, because it would result in an immense ammount of unused data. Of course, I also want to avoid manually having to download the data for each company and date, as this would take me months.
I would be glad, if someone knew a way of automatically downloading only the stock price dates relevant for the regarding transaction for a large dataset, on the basis of the companies' Datastream code.
Thanks a lot for your help!
Ben
Best Answer
-
I have been thinking about this request a fair bit, personally i think this is best tackled from a request table.. rather than through the API
Step 1 - Identify the companies & the M&A date, then use Excel functionality to create two new dates (the start & end date for each instruments time series).
Step 2 - Drop this information into a request table and each request will run from between your imputed start and end dates.
See the images attached for reference.
0
Answers
-
Hello,
This forum is for software developers using Thomson Reuters APIs. The moderators on this forum don't have the expertise to answer general product or content search and explanation questions for Thomson Reuters products.
For questions such as this one you posted it is best to contact Thomson Reuters Helpdesk by either calling the Helpdesk number in your country or by following the link below:
0 -
@bendavid.dalka
I'm not sure I understand the challenge. Perhaps I'm missing something here, but what you're asking for seems trivial to me. When you're requesting timeseries from Datastream Web Service you can specify the time period. There's an example in the tutorial for Datastream Web Services on this portal:
https://developers.thomsonreuters.com/eikon-apis/datastream-web-service/learning?content=47128&type=learning_material_item
So, if I understand the desired workflow correctly, you read the data from "SDC Mergers&Acquisitions database" (I don't know what that database is, but from my perspective it doesn't matter). This data contains the symbol for the company you're interested in and the deal date. You use this symbol and the 30 day time period preceding the deal date to construct and execute the timeseries request to Datastream Web Service. Where's the challenge?0 -
I guess, the challenge would be that DSWS has a limit of 10m datapoints a month. Given that OP has >90000 transactions, i.e. >180000 companies (target and acquiror), he will not be able to retrieve more than 5 days of data (5 datapoints) per company, which I suppose is insufficient.
0 -
Hi @paul.bacon @Alex Putkov.1 ,
thanks for your answers. I forgot to update the status of this post, as I already found out how to solve the issue and that this forum is not quite the right spot to post this question.
I have pretty much done it like paul.bacon proposed to do it. Create start and end dates for each company in Stata based on the Acquisition Announcement Date and then splitting the dataset into subsets to make processing a request via the Datastream Excel plug-in possible.
Thanks a lot for your input!
0 -
Hi @bendavid.dalka, I've been trying to do something similar as you are, but am running into some problems there. Especially handling requests for so many companies using request tables. Could you please elaborate a bit more on your approach? Of course also via PM if that's better for you.
That would be incredible helpful, thanks!
0 -
Hi @rajat.agarwal, I simply split the data into a volume that the request table can handle (e.g. 80.000 companies). The issue I had first, is that I wanted to have both the number of shares (NOSH), as well as the undisturbed price (UP). For me, it was not really sensible to include both measures at once, because Datastream would output two lines per company, which makes copying the data annoying. So, I ran seperate requests for each measure. It is probably not the best way of doing it, because the macro does take some processing time, but it worked out fine for me.
0
Categories
- All Categories
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 683 Datastream
- 1.4K DSS
- 613 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 248 ETA
- 552 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 626 Refinitiv Data Platform Libraries
- 5 LSEG Due Diligence
- 1 LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 191 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 84 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛