Downloadding Ownership data for US listed REITs

Dear Sir/Madam,
I am a PhD student in the REP department at University of Reading, currently working on a research project that requires historical investor ownership data for European and US-listed Real Estate Investment Trusts (REITs).
Our university provides access to LSEG data, and I have been attempting to use the LSEG Data Library for Python (via Anaconda/Jupyter Notebook) to download this information. I have successfully downloaded all the investor data for European listed REITs, however I can't download the data for US listed REITs. I can't even access the RICs for the US market so I have to import a csv file which I generated for all the RICs to proceed the downloading, while I can use python to get the RICs for the European listed REITs. Also, while I can establish a session and download some financial data, I am encountering persistent and significant issues when trying to retrieve **investor ownership data** for a list of US REITs.
The primary errors I'm encountering, even after extensive troubleshooting (including fetching data for one instrument at a time, adjusting request batch sizes, implementing significant delays between requests, and multiple retries), are network-related and include:
Asynchronous Query library internal error
ReadTimeout('timed out')
[WinError 10054] An existing connection was forcibly closed by the remote host
These errors often point to the LSEG API endpoint `http://localhost:9000/api/udf`, suggesting potential issues with the local LSEG service/proxy or the API endpoint's stability for these specific data requests. Interestingly, I can retrieve some data types for individual US RICs, which suggests my basic API access is functional, but bulk or specific ownership data retrieval is problematic. Content Set : Ownership (Shareholdings)
I have consulted your LSEG Helpdesk and runned a live session, after several shooting they suspect that this could be a developer level issue, because they have confirm that the code is running.
I have attached the code and the log file for your reference.
Many thanks in advance,
Best regards,
Yusuf Zhang
Answers
-
Thank you for reaching out to us.
I checked the log file and found that the problem happens with this request.
{'Entity': {'E': 'DataGrid_StandardAsync', 'W': {'requests': [{'instruments': ['FRT.N'], 'fields': [{'name': 'TR.INVESTORFULLNAME'}, {'name': 'TR.PCTOFSHARESOUTHELD'}, {'name': 'TR.ADJSHARESHELD'}, {'name': 'TR.INVESTORTYPE'}, {'name': 'TR.INVESTORFULLNAME.DATE'}], 'parameters': {'SDate': '-50Y', 'EDate': '0D'}}]}}}
It seems you requested data dating back 50 years.
The library used in the desktop session is not intended for high-volume data requests. If you request large amounts of data, the request may time out.
I tried the following code to get data dating back 1 week.
ld.get_data( universe=['FRT.N'], fields = ['TR.INVESTORFULLNAME','TR.PCTOFSHARESOUTHELD','TR.ADJSHARESHELD','TR.INVESTORTYPE','TR.INVESTORFULLNAME.DATE'], parameters = {'SDate':'-1W','EDate':'0D'} )
The response contains 12792 rows × 6 columns.
For more information, please refer to the LSEG Data Library for Python - Reference Guide.
0 -
Hi Jirapongse,
Thank you for your reply. I was using the same code to get 50 years of historical ownership data for all the public listed real estate companies in the Europe. My understanding is that the API is supposed to be used to get "large" data. I tried to shorten the duration from 50 years to 5 years, but the problem remains.
Best regards,
Yusuf
0 -
No, the get_data method is not designed to get large data.
You may contact your administrator, who can then reach out to the LSEG account or sales team for alternative solutions.
0 -
The get_data method supposed to be designed to get large data, plus I can get large data for European market via the get_data method, it is clearly a design fault that I can't get the same data for US market.
Please fix it not try to push the problem away!!!!
0 -
I was stated in the document.
If the application requests a lot of data, the server can timeout the request.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 690 Datastream
- 1.4K DSS
- 629 Eikon COM
- 5.2K Eikon Data APIs
- 11 Electronic Trading
- 1 Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 255 ETA
- 559 WebSocket API
- 39 FX Venues
- 15 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 25 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 279 Open PermID
- 45 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 23 RDMS
- 2K Refinitiv Data Platform
- 716 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 106 UPA
- 194 TREP Infrastructure
- 229 TRKD
- 918 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 95 Workspace SDK
- 11 Element Framework
- 5 Grid
- 19 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛