Downloadding Ownership data for US listed REITs

Dear Sir/Madam,

 I am a PhD student in the REP department at University of Reading, currently working on a research project that requires historical investor ownership data for European and US-listed Real Estate Investment Trusts (REITs).

Our university provides access to LSEG data, and I have been attempting to use the LSEG Data Library for Python (via Anaconda/Jupyter Notebook) to download this information. I have successfully downloaded all the investor data for European listed REITs, however I can't download the data for US listed REITs. I can't even access the RICs for the US market so I have to import a csv file which I generated for all the RICs to proceed the downloading, while I can use python to get the RICs for the European listed REITs. Also, while I can establish a session and download some financial data, I am encountering persistent and significant issues when trying to retrieve **investor ownership data** for a list of US REITs.

The primary errors I'm encountering, even after extensive troubleshooting (including fetching data for one instrument at a time, adjusting request batch sizes, implementing significant delays between requests, and multiple retries), are network-related and include:

Asynchronous Query library internal error

ReadTimeout('timed out')

[WinError 10054] An existing connection was forcibly closed by the remote host

These errors often point to the LSEG API endpoint `http://localhost:9000/api/udf`, suggesting potential issues with the local LSEG service/proxy or the API endpoint's stability for these specific data requests. Interestingly, I can retrieve some data types for individual US RICs, which suggests my basic API access is functional, but bulk or specific ownership data retrieval is problematic. Content Set : Ownership (Shareholdings)

I have consulted your LSEG Helpdesk and runned a live session, after several shooting they suspect that this could be a developer level issue, because they have confirm that the code is running.

I have attached the code and the log file for your reference.

Many thanks in advance,

Best regards,

Yusuf Zhang

Answers

  • Jirapongse
    Jirapongse ✭✭✭✭✭

    @Yusuf212122

    Thank you for reaching out to us.

    I checked the log file and found that the problem happens with this request.

    {'Entity': {'E': 'DataGrid_StandardAsync', 'W': {'requests': [{'instruments': ['FRT.N'], 'fields': [{'name': 'TR.INVESTORFULLNAME'}, {'name': 'TR.PCTOFSHARESOUTHELD'}, {'name': 'TR.ADJSHARESHELD'}, {'name': 'TR.INVESTORTYPE'}, {'name': 'TR.INVESTORFULLNAME.DATE'}], 'parameters': {'SDate': '-50Y', 'EDate': '0D'}}]}}}

    It seems you requested data dating back 50 years.

    The library used in the desktop session is not intended for high-volume data requests. If you request large amounts of data, the request may time out.

    I tried the following code to get data dating back 1 week.

    ld.get_data(
        universe=['FRT.N'],
        fields = ['TR.INVESTORFULLNAME','TR.PCTOFSHARESOUTHELD','TR.ADJSHARESHELD','TR.INVESTORTYPE','TR.INVESTORFULLNAME.DATE'],
        parameters = {'SDate':'-1W','EDate':'0D'}
    )
    

    The response contains 12792 rows × 6 columns.

    For more information, please refer to the LSEG Data Library for Python - Reference Guide.

  • Yusuf212122
    Yusuf212122 Newcomer

    Hi Jirapongse,

    Thank you for your reply. I was using the same code to get 50 years of historical ownership data for all the public listed real estate companies in the Europe. My understanding is that the API is supposed to be used to get "large" data. I tried to shorten the duration from 50 years to 5 years, but the problem remains.

    Best regards,

    Yusuf

  • Jirapongse
    Jirapongse ✭✭✭✭✭

    @Yusuf212122

    No, the get_data method is not designed to get large data.

    You may contact your administrator, who can then reach out to the LSEG account or sales team for alternative solutions.

  • Yusuf212122
    Yusuf212122 Newcomer

    The get_data method supposed to be designed to get large data, plus I can get large data for European market via the get_data method, it is clearly a design fault that I can't get the same data for US market.

    Please fix it not try to push the problem away!!!!

  • Jirapongse
    Jirapongse ✭✭✭✭✭

    @Yusuf212122

    I was stated in the document.

    image.png

    If the application requests a lot of data, the server can timeout the request.