Hi team,
On 22 Mar the client exceeded 10 million data points per month. So asked our product side to reset it. Then had removed DSWSAPIC from ZMUT001 accordingly. But the client is still seeing the below error message .
“HTTPConnectionPool(host='datastream.thomsonreuters.com', port=80): Max retries exceeded with url: /DswsClient/V1/DSService.svc/rest/Data?token=<token>&instrument=%40AAPL%28RI%29&datatypes=&datekind=TimeSeries&start=1980-01-01&end=2022-04-27&freq=D (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002224A2ED8B0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))”
Could anybody help this?
Regards,
Hiro
The error doesn't relate to the limitation. I think the client is using the old hostname (datastream.thomsonreuters.com). The current DatastreamDSWS hostname is product.datastream.com.
Please verify the name and version of the datastream library the client is using. The client can use our DatastreamPy library.
According to the client he is not using the hostname. He is guessing that the library name has been changed. Now using "import PyDSWS". So please advice the correct library name.
Regards,
Hiro
PyDSWS is not maintained by Refinitiv. We only maintain the DatastreamPy and DatastreamDSWS libraries. We recommend using DatastreamPy which is a new name for DatastreamDSWS.
The client can submit this issue to the PyDSWS developer via GitHub to fix this problem or migrate the application to use our DatastreamPy library. The client can refer to the Getting Started with Python and DSWS Python Tutorial that explain how to use this library.