Websocket connection is disconnecting using lseg.data library

Hi,
I am using v2.1.1 of the lseg.data Python library to stream realtime data.
My setup is working most of the time, but once in a while (every few days, but no set pattern), the websocket disconnects for ~2 seconds, during which I lose data.
I have identified one such event and captured the logs surrounding the event, attached.
Can someone please help me diagnose what the issue is?
I only using the lseg.data library, so all the connection / reconnection logic is handled by this library. The only thing I'm doing is starting a stream and letting it run.
Answers
-
Thank you for reaching out to us.
I have done a quick test and found that the library could reconnect and re-subscribe to items properly.
I found that the stream ID reaches 11940.
2025-08-11 01:54:05,523 - DEBUG - sessions.platform.default.0 - [OMMSTREAMING_PRICING_0.0] send {"ID": 11940, "Domain": "MarketPrice", "Streaming": true, "Key": {"Name": "LCOX5-J6"}}
Does the application subscribe to a lot of items?
You may contact the server team via LSEG Support to verify if the server cut the connection. Please include the URL of this discussion in your raised question to prevent it from being redirected back to this Q&A forum.
The library may not perform well when subscribing to a large number of streaming items. Please refer to the Choosing a Real-time Streaming API article for more information.
0 -
Hi @Jirapongse thank you for looking into this.
My application subscribes to up to ~2000 instruments - would you say that this is too many for the lseg.data library to handle?
I will reach out to LSEG support to see if they have any more insights.
0 -
It depends on the total of update rates of those instruments.
Did you see the following log in the log file?
[2025-08-13T14:11:10.681148+07:00] - [sessions.platform.deployed.0] - [DEBUG] - [35184 - Msg-Proc-ThreadOMMSTREAMING_PRICING_0.0] - [stream_connection] - [_get_messages] - [OMMSTREAMING_PRICING_0.0] received messages: 8 | processed messages: 8
You may consider disabling the debug log, as it can negatively affect the application's performance.
0 -
Yes, I have checked that received messages = processed messages. Just before the last disconnection I have in my logs:
received messages: 21256975 | processed messages: 21256975
I only enabled the debug logs because of this disconnection issue. Previously I had them disabled, but kept getting disconnected, which is why I turned them on.
I also contacted the support team and they say that it was a client-side disconnection. The logs from their side are pasted below.
Since I am using the lseg.data library, I am not handling any ping / reconnects myself. Is there anything else I can do to further diagnose this?
<ads-fanout-sm-az1-apse1-prd.1.ads:Info:Thu Aug 14 13:31:35.041971 2025> WS JSON2 disconnect from "GE-******" at position "10.1.210.54/lseg-realtime-acquisition-staging-7959d67c6d-h4w9z" on host "" and port number "19418" using application "256" on channel 73 (Client-side initiated disconnection). Reason:rsslRead() failed with code -1 and system error 0. Text:WS Code 1000 Normal Closure <END> <ads-fanout-sm-az2-apne1-prd.1.ads:Info:Wed Aug 13 18:12:08.269419 2025> WS JSON2 disconnect from "GE-******" at position "10.1.210.54/lseg-realtime-acquisition-staging-7959d67c6d-h4w9z" on host "" and port number "58125" using application "256" on channel 73 (Client-side initiated disconnection). Reason:rsslRead() failed with code -1 and system error 11. Text:<rwsReadTransportMsg:4444> Error:1002ipcRead() failure. System errno: (11)<END>
<ads-fanout-sm-az2-euw1-prd.1.ads:Info:Wed Aug 13 07:46:06.497160 2025> WS JSON2 disconnect from "GE-******" at position "10.1.210.45/lseg-volume-stream-5778f9bb47-v6m48" on host "" and port number "57606" using application "256" on channel 185 (Server-side initiated disconnection). Reason:Client application did not ping. <END>
0 -
There are both Client-side initiated disconnection and Server-side initiated disconnection.
Do you have the full debug log files when the problem occurred?
If yes, please share those files. We can check the ping/pong messages from the debug log files.
[2025-08-15T17:14:21.675195+07:00] - [sessions.platform.ldpv2.0] - [DEBUG] - [35244 - Msg-Proc-ThreadOMMSTREAMING_PRICING_0.0] - [omm_stream_connection] - [_process_message] - [OMMSTREAMING_PRICING_0.0] process message {"Type": "Ping"}
[2025-08-15T17:14:21.675195+07:00] - [sessions.platform.ldpv2.0] - [DEBUG] - [35244 - Msg-Proc-ThreadOMMSTREAMING_PRICING_0.0] - [stream_connection] - [send_message] - [OMMSTREAMING_PRICING_0.0] send {"Type": "Pong"}0 -
I have looked into my debug logs and found the 3 corresponding disconnection events to match up with what was reported by your support team on the server side.
I see that there indeed seems to be different disconnect reasons.
I've attached the logs on my side for the first 2 instances. In order to send it here, I have had to apply a filter. Even within a single 20 second ping/pong window, I have >100MB of logs. The filter I applied removes any line that contains either "ELEKTRON_DD" or "received_messages". Please let me know if there's a better way for me to create a filter such that we don't lose any relevant information.
The file names are timestamped to the disconnection event. I have just realized the third event (at Wed Aug 13 07:46:06.497160) was from a test application, so we do not need to look into that one.
Appreciate your time looking into this and any insights you can provide.
0 -
The server sends a ping message every 20 seconds. If it doesn't receive a response (a pong message) from the client within 30 seconds, it will terminate the connection.
I reviewed the log file and found that the ping and pong messages appear to be functioning correctly.
If you're connecting to the server through a proxy or firewall, please check with your IT support team to determine whether the disconnections are originating from those systems.
I’ve reached out to the product team to verify the log files.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 37 Alpha
- 167 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 704 Datastream
- 1.5K DSS
- 633 Eikon COM
- 5.2K Eikon Data APIs
- 14 Electronic Trading
- 1 Generic FIX
- 7 Local Bank Node API
- 6 Trading API
- 2.9K Elektron
- 1.5K EMA
- 257 ETA
- 566 WebSocket API
- 40 FX Venues
- 16 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 25 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 283 Open PermID
- 47 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 24 RDMS
- 2.1K Refinitiv Data Platform
- 844 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 122 Open DACS
- 1.1K RFA
- 107 UPA
- 195 TREP Infrastructure
- 232 TRKD
- 918 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 101 Workspace SDK
- 11 Element Framework
- 5 Grid
- 19 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛