Where can I check how much data I have extracted using API calls for the day so as to prevent myself

Where can I check how much data I have extracted using API calls for the day so as to prevent myself from hitting the daily limit?
I have been checking the Logs from the Configuration Manager and in the log text file I get the information of my usage. Can you please let me know how frequent do the logs get updated in the log file because they do not seem to be updated dynamically on a quick real time basis?
Also, I would like to know if there is a better approach to find the amount of usage of data I have done and/or the number of API calls.
So, there is no direct way possible to get the data usage limit alert. I would like to know how frequent the logs are updated in the configuration file so that I can look into it. Can you please help me with this?
(Raising it on behalf of external client)
Answers
-
Thank you for reaching out to us.
To the best of my knowledge, there is currently no API available to check the usage limits for Workspace Desktop or Data Platform sessions.
For assistance with log files, please contact the product support team directly.
0 -
Hi @meenakshi.j ,
Currently, you can check the amount of your data usage by the step below
- Turn on the Debug log in Configuration manager in Workspace Application
- In LSEG Workspace Configuration Manager, go to Logs page in Advanced section, then set Trace Level as Debug as the photo below. (This trace level may impact system performance, so consider using this only when you need to see the debug log level for troubleshooting stuff)
- Then click OK to apply this configuration and please restart LSEG Workspace to let the changes taking an effect.
- Then on your Python script, set log level to debug on the configuration file or in-line code and do the monkey patching to print out the response headers in the http_service file of the LSEG Data Library
# Monkey Patching import lseg.data._core.session.http_service as http_service from httpx import Response # Save original method if needed original_request = http_service.HTTPService.request # Define your custom version def custom_request(self, request): response: Response = self._client.send(request) print(f"HTTP Response id {request.id} {response.headers}") return response # Monkey-patch the method http_service.HTTPService.request = custom_request # Set log level to debug and turn on printing the log into console (or file) import lseg.data as ld config = ld.get_config() config.set_param("logs.transports.console.enabled", True) # config.set_param("logs.transports.file.enabled", True) config.set_param("logs.level", "debug") ld.open_session()
After that, when you run the Data Library's function to retrieve the data, the limit-remaining detail printed in the http response headers will be shown in console log (or log file, depend on how the configuration has been set)
More detail of the limit can be found inside the Refinitiv Workspace Logs folder, browse into the sub-folder with the most recently created folder with the "Desktop.<date>.<time>.p<process-ID>" folder. Then check the file named node-sxs.<date>.p<process-ID>.log
Article related to this with more information and detail will be published by next month, I'll keep you posted.
1 - Turn on the Debug log in Configuration manager in Workspace Application
Categories
- All Categories
- 3 Polls
- 6 AHS
- 37 Alpha
- 168 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 710 Datastream
- 1.5K DSS
- 637 Eikon COM
- 5.3K Eikon Data APIs
- 19 Electronic Trading
- 1 Generic FIX
- 7 Local Bank Node API
- 11 Trading API
- 3K Elektron
- 1.5K EMA
- 260 ETA
- 571 WebSocket API
- 42 FX Venues
- 16 FX Market Data
- 2 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 26 Messenger Bot
- 4 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 285 Open PermID
- 47 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 25 RDMS
- 2.3K Refinitiv Data Platform
- 17 CFS Bulk File/TM3
- 934 Refinitiv Data Platform Libraries
- 5 LSEG Due Diligence
- 1 LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 46 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 126 Open DACS
- 1.1K RFA
- 108 UPA
- 197 TREP Infrastructure
- 232 TRKD
- 924 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 106 Workspace SDK
- 11 Element Framework
- 5 Grid
- 19 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛