About the Cleanup instruction

Hello everybody,
I remark the instruction "Cleanup" at the end of the examples given by the example application.
I presume I have to pass it all objects created in the function I write. I tempted to read a documentation about that, the search engine was not very direct about that. This is to clean the memory, and changes nothing on the server, right ?
Best Answer
-
To cleanup is to delete the extracted files, the schedule, the instrument list and report template. The C# example app has a "Cleanup" example under "Scheduled Extractions" that shows how you could do that. Both the C# example app and the tutorials cleanup after execution, so as not to leave any unused objects in your environment.
But cleaning up is not an obligation. If you created a recurring schedule, then you certainly don't want to delete its objects, because the schedule will require them the next time it runs !
Cleaning up or not depends on your workflow. If you want to reuse objects you created (be it the instrument list, the report template or the schedule), then you should not delete them !
Added later:
When you create objects like an instrument list, a report template, or a schedule, they are stored on the DSS server, and will stay there until you delete them. So the best practice is to delete them IF and WHEN you do not need them any more.
So just to be quite clear: the cleanup is not about garbage cleaning, process memory and such. It is just the action of deleting objects that are no longer required (instrument list, report template, schedule).The C# example app implemented that and named it "Cleanup", but it is not an API call, that is why it is not in the documentation.
0
Answers
-
Hello, thank you for your answer.
But cleaning up is not an obligation. If you created a recurring
schedule, then you certainly don't want to delete its objects, because
the schedule will require them the next time it runs !Well, this is exactly why I wanted to verify whether the aim of Cleanup is just to clean up the memory, and changes nothing on the server.
Of course, the next time, I get the schedule back by its name (rather than its id, as I have to create it again if it is not found).
0 -
Oh, thank you.
Sorry for the delay, I read more clearly today.
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 37 Alpha
- 167 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 705 Datastream
- 1.5K DSS
- 633 Eikon COM
- 5.2K Eikon Data APIs
- 14 Electronic Trading
- 1 Generic FIX
- 7 Local Bank Node API
- 6 Trading API
- 3K Elektron
- 1.5K EMA
- 259 ETA
- 569 WebSocket API
- 40 FX Venues
- 16 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 25 Messenger Bot
- 4 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 284 Open PermID
- 47 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 24 RDMS
- 2.2K Refinitiv Data Platform
- 879 Refinitiv Data Platform Libraries
- 5 LSEG Due Diligence
- 1 LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 123 Open DACS
- 1.1K RFA
- 108 UPA
- 196 TREP Infrastructure
- 232 TRKD
- 919 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 103 Workspace SDK
- 11 Element Framework
- 5 Grid
- 19 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛