I'm currently testing the Python API and it is looking really great so far, but I'd currently like to turn my testing up a notch to the next level and dynamically retrieve a bunch of data to run through a model I'm using for the test. I'm hoping there is a good solution for this (dynamically getting data items) chances are I am probably missing something here, so any pointers highly appreciated.
The problem I'm encountering is two-fold:
1) For a larger test, it's too time consuming to manually copy/paste each data item from the Eikon Data Item Browser in order to query the data from within my python code. Just the data items of StarMine Models filtered for Equity asset class alone will take me an estimated 2 or 3 hours to manually copy into the code (based on several minutes of working and measuring my progress over time) - and there are thousands upon thousands of available data items, which is really great, but manual copy/paste is not a feasible way for me to test in that case. Wasn't able to find a bulk export option in either the Data Item Browser built into the scripting proxy, or the one in Eikon web -- an option I was hoping to find in one of them.
What would be really great, and I am imagining this must exist in some form, is to be able to dynamically query all of the data fields from the Python API similar to how you can filter by types, categories, classification, etc, and isolate data items available in the Data Item Browser except do this part inside my code -- Or, alternately, be able to bulk-download the data item lists from within the Eikon Data Item Browser interface into a csv or some-such machine readable format.
I've looked through the entire API and didn't see an obvious direct way to do this but I'm guessing there must be something exposed for this since it's all already exposed right there in Eikon.
Would I need to use eikon.send_json_request() for something like this? If so, how would I go about determining the data structure to send in, or options to query, etc?
Or, perhaps is there a way to bulk export data from the Data Item Browser that I might have missed somehow?
Update: One hacky temporary work-around solution I just thought of and which seems like it would work, is since Eikon web is dynamically generating html, I can manually save the html as text then write some code that reads the html with a dom parser and extracts the available data items -- but this feels pretty hacky/kludgy and seems like a pain and there must be a better way.
Note: I'm currently running the data item browser from Eikon web on Linux.
Thanks in advance for your help!
This is a great idea. We should definitely have an API to queries data items. Most likely we wouldn't want to expose the entire database/metadata for a bulk download - as it would be a very large request. Instead to have a query approach for a given string (ie "revenue") and return all the items that match that string, or given a set of metadata values (ie real_time=True AND asset_class='FX') or something like that.
Then a second API request to get more info per data item. Ie get_fld_info("TR.Revenue") or a list of fields - to get all the surrounding metadata.
Which is basically exposing the two APIs behind Data Item Browser.
Would this work for you? Or do you have a better example of what you are trying to achieve?
Regarding the Linux API/Web access - we are working in a proxy solution that will allow simultaneous access to both Web and API - so it doesn't log you out. I will update you as soon as we have a beta version of this.