question

Upvotes
Accepted
1 1 3 4

Workspace API -> Docker Container Application

Hi,

We are a Workspace user and have developed a series of dashboards using Python / Streamlit library.

When trying to containerize our app with docker we face issues as the docker image is unable to detect that Workspace is open and running.

1. What would be the best practice of consuming the real-time streaming data with Refinitiv tools (to be able to integrate into our application that is containerized with Docker)? Or what would be the alternative method to achieve our objectives?

2. Is there a limitation of the size of instrument list when opening a data stream? Is so, what would be the workaround if we have a huge list of instruments that we’d like to track where we just need last price, or last available traded price only?

Thanks!

workspace-data-api#technologydatastream-apidocker
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvote
Accepted
86k 292 53 78

@ajaber

If you uses the Refinitiv Data Library for Python in a docker container, you can change the base-url configuration of the desktop.workspace session to "http://host.docker.internal:9000" in the RD library configuration file (refinitiv-data.config.json). However, you need to know the TCP port used by the API proxy server. The default port is 9000. For example:

  },
        "desktop": {
            "workspace": {
                "app-key": "<app-key>",
                "base-url": "http://host.docker.internal:9000"
            }
        }
    }


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Thank you @Jirapongse Does that mean that we can use desktop session even in docker?
Yes, I tested this setting with the rd.get_data an rd. get_history methods in docker and it works.
Thanks, I’ll give it a shot and keep you posted. Question: what’s the difference then between open_streams and OMM? What would be the use case in the latter?
Upvote
27.5k 67 18 14

Hello @ajaber

### Docker ###

I am assuming you are using the Workspace SDK or Refinitiv Data Library for Python with "Desktop Session", please correct me if I am wrong. If so, please be informed that the Desktop session needs Refinitiv Workspace/Eikon running in the same machine as the API proxy. The Workspace/Eikon does not support Docker, so I do not think the Desktop session application can be containerize.


Alternatively, the Refinitiv Data Library for Python supports the "Platform Session" that connects and consumes data from the Refinitiv Data Platform (RDP) and Real-time Optimized (RTO). It does not require the Workspace/Eikon to run in the same machine.

You can find the example and tutorials for Streaming on the GitHub repository page.

### DataStream ###

There is the document about the limitation on the DSWS user stats and limits page.

You may need to split the instrument list to be a smaller size in each request.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

@wasin.w Yes the workspace is running on the same machine we are aware of this requirement
Upvotes
27.5k 67 18 14

Hi @ajaber

Additionally about the Refinitiv Data Library for Python "Platform Session", it requires the RDP credentials (username, password, and App Key) account and it has different permissions set from the Desktop session's App Key.

You may need to contact your Refinitiv representative/Account Manager to help you with the RDP access.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Thank you @wasin.w. I managed to recreate a new AppKey from workspace and enabled RDP API. After that we tried the OMM stream to get CF_LAST but that field is not even present.

Hi @ajaber

Unfortunately, the CF_XXX fields are available via Eikon/Workspace only.

@wasin.w and the alternative?
Upvotes
1 1 3 4

@wasin.w @Jirapongse

What’s the difference here between OMM and the open_streams? And what are the limits.


Our ultimate objective is to be able to get the following fields:


- trading status: ie is the stock open for trading, or is market closed

- last traded price

- if last traded price is null, then we need the last price available and it’s respective date

- last bid

- last ask

- volume

- close price or official close price at a predetermined date.


From the above what we need in real-time stream are:

- last traded price

- last bid

- last ask

- volume


Would appreciate your help on this please.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
27.5k 67 18 14

Hello @ajaber

### OMM Stream and open_stream ###

Could you please clarify the "open_streams"? I cannot find this class or method anywhere in the GitHub https://github.com/Refinitiv-API-Samples/Example.DataLibrary.Python repository.

The omm_stream interface is the Delivery layer interface for consuming real-time data from Refinitiv Real-Time Optimized (RTO - cloud hosted) and Refinitiv Real-Time Distribution System (deployed RTDS). It supports Refinitiv Real-Time data domains such as Market Price, Market By Price, etc.

Onces you create the omm_stream object, you can call <omm_stream obj>.open() method to start streaming data request flow.

### Real-Time fields ###

About the real-time fields, you can try these fields:

  • - last traded price: TRDPRC_1 "Last trade price or value."
  • - last bid: CLOSE_BID "Last bid price of the day."
  • - last ask: CLOSE_ASK "Last ask price of the day."
  • - volume: ACVOL_1 "Today's total trading volume."

However, I admit that I am not the field/content expert. The field definition/behavior of each RIC are also different based on its exchange. I highly recommend you contact the Content support team directly to help you with the field definition. You can contact the team via the https://my.refinitiv.com/content/mytr/en/helpandsupport.html website.

content-questions-2.png



icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

@wasin.w thank you for the clarification, to answer you question above:

1. By open_streams, I meant Price Streaming with Data Frame.

2. Will test the fields above and revert back.

Seperately, we continue to have an issue accessing refinitiv Workspace in Docker, despite having “base-url” with the right port number configured in refinitiv-data.config.json as @Jirapongse suggested, the session still failed to open. There’re a few points taken during the debugging process:

- Port number for Workspace identified in the local machine is 9015, and Workspace is confirmed running well (since accessing http://localhost:9015/api/status gives status code of “ST_PROXY_READY”)


1693555959004.png

- Based on the log in DEBUG mode, the desktop session is initally opened with port 9015. However, there is a warning stating that the file “.portInUse” was not found, then it falls back to the default port 9000, which turns out to fail in opening the desktop session.

1693555996404.png

@ajaber

Yes, this is a problem when using the RD Library on Docker container. You need to know the TCP port used by the Eikon API Proxy on the hosted mahcine.

The base URL in a Docket container must be host.docker.internal, not localhost.

@Jirapongse yes the base URL is correct. below is the log, any thoughts please?

1694461364544.png


1694461398413.png (153.3 KiB)
Show more comments
Upvotes
27.5k 67 18 14

Hello @ajaber

Thank you for the information. The "rd.content.pricing.Definition" interface in EX-2.02.02-Pricing-StreamingWithDataFrame.ipynb example is the Content layer interface that optimized for the Market Price data only.

The content layer refers to logical market data objects, largely representing financial items, such as level 1 market data prices and quotes, order books, news, historical pricing, company research data and so on. These objects are built on top of the delivery abstraction layer and provide value-added capabilities to manage and access the content within the interface.

The omm_stream interface is the Delivery layer interface that supports any data domain (Market Price, Market By Price, Market By Order, etc).

The delivery layer implements classes that can be used to interact with the different delivery modes provided by the Refinitiv Data Platform to retrieve data:

  • Request (HTTP Request/Response)
  • Stream (WebSockets)
  • Queue (Alerts)
  • Files (bulk)

For each of these delivery modes, defined classes can be used to retrieve data from the platform in raw JSON format.

Classes defined in the delivery layer are not dependent on any specific data service. In other words, they are service agnostic. Designed as a low abstraction layer of the library, the delivery layer targets developers who need specific features that are not offered by other higher level abstraction layers (content and access).

refinitiv-data-platform-libraries-layers.png

You can find more detail about the RD Library layers in this https://cdn.refinitiv.com/public/rd-lib-python-doc/1.0.0.0/book/en/sections/concepts-and-design.html document.



icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
27.5k 67 18 14

(Cont) @ajaber

About the difference between the "rd.content.pricing.Definition" and the "omm_stream" interface, the omm_stream supports any data-domain (level 1, level 2, etc) request while the pricing support level 1 market data only.

You can use the Python help() command to check the class detail as follows:

pricing.png

omm.png


pricing.png (56.0 KiB)
omm.png (64.9 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.