Microsecond and nanosecond precision timstamp support in RFA 7.6 version

Is
the microsecond and nanosecond precision timestamp only available in RFA 8?
In
RFA 8.0 README I see
Time and DateTime Microsecond and Nanosecond Support
----------------------------------------------------
This release contains Time and DateTime enhancements to
support microsecond and nanoseconds granularity.
Note: Applications based on RWF Major Version 14 and Minor Version 0 do
not have the ability to recognize microsecond and nanosecond precision time
values. If receiving content encoded with this larger precision,
the older versions of RFA's decoder will result in an exception.
Applications can be coded to handle the exception and continue decoding
the remaining content in the message.
Our application uses the RFA 7.6
version, so will it not be able to get the any microsecond and nanosecond
precision time value and get exception? If our application tries
to call DataBuffer::getAsString(), will we still get an exception ? What
is the recommended way to handle such fields in RFA 7 application?
Best Answer
-
The
microsecond and nanosecond support in Time and DateTime field currently is
available only in RFA 8.x version. Once RFA 7.6.x version or older application
tries to decode the field published with microsecond and nanosecond
(i.e. via the DataBuffer::getTime(), DataBuffer::getAsString(),
DataBuffer::isBlank()), RFA will throw an invalid usage exception.Below is the example of exception in 7.6.2 vesion.
Exception Type: InvalidUsageException
Exception Severity: Error
Exception Classification: IncorrectAPIUsage
Exception Status Text: Data decoding failed in DataBuffer::getTime(); Reason: RSSL_RET_INCOMPLETE_DATARFA
7.x application needs to handle the exception and continue decoding the
remaining content in the message (i.e. next field entry). This means that application will not be ableto receive any value from the file. We would recommend updating RFA version to 8.x version.Anyway, if there is no nanosecond and
microsecond precision published in the Time and Date Time field, RFA 7.x will
decode TIME and DATETIME data type properly.0
Answers
-
My understanding is that in order to convert to string, the API still needs to decode the value before it can convert.
Therefore, from a programming perspective you will either need to upgrade to v8 API or amend your code handle the exceptions - if your code does not just decode a predetermined set of fields i.e. there is a chance it will attempt to decode one of the newer fields.
From the perspective of your Market Data team, they should be provided with some workarounds they can implement at the TREP level. e.g. having a dedicated set of ADS servers for legacy API applications - where the ADS filters out / does not send out the newer fields. They should contact their technical account manager or the TREP helpdesk for further guidance.
0 -
It is may be interesting to note that in order to preserve
the compatibility of existing application, the microsecond and nanosecond
precision time values have only been introduced in new fields. These fields have
an acronym with a suffix of “_NS” (like “ASK_TIM_NS”). This means that applications that do not try to decode these new
“_NS” fields will not experience the InvalidUsageException.Now the question is: How does your application decode the
fields it receives?- If it iterates through the whole field list and tries to
decode each field whatever its name is, then it will probably
experience the InvalidUsageException. To fix this issue you will have either to
change your code to handle the exception (as described above by Veerapath) or
to upgrade your application to RFA 8 (recommended solution). - If your application iterates through the field list but only
decodes the fields it knows (either because it recognizes them by the acronym or
by the field id), then there’s no chance your application tries to decode any
of the new “_NS” fields. In that case, you don’t have to worry about the
exception and you can keep your RFA version. Obviously, you will not
benefit from the new microsecond and nanoseconds precision.
0 - If it iterates through the whole field list and tries to
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 687 Datastream
- 1.4K DSS
- 623 Eikon COM
- 5.2K Eikon Data APIs
- 11 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 255 ETA
- 557 WebSocket API
- 38 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 276 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 684 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 27 DACS Station
- 121 Open DACS
- 1.1K RFA
- 105 UPA
- 194 TREP Infrastructure
- 229 TRKD
- 918 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 91 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 48 中文论坛