Incorrect Daylight Saving Time adjustment of historical Hourly summaries in NYSE equities
When requesting Hourly bars for NYSE equities we have noticed that the raw UTC time stamps have incorrectly been adjusted to account for Daylight Saving Time. The global metadata provides definitions for all existent time zones, which includes UTC offset, DST offset, and DST start and end dates, but only for the current year. NYC is defined as follows, with the UTC offset being -300 and DST offset being -240:
When we perform the time zone conversion for all the data records from UTC to exchange time to display in the chart, we use the relevant time zone definition to know when to use the DST offset and when not to. Hourly data goes back a year for NYSE equities so we get summarized data back to June of 2023. Because the DST definition does not account for last year the code is not able to adjust last year's data for DST.
Here is a concrete example...
The NYSE trades from 0930 to 1602 local time, so every trading day there are 8 hourly bars stamped (start-of-period) 0900, 1000, 1100, 1200, 1300, 1400, 1500, 1600. The data we receive from the Summaries API is stamped with UTC times and, when DST is in effect (records between 3/10/2024 and 11/3/2024), should be stamped +240, or: 1300, 1400, 1500, 1600, 1700, 1800, 1900, 2000. So far so good. Prior to 3/10/2024 the UTC bars we receive should be stamped +300, or 1400, 1500, 1600, 1700, 1800, 1900, 2000, 2100. Also good. The problem is when the data crosses into last year's DST prior to 11/5/2023 it is adjusted back to the DST offset, so we're getting 1300-2000. When we convert to local time we don't know about last year's DST definition so we adjust using the normal offset of -300 and the time stamps are all off by an hour.
UTC data that we receive should only be adjusted for DST for the current year as defined by the global metadata time zone definition. Hourly data prior to the current year's DST should not be adjusted for DST. This is how we receive it from TSI in Eikon.
Best Answer
-
Hello @cory.schmidt.1
After investigating your question, it seems that this is a content issue.
On the library side, we are requesting historical data from the backend as is and we do not process it further, like we do for streaming updates.
For your convenience I have submitted a ticket on your behalf with the number 13630510.
If you identify further content issues related to historical data, feel free to submit a ticket yourself.
Best regards,
Baciu Wahl Cristian
0
Categories
- All Categories
- 3 Polls
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 684 Datastream
- 1.4K DSS
- 615 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 249 ETA
- 554 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.9K Refinitiv Data Platform
- 643 Refinitiv Data Platform Libraries
- 4 LSEG Due Diligence
- LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 192 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 90 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛