Cotton Continuation contracts pulling CRT_MNTH and EXPIR_DATE

Hi,
I have code below
http://import lseg.data as ld import pandas as pd import time from datetime import datetime, timedelta from dateutil.relativedelta import relativedelta # ------------------------------------------------------------------------------ # 1. Open LSEG Data session # ------------------------------------------------------------------------------ print("Opening LSEG Data session...") ld.open_session() print("Session opened.\n") # ------------------------------------------------------------------------------ # 2. Define RICs and requested fields # ------------------------------------------------------------------------------ continuation_rics = [ "CTc1", "CTc2", "CTc3", "CTc4", "CTc5", "CTc6", "CTc7", "CTc8", "CTc9", "CTc10" ] requested_fields = [ ] # ------------------------------------------------------------------------------ # 3. Define date range and chunking logic # ------------------------------------------------------------------------------ start_date_str = "2000-01-01" end_date_str = "2024-03-02" start_date = datetime.strptime(start_date_str, "%Y-%m-%d") end_date = datetime.strptime(end_date_str, "%Y-%m-%d") print(f"Date range set from {start_date_str} to {end_date_str}.\n") chunk_years = 1 # Chunk size: 1 year at a time current_start = start_date all_chunks = [] # ------------------------------------------------------------------------------ # 4. Fetch data in chunks and log timing # ------------------------------------------------------------------------------ while current_start <= end_date: # Determine chunk end (inclusive) chunk_end = current_start + relativedelta(years=chunk_years) - timedelta(days=1) if chunk_end > end_date: chunk_end = end_date chunk_start_str = current_start.strftime("%Y-%m-%d") chunk_end_str = chunk_end.strftime("%Y-%m-%d") print(f"Fetching data from {chunk_start_str} to {chunk_end_str}...") chunk_start_time = time.time() df_chunk = ld.get_history( continuation_rics, requested_fields, start=chunk_start_str, end=chunk_end_str, interval="1d" # Assuming daily data intervals; adjust if needed. ) chunk_end_time = time.time() print(f"Chunk fetched in {chunk_end_time - chunk_start_time:.2f} seconds.\n") if df_chunk is not None and not df_chunk.empty: all_chunks.append(df_chunk) else: print("No data returned for this chunk.") # Move to the next chunk (day after the current chunk_end) current_start = chunk_end + timedelta(days=1) # ------------------------------------------------------------------------------ # 5. Combine the chunks and flatten multi-index columns if necessary # ------------------------------------------------------------------------------ if all_chunks: full_df = pd.concat(all_chunks) full_df.drop_duplicates(inplace=True) print(f"Combined data shape after concatenating chunks: {full_df.shape}") else: full_df = pd.DataFrame() print("No data retrieved from any chunks.") if full_df.empty: print("No data in the final DataFrame. Exiting.") else: # If the DataFrame has multi-index columns, flatten them if isinstance(full_df.columns, pd.MultiIndex): full_df.columns = [ f"{col[0]}_{col[1]}" if col[1] else str(col[0]) for col in full_df.columns ] print("Flattened multi-index columns.\n") # ------------------------------------------------------------------------------ # 6. Log missing fields for specific instruments (CTc3 to CTc10) # ------------------------------------------------------------------------------ # Check for each instrument whether CRT_MNTH and EXPIR_DATE are present. missing_info_instruments = [] for ric in continuation_rics: col_crt = f"CRT_MNTH_{ric}" col_exp = f"EXPIR_DATE_{ric}" if col_crt not in full_df.columns or col_exp not in full_df.columns: missing_info_instruments.append(ric) if missing_info_instruments: print("Note: The following instruments did not return data for CRT_MNTH and/or EXPIR_DATE:") print(missing_info_instruments) else: print("All instruments returned CRT_MNTH and EXPIR_DATE data.\n")
My issue is that some data such as CRT_MNTH and EXPIR_DATE is failing to come in prior to about 2022 for all Continous contracts before 2022, what do I need to change in the code?
Could please provide the fix?
Answers
-
Hi Dev Team, I am Kevin from Frontline Helpdesk currently assisting the user. Our content team informed us that these fields CRT_MONTH and EXPIR_DATE only started retrieving at the start of year 2021.
Is it possible to facilitate an MS Teams meeting with the user and help the user fix the issue?0 -
Thank you for reaching out to us.
This could be a content question. I ran the code and got the following data for the CRT_MONTH and EXPIR_DATE fields when requesting historical data from 2000-01-01 to 2024-03-02.
CRT_MNTH
EXPIR_DATE
CTc1
202207-202403
20000309-20240306
CTc2
202210-202405
20000508-20240508
CTc3
202212-202407
20220309-20240709
CTc4
202303-202410
20220506-20241009
CTc5
202305-202412
20220707-20241206
CTc6
202307-202503
20221007-20250507
CTc7
202310-202505
20221207-20250507
CTc8
202312-202507
20230309-20250709
CTc9
202403-202510
20230508-20251009
CTc10
202405-202512
20230707-20251208
Please contact the Historical Pricing API support team directly via MyAccount to verify the content.
0 -
@Jirapongse
For CRT_MNTH and EXPIR_DATE data is available for the continuous RIC 2021 onwards. There must be a way to get these fields. On Bloomberg this data is available.
CT1 and CT2, I believe these two you can get data for these two fields from the year 2000 however you can't for CT3 - CT10.
How do we fix this?0 -
@Jirapongse
please see this code
import lseg.data as ld
import pandas as pdld.open_session()
continuous_rics = [
"CTc1", "CTc2", "CTc3", "CTc4", "CTc5",
"CTc6", "CTc7", "CTc8", "CTc9", "CTc10"
]df_continuous = ld.get_history(continuous_rics, 'EXPIR_DATE',
start='01-Jan-2020', end='13-Mar-2025', interval='daily')
print(df_continuous)0 -
For content availability type questions, I recommend you raise a ticket at MyAccount.
0 -
@Jirapongse
When you run that code you will see that there are NA for EXPIR_DATE and CRT_MNTH is there a workaround to get these fields?0 -
@Jirapongse In the meantime is it possible for you to create a workaround?
0 -
The library just retrieves data from the endpoint. In this case, the data is from the historical pricing API.
To verify the content, please contact the helpdesk team directly via MyAccount.
0 -
@Jirapongse I have already verified the content with the help desk. Prior to 2021 there isn't data for the continuation RIC for expir_date and CRT_MNTH. However I believe there could be a way for you to build a script to fix this?
0 -
If the data is not available in the historiacl database, I think we can't modify the script to retrive it with the get_history method.
You may check with the helpdesk if this is data is available in Workspace or Workspace Excel. It may be available via the Search API.
0 -
Ok how can we do it via the search API please
0 -
It's not available in the workspace excel
0
Categories
- All Categories
- 6 AHS
- 36 Alpha
- 166 App Studio
- 6 Block Chain
- 4 Bot Platform
- 18 Connected Risk APIs
- 47 Data Fusion
- 34 Data Model Discovery
- 682 Datastream
- 1.4K DSS
- 613 Eikon COM
- 5.2K Eikon Data APIs
- 10 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- 3 Trading API
- 2.9K Elektron
- 1.4K EMA
- 248 ETA
- 552 WebSocket API
- 37 FX Venues
- 14 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 23 Messenger Bot
- 3 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 60 Open Calais
- 275 Open PermID
- 44 Entity Search
- 2 Org ID
- 1 PAM
- PAM - Logging
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 22 RDMS
- 1.8K Refinitiv Data Platform
- 625 Refinitiv Data Platform Libraries
- 5 LSEG Due Diligence
- 1 LSEG Due Diligence Portal API
- 4 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.2K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 12 World-Check Customer Risk Screener
- 1K World-Check One
- 46 World-Check One Zero Footprint
- 45 Side by Side Integration API
- 2 Test Space
- 3 Thomson One Smart
- 10 TR Knowledge Graph
- 151 Transactions
- 143 REDI API
- 1.8K TREP APIs
- 4 CAT
- 26 DACS Station
- 121 Open DACS
- 1.1K RFA
- 104 UPA
- 191 TREP Infrastructure
- 228 TRKD
- 915 TRTH
- 5 Velocity Analytics
- 9 Wealth Management Web Services
- 83 Workspace SDK
- 11 Element Framework
- 5 Grid
- 18 World-Check Data File
- 1 Yield Book Analytics
- 46 中文论坛