For a deeper look into our Eikon Data API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials |  Articles

question

Upvotes
Accepted
18 2 1 3

Eikon Data API: Error code 500

Hi, I have been using the Eikon Data API for more than a year and it worked well until recently. I'm getting the following error message: Error code 500 | Server Error: Internal Server Error - {"code":500,"message":"connect ETIMEDOUT 159.220.40.41:443","statusMessage":"Internal Server Error"}

Can you help me?I have already tried with the IT team to see if there was a problem in proxy, updated the python package.

eikoneikon-data-apipythonworkspacerefinitiv-dataplatform-eikonworkspace-data-apierror-500
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

27 Answers

Upvotes
Accepted
7.6k 10 6 8

@victor.koyama Hi - I have checked with one of my colleagues and a similar customer issue was resolved by downgrading eikon to previous version (4.0.51). Its probably worth trying this here:

Can you or your IT team try this following the steps below:

1. Close all Office application (except Outlook) and also Eikon processes. You might need to check Task Manager to make sure no Eikon & Office processes running.

2. Open command prompt

3. Go to directory: C:\Program Files (x86)\Thomson Reuters\Eikon (cd C:\Program Files (x86)\Thomson Reuters\Eikon )

4. Enter command line: Eikon.exe –rollback Or silently rollback by using: Eikon.exe –rollbacksilent

5. Wait until Rollback Process terminates in UI or Eikon.exe process disappears from the Task Manager.

After this you also need to stop Eikon automatic updates until we can establish what the issue with Eikon 4.0.52 is.

Then please test your script again.

I hope this can help.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

@jason.ramchandani Hello Jason, I tried the rollback approach described above and it does not roll back to version 4.0.51 it only does roll back to either .52 or .53

Refinitiv rollback window says Refinitv Eikon Desktop 4.0.52 has been installed.

I tried rolling back several times, with no avail.

What should I do?

@alexander.yermolayev a similar issue was resolved by:

opening firewall access to:

emea1.apps.cp.thomsonreuters.com 159.220.1.19
amers1.apps.cp.thomsonreuters.com 159.220.40.41
apac1.apps.cp.thomsonreuters.com 159.220.16.215

for the API application. I hope this can help - I posted this to your other question here as well.

Upvotes
7.6k 10 6 8

@victor.koyama Please could you provide the following: Eikon Desktop version, Eikon Library version (ek._version), httpx library version and also nest-asycio library version (you can get these from a pip freeze), also what version of python you are using.

Please can you include the API call code that is failing.

You might try to downgrade httpx library to 0.14.2 (pip install httpx==0.14.2). Also downgrade nest-asnycio to 1.3.3 (pip install nest-aynchio==1.3.3). And try again to see if this helps at all.

If it does not please follow the steps in our detailed troubleshooting guide including enabling the logfiles. This will help to analyse the issue further.

Also have there been any changes to your firewall recently?

I hope this can help.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

Hi,


The Eikon Desktop version is 4.0.52 (4.0.52055).

How can i find information in eikon library version, httpx library and nest-asycio library version?

The python version i have is 3.7.


I was able to downgrade using the code (pip install httpx==0.14.2) . However for the asycio (pip install nest-aynchio==1.3.3 ) it is giving this response; ERROR: could not find a version that satisfies the requirement. No matching distribution found for nest-aynchio==1.3.3.

It continues with the same error code 500.


There hasnt been any change in the firewall as well.


Thank you

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

Please also check details on this thread - similar issue and it was nest-asyncio. Hence I ask you to do a pip freeze from the command line and tell me the version of both eikon library and nest-asyncio. See my screenshot below:

Please also include the API call that is failing please. I await your response.


1607013599128.png (203.0 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

Please maybe update your version of PIP

(Mac/Unix)

python -m pip install -U pip

Windows

py -m pip install -U pip


You can check the release history for nest-asyncio to confirm 1.3.3 is definitely available.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

eikon==1.1.2

nest-asyncio==1.0.0



1607014245845.png (11.6 KiB)
1607014342081.png (31.8 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama Ok - better and now please the API call that is failing, Thanks

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

The API call is the code i am sending from my python? so import eikon as ek?

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama yes please the API call that is failing ek.get_data or ek.get_timeseries thx

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

oh ok, it is ek.get_data


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama yes....please paste the full API call please with instruments, fields and parameters - thx - its the 3rd time i ask.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama you also said that you had updated your eikon library - the version here is 1.1.2 - are you running multiple python instances or environments by any chance

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

return ek.get_data(stock_list,features,{'SDate':first_date.strftime('%Y-%m-%d'), 'EDate':end_date.strftime('%Y-%m-%d'), 'FRQ':'D'})[0]

ek.get_data(stock_list,features,{'Period':item})


oh my bad, i did the update but i did a downgrade on the excel version. what is the code again to update eikon in cmd with the proper version you recommend?

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

not excel, eikon version

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

i tried this code as well,

  1. py -m pip install -U pip

appearing this error, do you know what it might be?



1607016669362.png (61.1 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

Look we are going a bit round in circles here - when i say all instruments and fields and parameters i want to see the instrument list and the list of actual fields. I need these to help you. Please post the full API call - i cannot intuit the instruments or fields from a variable name.

First though - can you update your eikon library to 1.1.8. (pip install eikon==1.1.8). Then try the request again. Also then conduct another pip freeze and tell me the versions of httpx, nest-asyncio please.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

import pandas as pd

import eikon as ek

import datetime as dt

from datetime import datetime



# ### Functions


# #### Related to main databases


# In[18]:



def get_data_reuters(stock_list, features,first_date, end_date): #Function to retrive data from Reuters API

return ek.get_data(stock_list,features,{'SDate':first_date.strftime('%Y-%m-%d'), 'EDate':end_date.strftime('%Y-%m-%d'), 'FRQ':'D'})[0]

def working_day(date, increment):

while date.weekday() in (5,6) or date.strftime("%Y-%m-%d") in np_feriados:

date = date + dt.timedelta(days = increment)

return date


def retrieve_database(csv_path, att_data, features): #retrive our database

#Open chosen file

df_file = pd.read_csv(csv_path, index_col = False)

#Find the lastest date

file_date = df_file['date'].iloc[1]

file_date = datetime.strptime(file_date,'%Y-%m-%d').date()

delta_period = (att_date - file_date).days

first_day_month = dt.datetime.today().replace(day=1).date()

first_working_day = working_day(first_day_month, 1)

if att_date == first_working_day: #just update stock list on the first working day of the month

stock_turple = ek.get_data('SCREEN(U(IN(Equity(active,public))/*UNV:Public*/), IN(TR.ExchangeCountryCode,"BR"))','TR.CommonName')

stock_list = list(stock_turple[0]['Instrument'])

stock_list.extend(foreign_ric) #add foreign rics at the beginning of the month

else: #if not the first day of the month, get stocks form file

stock_list = list(df_file.drop_duplicates(['ticker'])['ticker_reuters'].dropna())

#Define the search interval

if file_date < att_date:

first_date = working_day(file_date + dt.timedelta(1),1)

else:

first_date = att_date

end_date = att_date

print("Acessando API Reuters para extração de dados...")

df = get_data_reuters(stock_list,features,first_date, end_date)

return (df, df_file, file_date, stock_list) #Return df with retrieved files, history file and latest history date


def data_cleansing(df_stock, column_pivot, att_date, end_date): #Clean and format the raw data

#Treat the date

df_stock['Date'] = df_stock['Date'].str.slice(start = 0, stop = 10)

#Delete rows with NaN and 0 values for the column pivot

df_stock = df_stock[df_stock[column_pivot].notnull()]

df_stock = df_stock[df_stock[column_pivot] != 0]


#Lowercase columns name. Change the order

columns_order = list(df_stock.columns.copy())


df_stock.columns = map(str.lower,df_stock.columns)

df_stock = df_stock.rename(columns={'instrument':'ticker'})


columns_order[0], columns_order[1] = columns_order[1], columns_order[0]

columns_order[1] = 'ticker'

columns_order.insert(2,'ticker_reuters')

columns_order = list(map(str.lower, columns_order))

# Insert ticker_reuters column to future searches

df_stock['ticker_reuters'] = df_stock['ticker']

df_stock = df_stock[columns_order]

df_stock['ticker'] = df_stock['ticker'].apply(lambda col: col[0:col.find('.')] if (col.find('.') != -1 and col.find('.') != 0) else (col if col.find('.') != 0 else col[1:]))

df_stock = df_stock.sort_values(['date', 'ticker'])

df_stock = df_stock.loc[(df_stock['date'] <= att_date.strftime("%Y-%m-%d")) & (df_stock['date'] >= end_date.strftime("%Y-%m-%d"))]

return df_stock #return treated dataframe


def update_database(csv_path, df_file,df_stock_clean,file_date,att_date, stock_list, features): #append the cleaned dataframe to the history

if file_date != att_date:

df_base = pd.concat([df_file, df_stock_clean], join = 'outer', sort = False) # if updating for the first time, just append

else:

df_file = df_file.drop(df_file[df_file['date']==att_date.strftime("%Y-%m-%d")].index) #otherwise, delete first

df_base = pd.concat([df_file, df_stock_clean], join = 'outer', sort = False) #then append

df_base = df_base.sort_values(['date','ticker'],ascending = False).drop_duplicates(['date','ticker'])

if 'TR.PriceClose' in features: #check for company events for the base_price only

print("Checando eventos de companhias...")

df_base = company_events(df_base,stock_list,features,att_date)

df_base.to_csv(csv_path, index = False)

def company_events(df_file,stock_list,features_price,att_date):

features_price_only = ['TR.PriceClose.Date','TR.PriceClose']

check_date = working_day(att_date - dt.timedelta(days = 1),-1) #get one day before att_date

df_price = get_data_reuters(stock_list,features_price,check_date, check_date) #get prices from check_date

#clean retrieved df

df_price_clean = data_cleansing(df_price,'Price Close', check_date, check_date)

df_price_clean = df_price_clean.loc[df_price_clean['date'] == check_date.strftime("%Y-%m-%d"),:] # API might bring old dates

df_values = df_price_clean[['ticker_reuters','price close']].set_index('ticker_reuters')

df_hist = df_file.loc[df_file['date']== check_date.strftime("%Y-%m-%d")] #get prices of check_date from current base

df_hist = df_hist[['ticker_reuters','price close']].set_index('ticker_reuters')

#check whether prices for check_date are equal in both base_price and reuters

df_compare = pd.concat([df_hist,df_values], join = 'inner', axis = 1, sort = False)

df_compare['compare'] = (df_compare.iloc[:,0].round(decimals = 5) == df_compare.iloc[:,1].round(decimals = 5))

#If they differ, it's due company events

df_stock_event = df_compare[df_compare['compare'] == False].reset_index() #get_stocks where prices differ

stock_event = list(df_stock_event['ticker_reuters'])

if stock_event:

print("Atualizando eventos...")

start_date = df_file['date'].iloc[-1]

#update price history in the base_price for the selected stocks

df_att_hist = ek.get_data(stock_event,features_price, {'SDate':start_date, 'EDate':att_date.strftime("%Y-%m-%d")})[0]

df_att_hist_clean = data_cleansing(df_att_hist,'Price Close', att_date, dt.datetime.strptime(start_date, "%Y-%m-%d"))

df_file = df_file.loc[df_file['ticker_reuters'].isin(stock_event) == False ,:]

df_file = pd.concat([df_file,df_att_hist_clean], ignore_index = True).reset_index().drop(columns='index').sort_values(['date','ticker'], ascending = False)


return df_file #returns the corrected price history



# #### Related to financial database


# In[19]:



def get_data_reuters_financial(csv_path,att_date,features, period): #Retrieve data from eikon API (different years)

df_file = pd.read_csv(csv_path, index_col = False)

first_day_month = dt.datetime.today().replace(day=1).date()

first_working_day = working_day(first_day_month, 1)

if att_date == first_working_day: #just update stock list on the first working day of the month

stock_turple = ek.get_data('SCREEN(U(IN(Equity(active,public))/*UNV:Public*/), IN(TR.ExchangeCountryCode,"BR"))','TR.CommonName')

stock_list = list(stock_turple[0]['Instrument'])

stock_list.extend(foreign_ric) #add foreign rics at the beginning of the month

else: #if not the first day of the month, get stocks from file

stock_list = list(df_file.drop_duplicates(['ticker'])['ticker_reuters'].dropna())

data = []

print("Acessando API Reuters para extração de dados...")

for item in period:

df = ek.get_data(stock_list,features,{'Period':item})

df[0]['date'] = item[2:6]

data.append(df[0])

return data


def treat_data_financial(reuters_root_data):

for x in range(len(reuters_root_data)): #Build a single df appending data from reuters

if x == 0:

df = reuters_root_data[0]

else:

df = df.append(reuters_root_data[x], ignore_index = 0)

df = df.rename(columns = {'Instrument':'ticker_reuters'})

df.columns = map(str.lower, df.columns)

df['ticker'] = df['ticker_reuters'].apply(lambda col: col[0:col.find('.')] if col.find('.') != -1 else col)

df = df.sort_values(['ticker','date'], ascending = False)

df = df.set_index(['ticker','ticker_reuters','date'])

df = df.dropna(how='all')

df = df.drop_duplicates()


return df


def update_database_financial(path, df_data, year_array,index_col):

df_hist = pd.read_csv(path)

df_hist = df_hist[df_hist['date'].isin(year_array)==False].set_index(index_col) #drop data to be updated

df_data = pd.concat([df_data,df_hist], join = 'outer', sort = False, axis = 0).drop_duplicates()

df_data = df_data.reset_index().sort_values(['ticker','date'], ascending = [True,False]).set_index(index_col)

df_data.to_csv(path)



# ### Set Reuters API Key


# In[20]:



ek.set_app_key('706f7cf0f1d14a1983f54c32c70dd75ee03b308f')



# ### Update Databases


# #### Define all the bases which will be updated


# In[21]:



np_feriados = pd.read_csv('//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/feriados/feriados.csv').to_numpy()


#databases path

path_beta = '//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/dados_acoes/beta_acoes.csv'

path_price = '//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/dados_acoes/preco_acoes.csv'

path_valuation = '//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/dados_acoes/valuation_acoes.csv'

path_starmine = '//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/dados_acoes/starmine_acoes.csv'

path_index = '//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/dados_indice/preco_indice.csv'

path_financial = '//pfs14/PSI/GESTAO TERCEIROS/Mesa Bolsa/Database/dados_acoes/financial_acoes.csv' #different update



#databases feature

features_beta = ['TR.BetaDaily180D.Date','TR.BetaDaily180D','TR.BetaWklyAdj2Y','TR.BetaDaily90D']

features_price = ['TR.PriceClose.Date','TR.PriceClose','TR.Volume','TR.MarketCapLocalCurn']

features_valuation = ['TR.FwdPtoEPSSmartEst.Date','TR.FwdPtoEPSSmartEst','TR.EVtoEBITDASmartEst','TR.CompanySharesOutstanding',

'TR.DPSSmartEst','TR.EPSSMARTEST','TR.EPSActValue','TR.ReturnOnCapitalPercent','TR.EVSmartEst']

features_starmine = ['TR.CreditTextRegRank.Date','TR.CreditTextRegRank', 'TR.ValMoRegionRank','TR.ARM100Region','TR.ARM100RegionChange'

'TR.NumberOfAnalysts','TR.CombinedAlphaRegionRank']

features_financial = ['TR.EBITDAActValue','TR.NetprofitSmartEst','TR.NDebtSmartEst','TR.TotalDebtOutstanding',

'TR.NetprofitActValue','TR.EBITDASmartEst','TR.NetDebtToEBITDA','TR.RevenueActValue',

'TR.RevenueSmartEst','TR.EBITActValue','TR.DepreciationAmortizationActValue','TR.SGandAExpActual',

'TR.OperatingExpActual','TR.GenAdminExpActValue','TR.SellMarkexpActValue','TR.RnDExpActual',

'TR.PreTaxProfitActValue', 'TR.TaxProvisionActValue','TR.CapexActValue','TR.IntExpSmartEst','TR.IntExpActual']

features_index = ['TR.PriceClose.Date','TR.PriceClose','TR.Volume']



index_list = ['.BVSP']

foreign_ric = ['CZZ','GDXJ.K','VALE.K','.STOXX'] #foreign rics to be added at the beginning of every month (when the stock list is updated)

index_col_financial = ['ticker','ticker_reuters','date']


#defining years to be updated in the financial database

next_year = dt.date.today().year +1

year_array = [next_year - i for i in range(0,3)]

period = ['FY'+ str(item) for item in year_array]


path_list = [path_beta, path_price,path_valuation,path_starmine, path_index] #finance and index database must not be added to the list

features_list = [features_beta, features_price, features_valuation,features_starmine, features_index]

column_pivot = ['Daily Beta - 90 Day','Price Close','Company Shares ','Credit Text Mining Region Rank','Volume']


base_list = [None]*len(path_list) #build list with bases' names

index = 0

for item in path_list:

i = -1

while item[i] != "/":

i = i-1

base_list[index] = item[i+1:len(item)]

index = index + 1



# #### Execute the query


# In[22]:



#define dates

att_date = working_day(dt.date.today() - dt.timedelta(days = 1), -1)


print(" ********** Iniciando o processo de atualização das bases **********\n")

print("Número de bases a serem atualizadas: "+ str(len(path_list)+1) + "\n")


index = 0

for path in path_list:

print("Atualizando base " + base_list[index])

(df_stock, df_file, file_date, stock_list) = retrieve_database(path, att_date, features_list[index])

print("Limpando e tratando dados extraídos...")

df_stock_clean = data_cleansing(df_stock,column_pivot[index],att_date, file_date)

print("Salvando base de dados...")

update_database(path, df_file, df_stock_clean, file_date, att_date, stock_list, features_list[index])

print("Base " + base_list[index] + " atualizada! \n")

index = index + 1

#update different procedures - (financial base)

print("Atualizando base financial_acoes.csv")

data_reuters = get_data_reuters_financial(path_financial, att_date, features_financial,period)


print("Limpando e tratando dados extraídos...")

df_data = treat_data_financial(data_reuters)


print("Salvando base de dados...")

df = update_database_financial(path_financial, df_data, year_array, index_col_financial)

print("Base financial atualizada! \n")


print("**********Processo de Atualização Encerrado!**********")


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

i will send you the other informations, just a minute

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

After you have installed the latest eikon library - please try the following:

ek.get_data('EUR=','CF_CLOSE')

and send me the result - I just want to see if you have connectivity with a simple call.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

do you know why i get this error when i write the code pip install eikon==1.1.8?



1607018820981.png (69.9 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama im not exactly sure but take a look at this thread

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

i updated the eikon with the version 1.1.8.

httpx is 0.14.3

nest-asyncio is 1.3.3


regarding the code you sent me to try, here is the error:



1607024669780.png (49.5 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama OK could you please try the following 4 steps:

  1. Open http://localhost:9000/api/status to check if API Proxy is running.
  2. Verify if Refinitiv server is authorized by your proxy
    Run cmd.exe then type "ping emea1.apps.cp.thomsonreuters.com", you should have following result:
    Reply from 159.220.1.19: bytes=32 time=34ms TTL=242
  3. In your script, activate verbose log and when you set app_key, you can test if connection is up before continue as follow:
                
  1. ek.set_log_level(1)

    ek.set_app_key('.....')

    state = ek.get_desktop_session().get_open_state()

    if state == ek.Session.State.Open:

  2.     ek.get_news_headlines('R:LHAG.DE', date_from='2019-03-06T09:00:00', date_to='2019-03-06T18:00:00')

4. Lastly, if you can define environment variable HTTPX_LOG_LEVEL=trace then run your script, we'll have full logs to investigate and check what's the root cause of this error code 500.


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

Hi,


I guess the refinitiv server is no authorized by my proxy. See picture below:

The second step is bring me an error in the cmd. please, could you confirm?


1607028614264.png (35.7 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
7.6k 10 6 8

@victor.koyama this is why I asked about changes to firewall. I will email you the Eikon networking guide as it is and also the following instructions for your IT team - you may need to check with them. See main points below:

Please see pages 15-20 for configuring firewall, proxy and internet options in the attached document.

Also on page 14, you will see the IP Subnet and TCP Port that we are using.

Here I will give you a summary of the changes that have to be made on your proxy server and the firewall.

Proxy Server:

Please approve the connection to the following domains (on port 80 and 443)

*.cp.extranet.thomsonreuters.biz
*.refinitiv.biz
*.thomsonreuters.net
*.refinitiv.net

Firewall and Antivirus:

Please also make sure that the firewall and antivirus allows the following processes:

Eikon.exe (Thomson Reuters Eikon Desktop)
EikonBox.exe (Thomson Reuters Eikon, Interactive Map Component)
EikonDM.exe (Thomson Reuters Eikon Deployment / Update Manager)
TRUserServiceHostv4.exe (Thomson Reuters Common Office User Service Daemon)
Excel.exe (Thomson Reuters Eikon Excel)
EikonSupportTool.exe (Eikon Support Tool)

Please make sure that the firewall server allows you to download EXE files from the following domains (on port 80 and 443):

*.cp.extranet.thomsonreuters.biz
*.refinitiv.biz
*.thomsonreuters.net
*.refinitiv.net

HTTPS Tunneling
Thomson Reuters Eikon uses HTTPS Tunneling to connect to Thomson Reuters Eikon platform for Streaming Service. In some Proxy Server, HTTP/HTTPS Tunneling is disabled by default because of unknown protocol e.g. Web Sense. Meanwhile some Proxy Servers are able to set up a rule to enable or disable it.

It is necessary that customers set up a rule to ALLOW HTTPS Tunneling to the following URL:

*.cp.extranet.thomsonreuters.biz
*.refinitiv.biz
*.thomsonreuters.net
*.refinitiv.net

You may also need to allow our domains if you are implementing SSL/TLS decryption or inspection as this might break the certificate from our servers and will break the connection to the Eikon Desktop application.

*.cp.extranet.thomsonreuters.biz
*.refinitiv.biz
*.thomsonreuters.net
*.refinitiv.net

I hope this can help.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Hello @jason.ramchandani

Thank you for the detailed information. Has this change been made recently? I am experiencing the same issue as Victor. All my EDAPI scripts used to work fine and now they are getting blocked. When were these the changes in domain usage implemented and why weren't users informed of these changes in advance?

Many thanks.

Upvotes
7.6k 10 6 8

@victor.koyama As an outside chance can you please can you also try:

1. Clean Eikon cache. From Windows menu Thomson Reuters ==> Clear Cache

2. Reboot the machine then retry API call.


1607034857316.png (109.2 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Upvotes
18 2 1 3

Hi, sorry for the late response. The IT team was looking into this matter. What they told me was that there is no problem with the proxy, but problem is with the reuters server. We cant connect into the reuters server in this route: 159.220.40.41:443 . Please, could you verify if this server is working or it was changed/updated? How can i proceed with this matter?

Also i tried clearing the cache, but the problem still persists.

Thank you!

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.