Skip to main content

Using the TIM Python Client

When the TIM Python client is --installed--, you can use the Python commands described in the following subsections to call the TIM API from your Python code.

You can explore the capabilities of the TIM Python client by downloading and running the --Jupyter notebook examples--.

Import the TIM Python client

It is necessary to properly import the TIM Python client. This is achieved by a standard Python import statement:

import tim_client

Configure Python logging

The TIM Python client uses a built-in Python -logging- package and respects existing program-wide logging configurations. It is recommended to configure -logging- before using the TIM Python client.

Logs generated by the TIM Python client can be divided into 4 standard categories:

  • ERROR - when an error occurs, the function that was called cannot proceed - an error log is displayed and an Exception is thrown out of the TIM Python client scope
  • WARNING - a warning informs the user about an undesired state, but the function can recover and continue to process the given task
  • INFO - info logs inform the user about the state of function's execution
  • DEBUG - debug logs provide detailed reports about the function's execution (should be enabled only for development and testing)

The following code example demonstrates how to configure -logging- in your Python program:

-Note: This configuration affects other parts and packages in your program.-

import logging
LOGGING_LEVEL = "[DEBUG/INFO/WARNING/ERROR]"
level = logging.getLevelName(LOGGING_LEVEL)
logging.basicConfig(level=level, format='[%(levelname)s] %(asctime)s - %(name)s:%(funcName)s:%(lineno)s - %(message)s')

More information about the -logging- package can be found in --the official Python documentation--.

Authentication

Calls to the TIM API are handled by the ApiClient class (--tim_client.api_client.ApiClient--). The constructor of this class takes a Credentials object as an argument (--tim_client.credentials.Credentials--).

The constructor of the Credentials class takes 3 string arguments: license_key, email, password. Populate these arguments with the data you obtained from your TIM License:

  • License key (e.g. abcd-1234-defg-5678)
  • Email address (e.g. tim.user@example.com)
  • Password (e.g. password123)

Create an instance of the Credentials class and store it to the -credentials- variable:

credentials = tim_client.Credentials('abcd-1234-defg-5678', 'tim.user@example.com', 'password123', tim_url='https://timws.tangent.works/v4/api')
  • Note: Python client is communicating with TIM endpoint defined by tim_url argument. For SaaS solutions use value https://timws.tangent.works/v4/api . For on-premise installations tim_url should be set to http://localhost/v4/api .-

Create an instance of the ApiClient class with the created -credentials- object and store it to the -api_client- variable:

api_client = tim_client.ApiClient(credentials)

The object stored in the api_client variable will be used.

Saving full request and response JSONs

The TIM Python Client is able to save full requests and responses as JSON files to a defined -target directory-. This behavior is disabled by default. To enable it, use the following command:

api_client.save_json = True

To disable saving JSONs, use the command:

api_client.save_json = False

By default JSONs are saved to the ./logs/ -target directory-. The -target directory- can be modified by running:

api_client.json_saving_folder_path = [relative-or-absolute-path-to-existing-directory]

The absolute path of the -target directory- can be obtained by the following command:

str(api_client.json_saving_folder_path.absolute())

The following naming conventions of JSON files are used:

  • --Requests--: [timestamp]_request_[endpoint].json
    • -timestamp- is in format YYYYmmddHHMMSS
    • -endpoint- is in format [category]-[action] (e.g. prediction-build-model)
  • --Responses--: [timestamp]_response_[endpoint]_[request_uuid].json
    • -timestamp- is in format YYYYmmddHHMMSS
    • -endpoint- is in format [category]-[action] (e.g. prediction-build-model)

Load CSV dataset

Let's suppose a dataset CSV file with semicolon (;) separated columns has been created. It is recommended to use the UTF-8 charset. For the purpose of this example the file is supposed to be stored in the location /home/user/dataset.csv. Loading this dataset to a Pandas Dataframe data structure can be done with the following statement:

dataset = tim_client.load_dataset_from_csv_file('/home/user/dataset.csv', sep=';')

The variable dataset now contains a Pandas Dataframe object with our dataset.

Each predictor in the dataset can be optionally extended with updateTime and updateUntil properties. To define the updateTime and updateUntil values, create a list of dictionaries with the following schema and use this list in the function calls mentioned in the next sections:

dataset_updates = [
{
"uniqueName": "[predictor-unique-name]",
"updateTime": [
[updateTime-list]
],
"updateUntil": {
[updateUntil-object]
}
}
]
  • --predictor-unique-name-- is the column name in the CSV file
  • --updateTime-list-- - Time of the update - must follow --the definition--
    • Example:
[
{
"type": "Day",
"value": "-"
},
{
"type": "Hour",
"value": "7,11"
},
{
"type": "Minute",
"value": "0"
}
]
  • --updateUntil-object-- - Timestamp of the last (most recent) updated value - must follow --the definition--
    • Example:
{
"baseUnit": "Sample",
"offset": 0
}

Making a prediction

Before making any prediction, it is necessary to define the model configuration. This can be achieved by creating a Python dictionary:

prediction_configuration = {
'usage': {
'predictionTo': {
'baseUnit': 'Day',
'offset': 1
}
}
}

-Note: See the full list of --configuration options--.-

There are two possible ways how to do predictions:

  • -Case 1-: A model is built in the first step and then this model is used to make the prediction itself in the second step.
  • -Case 2-: The two steps in -case 1- are combined and the prediction is obtained directly without explicitly building a model (this is called -RTInstantML-).

Case 1

Building the model:

prediction_model = api_client.prediction_build_model(dataset, prediction_configuration, predictors_update=dataset_updates)

-Note: The predictors_update argument is optional.-

Making a prediction:

prediction = api_client.prediction_predict(dataset, prediction_model, predictors_update=dataset_updates)

-Note: The predictors_update argument is optional.-

Getting the prediction result (as a Pandas Dataframe):

prediction_result_df = prediction.get_prediction()

Case 2

Making a -RTInstantML- prediction:

rt_instant_ml_prediction = api_client.prediction_build_model_predict(dataset, prediction_configuration, predictors_update=dataset_updates)

-Note: the predictors_update argument is optional.-

Getting the -RTInstantML- prediction result (as a Pandas Dataframe):

rt_instant_ml_prediction_result_df = rt_instant_ml_prediction.get_prediction()

It is possible to extend basic prediction table with prediction intervals. To include prediction intervals in the prediction result, it is necessary to pass argument include_intervals=True to get_prediction function call:

rt_instant_ml_prediction_result_df = rt_instant_ml_prediction.get_prediction(include_intervals=True)

Returned dataframe will contain 4 columns: -Timestamp-, -Prediction-, -LowerValues- and -UpperValues-.

Making an anomaly detection

Before making an anomaly detection, it is necessary to define the model configuration. This can be achieved by creating a Python dictionary:

anomaly_detection_configuration = {
"normalBehaviorModellingConfiguration": {
"useTargetOffsets": False
}
}

Building a model for anomaly detection:

anomaly_detection_model = api_client.detection_build_model(dataset, anomaly_detection_configuration, predictors_update=dataset_updates)

-Note: The predictors_update argument is optional.-

Rebuilding a model for anomaly detection:

new_anomaly_detection_model = api_client.detection_rebuild_model(dataset, anomaly_detection_model, predictors_update=dataset_updates)

-Note: The predictors_update argument is optional.-

Performing the anomaly detection:

anomaly_detection = api_client.detection_detect(dataset, anomaly_detection_model, predictors_update=dataset_updates)

-Note: The predictors_update argument is optional.-

Getting the anomaly indicators (as a Pandas Dataframe):

anomaly_indicators_df = anomaly_detection.get_anomaly_indicator()

Getting the normal behavior values (as a Pandas Dataframe):

normal_behavior_df = anomaly_detection.get_normal_behavior()