Single Asset Solar Production

Problem description¶

Owners of the PV plants, electricity traders, system regulators and need accurate forecasts of PV plants production for different time horizons and different granularities to optimize their maintenance, trading and regulation strategies.

In essence, forecasting of solar production is a regression problem, where solar production is a time series of volume of the production for individual periods (usually for specific hours or quarter-hours) and predictors are forecasts of various meteorological parameters. This is generally the description of the problems that TIM is designed to solve.

Data Recommendation Template¶

Except from maintenance, production of solar power is not driven by any part of sociological factors such as day of the week and holidays. Main factors affecting production of PV plants are the amount of irradiance reaching the panel, angle of the panel surface with solar flux and Temperature (TEMP) affecting panels efficiency in converting irradiance to electricity.

Minimal requirements for having a good forecast of PV plant production is to have a good forecast point of Global Horizontal Irradiance (GHI) for the location of the PV plant. However, TIM achieves much better accuracy, when provided with components of GHI – Direct Normal Irradiance (DNI) and Diffuse Irradiance (DIF), as they arrive at panel in different angles.

Finally, to help TIM capture geometry type of PV plant (fixed, tracking, …) it is also useful to upload data about angular position of the sun on the sky in respect to GPS coordinate of PV plant – Sun Elevation (SE) and Sun Azimuth (SA). Provided these variables, TIM is able to create accurate models for forecasting of solar power without need to know specifics about configuration of modeled PV plant.

TIM Setup¶

TIM requires no setup of TIMs mathematical internals and works well in business user mode. All that is required from a user is to let TIM know a forecasting routine and desired prediction horizon.

TIM can automatically learn that there is no weekly pattern, in some cases, however, (e.g. short datasets) it can be difficult to learn this and therefore we recommend switching off the weekdays dictionary.

Demo example¶

A prepared dataset, along with a prepared configuration YAML-file, can be downloaded here.

Target¶

Data used in this example are assembled from an individual PV plant in Central Europe. Production of this PV plant is our target. It is the second column in CSV file, right after column with timestamps. In this case the name of the target is ‘PV_obs’. Data are sampled hourly.

Predictor candidates¶

As meteo predictors are used GHI, DNI, DIF, TEMP, SA, and SE, as discussed in section ‘Data Recommendation Template’. In this demo we use historical actuals for both model building and out-of-sample forecasting.

Data used in this example are from 2015-01-01 to 2017-01-12.

Timestamp¶

Timestamp is in UTC+01:00 timezone and each value of the timestamp is the beginning of the period it corresponds to i.e. ‘PV_obs’ in the row with timestamp 2015-01-01 00:00:00 corresponds to production of PV plant during period between 2015-01-01 00:00:00 and 2015-01-01 01:00:00.

Forecasting scenario¶

In this example we will simulate day ahead scenario. Each day at 10:15 we wish to have forecast for each hour of the next day.

Last target value will be from the hour 09:00 (because data are updated at 10:05 for time period 09:00 – 10:00 and at 11:05 for hour 11:00-12:00). Meteo predictors will be available for every hour of our prediction.

Model building and validation¶

Model is built using a range between 2015-01-01 00:00:00 and 2015-12-31 23:00:00.

Out-of-sample forecasts are made at range between 2016-01-01 00:00:00 and 2016-10-18 19:00:00. In this demo data set, out-of-sample validation is performed using historical actuals of meteorological data. More representative validation may be obtained by using historical forecasts of meteorological data instead.

Demonstration¶

TIM Studio¶

This section covers the use of TIM Studio to solve the challenge described above. Additional information regarding TIM Studio can be found here.

Select workspace¶

In the Workspaces screen, select the workspace in which the dataset should be added. If there is no available workspace, create one by clicking "Add Workspace". In this solution template, the workspace called "TIM Solution Templates" is used.

In the Datasets screen click on Add New Dataset. Stay in the tab CSV-File and insert name of the dataset. In this example, the dataset is called Solar_single. Click "Browse" and select the dataset from the computer. Click "Add Dataset" to confirm.

Model building definition¶

Go to the Model Building Definition screen in the panel on the left. Click "Add New Definition" and fill in the desired definition name. In this demo, the MBD is called Solar_single day-ahead. In the next screen, select the dataset that was previously uploaded (Solar_single).

In step 2, define the desired forecasting scenario. In this example, the model is used each day at 10:15. Therefore, leave all "Weekdays ranges" checked on. Then, set "Hour ranges" to 10, "Minutes ranges" to 15 and leave "Seconds ranges" at 00. Look into the section about the Cron notation for more details. Since forecasts are to be made for each hour of the next day, leave default settings for "Forecast from" and "Forecast to", i.e. Day with offset 1. Look into the section about the relative time notation to learn more about this.

Click "Next" to advance to the next step. It is also possible to already finalize all settings at this point, in which case everything else would be set up automatically. In this example, some more changes will be made to the data updates in the third step. The target variable, PV_obs, is updated at the fifth minute of each hour. Click on the small arrow next to PV_obs and change the settings of this variable. Leave all Weekdays ranges checked on. Select "All" for the Hour, since this variable is updated each hour, and set the Minutes ranges to 5, since the data is updated in the fifth minute of every hour. Leave the Second ranges at 0. Then, set "Update until" to Sample with offset -1, since the target variable is updated with a delay of one sample.

Leave the default settings for all other predictors, i.e. they are set to update at 8:20 until Day+1. Since forecasts are made at 12:00 and forecasts of the predictor values will be available for the next day, these default settings are alright; at what time they update exactly does not matter in this case. Here we could finalize our settings because we do not want to do more settings, but we will go through all steps.

Click "Next" to advance to step 4. Here, training regions can be selected. Since the goal is to move on to back-testing, this screen will be left in its default settings (i.e. Use All Data). Click "Next" to advance to the next screen.

In this fifth step, the mathematical settings can be changed, e.g. weekdays could be switched off in TIM Transformations, since this dictionary) is not relevant for solar forecasting. Since TIM can select features automatically, we will leave default settings. Then, click "Finalize" to complete the model building definition.

Experiments¶

Click "Experiments" in the panel on the left to move on to backtesting. Then, click "Make experiment" next to the correct model building definition (Wind_single day-ahead).

Click "Build Model" and select the appropriate training range, i.e. 2015-01-01 00:00:00 - 2015-12-31 23:00:00.

The in-sample prediction as well as the Model Tree Map become visible.

Click "Validate model" and select the correct Out-of-sample period for backtesting, i.e. 2016-01-01 00:00:00 - 2019-10-18 19:00:00.

This generates the aggregated forecast for the day D+1. We will switch off all predictors displayed in the graph except target PV_obs, then we will see only our results.

TIM Connector¶

This section covers the use of TIM Connector to solve the challenge described above. Additional information regarding TIM Connector can be found in the respective section.

2. Create folder with dataset¶

Create a folder (e.g. Solar_single) with the dataset-file [data.csv] and the configuration file [conf.yaml].

Solar_single/ data.csv conf.yaml

The conf.yaml-file defines the forecasting scenario, in this case the one described above. An exemplary configuration YAML-file is shown below; under this example the different aspects of the file are explained in more detail.

Configuration: The model is used repeatedly each day at 10:15. Forecasts are made for each hour of the next day.

Forecasting: Out-of-sample forecasts are made on the range from 2016-01-01 00:00:00 to 2016-10-18 19:00:00.

Model building: Model building in this example considers the range from 2015-01-01 00:00:00 to 2015-12-31 23:00:00. The target is named PV_obs and is updated with a delay of one sample (at 10:15 only until 10:00, i.e. the sample corresponding to hour 09:00). It is updated at the fifth minute of each hour.

version: "1.0"
type: Forecasting

modelBuilding:
data:
rows:
- from: 2015-01-01 00:00:00
to:   2015-12-31 23:00:00
- uniqueName: PV_obs
updateUntil:
baseUnit: Sample
offset: -1
updateTime:
- type: Day
value: "*"
- type: Hour
value: "*"
- type: Minute
value: "5"
configuration:
usage:
usageType: Repeating
usageTime:
- type: Day
value: "*"
- type: Hour
value: "10"
- type: Minute
value: "15"
predictionFrom:
baseUnit: Day
offset: 1
predictionTo:
baseUnit: Day
offset: 1
forecasting:
configuration:
predictionScope:
type: Ranges
ranges:
- from: 2016-01-01 00:00:00
to:   2016-10-18 19:00:00


3. Fill in user credentials¶

Following the previous command, the user will be prompted to fill in their user credentials. Fill in the correct information and click "OK" to continue.

4. Call connector from the command line (terminal)¶

First, change the directory to TIM Connector's builddir with the command: > cd pathToConnector\builddir. Then, call the connector with the following command: > pathToConnector\timconnect.exe path\to\Solar_single\conf.yaml`. Output in console:

Output in folder: Predictions: Report/timeStamp/conf/prediction.csv Errors: Report/timeStamp/conf/accuracy.txt The following accuracies were reported by TIM: Model building stage: RMSE = 0.42, MAE = 0.17
Validation stage: RMSE = 0.35, MAE = 0.17