未加星标

How to Use the TimeseriesGenerator for Time Series Forecasting in Keras

字体大小 | |
[开发(python) 所属分类 开发(python) | 发布者 店小二03 | 时间 2018 | 作者 红领巾 ] 0人收藏点击收藏

Time series data must be transformed into a structure of samples with input and output components before it can be used to fit a supervised learning model.

This can be challenging if you have to perform this transformation manually. The Keras deep learning library provides the TimeseriesGenerator to automatically transform both univariate and multivariate time series data into samples, ready to train deep learning models.

In this tutorial, you will discover how to use the Keras TimeseriesGenerator for preparing time series data for modeling with deep learning methods.

After completing this tutorial, you will know:

How to define the TimeseriesGenerator generator and use it to fit deep learning models. How to prepare a generator for univariate time series and fit MLP and LSTM models. How to prepare a generator for multivariate time series and fit an LSTM model.

Let’s get started.


How to Use the TimeseriesGenerator for Time Series Forecasting in Keras

How to Use the TimeseriesGenerator for Time Series Forecasting in Keras

Photo by Chris Fithall , some rights reserved.

Tutorial Overview

This tutorial is divided into six parts; they are:

Problem with Time Series for Supervised Learning How to Use the TimeseriesGenerator Univariate Time Series Example Multivariate Time Series Example Multivariate Inputs and Dependent Series Example Multi-step Forecasts Example Problem with Time Series for Supervised Learning

Time series data requires preparation before it can be used to train a supervised learning model, such as a deep learning model.

For example, a univariate time series is represented as a vector of observations:

[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

A supervised learning algorithm requires that data is provided as a collection of samples, where each sample has an input component ( X ) and an output component ( y ).

X, y example input, example output example input, example output example input, example output ...

The model will learn how to map inputs to outputs from the provided examples.

y = f(X)

A time series must be transformed into samples with input and output components. The transform both informs what the model will learn and how you intend to use the model in the future when making predictions, e.g. what is required to make a prediction ( X ) and what prediction is made ( y ).

For a univariate time series interested in one-step predictions, the observations at prior time steps, so-called lag observations, are used as input and the output is the observation at the current time step.

For example, the above 10-step univariate series can be expressed as a supervised learning problem with three time steps for input and one step as output, as follows:

X, y [1, 2, 3], [4] [2, 3, 4], [5] [3, 4, 5], [6] ...

You can write code to perform this transform yourself; for example, see the post:

How to Convert a Time Series to a Supervised Learning Problem in python

Alternately, when you are interested in training neural network models with Keras, you can use the TimeseriesGenerator class.

Need help with Deep Learning for Time Series?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

Download Your FREE Mini-Course

How to use the TimeseriesGenerator

Keras provides the TimeseriesGenerator that can be used to automatically transform a univariate or multivariate time series dataset into a supervised learning problem.

There are two parts to using the TimeseriesGenerator: defining it and using it to train models.

Defining a TimeseriesGenerator

You can create an instance of the class and specify the input and output aspects of your time series problem and it will provide an instance of a Sequence class that can then be used to iterate across the inputs and outputs of the series.

In most time series prediction problems, the input and output series will be the same series.

For example:

# load data inputs = ... outputs = ... # define generator generator = TimeseriesGenerator(inputs, outputs, ...) # iterator generator for i in range(len(generator)): ...

Technically, the class is not a generator in the sense that it is not a Python Generator and you cannot use the next() function on it.

In addition to specifying the input and output aspects of your time series problem, there are some additional parameters that you should configure; for example:

length : The number of lag observations to use in the input portion of each sample (e.g. 3). batch_size : The number of samples to return on each iteration (e.g. 32).

You must define a length argument based on your designed framing of the problem. That is the desired number of lag observations to use as input.

You must also define the batch size as the batch size of your model during training. If the number of samples in your dataset is less than your batch size, you can set the batch size in the generator and in your model to the total number of samples in your generator found via calculating its length; for example:

print(len(generator))

There are also other arguments such as defining start and end offsets into your data, the sampling rate, stride, and more. You are less likely to use these features, but you can see the full API for more details.

The samples are not shuffled by default. This is useful for some recurrent neural networks like LSTMs that maintain state across samples within a batch.

It can benefit other neural networks, such as CNNs and MLPs, to shuffle the samples when training. Shuffling can be enabled by setting the ‘ shuffle ‘ argument to True. This will have the effect of shuffling samples returned for each batch.

At the time of writing, the TimeseriesGenerator is limited to one-step outputs. Multi-step time series forecasting is not supported.

Training a Model with a TimeseriesGenerator

Once a TimeseriesGenerator instance has been defined, it can be used to train a neural network model.

A model can be trained using the TimeseriesGenerator as a data generator. This can be achieved by fitting the defined model using the fit_generator() function.

This function takes the generator as an argument. It also takes a steps_per_epoch argument that defines the number of samples to use in each epoch. This can be set to the length of the TimeseriesGenerator instance to use all samples in the generator.

For example:

# define generator generator = TimeseriesGenerator(...) # define model model = ... # fit model model.fit_generator(generator, steps_per_epoch=len(generator), ...)

Similarly, the generator can be used to evaluate a fit model by calling the evaluate_generator() function, and using a fit model to make predictions on new data with the predict_generator() function.

A model fit with the data generator does not have to use the generator versions of the evaluate and predict functions. They can be used only if you wish to have the data generator prepare your data for the model.

Univariate Time Series Example

We can make the TimeseriesGenerator concrete with a worked example with a small contrived univariate time series dataset.

First, let’s define our dataset.

# define dataset series = array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])

We will choose to frame the problem where the last two lag observations will be used to predict the next value in the sequence. For example:

X, y [1, 2] 3

For now, we will use a batch size of 1, so that we can explore the data in the generator.

# define generator n_input = 2 generator = TimeseriesGenerator(series, series, length=n_input, batch_size=1)

Next, we can see how many samples will be prepared by the data generator for this time series.

# number of samples print('Samples: %d' % len(generator))

Finally, we can print the input and output components of each sample, to confirm that the data was prepared as we expected.

for i in range(len(generator)): x, y = generator[i] print('%s => %s' % (x, y))

The complete example is listed below.

# univariate one step problem from numpy import array from keras.preprocessing.sequence import TimeseriesGenerator # define dataset series = array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]) # define generator n_input = 2 generator = TimeseriesGenerator(series, series, length=n_input, batch_size=1) # number of samples print('Samples: %d' % len(generator)) # print each sample for i in range(len(generator)): x, y = generator[i] print('%s => %s' % (x, y))

Running the example first prints the total number of samples in the generator, which is eight.

We can then see that each input array has the shape [1, 2] and each output has the shape [1,].

The observations are prepared as we expected, with two lag observations that will be used as input and the subsequent value in the sequence as the output.

Samples: 8 [[1. 2.]] => [3.] [[2. 3.]] => [4.] [[3. 4.]] => [5.] [[4. 5.]] => [6.] [[5. 6.]] => [7.] [[6. 7.]] => [8.] [[7. 8.]] => [9.] [[8. 9.]] => [10.]

Now we can fit a model on this data and learn to map the input sequence to the output sequence.

We will start with a simple Multilayer Perceptron, or MLP, model.

The generator will be defined so that all samples will be used in each batch, given the small number of samples.

# define generator n_input = 2 generator = TimeseriesGenerator(series, series, length=n_input, batch_size=8)

We can define a simple model with one hidden layer with 50 nodes and an output layer that will make the prediction.

# define model model = Sequential() model.add(Dense(100, activation='relu', input_dim=n_input)) model.add(Dense(1)) model.compile(optimizer='adam', loss='mse')

We can then fit the model with the generator using the fit_generator() function. We only have one batch worth of data in the generator so we’ll set the steps_per_epoch to 1. The model will be fit for 200 epochs.

# fit model model.fit_generator(generator, steps_per_epoch=1, epochs=200, verbose=0)

Once fit, we will make an out of sample prediction.

Given the inputs [9, 10], we will make a prediction and expect the model to predict [11], or close to it. The model is not tuned; this is just an example of how to use the generator. # make a one step prediction out of sample x_input = array([9, 10]).reshape((1, n_input)) yhat = model.predict(x_input, verbose=0)

The complete example is listed below.

# univariate one step problem with mlp from numpy import array from keras.models import Sequential from keras.layers import Dense from keras.preprocessing.sequence import TimeseriesGenerator # define dataset series = array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]) # define generator n_input = 2 generator = TimeseriesGenerator(series, series, length=n_input, batch_size=8) # define model model = Sequential() model.add(Dense(100, activation='relu', input_dim=n_input)) model.add(Dense(1)) model.compile(optimizer='adam', loss='mse') # fit model model.fit_generator(generator, steps_per_epoch=1, epochs=200, verbose=0) # make a one step prediction out of sample x_input = array([9, 10]).reshape((1, n_input)) yhat = model.predict(x_input, verbose=0) print(yhat)

Running the example prepares the generator, fits the model, and makes the out of sample prediction, correctly predicting a value close to 11.

[[11.510406]]

We can also use the generator to fit a recurrent neural network, such as a Long Short-Term Memory network, or LSTM.

The LSTM expects data input to have the shape [ samples, timesteps, features ], whereas the generator described so far is providing lag observations as features or the shape [ samples, features ]. We can reshape the univariate time series prior to preparing the generator from [10, ] to [10, 1] for 10 time steps and 1 feature; for example: # reshape to [10, 1] n_features = 1 series = series.reshape((len(series), n_features)) The TimeseriesGenerator will then split the series into samples with the shape [ batch, n_input, 1 ] or [8, 2, 1] for all eight samples in the generator and the two lag observations used as time steps.

The complete example is listed below.

# univariate one step problem with lstm from numpy import array from keras.models import Sequential from keras.layers import Dense from keras.layers import LSTM from keras.preprocessing.sequence import TimeseriesGenerator # define dataset series = array([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]) # reshape to [10, 1] n_features = 1 series = series.reshape((len(series), n_features)) # define generator n_input = 2 generator = TimeseriesGenerator(series, series, length=n_input, batch_size=8) # define model model = Sequential() model.add(LSTM(100, activation='relu', input_shape=(n_input, n_features))) model.add(Dense(1)) model.compile(optimizer='adam', loss='mse') # fit model model.fit_generator(generator, steps_per_epoch=1, epochs=500, verbose=0) # make a one step prediction out of sample x_input = array([9, 10]).reshape((1, n_input, n_features)) yhat = model.predict(x_input, verbose=0) print(yhat)

Again, running the example prepares the data, fits the model, and predicts the next out of sample value in the sequence.

[[11.092189]] Multivariate Time Series Example

The TimeseriesGenerator also supports multivariate time series problems.

These are problems where you have multiple parallel series, with observations at the same time step in each series.

We can demonstrate this with an example.

First, we can contrive a dataset of two parallel series.

# define dataset in_seq1 = array([10, 20, 30, 40, 50, 60, 70, 80, 90, 100]) in_seq2 = array([15, 25, 35, 45, 55, 65, 75, 85, 95, 105])

It is a standard structure to have multivariate time series formatted such that each time series is a separate column and rows are the observations at each time step.

The series we have defined are vectors, but we can convert them into columns. We can reshape each series into an array with th

本文开发(python)相关术语:python基础教程 python多线程 web开发工程师 软件开发工程师 软件开发流程

代码区博客精选文章
分页:12
转载请注明
本文标题:How to Use the TimeseriesGenerator for Time Series Forecasting in Keras
本站链接:https://www.codesec.net/view/610742.html


1.凡CodeSecTeam转载的文章,均出自其它媒体或其他官网介绍,目的在于传递更多的信息,并不代表本站赞同其观点和其真实性负责;
2.转载的文章仅代表原创作者观点,与本站无关。其原创性以及文中陈述文字和内容未经本站证实,本站对该文以及其中全部或者部分内容、文字的真实性、完整性、及时性,不作出任何保证或承若;
3.如本站转载稿涉及版权等问题,请作者及时联系本站,我们会及时处理。
登录后可拥有收藏文章、关注作者等权限...
技术大类 技术大类 | 开发(python) | 评论(0) | 阅读(83)