未加星标

How to Develop Multilayer Perceptron Models for Time Series Forecasting

字体大小 | |
[开发(python) 所属分类 开发(python) | 发布者 店小二04 | 时间 2018 | 作者 红领巾 ] 0人收藏点击收藏

Multilayer Perceptrons, or MLPs for short, can be applied to time series forecasting.

A challenge with using MLPs for time series forecasting is in the preparation of the data. Specifically, lag observations must be flattened into feature vectors.

In this tutorial, you will discover how to develop a suite of MLP models for a range of standard time series forecasting problems.

The objective of this tutorial is to provide standalone examples of each model on each type of time series problem as a template that you can copy and adapt for your specific time series forecasting problem.

In this tutorial, you will discover how to develop a suite of Multilayer Perceptron models for a range of standard time series forecasting problems.

After completing this tutorial, you will know:

How to develop MLP models for univariate time series forecasting. How to develop MLP models for multivariate time series forecasting. How to develop MLP models for multi-step time series forecasting.

Let’s get started.


How to Develop Multilayer Perceptron Models for Time Series Forecasting

How to Develop Multilayer Perceptron Models for Time Series Forecasting

Photo by Bureau of Land Management , some rights reserved.

Tutorial Overview

This tutorial is divided into four parts; they are:

Univariate MLP Models Multivariate MLP Models Multi-Step MLP Models Multivariate Multi-Step MLP Models Univariate MLP Models

Multilayer Perceptrons, or MLPs for short, can be used to model univariate time series forecasting problems.

Univariate time series are a dataset comprised of a single series of observations with a temporal ordering and a model is required to learn from the series of past observations to predict the next value in the sequence.

This section is divided into two parts; they are:

Data Preparation MLP Model Data Preparation

Before a univariate series can be modeled, it must be prepared.

The MLP model will learn a function that maps a sequence of past observations as input to an output observation. As such, the sequence of observations must be transformed into multiple examples from which the model can learn.

Consider a given univariate sequence:

[10, 20, 30, 40, 50, 60, 70, 80, 90]

We can divide the sequence into multiple input/output patterns called samples, where three time steps are used as input and one time step is used as output for the one-step prediction that is being learned.

X, y 10, 20, 30 40 20, 30, 40 50 30, 40, 50 60 ...

The split_sequence() function below implements this behavior and will split a given univariate sequence into multiple samples where each sample has a specified number of time steps and the output is a single time step.

# split a univariate sequence into samples def split_sequence(sequence, n_steps): X, y = list(), list() for i in range(len(sequence)): # find the end of this pattern end_ix = i + n_steps # check if we are beyond the sequence if end_ix > len(sequence)-1: break # gather input and output parts of the pattern seq_x, seq_y = sequence[i:end_ix], sequence[end_ix] X.append(seq_x) y.append(seq_y) return array(X), array(y)

We can demonstrate this function on our small contrived dataset above.

The complete example is listed below.

# univariate data preparation from numpy import array # split a univariate sequence into samples def split_sequence(sequence, n_steps): X, y = list(), list() for i in range(len(sequence)): # find the end of this pattern end_ix = i + n_steps # check if we are beyond the sequence if end_ix > len(sequence)-1: break # gather input and output parts of the pattern seq_x, seq_y = sequence[i:end_ix], sequence[end_ix] X.append(seq_x) y.append(seq_y) return array(X), array(y) # define input sequence raw_seq = [10, 20, 30, 40, 50, 60, 70, 80, 90] # choose a number of time steps n_steps = 3 # split into samples X, y = split_sequence(raw_seq, n_steps) # summarize the data for i in range(len(X)): print(X[i], y[i])

Running the example splits the univariate series into six samples where each sample has three input time steps and one output time step.

[10 20 30] 40 [20 30 40] 50 [30 40 50] 60 [40 50 60] 70 [50 60 70] 80 [60 70 80] 90

Now that we know how to prepare a univariate series for modeling, let’s look at developing an MLP model that can learn the mapping of inputs to outputs.

Need help with Deep Learning for Time Series?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

Download Your FREE Mini-Course

MLP Model

A simple MLP model has a single hidden layer of nodes, and an output layer used to make a prediction.

We can define an MLP for univariate time series forecasting as follows.

# define model model = Sequential() model.add(Dense(100, activation='relu', input_dim=n_steps)) model.add(Dense(1)) model.compile(optimizer='adam', loss='mse')

Important in the definition is the shape of the input; that is what the model expects as input for each sample in terms of the number of time steps.

The number of time steps as input is the number we chose when preparing our dataset as an argument to the split_sequence() function.

The input dimension for each sample is specified in the input_dim argument on the definition of first hidden layer. Technically, the model will view each time step as a separate feature instead of separate time steps.

We almost always have multiple samples, therefore, the model will expect the input component of training data to have the dimensions or shape:

[samples, features] Our split_sequence() function in the previous section outputs the X with the shape [samples, features] ready to use for modeling.

The model is fit using the efficient Adam version of stochastic gradient descent and optimized using the mean squared error, or ‘ mse ‘, loss function.

Once the model is defined, we can fit it on the training dataset.

# fit model model.fit(X, y, epochs=2000, verbose=0)

After the model is fit, we can use it to make a prediction.

We can predict the next value in the sequence by providing the input:

[70, 80, 90]

And expecting the model to predict something like:

[100] The model expects the input shape to be two-dimensional with [samples, features] , therefore, we must reshape the single input sample before making the prediction, e.g with the shape [1, 3] for 1 sample and 3 time steps used as input features. # demonstrate prediction x_input = array([70, 80, 90]) x_input = x_input.reshape((1, n_steps)) yhat = model.predict(x_input, verbose=0)

We can tie all of this together and demonstrate how to develop an MLP for univariate time series forecasting and make a single prediction.

# univariate mlp example from numpy import array from keras.models import Sequential from keras.layers import Dens

本文开发(python)相关术语:python基础教程 python多线程 web开发工程师 软件开发工程师 软件开发流程

代码区博客精选文章
分页:12
转载请注明
本文标题:How to Develop Multilayer Perceptron Models for Time Series Forecasting
本站链接:https://www.codesec.net/view/611598.html


1.凡CodeSecTeam转载的文章,均出自其它媒体或其他官网介绍,目的在于传递更多的信息,并不代表本站赞同其观点和其真实性负责;
2.转载的文章仅代表原创作者观点,与本站无关。其原创性以及文中陈述文字和内容未经本站证实,本站对该文以及其中全部或者部分内容、文字的真实性、完整性、及时性,不作出任何保证或承若;
3.如本站转载稿涉及版权等问题,请作者及时联系本站,我们会及时处理。
登录后可拥有收藏文章、关注作者等权限...
技术大类 技术大类 | 开发(python) | 评论(0) | 阅读(69)