SlideShare a Scribd company logo
Time Delayed Recurrent Neural Network
for Multi-Step Prediction
By Kostas Hatalis
hatalis@gmail.com
Dept. of Electrical & Computer Engineering
Lehigh University, Bethlehem, PA
2014
Kostas Hatalis Recurrent Neural Network 2014 1 / 18
Renewable Power Forecasting
Forecasting is essential to the integration of renewable power
generation to the smart grid.
Kostas Hatalis Recurrent Neural Network 2014 2 / 18
General Short Term Time Frames and Methods
Resolution of forecasting:
Seconds: e.g. turbine/converter control.
Minutes: e.g. balancing and transmission.
Hours: e.g. storage and scheduling.
Days: e.g. farm operations.
Two categories of forecasting methods:
Physical: meteorological models.
Statistical: time series / machine learning.
My research focusing on statistical models.
Kostas Hatalis Recurrent Neural Network 2014 3 / 18
Time Series Forecasting
Time series forecasts are made as follows:
1 Study features of the series.
2 Remove trend and seasonality to get stationary residuals.
3 Test if IID noise then fit an ARMA model to residuals.
4 Forecast residual series by minimizing the mean squared error of the
expected future value.
Kostas Hatalis Recurrent Neural Network 2014 4 / 18
Trend and Seasonal Fitting
The classical decomposition model:
Kostas Hatalis Recurrent Neural Network 2014 5 / 18
Fitting ARMA Models
For our residual sequence, Yt is an ARMA(p,q) process if:
Yt − φ1Yt−1 − ... − φpYt−p = Zt + θ1Zt−1 + ... + θqZt−q
where Zt is white noise with mean zero and variance σ2.
Once we estimate all the parameters we can forecast future values at time
t + k. Eg. of an ARMA(2,1) model forecasting at t + 1 ahead:
ˆYt+1 = φ1Yt + φ2Yt−1 + Zt+1 − θ1Zt
ARMA is a model for the conditional mean of a process.
Kostas Hatalis Recurrent Neural Network 2014 6 / 18
Time Series Forecasting
Other Time Series forecasting models include:
Weighted moving average
Kalman filtering
Exponential smoothing
Autoregressive integrated moving average (ARIMA)
Seasonal ARIMA (SARIMA)
Extrapolation methods
Downside? Most assume data is stationary (distribution and
parameters do not change overtime), renewables are non-stationary!
Kostas Hatalis Recurrent Neural Network 2014 7 / 18
Evaluation Methods
Methods evaluating the predicted values vs the observed value
include:
Mean Squared Error (MSE)
Root Mean Squared Error (RMSE)
Median Absolute Deviation (MAD)
Mean Absolute Percentage Error (MAPE)
Sum of Squared Error (SSE)
Mean Absolute Error (MAE)
Signed Mean Squared Error (SMSE)
Kostas Hatalis Recurrent Neural Network 2014 8 / 18
Machine Learning Approach
Machine learning based forecasting updates it’s model with every
new observation
Prediction inference based on supervised learning which consists in
learning the link between two datasets: the observed data X and an
variable y that we are trying to predict, usually called “targets”.
Kostas Hatalis Recurrent Neural Network 2014 9 / 18
Statistical Learning Forecasting Methods
A number of machine learning methods have been applied for point
forecasting such as support vector machines, Bayesian networks, k-nearest
neighbors, etc. For point forecasting Artificial Neural Networks (ANN) are
amongst the most popular.
Universal Approximation Theorem
An artificial neural network with a single hidden layer containing a finite
number of neurons can approximate any function provided the activation
function f of the hidden neurons are non-linear.
Kostas Hatalis Recurrent Neural Network 2014 10 / 18
NARNET - Forecasting Ocean Wave
In my initial forecasting work I used an ANN called nonlinear
autoregressive network (NARNET) to forecast wave heights.
https://en.wikipedia.org/wiki/Significantwaveheight
Kostas Hatalis Recurrent Neural Network 2014 11 / 18
NARNET - Building a Neural Network
When building an ANN its important to define:
Architecture: number of layers and nodes.
Activation function: sigmoid, tanh, etc.
Cost function: mean squared error, etc.
Learning: gradient descent backpropagation, etc.
Kostas Hatalis Recurrent Neural Network 2014 12 / 18
NARNET - Model
For time series prediction NARNET is a recurrent ANN:
Kostas Hatalis Recurrent Neural Network 2014 13 / 18
NARNET - Results
Kostas Hatalis Recurrent Neural Network 2014 14 / 18
PSONET - Simulated Ocean Waves
Short term forecasting of ocean waves must be simulated.
Also NARNET has few problems, such as using too much memory
and occasionally getting stuck in local minamas. Needed to find new
training solution.
Kostas Hatalis Recurrent Neural Network 2014 15 / 18
PSONET - Particle Swarm Optimization
The adaptive particle swarm optimization (APSO) algorithm inspired by
flock of birds/fish searching for food:
−→v i (t + 1) =
ω−→v i (t) + c1φ1(−→p i (t) − −→x i (t)) + c2φ2(−→p g (t) − −→x i (t))
−→x i (t + 1) = −→x i (t) + ∆t−→v i (t + 1)
Kostas Hatalis Recurrent Neural Network 2014 16 / 18
PSONET - Results
Table: Table of error statistics with higher noisy data.
PSONAR 5s 10s 30s 60s
MSE 0.1603 0.9153 8.9066 13.4073
RMSE 0.4579 0.9567 2.9843 3.6615
MAPE 0.1734 0.2964 2.0758 3.2429
MAD 0.3470 0.6783 2.4161 2.8644
CC 0.9974 0.9693 0.3743 0.0902
NARNET 5s 10s 30s 60s
MSE 0.2097 1.4119 10.8491 15.0918
RMSE 0.4003 1.1882 3.2937 3.7525
MAPE 0.2531 0.7421 1.3229 4.7647
MAD 0.3698 0.8938 2.0769 3.0842
CC 0.9971 0.9165 0.31433 0.12301
Kostas Hatalis Recurrent Neural Network 2014 17 / 18
Work Done
[1] Hatalis, Kostas, et al. ”Multi-step forecasting of wave power using a
nonlinear recurrent neural network.” 2014 IEEE PES General Meeting—
Conference & Exposition. IEEE, 2014.
[2] Hatalis, Kostas, et al. ”Adaptive particle swarm optimization learning
in a time delayed recurrent neural network for multi-step prediction.”
Foundations of Computational Intelligence (FOCI), 2014 IEEE Symposium
on. IEEE, 2014.
[3] Hatalis, Kostas, et al. ”Swarm Based Parameter Estimation of Wave
Characteristics for Control in Ocean Energy Farms.” In Proceedings of the
IEEE Power Energy Society General Meeting, 2015.
[4] Hatalis, Kostas, et al. ”Particle Swarm Based Model Exploitation for
Parameter Estimation of Wave Realizations.” IEEE Symposium Series on
Computational Intelligence , 2016
Kostas Hatalis Recurrent Neural Network 2014 18 / 18

More Related Content

PPT
Using HDF5 and Python: The H5py module
PDF
03 Machine Learning Linear Algebra
PDF
Universal Approximation Theorem
PDF
Dataflow Analysis
PPTX
Travelling salesman problem
PPTX
Adaptive neural network controller Presentation
PPTX
Discrete Time Fourier Transform
DOCX
Btp report final_lalit
Using HDF5 and Python: The H5py module
03 Machine Learning Linear Algebra
Universal Approximation Theorem
Dataflow Analysis
Travelling salesman problem
Adaptive neural network controller Presentation
Discrete Time Fourier Transform
Btp report final_lalit

What's hot (20)

PPT
L03 ai - knowledge representation using logic
PPT
Chomsky Hierarchy.ppt
PPTX
Unit 4 sp macro
PPTX
Autoencoder
PPT
Introduction to Compiler design
PPTX
Reinforcement Learning
PPTX
properties of the task environment in artificial intelligence system
PPTX
Convolution Codes
DOCX
Machine learning important questions
PPTX
Chernoff bound
PDF
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
PPTX
Minimization of DFA.pptx
PDF
Intermediate code generation in Compiler Design
PDF
Theory of computation and automata
PDF
Padrões-13 - Padrões Estruturais - Proxy
PPTX
Moore and mealy machines
PPTX
Equivalence of DFAs and NFAs.pptx
PPTX
Introduction to genetic algorithms
PPT
Variants of Turing Machine
L03 ai - knowledge representation using logic
Chomsky Hierarchy.ppt
Unit 4 sp macro
Autoencoder
Introduction to Compiler design
Reinforcement Learning
properties of the task environment in artificial intelligence system
Convolution Codes
Machine learning important questions
Chernoff bound
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Minimization of DFA.pptx
Intermediate code generation in Compiler Design
Theory of computation and automata
Padrões-13 - Padrões Estruturais - Proxy
Moore and mealy machines
Equivalence of DFAs and NFAs.pptx
Introduction to genetic algorithms
Variants of Turing Machine
Ad

Similar to Time Delayed Recurrent Neural Network for Multi-Step Prediction (20)

PDF
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
PDF
Sequence-to-Sequence Modeling for Time Series
PDF
Sequence-to-Sequence Modeling for Time Series
PPTX
Deep ar presentation
PDF
Automatic time series forecasting using nonlinear autoregressive neural netwo...
PDF
ANN based STLF of Power System
PDF
Multivariate Time Series
PDF
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
PDF
Time series deep learning
PPTX
Deep learning Tutorial - Part II
PDF
Poster_Reseau_Neurones_Journees_2013
PDF
A Hybrid Deep Neural Network Model For Time Series Forecasting
PDF
Deep Learning for Time Series Data
PDF
Efficiency of recurrent neural networks for seasonal trended time series mode...
PDF
Time Series Forecasting Using Novel Feature Extraction Algorithm and Multilay...
PDF
Multi-task learning using non-linear autoregressive models and recurrent neur...
PDF
Slides_Neural Networks for Time Series Prediction
PDF
lecture5.pdf
PDF
Павел Филонов, «Лаборатория Касперского», Глубокое обучение и извлечение приз...
PDF
Deep learning and feature extraction for time series forecasting
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Sequence-to-Sequence Modeling for Time Series
Sequence-to-Sequence Modeling for Time Series
Deep ar presentation
Automatic time series forecasting using nonlinear autoregressive neural netwo...
ANN based STLF of Power System
Multivariate Time Series
Time Series Forecasting Using Recurrent Neural Network and Vector Autoregress...
Time series deep learning
Deep learning Tutorial - Part II
Poster_Reseau_Neurones_Journees_2013
A Hybrid Deep Neural Network Model For Time Series Forecasting
Deep Learning for Time Series Data
Efficiency of recurrent neural networks for seasonal trended time series mode...
Time Series Forecasting Using Novel Feature Extraction Algorithm and Multilay...
Multi-task learning using non-linear autoregressive models and recurrent neur...
Slides_Neural Networks for Time Series Prediction
lecture5.pdf
Павел Филонов, «Лаборатория Касперского», Глубокое обучение и извлечение приз...
Deep learning and feature extraction for time series forecasting
Ad

Recently uploaded (20)

PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
PDF
Clinical guidelines as a resource for EBP(1).pdf
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PDF
TRAFFIC-MANAGEMENT-AND-ACCIDENT-INVESTIGATION-WITH-DRIVING-PDF-FILE.pdf
PDF
annual-report-2024-2025 original latest.
PDF
Business Analytics and business intelligence.pdf
PPTX
Qualitative Qantitative and Mixed Methods.pptx
PPT
Reliability_Chapter_ presentation 1221.5784
PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PDF
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
PPTX
Introduction to machine learning and Linear Models
PPTX
Introduction to Knowledge Engineering Part 1
PPTX
Supervised vs unsupervised machine learning algorithms
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
Acceptance and paychological effects of mandatory extra coach I classes.pptx
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
Clinical guidelines as a resource for EBP(1).pdf
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
Introduction-to-Cloud-ComputingFinal.pptx
TRAFFIC-MANAGEMENT-AND-ACCIDENT-INVESTIGATION-WITH-DRIVING-PDF-FILE.pdf
annual-report-2024-2025 original latest.
Business Analytics and business intelligence.pdf
Qualitative Qantitative and Mixed Methods.pptx
Reliability_Chapter_ presentation 1221.5784
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
oil_refinery_comprehensive_20250804084928 (1).pptx
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
Introduction to machine learning and Linear Models
Introduction to Knowledge Engineering Part 1
Supervised vs unsupervised machine learning algorithms
STUDY DESIGN details- Lt Col Maksud (21).pptx
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
Galatica Smart Energy Infrastructure Startup Pitch Deck

Time Delayed Recurrent Neural Network for Multi-Step Prediction

  • 1. Time Delayed Recurrent Neural Network for Multi-Step Prediction By Kostas Hatalis hatalis@gmail.com Dept. of Electrical & Computer Engineering Lehigh University, Bethlehem, PA 2014 Kostas Hatalis Recurrent Neural Network 2014 1 / 18
  • 2. Renewable Power Forecasting Forecasting is essential to the integration of renewable power generation to the smart grid. Kostas Hatalis Recurrent Neural Network 2014 2 / 18
  • 3. General Short Term Time Frames and Methods Resolution of forecasting: Seconds: e.g. turbine/converter control. Minutes: e.g. balancing and transmission. Hours: e.g. storage and scheduling. Days: e.g. farm operations. Two categories of forecasting methods: Physical: meteorological models. Statistical: time series / machine learning. My research focusing on statistical models. Kostas Hatalis Recurrent Neural Network 2014 3 / 18
  • 4. Time Series Forecasting Time series forecasts are made as follows: 1 Study features of the series. 2 Remove trend and seasonality to get stationary residuals. 3 Test if IID noise then fit an ARMA model to residuals. 4 Forecast residual series by minimizing the mean squared error of the expected future value. Kostas Hatalis Recurrent Neural Network 2014 4 / 18
  • 5. Trend and Seasonal Fitting The classical decomposition model: Kostas Hatalis Recurrent Neural Network 2014 5 / 18
  • 6. Fitting ARMA Models For our residual sequence, Yt is an ARMA(p,q) process if: Yt − φ1Yt−1 − ... − φpYt−p = Zt + θ1Zt−1 + ... + θqZt−q where Zt is white noise with mean zero and variance σ2. Once we estimate all the parameters we can forecast future values at time t + k. Eg. of an ARMA(2,1) model forecasting at t + 1 ahead: ˆYt+1 = φ1Yt + φ2Yt−1 + Zt+1 − θ1Zt ARMA is a model for the conditional mean of a process. Kostas Hatalis Recurrent Neural Network 2014 6 / 18
  • 7. Time Series Forecasting Other Time Series forecasting models include: Weighted moving average Kalman filtering Exponential smoothing Autoregressive integrated moving average (ARIMA) Seasonal ARIMA (SARIMA) Extrapolation methods Downside? Most assume data is stationary (distribution and parameters do not change overtime), renewables are non-stationary! Kostas Hatalis Recurrent Neural Network 2014 7 / 18
  • 8. Evaluation Methods Methods evaluating the predicted values vs the observed value include: Mean Squared Error (MSE) Root Mean Squared Error (RMSE) Median Absolute Deviation (MAD) Mean Absolute Percentage Error (MAPE) Sum of Squared Error (SSE) Mean Absolute Error (MAE) Signed Mean Squared Error (SMSE) Kostas Hatalis Recurrent Neural Network 2014 8 / 18
  • 9. Machine Learning Approach Machine learning based forecasting updates it’s model with every new observation Prediction inference based on supervised learning which consists in learning the link between two datasets: the observed data X and an variable y that we are trying to predict, usually called “targets”. Kostas Hatalis Recurrent Neural Network 2014 9 / 18
  • 10. Statistical Learning Forecasting Methods A number of machine learning methods have been applied for point forecasting such as support vector machines, Bayesian networks, k-nearest neighbors, etc. For point forecasting Artificial Neural Networks (ANN) are amongst the most popular. Universal Approximation Theorem An artificial neural network with a single hidden layer containing a finite number of neurons can approximate any function provided the activation function f of the hidden neurons are non-linear. Kostas Hatalis Recurrent Neural Network 2014 10 / 18
  • 11. NARNET - Forecasting Ocean Wave In my initial forecasting work I used an ANN called nonlinear autoregressive network (NARNET) to forecast wave heights. https://en.wikipedia.org/wiki/Significantwaveheight Kostas Hatalis Recurrent Neural Network 2014 11 / 18
  • 12. NARNET - Building a Neural Network When building an ANN its important to define: Architecture: number of layers and nodes. Activation function: sigmoid, tanh, etc. Cost function: mean squared error, etc. Learning: gradient descent backpropagation, etc. Kostas Hatalis Recurrent Neural Network 2014 12 / 18
  • 13. NARNET - Model For time series prediction NARNET is a recurrent ANN: Kostas Hatalis Recurrent Neural Network 2014 13 / 18
  • 14. NARNET - Results Kostas Hatalis Recurrent Neural Network 2014 14 / 18
  • 15. PSONET - Simulated Ocean Waves Short term forecasting of ocean waves must be simulated. Also NARNET has few problems, such as using too much memory and occasionally getting stuck in local minamas. Needed to find new training solution. Kostas Hatalis Recurrent Neural Network 2014 15 / 18
  • 16. PSONET - Particle Swarm Optimization The adaptive particle swarm optimization (APSO) algorithm inspired by flock of birds/fish searching for food: −→v i (t + 1) = ω−→v i (t) + c1φ1(−→p i (t) − −→x i (t)) + c2φ2(−→p g (t) − −→x i (t)) −→x i (t + 1) = −→x i (t) + ∆t−→v i (t + 1) Kostas Hatalis Recurrent Neural Network 2014 16 / 18
  • 17. PSONET - Results Table: Table of error statistics with higher noisy data. PSONAR 5s 10s 30s 60s MSE 0.1603 0.9153 8.9066 13.4073 RMSE 0.4579 0.9567 2.9843 3.6615 MAPE 0.1734 0.2964 2.0758 3.2429 MAD 0.3470 0.6783 2.4161 2.8644 CC 0.9974 0.9693 0.3743 0.0902 NARNET 5s 10s 30s 60s MSE 0.2097 1.4119 10.8491 15.0918 RMSE 0.4003 1.1882 3.2937 3.7525 MAPE 0.2531 0.7421 1.3229 4.7647 MAD 0.3698 0.8938 2.0769 3.0842 CC 0.9971 0.9165 0.31433 0.12301 Kostas Hatalis Recurrent Neural Network 2014 17 / 18
  • 18. Work Done [1] Hatalis, Kostas, et al. ”Multi-step forecasting of wave power using a nonlinear recurrent neural network.” 2014 IEEE PES General Meeting— Conference & Exposition. IEEE, 2014. [2] Hatalis, Kostas, et al. ”Adaptive particle swarm optimization learning in a time delayed recurrent neural network for multi-step prediction.” Foundations of Computational Intelligence (FOCI), 2014 IEEE Symposium on. IEEE, 2014. [3] Hatalis, Kostas, et al. ”Swarm Based Parameter Estimation of Wave Characteristics for Control in Ocean Energy Farms.” In Proceedings of the IEEE Power Energy Society General Meeting, 2015. [4] Hatalis, Kostas, et al. ”Particle Swarm Based Model Exploitation for Parameter Estimation of Wave Realizations.” IEEE Symposium Series on Computational Intelligence , 2016 Kostas Hatalis Recurrent Neural Network 2014 18 / 18