SlideShare a Scribd company logo
ARTIFICIAL NEURAL
NETWORK FOR LOAD
FORECASTING IN SMART
GRID
IEEE PAPER PRESENTATION in the subject of
Artificial Intelligence(A.I.)
By
Utsav Yagnik
(150430707017)
M.E. Electrical (Semester 2), SSEC, Bhavnagar
ABOUT THE PAPER
• Author(s):
1. HAO-TIAN ZHANG
2. FANG-YUAN XU
3. LONG ZHOU
• Energy system group, city university, London, UK.
• Proceedings of the Ninth International Conference on Machine Learning and
Cybernetics, Qingdao, 11-14 July 2010
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
2
OUTLINE
 Abstract 4
 Introduction 5
 Back propagation network 6
 Architecture of Back propagation network 7
 Activation function of Back propagation network 10
 Training procedure 11
 Data acquisition 12
 Data normalization 15
 Creation of Neural Network 16
 Training of Neural Network 18
 Problems while training Neural Network 21
 Neural Network Simulation 22
 Conclusion 30
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
3
ABSTRACT
• To achieve optimization of power configuration and energy saving, a large
amount of new technologies are applied to the power system in the smart
grid.
• Load forecasting is must as far as planning and operation of a power
system is concerned.
• This paper intends to demonstrate the application of ANN to forecast the
load of a practical situation in the Ontario, Canada.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
4
INTRODUCTION
• To give the exact information about the power purchasing and generation,
high accuracy of load forecasting is required.
• Factors such as season differences, climate changes, weekends and
holidays, disasters and political reasons, operation scenarios of the power
plants and faults occurring on the network lead to changes of the load
demand and generations.
• Owing to the transcendent characteristics, ANNs is one of the most
competent methods to do the practical works like load forecasting.
• In this paper, ANN is applied to the Ontario, Canada and factors affecting
load forecasting are been analysed.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
5
BACK PROPAGATION NETWORK
• As the problem defined, the relationship between the input and target is
non-linear and very complicated. ANN is an appropriate method to apply
into the problem to forecast the load situation.
• There are various methods available to do the load forecasting but Back
propagation remains one of the most widely used methods.
• It can create a logical relationship between input data and output data to
use it to predict the future outcome.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
6
ARCHITECTURE OF BACK
PROPAGATION NETWORK
Single neuron model of back propagation network
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
7
ARCHITECTURE OF BACK
PROPAGATION NETWORK
• Generally, the output is a function of the sum of bias and weight multiplied
by the input. The activation function could be any kinds of functions.
However, the generated output is different.
• Owing to the feed-forward network, in general, at least one hidden layer
before the output layer is needed. Three-layer network is selected as the
architecture, because this kind of architecture can approximate any
function with a few discontinuities.
• The architecture with three layers is shown in Figure in the next slide.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
8
ARCHITECTURE OF BACK
PROPAGATION NETWORK
Architecture with three layers
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
9
ACTIVATION FUNCTIONS OF
BACK PROPAGATION NETWORK
(a) Log-sigmoid
(b) Tan-sigmoid
(c) Linear
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
10
TRAINING PROCEDURES
• The neural network toolbox (nntool) is used for training the network.
• The neural network training is based on the load demand and weather
conditions in Ontario Province, Canada which is located in the south of
Canada.
• The region in Ontario can be divided into three parts which are southwest,
central and east, and north, according to the weather conditions. The
population is gathered around southeastern part of the entire province,
which includes two of the largest cities of Canada, Toronto and Ottawa.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
11
DATA ACQUISITION
• For training a neural network, the input and target output data are
required to give to the neural network.
• For load forecasting, input vectors include all the information of factors
affecting the load demand change, such as weather information, holidays or
working days, fault occurring in the network and so on.
• Output targets are the real time load scenarios, which mean the demand
presented at the same time as input vectors changing.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
12
DATA ACQUISITION
• This paper has taken following factors into the consideration:
1. Temperature (˚C)
2. Dew Point Temperature (˚C)
3. Relative Humidity (%)
4. Wind speed (km/h)
5. Wind Direction (10)
6. Visibility (km)
7. Atmospheric pressure (kPa)
8. Logical adjustment of weekday or weekend
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
13
DATA ACQUISITION
• The data was gathered hourly according to the historical weather
conditions remained in the weather stations.
• Load demand data also needs to be gathered hourly and correspondingly.
• In this paper, 2 years weather data and load data is collected to train and
test the created network.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
14
DATA NORMALIZATION
• To avoid the neurons going too far into saturation, all the gathered data
needs to be normalized once it is acquired.
• Like per unit system, each input and target data are required to be divided
by the maximum absolute value in corresponding factor.
• Each value of the normalized data is within the range between -1 and +1 so
that the ANN could recognize the data easily.
• Besides, weekdays are represented as 1, and weekend are represented as 0.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
15
CREATION OF NEURAL NETWORK
• In MATLAB, neural network can be created using “nntool” command in the
command window which will open toolbox in which we can insert input and
target output data.
• Three-layer architecture has been chosen to give the simulation.
• Due to the practical input value is from -1 to +1, the transfer function of
the first layer is set to be tan sigmoid, which is a hyperbolic tangent
sigmoid transfer function.
• The transfer function of the output layer is set to be linear function, which
is a linear function to calculate a layer’s output from its net input.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
16
CREATION OF NEURAL NETWORK
• There is one advantage for the linear output transfer function: because the
linear output neurons lead to the output take on any value, there is no
difficulty to find out the differences between output and target.
• The next step is the neurons and training functions selection. Generally,
Trainbr and Trainlm are the best choices around all of the training
functions in MATLAB toolbox.
• In this paper, the number of neurons is 8 in Trainlm algorithm, and 30 in
Trainbr algorithm.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
17
TRAINING OF THE NETWORK
• The information on weather conditions in 2007 hourly and weekday and
weekend logic in Ontario are defined as training input; the load demand
changes in 2007 hourly in Ontario are defined as training target.
• The training performances of Trainlm algorithm and Trainbr algorithm are
shown in Figures in slide 19 & 20, respectively.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
18
“TRAINLM” TRAINING
PERFORMANCE
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
19
“TRAINBR” TRAINING
PERFORMANCE
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
20
PROBLEMS WHILE TRAINING THE
NETWORKS
• For both training algorithms, namely, Trainbr and Trainlm, the procedure
will stop when any of the conditions occurs:
1. Epochs reached the maximum value
2. Time approaches to the preinstalled value
3. Goal error is minimized
4. Gradient is decreased to min_grad
5. Mu exceeds mu_max
6. Validation performance has increased more than max_fail times since the
last time it decreased
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
21
NEURAL NETWORK SIMULATION
• The network is required to check whether it can achieve the expectation
after training. Another set of input vectors and demand scenarios are
needed to test the network.
• Comparison needs to be made to check out the difference between the test
output and real demand.
• After the simulation, a set of output could be obtained through the trained
neural network.
• The simulation output and the simulation target are used to check the
mean squared error to analyse the extent of succeed with neural network
application.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
22
NEURAL NETWORK SIMULATION
• In this project, the information on weather conditions in 2008 hourly and
weekday and weekend logic in Ontario are used as simulation input, and
the load demand scenarios in 2008 hourly in Ontario are used as the
simulation target.
• Mean squared error is given by …
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
23
COMPARISON OF OUTCOMES WITH
DIFFERENT NUMBER OF NEURONS
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
24
COMPARISON OF OUTCOMES WITH
DIFFERENT NUMBER OF NEURONS
• In the last slide, the upper figure shows Trainbr algorithm outcome with 8
neurons and lower figure shoes same with 10 neurons.
• The green track is the test simulation result and the blue track is the real
load demand which is provided by electric industry in Ontario.
• The horizontal is presenting time, and ordinate is presenting the load which
has been normalized.
• The less the mean squared error is, the better the created neuron network
can perform.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
25
COMPARISON OF OUTCOMES WITH
DIFFERENT NUMBER OF NEURONS
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
26
COMPARISON OF OUTCOMES WITH
DIFFERENT NUMBER OF NEURONS
• In the last slide the upper figure shows results of Trainlm with 8 and 30
neurons and lower figure shows results of both Trainbr and Trainlm with
8 neurons.
• In the second slide, The blue track is the test simulation target, the red
one is the result simulated by Trainlm, and the green one is the result
simulated by Trainbr.
• The simulation result of Trainbr is much closer to the test target than
that of Trainlm, although the error goal of Trainbr is much larger. The
trend of the Trainbr followed the general track of the test target, whereas
the Trainlm can only simulate the maximum and minimum value per
cycle.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
27
COMPARISON OF OUTCOMES WITH
DIFFERENT NUMBER OF NEURONS
• Theoretically, more neuron applied into the neural network could make
the performance better.
• However, over-fittings could occur much more obviously. In Figure last
slide, red track which is representing the trainlm with 30 neurons is
much closer to the test target. Over-fitting is the main problem which
cannot be avoided.
• So, although the MSE of Trainlm is less then Trainbr as shown in the
next slide when neurons is increased, but it has the main problem of
overfitting which could decrease the quality of the simulation.
• So finally, trainbr could be the optimized algorithm that can be employed
in load forecast by back propagation.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
28
MSE(MEAN SQUARED ERRORS)
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
29
CONCLUSION
• This paper focused on the behaviours of different training algorithms for
load forecasting by back propagation algorithm in Neural Network.
• After research, Trainbr algorithm which is integrated in Neural Network
Toolbox in MATLAB is regarded as one of the best choice to do load
forecast. If the accurate results are required to forecast the load, more
neurons are needed to apply into the network architecture.
• On the other hand, over-fitting must be considered about to ensure the
network simulate the load situation well. Owing to the condition
limitation, the input vectors did not take all of the information into
account.
• A few of the simulation part didn’t meet the real demand very well, even
large squared error occurred. If the information was gathered enough and
the networks were trained more meticulous, better result could be
obtained to apply into load forecasting for smart grid.
A.I. IEEE PPT UTSAV YAGNIK (150430707017)
30
THANK YOU
A.I. IEEE PPT UTSAV YAGNIK (150430707017) 31

More Related Content

PPTX
Artificial neural network for load forecasting in smart grid
ODP
Convolutional Neural Networks
PPTX
Deep Learning
PPTX
Long Short Term Memory (Neural Networks)
PPTX
Deep learning
PPTX
Introduction to Deep learning
PDF
Introduction to Recurrent Neural Network
DOCX
Data Mining _ Weka
Artificial neural network for load forecasting in smart grid
Convolutional Neural Networks
Deep Learning
Long Short Term Memory (Neural Networks)
Deep learning
Introduction to Deep learning
Introduction to Recurrent Neural Network
Data Mining _ Weka

What's hot (20)

PPTX
Recurrent neural network
PPTX
[NeuralIPS 2020]filter in filter pruning
PDF
Keras and TensorFlow
PPTX
Explainable AI
PDF
Recurrent Neural Networks
PPTX
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
PPTX
Pattern recognition and Machine Learning.
PPTX
Recurrent Neural Network
PPTX
Introduction to Text Mining
PPT
Introduction_to_DEEP_LEARNING.ppt
PPTX
Transformers AI PPT.pptx
PPTX
Deep learning health care
PDF
Internet of Things(IoT) Applications | IoT Tutorial for Beginners | IoT Train...
PDF
Unit 1
PPTX
Text categorization
PPTX
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
PPTX
Deep Learning: Introduction & Chapter 5 Machine Learning Basics
PDF
Deep Learning for Time Series Data
PPTX
Introduction to deep learning
PPT
STOCK MARKET PRREDICTION WITH FEATURE EXTRACTION USING NEURAL NETWORK TEHNIQUE
Recurrent neural network
[NeuralIPS 2020]filter in filter pruning
Keras and TensorFlow
Explainable AI
Recurrent Neural Networks
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Pattern recognition and Machine Learning.
Recurrent Neural Network
Introduction to Text Mining
Introduction_to_DEEP_LEARNING.ppt
Transformers AI PPT.pptx
Deep learning health care
Internet of Things(IoT) Applications | IoT Tutorial for Beginners | IoT Train...
Unit 1
Text categorization
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Deep Learning: Introduction & Chapter 5 Machine Learning Basics
Deep Learning for Time Series Data
Introduction to deep learning
STOCK MARKET PRREDICTION WITH FEATURE EXTRACTION USING NEURAL NETWORK TEHNIQUE
Ad

Similar to AI IEEE (20)

PPTX
ANN load forecasting
PDF
A040101001006
PDF
Iv3515241527
PDF
ANN based STLF of Power System
DOCX
ABSTRACT.docxiyhkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PDF
Artificial Neural Network Based Load Forecasting
DOCX
artificial-neural-network-seminar-report.docx
PDF
Neural network-toolbox
PDF
Electricity Demand Forecasting Using ANN
PDF
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
PDF
Differential Protection of Generator by Using Neural Network, Fuzzy Neural an...
PDF
Electricity Demand Forecasting Using Fuzzy-Neural Network
PDF
Neural Networks AI presentation.pdf
PDF
Neural Network Implementation Control Mobile Robot
PDF
Application of CI in Motor Modeling
PDF
A multi-layer-artificial-neural-network-architecture-design-for-load-forecast...
PPT
Smartplug ppt
PDF
Presentatie 4. Jochen Cremer - TU Delft 28 mei 2024
PPTX
Artificial Neural Network
PPTX
Build a Neural Network for ITSM with TensorFlow
ANN load forecasting
A040101001006
Iv3515241527
ANN based STLF of Power System
ABSTRACT.docxiyhkkkkkkkkkkkkkkkkkkkkkkkkkkkk
Artificial Neural Network Based Load Forecasting
artificial-neural-network-seminar-report.docx
Neural network-toolbox
Electricity Demand Forecasting Using ANN
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
Differential Protection of Generator by Using Neural Network, Fuzzy Neural an...
Electricity Demand Forecasting Using Fuzzy-Neural Network
Neural Networks AI presentation.pdf
Neural Network Implementation Control Mobile Robot
Application of CI in Motor Modeling
A multi-layer-artificial-neural-network-architecture-design-for-load-forecast...
Smartplug ppt
Presentatie 4. Jochen Cremer - TU Delft 28 mei 2024
Artificial Neural Network
Build a Neural Network for ITSM with TensorFlow
Ad

More from Utsav Yagnik (20)

PPTX
2025 Winter SWAYAM NPTEL & A Student.pptx
PPTX
2024 Winter SWAYAM NPTEL & A Student.pptx
PDF
2022 SWAYAM NPTEL & A Student.pdf
PDF
2022 SWAYAM NPTEL & A Faculty Member.pdf
PPTX
2022 FEBRUARY LAST MONTH IN ELECTRICAL ENGINEERING
PDF
2020 SWAYAM NPTEL & a student
PDF
SWAYAM NPTEL & A STUDENT 2019-20
PDF
Gauss's law to differential volume
PPTX
Smart grid
PPTX
Smart grid paper presentation
PPTX
EMMA IEEE
PPTX
FACTS IEEE
PDF
paper 149
PPTX
IEEE APE
PPTX
Ape thyristor
PPTX
CMPSA IEEE PPT 150430707017
PPT
GSECL Electrical branch Training Report
PPTX
1-ф to 1-ф Cycloconverter ppt
PPT
I-V characteristics of SCR final
PDF
RS dc-dc converter 2004
2025 Winter SWAYAM NPTEL & A Student.pptx
2024 Winter SWAYAM NPTEL & A Student.pptx
2022 SWAYAM NPTEL & A Student.pdf
2022 SWAYAM NPTEL & A Faculty Member.pdf
2022 FEBRUARY LAST MONTH IN ELECTRICAL ENGINEERING
2020 SWAYAM NPTEL & a student
SWAYAM NPTEL & A STUDENT 2019-20
Gauss's law to differential volume
Smart grid
Smart grid paper presentation
EMMA IEEE
FACTS IEEE
paper 149
IEEE APE
Ape thyristor
CMPSA IEEE PPT 150430707017
GSECL Electrical branch Training Report
1-ф to 1-ф Cycloconverter ppt
I-V characteristics of SCR final
RS dc-dc converter 2004

AI IEEE

  • 1. ARTIFICIAL NEURAL NETWORK FOR LOAD FORECASTING IN SMART GRID IEEE PAPER PRESENTATION in the subject of Artificial Intelligence(A.I.) By Utsav Yagnik (150430707017) M.E. Electrical (Semester 2), SSEC, Bhavnagar
  • 2. ABOUT THE PAPER • Author(s): 1. HAO-TIAN ZHANG 2. FANG-YUAN XU 3. LONG ZHOU • Energy system group, city university, London, UK. • Proceedings of the Ninth International Conference on Machine Learning and Cybernetics, Qingdao, 11-14 July 2010 A.I. IEEE PPT UTSAV YAGNIK (150430707017) 2
  • 3. OUTLINE  Abstract 4  Introduction 5  Back propagation network 6  Architecture of Back propagation network 7  Activation function of Back propagation network 10  Training procedure 11  Data acquisition 12  Data normalization 15  Creation of Neural Network 16  Training of Neural Network 18  Problems while training Neural Network 21  Neural Network Simulation 22  Conclusion 30 A.I. IEEE PPT UTSAV YAGNIK (150430707017) 3
  • 4. ABSTRACT • To achieve optimization of power configuration and energy saving, a large amount of new technologies are applied to the power system in the smart grid. • Load forecasting is must as far as planning and operation of a power system is concerned. • This paper intends to demonstrate the application of ANN to forecast the load of a practical situation in the Ontario, Canada. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 4
  • 5. INTRODUCTION • To give the exact information about the power purchasing and generation, high accuracy of load forecasting is required. • Factors such as season differences, climate changes, weekends and holidays, disasters and political reasons, operation scenarios of the power plants and faults occurring on the network lead to changes of the load demand and generations. • Owing to the transcendent characteristics, ANNs is one of the most competent methods to do the practical works like load forecasting. • In this paper, ANN is applied to the Ontario, Canada and factors affecting load forecasting are been analysed. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 5
  • 6. BACK PROPAGATION NETWORK • As the problem defined, the relationship between the input and target is non-linear and very complicated. ANN is an appropriate method to apply into the problem to forecast the load situation. • There are various methods available to do the load forecasting but Back propagation remains one of the most widely used methods. • It can create a logical relationship between input data and output data to use it to predict the future outcome. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 6
  • 7. ARCHITECTURE OF BACK PROPAGATION NETWORK Single neuron model of back propagation network A.I. IEEE PPT UTSAV YAGNIK (150430707017) 7
  • 8. ARCHITECTURE OF BACK PROPAGATION NETWORK • Generally, the output is a function of the sum of bias and weight multiplied by the input. The activation function could be any kinds of functions. However, the generated output is different. • Owing to the feed-forward network, in general, at least one hidden layer before the output layer is needed. Three-layer network is selected as the architecture, because this kind of architecture can approximate any function with a few discontinuities. • The architecture with three layers is shown in Figure in the next slide. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 8
  • 9. ARCHITECTURE OF BACK PROPAGATION NETWORK Architecture with three layers A.I. IEEE PPT UTSAV YAGNIK (150430707017) 9
  • 10. ACTIVATION FUNCTIONS OF BACK PROPAGATION NETWORK (a) Log-sigmoid (b) Tan-sigmoid (c) Linear A.I. IEEE PPT UTSAV YAGNIK (150430707017) 10
  • 11. TRAINING PROCEDURES • The neural network toolbox (nntool) is used for training the network. • The neural network training is based on the load demand and weather conditions in Ontario Province, Canada which is located in the south of Canada. • The region in Ontario can be divided into three parts which are southwest, central and east, and north, according to the weather conditions. The population is gathered around southeastern part of the entire province, which includes two of the largest cities of Canada, Toronto and Ottawa. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 11
  • 12. DATA ACQUISITION • For training a neural network, the input and target output data are required to give to the neural network. • For load forecasting, input vectors include all the information of factors affecting the load demand change, such as weather information, holidays or working days, fault occurring in the network and so on. • Output targets are the real time load scenarios, which mean the demand presented at the same time as input vectors changing. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 12
  • 13. DATA ACQUISITION • This paper has taken following factors into the consideration: 1. Temperature (˚C) 2. Dew Point Temperature (˚C) 3. Relative Humidity (%) 4. Wind speed (km/h) 5. Wind Direction (10) 6. Visibility (km) 7. Atmospheric pressure (kPa) 8. Logical adjustment of weekday or weekend A.I. IEEE PPT UTSAV YAGNIK (150430707017) 13
  • 14. DATA ACQUISITION • The data was gathered hourly according to the historical weather conditions remained in the weather stations. • Load demand data also needs to be gathered hourly and correspondingly. • In this paper, 2 years weather data and load data is collected to train and test the created network. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 14
  • 15. DATA NORMALIZATION • To avoid the neurons going too far into saturation, all the gathered data needs to be normalized once it is acquired. • Like per unit system, each input and target data are required to be divided by the maximum absolute value in corresponding factor. • Each value of the normalized data is within the range between -1 and +1 so that the ANN could recognize the data easily. • Besides, weekdays are represented as 1, and weekend are represented as 0. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 15
  • 16. CREATION OF NEURAL NETWORK • In MATLAB, neural network can be created using “nntool” command in the command window which will open toolbox in which we can insert input and target output data. • Three-layer architecture has been chosen to give the simulation. • Due to the practical input value is from -1 to +1, the transfer function of the first layer is set to be tan sigmoid, which is a hyperbolic tangent sigmoid transfer function. • The transfer function of the output layer is set to be linear function, which is a linear function to calculate a layer’s output from its net input. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 16
  • 17. CREATION OF NEURAL NETWORK • There is one advantage for the linear output transfer function: because the linear output neurons lead to the output take on any value, there is no difficulty to find out the differences between output and target. • The next step is the neurons and training functions selection. Generally, Trainbr and Trainlm are the best choices around all of the training functions in MATLAB toolbox. • In this paper, the number of neurons is 8 in Trainlm algorithm, and 30 in Trainbr algorithm. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 17
  • 18. TRAINING OF THE NETWORK • The information on weather conditions in 2007 hourly and weekday and weekend logic in Ontario are defined as training input; the load demand changes in 2007 hourly in Ontario are defined as training target. • The training performances of Trainlm algorithm and Trainbr algorithm are shown in Figures in slide 19 & 20, respectively. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 18
  • 19. “TRAINLM” TRAINING PERFORMANCE A.I. IEEE PPT UTSAV YAGNIK (150430707017) 19
  • 20. “TRAINBR” TRAINING PERFORMANCE A.I. IEEE PPT UTSAV YAGNIK (150430707017) 20
  • 21. PROBLEMS WHILE TRAINING THE NETWORKS • For both training algorithms, namely, Trainbr and Trainlm, the procedure will stop when any of the conditions occurs: 1. Epochs reached the maximum value 2. Time approaches to the preinstalled value 3. Goal error is minimized 4. Gradient is decreased to min_grad 5. Mu exceeds mu_max 6. Validation performance has increased more than max_fail times since the last time it decreased A.I. IEEE PPT UTSAV YAGNIK (150430707017) 21
  • 22. NEURAL NETWORK SIMULATION • The network is required to check whether it can achieve the expectation after training. Another set of input vectors and demand scenarios are needed to test the network. • Comparison needs to be made to check out the difference between the test output and real demand. • After the simulation, a set of output could be obtained through the trained neural network. • The simulation output and the simulation target are used to check the mean squared error to analyse the extent of succeed with neural network application. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 22
  • 23. NEURAL NETWORK SIMULATION • In this project, the information on weather conditions in 2008 hourly and weekday and weekend logic in Ontario are used as simulation input, and the load demand scenarios in 2008 hourly in Ontario are used as the simulation target. • Mean squared error is given by … A.I. IEEE PPT UTSAV YAGNIK (150430707017) 23
  • 24. COMPARISON OF OUTCOMES WITH DIFFERENT NUMBER OF NEURONS A.I. IEEE PPT UTSAV YAGNIK (150430707017) 24
  • 25. COMPARISON OF OUTCOMES WITH DIFFERENT NUMBER OF NEURONS • In the last slide, the upper figure shows Trainbr algorithm outcome with 8 neurons and lower figure shoes same with 10 neurons. • The green track is the test simulation result and the blue track is the real load demand which is provided by electric industry in Ontario. • The horizontal is presenting time, and ordinate is presenting the load which has been normalized. • The less the mean squared error is, the better the created neuron network can perform. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 25
  • 26. COMPARISON OF OUTCOMES WITH DIFFERENT NUMBER OF NEURONS A.I. IEEE PPT UTSAV YAGNIK (150430707017) 26
  • 27. COMPARISON OF OUTCOMES WITH DIFFERENT NUMBER OF NEURONS • In the last slide the upper figure shows results of Trainlm with 8 and 30 neurons and lower figure shows results of both Trainbr and Trainlm with 8 neurons. • In the second slide, The blue track is the test simulation target, the red one is the result simulated by Trainlm, and the green one is the result simulated by Trainbr. • The simulation result of Trainbr is much closer to the test target than that of Trainlm, although the error goal of Trainbr is much larger. The trend of the Trainbr followed the general track of the test target, whereas the Trainlm can only simulate the maximum and minimum value per cycle. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 27
  • 28. COMPARISON OF OUTCOMES WITH DIFFERENT NUMBER OF NEURONS • Theoretically, more neuron applied into the neural network could make the performance better. • However, over-fittings could occur much more obviously. In Figure last slide, red track which is representing the trainlm with 30 neurons is much closer to the test target. Over-fitting is the main problem which cannot be avoided. • So, although the MSE of Trainlm is less then Trainbr as shown in the next slide when neurons is increased, but it has the main problem of overfitting which could decrease the quality of the simulation. • So finally, trainbr could be the optimized algorithm that can be employed in load forecast by back propagation. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 28
  • 29. MSE(MEAN SQUARED ERRORS) A.I. IEEE PPT UTSAV YAGNIK (150430707017) 29
  • 30. CONCLUSION • This paper focused on the behaviours of different training algorithms for load forecasting by back propagation algorithm in Neural Network. • After research, Trainbr algorithm which is integrated in Neural Network Toolbox in MATLAB is regarded as one of the best choice to do load forecast. If the accurate results are required to forecast the load, more neurons are needed to apply into the network architecture. • On the other hand, over-fitting must be considered about to ensure the network simulate the load situation well. Owing to the condition limitation, the input vectors did not take all of the information into account. • A few of the simulation part didn’t meet the real demand very well, even large squared error occurred. If the information was gathered enough and the networks were trained more meticulous, better result could be obtained to apply into load forecasting for smart grid. A.I. IEEE PPT UTSAV YAGNIK (150430707017) 30
  • 31. THANK YOU A.I. IEEE PPT UTSAV YAGNIK (150430707017) 31