Short-Term Load Forecasting of Australian
National Electricity Market by Hierarchical
Extreme Learning Machine
Park Jee Hyun
07 JUL 2017
-
SungKyunGwan University – undergraduate student
Nanyang Technological University – exchange student
Contents
1. Introduction
2. Hierarchical extreme learning machine (H-ELM)
3. Implemented structure
4. Test results
5. Conclusion
6. References
1. Introduction
• Predicting future demand load for electricity is considered an
essential process in power system operation.
• Good quality Short-Term Load Forecasting (STLF) is very
important for economical aspect and contributes to reliability of
power system [5], [6].
• There have been continuous efforts to achieve high load
forecasting accuracy.
• Improvement of the performance of the learning algorithm model
• How well the input data features are extracted through the pre-
processing process
2. Hierarchical Extreme Learning Machine
a. Extreme Learning Machine theory
b. H-ELM framework
2.a. Extreme Learning Machine theory
2.a. Extreme Learning Machine theory
• ELM learning can be summarized as the following major three
steps:
2.a. Extreme Learning Machine theory
• Pros:
• Due to the characteristics of the ELM based SLFN, we can obtain very
short training time, excellent efficiency and good generalize
performance [3].
• Cons:
• It randomly selects the input weights and biases for hidden nodes, and
cause a crux in the stability of its outputs [4].
2.b. H-ELM framework
• HELM consists of two parts
1) Unsupervised hierarchical feature representation
2) Supervised feature classification
2.b. H-ELM framework
1) Unsupervised hierarchical feature representation
• It receives raw input data and projects it to the ELM random feature
space, which plays an important role in extracting hidden feature
information of input training data.
• High-level sparse features can be obtained as output values through N-
layer unsupervised learning, which is then passed to the input data of
the second stage.
2) Supervised feature classification
• The final decision is made through regression, which is the same
process as the existing ELM.
3. Implementation Structure
a. Data used in experiments
b. Data pre-processing strategies
c. Correlation study
d. STLF architecture
e. Input & Output
3.a. Data used in experiments
• Australian National Electricity Market (NEM)
• Contents :
• Load profile of New South Wales (NSW) region from 1 January 2006 to 31
December 2015
• k-th date of the data :
• Australian Bureau of Meteorology (BOM)
• Contents :
• ㅍof NSW region from 1 January 2006 to 31 December 2015
• the k-th date of the data :
3.b. Data pre-processing strategies
• The feature of the weather data is too poor to be used as the
input feature data of the STLF.
• BOM data show only one record per day for maximum temperature,
minimum temperature, rainfall, and solar exposure.
• Using these data directly as input feature dataset can not provide
sufficient features to train the STLF model
• To overcome this problem, we suggest following data pre-
processing strategy
• To enrich the features of input data, include the historical load data
from the NEM in the input dataset consisting of the weather data of the
BOM.
3.b. Data pre-processing strategies
• In short, the forecasting method is based on historical load data
and takes weather forecast as a hint to predict future load data.
3.c. Correlation study
• NEM load data will be used not only as label data but also as
feature data
• Selecting the right amount of data and the appropriate features of data as
input data is an important process that must be performed first for accurate
load forecasting [15].
• We used Pearson's correlation coefficient (PCC) to track the
relationship between date and load data.
3.c. Correlation study
• As a result, the load data of the previous day and the week
before showed the highest PCC value.
• We set the strategy to add the load data of just before and one
week before to input feature dataset.
3.d. STLF architecture
• We will compare the performance of each STLF architecture
implemented with ELM and H-ELM.
3.e. Input & Output
• Input data consists of feature data and label data
• Output value is forecasted daily load profile, namely the 48 load
points corresponding to each half-hour of a day.
4. Test Results
a. Optimal number of hidden nodes
b. Performance comparison between ELM and H-ELM
4.a. Optimal number of hidden nodes
• The STLF model designed in this experiment needs to be
optimized for maximum performance, which is the task of
determining the number of hidden nodes.
4.b. Performance : ELM vs. H-ELM
• The accuracy of the prediction is evaluated using MAPE and
MAE.
• The stability of the prediction is evaluated as the standard
deviation of the distribution of the prediction accuracy.
➔In order to measure these two evaluation items,
1,000 identical repeating experiments were conducted for each
STLF model.
➔In order to evaluate the accuracy, the mean value of MAPE
and MAE values was taken as the final value, and the standard
deviation was calculated to evaluate the stability.
4.b. Performance : ELM vs. H-ELM
• Forecasting MAPE and stability
4.b. Performance : ELM vs. H-ELM
• Forecasting MAE and stability
5. Conclusion
• Improvement of Learning Algorithm
• Due to the characteristics of ELM, the output value of the ELM SLTF is
unstable, which degrades the accuracy of prediction.
• By constructing the STLF model through H-ELM, it is possible to solve
the instability of the output value and able to get more accurate
prediction.
• Data pre-processing
• Data pre-processing plays a significant role in improving the
performance of the STLF model.
➔ In summary, in order to improve the performance of the STLF,
it is necessary to construct a STLF model using H-ELM and to
input well-defined input features through proper data pre-
processing.
6. References
1. Zhiyi Li, Xuan Liu, Liyuan Chen, "Load Interval Forecasting Methods Based on An Ensemble of Extreme Learning Machines,"
Power & Energy Society General Meeting, 2015 IEEE, October 2015.
2. Guang-BinHuang,ErikCambria,“ExtremeLearningMachines,”IEEETransactions on Cybernetics, vol. 28, no. 6, pp. 30-59, 2013.
3. G.-B.Huang,Q.-Y.Zhu,andC.-K.Siew,“ExtremeLearningMachine:ANewLearn- ing Scheme of Feedforward Neural Networks,” 2004
International Joint Conference on Neural Networks (IJCNN’2004), July 25-29, 2004.
4. Rui Zhang, Zhao Yang Dong, Yan Xu, Ke Meng, Kit Po Wong, "Short-term load forecasting of Australian National Electricity
Market by an ensemble model of ex- treme learning machine," December 2012.
5. Muhammad Qamar Raza, Dr-Badar Islam, M. A. Zakariya, "Neural Network Based STLF Model to Study the Seasonal Impact of
Weather and Exogenous Variables," Research Journal of Applied Sciences, Engineering and Technology · November 2013.
6. Damitha K. Ranaweera, George G. Karady, Richard. C. Farmer, "Economic Impact Analysis of Load Forecasting," IEEE
Transactions on Power Systems, Vol. 12, No. 3, August 1997.
7. C.L. Wu, K.W. Chau, C. Fan, "Prediction of rainfall time series using modular artificial neural networks coupled with data-
preprocessing techniques," Journal of Hydrology Volume 389, Issues 1–2, 28 July 2010.
8. Huang, G.-B., Zhu, Q.-Y., Siew, C.-K, "Extreme learning machine: theory and ap- plications," Neurocomputing, 2006, 70, pp.
489–501
9. Bartlett, P.L., "The sample complexity of pattern classification with neural net- works: the size of the weights is more important
than the size of the network," IEEE Trans. Inf. Theory, 1998, 44, (2), pp. 525–536
6. References
10. Jiexiong Tang, Chenwei Deng, Guang-Bin Huang, "Extreme Learning Machine for Multilayer Perceptron," IEEE transactions on
neural Network and learning systems, Vol. 27, No. 4, April 2016.
11. Pradeep C.Gupta, Keigo Yamada, "Adaptive Short-Term Forecasting of Hourly Loads Using Weather Information," IEEE
Transactions on Power Apparatus and Systems, Volume: PAS-91, Issue: 5, Sept. 1972, pp. 2085-2094
12. Muhammad Qamar Raza, Zuhairi Baharudin, Badar-Ul-Islam, Mohd. Azman Za- kariya, Mohd Haris Md Khir, "Neural Network
Based STLF Model to Study the Seasonal Impact of Weather and Exogenous Variables," Intelligent and Advanced Systems (ICIAS),
2014 5th International Conference on, August 2014.
13. S.J.Kiartzis, C.E. Zoumas, A.G.Bakirtzis,V. Petridis, "Data Pre-Processing For Short-Term Load Forecasting In An Autonomous
Power System Using Artificial Neural Networks," Electronics, Circuits, and Systems, 1996. ICECS ’96., Proceed- ings of the Third
IEEE International Conference on, October 1996
14. Agnaldo J. Rocha Reis, Alexandre P. Alves da Silva, "Feature Extraction via Mul- tiresolution Analysis for Short-Term Load
Forecasting," IEEE TRANSACTIONS ON POWER SYSTEMS, VOL. 20, NO. 1, February 2005.
15. Abdussalam Mohamed, M. E. El-Hawary, "Effective Input Features Selection For Electricity Price Forecasting," 2016 IEEE
Canadian Conference on Electrical and Computer Engineering (CCECE)
16. Bunn, D.W., Farmer, E.D. (Eds.), "Comparative models for electrical load fore- casting" (John Wiley & Sons, 1985)
17. Australian Energy Market Operator [Online]. Available at http://www.aemo.com. au
18. Bureau of Meteorology of Australian Government [Online]. Available at http:// www.bom.gov.au/index.shtml
Short-Term Load Forecasting of Australian National Electricity Market by Hierarchical Extreme Learning Machine

More Related Content

PPT
Short-term Load Forecasting based on Neural network and Local Regression
PDF
Presentation - Msc Thesis - Machine Learning Techniques for Short-Term Electr...
PDF
Electric Load Forecasting
PPTX
Five most used load forecasting models
PDF
Feature Selection and Optimization of Artificial Neural Network for Short Te...
PPSX
Load Forecasting II
PPT
STLF PPT jagdish singh
PPSX
Forecasting & Planning
Short-term Load Forecasting based on Neural network and Local Regression
Presentation - Msc Thesis - Machine Learning Techniques for Short-Term Electr...
Electric Load Forecasting
Five most used load forecasting models
Feature Selection and Optimization of Artificial Neural Network for Short Te...
Load Forecasting II
STLF PPT jagdish singh
Forecasting & Planning

What's hot (18)

PPTX
Load forecasting
PPTX
Introduction about Power System Planning in the Presence of Uncertainties
PDF
An overview of electricity demand forecasting techniques
PPTX
POWER SYSTEM PLANNING
PDF
Modern power system planning new
PPTX
Group 7 load forecasting&harmonics final ppt
PPSX
Load Forecasting
PPTX
POWER SYSTEM OPERATION AND CONTROL. load forecasting - introduction, methodo...
PPT
Power system planning & operation [eceg 4410]
PPTX
Load forecasting
PDF
Load Forecasting Techniques.pdf
PPTX
Electrical Load forcasting
PPTX
POWER SYSTEM PLANNING AND DESIGN
PDF
Intelligent methods in load forecasting
PPTX
PhD defence: Self-Forecasting EneRgy load Stakeholders (SFERS) for Smart Grids
PDF
Maximum power point tracking techniques a review
PDF
Fuzzy logic methodology for short term load forecasting
Load forecasting
Introduction about Power System Planning in the Presence of Uncertainties
An overview of electricity demand forecasting techniques
POWER SYSTEM PLANNING
Modern power system planning new
Group 7 load forecasting&harmonics final ppt
Load Forecasting
POWER SYSTEM OPERATION AND CONTROL. load forecasting - introduction, methodo...
Power system planning & operation [eceg 4410]
Load forecasting
Load Forecasting Techniques.pdf
Electrical Load forcasting
POWER SYSTEM PLANNING AND DESIGN
Intelligent methods in load forecasting
PhD defence: Self-Forecasting EneRgy load Stakeholders (SFERS) for Smart Grids
Maximum power point tracking techniques a review
Fuzzy logic methodology for short term load forecasting
Ad

Similar to Short-Term Load Forecasting of Australian National Electricity Market by Hierarchical Extreme Learning Machine (20)

PDF
Short Term Electrical Load Forecasting by Artificial Neural Network
PDF
Enhancing the Prediction Accuracy of Solar Power Generation using a Generativ...
PDF
A040101001006
PDF
Electricity load forecasting by artificial neural network model
PDF
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
PPTX
Kcc201728apr2017 170828235330
PPTX
KCC2017 28APR2017
PDF
Iv3515241527
PDF
A Fletcher-Reeves conjugate gradient algorithm-based neuromodel for smart gri...
PDF
Target Response Electrical usage Profile Clustering using Big Data
PPT
Neural networks for the prediction and forecasting of water resources variables
PDF
A multi-layer-artificial-neural-network-architecture-design-for-load-forecast...
PPTX
Seminar.pptx
PDF
A Time Series ANN Approach for Weather Forecasting
PPTX
AI IEEE
PPTX
ICDATE PPT (4).pptx
PPTX
Presentation
PDF
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
PDF
Active Learning Entropy Sampling based Clustering Optimization Method for Ele...
PDF
ACTIVE LEARNING ENTROPY SAMPLING BASED CLUSTERING OPTIMIZATION METHOD FOR ELE...
Short Term Electrical Load Forecasting by Artificial Neural Network
Enhancing the Prediction Accuracy of Solar Power Generation using a Generativ...
A040101001006
Electricity load forecasting by artificial neural network model
AN IMPROVED METHOD FOR IDENTIFYING WELL-TEST INTERPRETATION MODEL BASED ON AG...
Kcc201728apr2017 170828235330
KCC2017 28APR2017
Iv3515241527
A Fletcher-Reeves conjugate gradient algorithm-based neuromodel for smart gri...
Target Response Electrical usage Profile Clustering using Big Data
Neural networks for the prediction and forecasting of water resources variables
A multi-layer-artificial-neural-network-architecture-design-for-load-forecast...
Seminar.pptx
A Time Series ANN Approach for Weather Forecasting
AI IEEE
ICDATE PPT (4).pptx
Presentation
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
Active Learning Entropy Sampling based Clustering Optimization Method for Ele...
ACTIVE LEARNING ENTROPY SAMPLING BASED CLUSTERING OPTIMIZATION METHOD FOR ELE...
Ad

More from JEE HYUN PARK (7)

PPTX
keti companion classifier
PPTX
[Paper review] BERT
PPTX
neural based_context_representation_learning_for_dialog_act_classification
PPTX
a deep reinforced model for abstractive summarization
PPTX
Understanding GloVe
PPTX
Historical Finance Data
PPTX
Understanding lstm and its diagrams
keti companion classifier
[Paper review] BERT
neural based_context_representation_learning_for_dialog_act_classification
a deep reinforced model for abstractive summarization
Understanding GloVe
Historical Finance Data
Understanding lstm and its diagrams

Recently uploaded (20)

PPTX
Web Crawler for Trend Tracking Gen Z Insights.pptx
PDF
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
PDF
Getting started with AI Agents and Multi-Agent Systems
PPTX
Final SEM Unit 1 for mit wpu at pune .pptx
PPTX
The various Industrial Revolutions .pptx
PDF
Hindi spoken digit analysis for native and non-native speakers
DOCX
search engine optimization ppt fir known well about this
PPTX
Benefits of Physical activity for teenagers.pptx
PDF
NewMind AI Weekly Chronicles – August ’25 Week III
PPT
Geologic Time for studying geology for geologist
PDF
Zenith AI: Advanced Artificial Intelligence
PPTX
MicrosoftCybserSecurityReferenceArchitecture-April-2025.pptx
PDF
Univ-Connecticut-ChatGPT-Presentaion.pdf
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PPTX
Modernising the Digital Integration Hub
PDF
A review of recent deep learning applications in wood surface defect identifi...
PDF
Developing a website for English-speaking practice to English as a foreign la...
PDF
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
PDF
From MVP to Full-Scale Product A Startup’s Software Journey.pdf
PPTX
Tartificialntelligence_presentation.pptx
Web Crawler for Trend Tracking Gen Z Insights.pptx
Transform Your ITIL® 4 & ITSM Strategy with AI in 2025.pdf
Getting started with AI Agents and Multi-Agent Systems
Final SEM Unit 1 for mit wpu at pune .pptx
The various Industrial Revolutions .pptx
Hindi spoken digit analysis for native and non-native speakers
search engine optimization ppt fir known well about this
Benefits of Physical activity for teenagers.pptx
NewMind AI Weekly Chronicles – August ’25 Week III
Geologic Time for studying geology for geologist
Zenith AI: Advanced Artificial Intelligence
MicrosoftCybserSecurityReferenceArchitecture-April-2025.pptx
Univ-Connecticut-ChatGPT-Presentaion.pdf
Assigned Numbers - 2025 - Bluetooth® Document
Modernising the Digital Integration Hub
A review of recent deep learning applications in wood surface defect identifi...
Developing a website for English-speaking practice to English as a foreign la...
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
From MVP to Full-Scale Product A Startup’s Software Journey.pdf
Tartificialntelligence_presentation.pptx

Short-Term Load Forecasting of Australian National Electricity Market by Hierarchical Extreme Learning Machine

  • 1. Short-Term Load Forecasting of Australian National Electricity Market by Hierarchical Extreme Learning Machine Park Jee Hyun 07 JUL 2017 - SungKyunGwan University – undergraduate student Nanyang Technological University – exchange student
  • 2. Contents 1. Introduction 2. Hierarchical extreme learning machine (H-ELM) 3. Implemented structure 4. Test results 5. Conclusion 6. References
  • 3. 1. Introduction • Predicting future demand load for electricity is considered an essential process in power system operation. • Good quality Short-Term Load Forecasting (STLF) is very important for economical aspect and contributes to reliability of power system [5], [6]. • There have been continuous efforts to achieve high load forecasting accuracy. • Improvement of the performance of the learning algorithm model • How well the input data features are extracted through the pre- processing process
  • 4. 2. Hierarchical Extreme Learning Machine a. Extreme Learning Machine theory b. H-ELM framework
  • 5. 2.a. Extreme Learning Machine theory
  • 6. 2.a. Extreme Learning Machine theory • ELM learning can be summarized as the following major three steps:
  • 7. 2.a. Extreme Learning Machine theory • Pros: • Due to the characteristics of the ELM based SLFN, we can obtain very short training time, excellent efficiency and good generalize performance [3]. • Cons: • It randomly selects the input weights and biases for hidden nodes, and cause a crux in the stability of its outputs [4].
  • 8. 2.b. H-ELM framework • HELM consists of two parts 1) Unsupervised hierarchical feature representation 2) Supervised feature classification
  • 9. 2.b. H-ELM framework 1) Unsupervised hierarchical feature representation • It receives raw input data and projects it to the ELM random feature space, which plays an important role in extracting hidden feature information of input training data. • High-level sparse features can be obtained as output values through N- layer unsupervised learning, which is then passed to the input data of the second stage. 2) Supervised feature classification • The final decision is made through regression, which is the same process as the existing ELM.
  • 10. 3. Implementation Structure a. Data used in experiments b. Data pre-processing strategies c. Correlation study d. STLF architecture e. Input & Output
  • 11. 3.a. Data used in experiments • Australian National Electricity Market (NEM) • Contents : • Load profile of New South Wales (NSW) region from 1 January 2006 to 31 December 2015 • k-th date of the data : • Australian Bureau of Meteorology (BOM) • Contents : • ㅍof NSW region from 1 January 2006 to 31 December 2015 • the k-th date of the data :
  • 12. 3.b. Data pre-processing strategies • The feature of the weather data is too poor to be used as the input feature data of the STLF. • BOM data show only one record per day for maximum temperature, minimum temperature, rainfall, and solar exposure. • Using these data directly as input feature dataset can not provide sufficient features to train the STLF model • To overcome this problem, we suggest following data pre- processing strategy • To enrich the features of input data, include the historical load data from the NEM in the input dataset consisting of the weather data of the BOM.
  • 13. 3.b. Data pre-processing strategies • In short, the forecasting method is based on historical load data and takes weather forecast as a hint to predict future load data.
  • 14. 3.c. Correlation study • NEM load data will be used not only as label data but also as feature data • Selecting the right amount of data and the appropriate features of data as input data is an important process that must be performed first for accurate load forecasting [15]. • We used Pearson's correlation coefficient (PCC) to track the relationship between date and load data.
  • 15. 3.c. Correlation study • As a result, the load data of the previous day and the week before showed the highest PCC value. • We set the strategy to add the load data of just before and one week before to input feature dataset.
  • 16. 3.d. STLF architecture • We will compare the performance of each STLF architecture implemented with ELM and H-ELM.
  • 17. 3.e. Input & Output • Input data consists of feature data and label data • Output value is forecasted daily load profile, namely the 48 load points corresponding to each half-hour of a day.
  • 18. 4. Test Results a. Optimal number of hidden nodes b. Performance comparison between ELM and H-ELM
  • 19. 4.a. Optimal number of hidden nodes • The STLF model designed in this experiment needs to be optimized for maximum performance, which is the task of determining the number of hidden nodes.
  • 20. 4.b. Performance : ELM vs. H-ELM • The accuracy of the prediction is evaluated using MAPE and MAE. • The stability of the prediction is evaluated as the standard deviation of the distribution of the prediction accuracy. ➔In order to measure these two evaluation items, 1,000 identical repeating experiments were conducted for each STLF model. ➔In order to evaluate the accuracy, the mean value of MAPE and MAE values was taken as the final value, and the standard deviation was calculated to evaluate the stability.
  • 21. 4.b. Performance : ELM vs. H-ELM • Forecasting MAPE and stability
  • 22. 4.b. Performance : ELM vs. H-ELM • Forecasting MAE and stability
  • 23. 5. Conclusion • Improvement of Learning Algorithm • Due to the characteristics of ELM, the output value of the ELM SLTF is unstable, which degrades the accuracy of prediction. • By constructing the STLF model through H-ELM, it is possible to solve the instability of the output value and able to get more accurate prediction. • Data pre-processing • Data pre-processing plays a significant role in improving the performance of the STLF model. ➔ In summary, in order to improve the performance of the STLF, it is necessary to construct a STLF model using H-ELM and to input well-defined input features through proper data pre- processing.
  • 24. 6. References 1. Zhiyi Li, Xuan Liu, Liyuan Chen, "Load Interval Forecasting Methods Based on An Ensemble of Extreme Learning Machines," Power & Energy Society General Meeting, 2015 IEEE, October 2015. 2. Guang-BinHuang,ErikCambria,“ExtremeLearningMachines,”IEEETransactions on Cybernetics, vol. 28, no. 6, pp. 30-59, 2013. 3. G.-B.Huang,Q.-Y.Zhu,andC.-K.Siew,“ExtremeLearningMachine:ANewLearn- ing Scheme of Feedforward Neural Networks,” 2004 International Joint Conference on Neural Networks (IJCNN’2004), July 25-29, 2004. 4. Rui Zhang, Zhao Yang Dong, Yan Xu, Ke Meng, Kit Po Wong, "Short-term load forecasting of Australian National Electricity Market by an ensemble model of ex- treme learning machine," December 2012. 5. Muhammad Qamar Raza, Dr-Badar Islam, M. A. Zakariya, "Neural Network Based STLF Model to Study the Seasonal Impact of Weather and Exogenous Variables," Research Journal of Applied Sciences, Engineering and Technology · November 2013. 6. Damitha K. Ranaweera, George G. Karady, Richard. C. Farmer, "Economic Impact Analysis of Load Forecasting," IEEE Transactions on Power Systems, Vol. 12, No. 3, August 1997. 7. C.L. Wu, K.W. Chau, C. Fan, "Prediction of rainfall time series using modular artificial neural networks coupled with data- preprocessing techniques," Journal of Hydrology Volume 389, Issues 1–2, 28 July 2010. 8. Huang, G.-B., Zhu, Q.-Y., Siew, C.-K, "Extreme learning machine: theory and ap- plications," Neurocomputing, 2006, 70, pp. 489–501 9. Bartlett, P.L., "The sample complexity of pattern classification with neural net- works: the size of the weights is more important than the size of the network," IEEE Trans. Inf. Theory, 1998, 44, (2), pp. 525–536
  • 25. 6. References 10. Jiexiong Tang, Chenwei Deng, Guang-Bin Huang, "Extreme Learning Machine for Multilayer Perceptron," IEEE transactions on neural Network and learning systems, Vol. 27, No. 4, April 2016. 11. Pradeep C.Gupta, Keigo Yamada, "Adaptive Short-Term Forecasting of Hourly Loads Using Weather Information," IEEE Transactions on Power Apparatus and Systems, Volume: PAS-91, Issue: 5, Sept. 1972, pp. 2085-2094 12. Muhammad Qamar Raza, Zuhairi Baharudin, Badar-Ul-Islam, Mohd. Azman Za- kariya, Mohd Haris Md Khir, "Neural Network Based STLF Model to Study the Seasonal Impact of Weather and Exogenous Variables," Intelligent and Advanced Systems (ICIAS), 2014 5th International Conference on, August 2014. 13. S.J.Kiartzis, C.E. Zoumas, A.G.Bakirtzis,V. Petridis, "Data Pre-Processing For Short-Term Load Forecasting In An Autonomous Power System Using Artificial Neural Networks," Electronics, Circuits, and Systems, 1996. ICECS ’96., Proceed- ings of the Third IEEE International Conference on, October 1996 14. Agnaldo J. Rocha Reis, Alexandre P. Alves da Silva, "Feature Extraction via Mul- tiresolution Analysis for Short-Term Load Forecasting," IEEE TRANSACTIONS ON POWER SYSTEMS, VOL. 20, NO. 1, February 2005. 15. Abdussalam Mohamed, M. E. El-Hawary, "Effective Input Features Selection For Electricity Price Forecasting," 2016 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE) 16. Bunn, D.W., Farmer, E.D. (Eds.), "Comparative models for electrical load fore- casting" (John Wiley & Sons, 1985) 17. Australian Energy Market Operator [Online]. Available at http://www.aemo.com. au 18. Bureau of Meteorology of Australian Government [Online]. Available at http:// www.bom.gov.au/index.shtml

Editor's Notes

  • #6: Given a training data set with N samples, the output function of the SLFN with L hidden nodes and activation function θ. The activation function is expressed as (1).
  • #7: ELM is completely different from traditional iterative learning algorithms as it randomly selects the input weights and biases for hidden nodes, ω and b and analytically calculates the output weights, β, by finding least-square solution [8]. once ELM parameters of hidden layers of the ELM are generated randomly, it does not need to be tuned. Therefore, its training speed can be thousands of times faster [3].
  • #8: Due to the characteristics of the ELM based SLFN, we can obtain very short training time, excellent efficiency and good generalize performance [3]. However, it randomly selects the input weights and biases for hidden nodes, and cause a crux in the stability of its outputs [4].
  • #10: Since H-ELM receives input of high-level sparse features obtained from unsupervised feature learning, instead of raw input data, it is able to obtain more accurate and stable performance. The experiments designed in the back will confirm that H-ELM shows better performance than ELM.
  • #12: Normally in STLF, load data is used as input label data and weather data is used as input feature data.
  • #13: Which historical load data to be included in the feature input dataset will be described in the following correlation study.
  • #17: This architecture is divided into two phases: a) training and b) testing. In the a) training phase, the learning model receives the feature vector obtained through the data pre- processing process and performs model training. The STLF model trained in this phase is used in the next phase. In the b) testing phase, the feature vector is input and forecasting is performed based on the STLF model trained in the previous process.
  • #18: In this experiment, we will confirm the role of data pre-processing in addition to the performance comparison between ELM and H-ELM. Therefore, based on the PCC results, we propose the following 4 cases.
  • #20: The STLF model designed in this experiment needs to be optimized for maxi- mum performance, which is the task of determining the number of hidden nodes. the MAPE method was used to measure accuracy, and the forecasting accuracy result obtained from this optimization process corresponds to the average value of the same 20 repetitive prediction experiments. In each case, the number of hidden nodes for the optimization of the STLF model slightly differed, and the results of the experiment for the optimization are shown in Figure.