Mathematical Problems in Engineering

Mathematical Problems in Engineering / 2017 / Article

Research Article | Open Access

Volume 2017 |Article ID 3485182 | https://doi.org/10.1155/2017/3485182

Zhisheng Zhang, "Short-Term Load Forecasting Model Based on the Fusion of PSRT and QCNN", Mathematical Problems in Engineering, vol. 2017, Article ID 3485182, 7 pages, 2017. https://doi.org/10.1155/2017/3485182

Short-Term Load Forecasting Model Based on the Fusion of PSRT and QCNN

Academic Editor: Yaguo Lei
Received11 Apr 2017
Revised12 Jul 2017
Accepted29 Aug 2017
Published03 Oct 2017

Abstract

Short-term load forecasting (STLF) model based on the fusion of Phase Space Reconstruction Theory (PSRT) and Quantum Chaotic Neural Networks (QCNN) was proposed. The quantum computation and chaotic mechanism were integrated into QCNN, which was composed of quantum neurons and chaotic neurons. QCNN has four layers, and they are the input layer, the first hidden layer of quantum hidden nodes, the second hidden layer of chaotic hidden nodes, and the output layer. The theoretical basis of constructing QCNN is Phase Space Reconstruction Theory (PSRT). Through the actual example simulation, the simulation results show that proposed model has good forecasting precision and stability.

1. Introduction

Short-term load forecasting (STLF) is a key basic research project. It has an important role in the economy, reliability, and operation management of power system. Smart grids will become the trend of the future development of the electric power system. Because of the higher economy and reliability of the smart grids, it will put forward higher forecasting accuracy for STLF model [13].

In the early stage of STLF, many forecasting methods based on mathematical statistics theory are put forward [4, 5]. But these methods are not suitable for the prediction of dynamic load time series. With the development of artificial intelligence technology, many forecasting models based on intelligent theory are proposed and applied [6, 7]. As an important branch of intelligent theory, neural network has been widely used in the STLF field because of its unique advantages.

The research results show that there are quantum phenomena and quantum mechanisms in the process of human brain information processing [8]. The quantum theory is introduced into the modeling of artificial neural network, which can improve the generalization ability and information processing efficiency of artificial neural network. The fusion of quantum theory and artificial neural network has been applied in many fields [911]. The quantum neural network has been applied to STLF, and satisfactory prediction results have been obtained [12].

Some studies had shown that the load time series is nonlinear and chaotic [13]. Chaotic characteristics had been found in the biological neurons. Chaotic neural network is the neural network in which chaotic characteristics are integrated. It has a strong sensitivity to initial values and has the ability to process complicated information. A number of literatures have discussed the application of chaotic neural network in STLF, and the prediction effect is satisfactory [14, 15].

The STLF model based on the fusion of Phase Space Reconstruction Theory (PSRT) and Quantum Chaotic Neural Networks (QCNN) is constructed in this paper. QCNN is composed of quantum neurons and modified chaotic neurons, and quantum computation and chaotic mechanism are integrated into the neural network. PSRT is the theoretical basis of constructing QCNN. The -nearest neighbor approach (NNA) is used to form the training samples.

Through the example simulation, the simulation results show that the STLF model based on the fusion of PSRT and QCNN has good forecasting performance.

2. Quantum Chaotic Neural Networks

QCNN is composed of quantum neurons and chaotic neurons, and quantum computation and chaotic mechanism are integrated into QCNN.

2.1. Quantum Neuron Model

In quantum neurons, the basic storage unit of quantum information is the qubit. The state of a qubit is a vector in two-dimensional complex number space. The qubit differs from the classical bit in that a quantum state can exist continuously and randomly on any superposition state of two polarized states. Quantum gates are the basis for the physical implementation of quantum computation, and 1-bit qubit rotation gate and 2-bit controlled NOT gate are the most basic universal gates.

2.1.1. Qubit

In quantum computation, a qubit state can be expressed aswhere and are the basic states of microscopic particles. and are the probability amplitudes, and they should satisfy the following equation:

Quantum logic gates are the basis of quantum computation. The 1-bit qubit rotation gate and the 2-bit controlled NOT gate are the basic universal quantum gates.

2.1.2. 1-Bit Qubit Rotation Gate

The 1-bit qubit rotation gate is expressed as

The quantum state is expressed as

By the 1-bit qubit rotation gate, can be transformed to

2.1.3. 2-Bit Controlled NOT Gate

The 2-bit controlled NOT gate is expressed as

By the 2-bit controlled NOT gate, can be transformed to

When , , and it represents the reversal rotation.

2.1.4. Quantum Neuron Model

The quantum neuron model is shown in Figure 1. It consists of the input part, the rotation part, the aggregation part, the reversal rotation part, and the output part.

In order to solve the practical problems, the input variable needs to be transformed to :

In the rotation part, the input qubit can be transformed by the 1-bit qubit rotation gate, as follows:

The aggregation part can be described aswhere The reversal part can be described aswhere is the activation function.

The output of the quantum neuron is the probability amplitudes of . So the relationship between input and output can be expressed as

2.2. Chaotic Neuron Model

The modified Aihara chaotic neuron was presented in [16], which is shown in Figure 2. The modified Aihara chaotic neuron is used to construct QCNN model in this paper.

is the internal state of the th chaotic neuron at time , which can be expressed aswhere is the internal state of the th chaotic neuron at time . is the weight between the th external input neuron and the th chaotic neuron. is the input value of the th external input neuron at time . The number of external input neurons is . is the weight between the th internal feedback neuron and the th chaotic neuron. is the output of the th chaotic neuron at time . The number of internal feedback input neurons applied to the chaotic neuron is . The damping factors of the external, feedback, and refractoriness are equal to .

is the output of the th chaotic neuron at time , which can be expressed aswhere can be expressed as

2.3. Quantum Chaotic Neural Networks Model

The QCNN model is constructed by quantum neurons and chaotic neurons, which is shown in Figure 3.

Considering Figure 3, QCNN has four layers, and they are the input layer, the first hidden layer of quantum hidden nodes, the second hidden layer of chaotic hidden nodes, and the output layer.

, , are the number of the input nodes, the quantum hidden nodes, and the chaotic hidden nodes. There is one output node.

There are three different types of connection weights in QCNN. The first type is the weights for forward direction between the input layer and the first hidden layer. The second type is the connection weights from the first hidden layer to the second hidden layer and from the second hidden layer to the output layer. The third type is the connection weights among the neurons of interlayer, which includes self-feedback in the second hidden layer and in the output layer.

In the input layer, is the th input qubit, which has been transformed from the input variable according to (8). is the input vector of QCNN.

In the first hidden layer, is the output of the th quantum hidden neuron at time , and it can be expressed as

In the second hidden layer, is the internal state of the th chaotic neuron at time , which can be described aswhere is the weight between the th quantum hidden neuron in the first hidden layer and the th chaotic neuron in the second hidden layer. is the interconnecting weight between the th chaotic neuron and the th chaotic neuron in the second hidden layer. is the output of th chaotic neuron in the second hidden layer. is refractory parameter of chaotic neuron.

are the output of th chaotic neuron in the second hidden layer, which can be expressed as where is a sigmoid function.

In the output layer, is the internal state of the chaotic neuron at time , which can be expressed aswhere is the weight between the th chaotic neuron in the second hidden and the only chaotic neuron in the output layer. is the interconnecting weight among neurons in the output layer. is the output of the chaotic neuron at time , which can be described as

3. STLF Model Based on the Fusion of PSRT and QCNN

The STLF model fuses PSRT and QCNN. QCNN is the core of the forecasting model. PSRT provides a theoretical basis for the construction of the model. For the QCNN model, the number of input nodes is decided by the embedding dimension according to PSRT. The -nearest neighbor phase points are obtained by -nearest neighbor approach [17], which can form the training samples of QCNN.

3.1. Phase Space Reconstruction Theory

PSRT is the important theoretical basis for the analysis of chaotic time series. The chaotic time series can be reconstructed by PSRT. To chaotic time series , the reconstructed phase space can be expressed as where is the embedding dimension and is the delay time. According to [18], the G-P method is used to acquire the embedding dimension. According to [19], the multiple-autocorrelation method is used to acquire the delay time .

The main steps of the G-P method are as follows:(1)For the time series , a smaller embedding dimension value is used to reconstruct the phase space. For all the phase points, an arbitrary small number is given, and the Euclidean distance between the corresponding phase points in the phase space is computed.(2)Computing correlation function: it can be calculated by the following formula:where is the number of phase points; is the distance between the phase point and the phase point . is the Heaviside function, and it can be expressed as is the cumulative distribution function, which represents the probability that the distance between two phase points is less than in phase space.(3)For an appropriate range of , the dimension of the attractor and the cumulative distribution function should approach the logarithmic linear relationship. It can be expressed asThus, the correlation dimension corresponding to can be obtained by fitting method.(4)Increasing embedding dimension (), the calculation steps and are repeated until the corresponding correlation dimension does not change with the increase of . It achieves convergence. is the embedding dimension.

PSRT has been applied to the field of STLF, which opens up new research ideas for STLF [13, 20].

3.2. The Architectural Structure of STLF Model

The architectural structure of STLF model based on the fusion of PSRT and QCNN is shown in Figure 4.

In Figure 4, the function of “Historical Load Database” and “Storage Area for Forecasting Data” is to store the historical load data and forecasting data. The function of “Module for Calculating and by PSRT” is to provide the basis for model construction. The function of “Module for Forming the Training Samples by KNNA” is to form the training samples. The function of “Module for Data Normalization” is to normalize the sample data.

4. Actual Example Simulation

4.1. Four STLF Models and Simulation Parameters

In order to verify the validity of the STLF model based on the fusion of PSRT and QCNN (Model I), three other models are also constructed to compare with Model I. They are the STLF model based on the fusion of PSRT and the quantum neural networks (Model II), the STLF model based on the fusion of PSRT and the chaotic neural networks (Model III), and the STLF model based on the fusion of PSRT and the conventional BP-NN (Model IV).

The time series in the simulation is the actual load in a regional power grid of Shandong Province in China in 2012. Twenty-four load points are collected each day, and the collected load values form the load time series.

The embedding dimension is 7 according to the G-P method. The delay time is 1 according to the multiple-autocorrelation method. The number of the neighbor phase points is 10 by -nearest neighbor approach.

For Model I, the number of the input nodes is 7, the number of the quantum hidden nodes is 9, the number of the chaotic hidden nodes is 7, and the number of the output points is 1.

For Model II, the number of the input nodes is 7, the number of the quantum hidden nodes is 9, and the number of the output points is 1.

For Model III, the number of the input nodes is 7, the number of the chaotic hidden nodes is 11, and the number of the output points is 1.

For Model IV, the number of the input nodes is 7, the number of the hidden nodes is 10, and the number of the output points is 1.

The maximum permissible error of the four STLF models is . For Model I, Model II, and Model III, genetic algorithm was used to adjust the parameters of the QCNN, the quantum neural networks, and the chaotic neural networks. The fitness function mainly considers the accuracy of the output of the forecasting model, that is, . is the mean square deviation between the desired output and the actual output of the forecasting model. The crossover probability is 0.90, and the variation probability is 0.10. For Model I and Model III, the value of is 0.1. For Model IV, the learning rate is 0.10, and the momentum factor is 0.90.

4.2. Real Example Simulation

Four models are used to forecast the actual load for different seasons including spring, summer, autumn, and winter.

The daily load forecasting errors of four models for spring, summer, autumn, and winter are shown in Figures 58.

The mean absolute percentage error () and the maximum relative error () of four STLF models for spring, summer, autumn, and winter are described in Table 1.


SeasonModel IModel IIModel IIIModel IV

Spring1.1662.9762.0914.5291.8274.2952.6295.429
Summer1.4373.5502.6225.2772.5494.9123.2706.125
Autumn1.0822.8891.9574.4761.8164.2092.6035.388
Winter1.2993.2702.3604.8292.2774.5152.9105.889

According to Table 1, the mean absolute percentage error and the maximum relative error of Model IV are largest, because the conventional BP-NN is a static neural network, which cannot effectively characterize the chaotic dynamic behavior of the load system.

Compared with the Model IV, the mean absolute percentage error of Model II is reduced by 0.538%–0.648% and the maximum relative error of Model II is reduced by 0.848%–1.06%. This shows that quantum computation can enhance information processing capability and generalization ability of neural networks.

The error performance of Model III is better than that of Model II and Model IV, because chaotic neural networks can effectively characterize the chaotic dynamic behavior of the load system. The forecasting performance of Model I is the best, because quantum computing and chaotic mechanism are integrated into neural networks.

From Table 1 it can be seen that the forecasting error is larger in summer and winter, and the summer forecasting error is the biggest. In summer and winter, the weather factors such as temperature have a greater impact on the load, and the load fluctuation is relatively large, which leads to the forecasting error being slightly larger than that of spring and autumn. The weather factors such as temperature have a little impact on the load in spring and autumn, and the load fluctuation is relatively small, which leads to the forecasting error being slightly small.

In order to further verify the prediction effect of the model proposed in this paper, the STLF model based on the fusion of PSRT and SVM is constructed and compared with the STLF model based on the fusion of PSRT and QCNN (Model I). The number of the input nodes is 7, and the number of the output points is 1. SVM uses the Gauss Kernel function, and is 0.15. Loss function parameter ε is 0.025, and penalty factor is 3.

From Table 2 it can be seen that Model I is clearly superior to the STLF model based on the fusion of PSRT and SVM.


SeasonModel IModel-SVM

Spring1.1662.9762.2524.640
Summer1.4373.5502.8005.335
Autumn1.0822.8892.0424.612
Winter1.2993.2702.4115.010

In order to further verify the precision performance and predictive stability, the real loads are used to test in a week in spring (2012-4-23~2012-4-29, a calm temperature week) and in summer (2012-8-6~2012-8-12, a high temperature week). The mean absolute percentage error and the maximum relative error in a calm temperature week are described in Table 3. The mean absolute percentage error and the maximum relative error in a high temperature week are described in Table 4. The mean absolute percentage error and the maximum relative error of proposed Model I can meet the requirements of actual prediction.


Temperature °CError%
Model I

Monday11~171.1662.976
Tuesday11~151.0972.885
Wednesday12~161.1382.903
Thursday10~181.0852.870
Friday12~181.1272.951
Saturday11~171.2053.016
Sunday12~161.2193.209


Temperature °CError%
Model I

Monday24~361.4373.550
Tuesday26~341.3623.439
Wednesday25~371.4663.510
Thursday25~351.4203.452
Friday26~371.3193.377
Saturday24~341.4923.980
Sunday25~331.4833.901

5. Conclusions

A STLF model based on the fusion of PSRT and QCNN was proposed in this paper. QCNN was composed of quantum neurons and chaotic neurons. Quantum computation can effectively improve information processing ability and approximation capability of the QCNN. QCNN is sensitive to the initial load value and to the whole chaotic track because of chaotic mechanism. PSRT is the important analytical method for the analysis of chaotic time series, and it is the theoretical basis of constructing QCNN. Through the actual simulation verification, the testing results show that proposed model has good prediction accuracy and stability.

Conflicts of Interest

The author declares that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported by National Natural Science Foundation of China (51477078).

References

  1. B. A. Hoverstad, A. Tidemann, H. Langseth, and P. Ozturk, “Short-term load forecasting with seasonal decomposition using evolution for parameter tuning,” IEEE Transactions on Smart Grid, vol. 6, no. 4, pp. 1904–1913, 2015. View at: Publisher Site | Google Scholar
  2. A. Ghasemi, H. Shayeghi, M. Moradzadeh, and M. Nooshyar, “A novel hybrid algorithm for electricity price and load forecasting in smart grids with demand-side management,” Applied Energy, vol. 177, pp. 40–59, 2016. View at: Publisher Site | Google Scholar
  3. S. A. Villalba and C. Á. Bel, “Hybrid demand model for load estimation and short term load forecasting in distribution electric systems,” IEEE Transactions on Power Delivery, vol. 15, no. 2, pp. 764–769, 2000. View at: Publisher Site | Google Scholar
  4. A. El Desouky and M. Elkateb, “Hybrid adaptive techniques for electric-load forecast using ANN and ARIMA,” IEE Proceeding—Generation, Transmission and Distribution, vol. 147, no. 4, pp. 213–217, 2000. View at: Publisher Site | Google Scholar
  5. C.-M. Huang, C.-J. Huang, and M.-L. Wang, “A particle swarm optimization to identifying the ARMAX model for short-term load forecasting,” IEEE Transactions on Power Systems, vol. 20, no. 2, pp. 1126–1133, 2005. View at: Publisher Site | Google Scholar
  6. C. Xia, J. Wang, and K. McMenemy, “Short, medium and long term load forecasting model and virtual load forecaster based on radial basis function neural networks,” International Journal of Electrical Power and Energy Systems, vol. 32, no. 7, pp. 743–750, 2010. View at: Publisher Site | Google Scholar
  7. D.-X. Niu, H.-F. Shi, and D. D. Wu, “Short-term load forecasting using bayesian neural networks learned by Hybrid Monte Carlo algorithm,” Applied Soft Computing Journal, vol. 12, no. 6, pp. 1822–1827, 2012. View at: Publisher Site | Google Scholar
  8. S. Kak, “On quantum neural computing,” Information Sciences, vol. 83, no. 3-4, pp. 143–160, 1995. View at: Publisher Site | Google Scholar
  9. K. Takahashi, M. Kurokawa, and M. Hashimoto, “Multi-layer quantum neural network controller trained by real-coded genetic algorithm,” Neurocomputing, vol. 134, pp. 159–164, 2014. View at: Publisher Site | Google Scholar
  10. C. Liu, Z. Y. He, and J. W. Yang, “A quantum neural network based fault diagnosis algorithm for power grid,” Power System Technology, vol. 32, no. 9, pp. 56–60, 2008. View at: Google Scholar
  11. J. Li, “Quantum-inspired neural networks with application,” Open Journal of Applied Sciences, vol. 5, no. 6, pp. 233–239, 2015. View at: Publisher Site | Google Scholar
  12. P. Li, Y. Yan, W.-J. Zheng, J.-F. Jia, and B. Bai, “Short-term load forecasting based on quantum neural network by evidential theory,” Power System Protection and Control, vol. 38, no. 16, pp. 49–53, 2010. View at: Google Scholar
  13. S. Kouhi, F. Keynia, and S. N. Ravadanegh, “A new short-term load forecast method based on neuro-evolutionary algorithm and chaotic feature selection,” International Journal of Electrical Power & Energy Systems, vol. 62, pp. 862–867, 2014. View at: Publisher Site | Google Scholar
  14. D. Niu, Y. Lu, X. Xu, and B. Li, “Short-term power load point prediction based on the sharp degree and chaotic RBF neural network,” Mathematical Problems in Engineering, vol. 2015, Article ID 231765, 8 pages, 2015. View at: Publisher Site | Google Scholar
  15. Y. He, Q. Xu, J. Wan, and S. Yang, “Electrical load forecasting based on self-adaptive chaotic neural network using Chebyshev map,” Neural Computing and Applications, pp. 1–10, 2016. View at: Publisher Site | Google Scholar
  16. S.-H. Kim, S.-D. Hong, and W.-W. Park, “An adaptive neurocontroller with modified chaotic neural networks,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN '01), pp. 509–514, Washington, DC, USA, July 2001. View at: Publisher Site | Google Scholar
  17. N. Zhang, A. Lin, and P. Shang, “Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting,” Physica A: Statistical Mechanics and its Applications, vol. 477, pp. 161–173, 2017. View at: Publisher Site | Google Scholar
  18. P. Grassberger and I. Procaccia, “Measuring the strangeness of strange attractors,” Physica D: Nonlinear Phenomena, vol. 9, no. 1-2, pp. 189–208, 1983. View at: Publisher Site | Google Scholar | MathSciNet
  19. J. Y. Lin, Y. K. Wang, and Z. P. Huang, “Selection of proper time-delay in phase space reconstruction of speech signals,” Signal Processing, vol. 15, no. 3, pp. 220–225, 1999. View at: Google Scholar
  20. H. P. Yang, C. F. Wang, and K. C. Zhu, “Short-term load forecasting based on phase space reconstruction and Chebyshev orthogonal basis neural network,” Power System Protection and Control, vol. 40, no. 24, pp. 95–99, 2012. View at: Google Scholar

Copyright © 2017 Zhisheng Zhang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views654
Downloads235
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.