Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2018, Article ID 4276176, 16 pages
Research Article

A Simple Method of Residential Electricity Load Forecasting by Improved Bayesian Neural Networks

School of Urban Railway Transportation, Shanghai University of Engineering Science, China

Correspondence should be addressed to Qianwen Zhong; moc.nuyila@wqzuotad

Received 4 April 2018; Revised 5 July 2018; Accepted 16 August 2018; Published 13 September 2018

Academic Editor: Gaetano Zizzo

Copyright © 2018 Shubin Zheng et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Electricity load forecasting is becoming one of the key issues to solve energy crisis problem, and time-series Bayesian Neural Network is one popular method used in load forecast models. However, it has long running time and relatively strong dependence on time and weather factors at a residential level. To solve these problems, this article presents an improved Bayesian Neural Networks (IBNN) forecast model by augmenting historical load data as inputs based on simple feedforward structure. From the load time delays correlations and impact factors analysis, containing different inputs, number of hidden neurons, historic period of data, forecasting time range, and range requirement of sample data, some advices are given on how to better choose these factors. To validate the performance of improved Bayesian Neural Networks model, several residential sample datasets of one whole year from Ausgrid have been selected to build the improved Bayesian Neural Networks model. The results compared with the time-series load forecast model show that the improved Bayesian Neural Networks model can significantly reduce calculating time by more than 30 times and even when the time or meteorological factors are missing, it can still predict the load with a high accuracy. Compared with other widely used prediction methods, the IBNN also performs a better accuracy and relatively shorter computing time. This improved Bayesian Neural Networks forecasting method can be applied in residential energy management.

1. Introduction

In recent years, how to continuously meet people's energy needs has become a hot research topic of global concern. In addition to improving energy efficiency and developing new energy sources, rational management of energy is also a useful way to solve the energy crisis. The premise of effective energy management is more accurate energy load forecasting. Therefore, researchers have done a lot of research work on the prediction of electricity demand. Yu et al. [1] studied the use of sparse coding for modeling and forecasting individual household electricity loads. Qiu et al. [2] presented an ensemble method composed of Empirical Mode Decomposition (EMD) algorithm and deep learning approach. Hsiao [3] proposed an approach to model the very short-term load of individual households based on context information and daily schedule pattern analysis. Aman et al. [4] analyzed dynamic demand response (D2R) of prediction models, especially deeply discussed the scenario of small customers. Behl et al. [5] provided a model based control with regression trees algorithm, which allows performing closed-loop control for demand response (DR) strategy synthesis for large commercial buildings. Chen et al. [6] designed a Support Vector Regression (SVR) forecasting model with the ambient temperature of two hours before DR event as input variables. Cabrera et al. [7] proposed a methodology to obtain probabilistic forecasts of electricity load that is based on functional data analysis of generalized quantile curves. Marszal-Pomianowska et al. [8] presented a high-resolution model of household electricity use developed based upon a combination of measured and statistical data. Liang et al. [9, 10] proposed two electricity demand forecasting methods. One applied a new weight determination method with forecasting accuracy as induced variables based on extreme learning machine and multiple regression models. The other was from a carbon emission view to give a hybrid model based on wavelet transform and least squares support vector machine optimized by an improved cuckoo search.

With the vigorous development of artificial intelligence, the neural network method has also been applied by many researchers in the field of electricity load forecasting [1115]. Neural network methods already have some mature training algorithms and network structures. Among various neural network algorithms that have been widely used, like Scaled Conjugate Gradient, Levenberg Marquardt, Bayesian, etc., the performance of Bayesian Neural Networks (BNN) is validated as one of the most effective way to build the electricity load prediction model [1622]. And related to the structures, Time-Series Neural Networks (TSNN) structure is more reasonable and effective than the simple Feedforward Neural Networks (FFNN) structure since the electricity load has obvious time-cycle characteristics [2325]. However, as the traditional continuous time delay feedback increases, the efficiency of the prediction model is significantly reduced under TSNN structure.

To improve the above problem, an improved BNN (IBNN) model of residential short-term electricity demand is proposed by a relatively simple method with both high performance and high efficiency. This method is based on the basic BNN method and the simple FFNN structure and considers the characteristics of electricity demand cyclic changing over time. Through analyzing the correlation results between historical electricity demand data and current electricity demand data at different time ranges, the historical demand data with stronger correlations are selected as the predictive vectors to construct the prediction model instead of continuous feedback in the TSNN.

The remaining of this paper is organized as follows. Firstly, an IBNN based on FFNN is built by adding historical demand data as inputs through correlation analysis of electricity consumption at different delayed time scales. Moreover, inputs selection of the IBNN forecast model is discussed and further with the effect analysis of relative factors on the forecast performance. Then, the results of the time-series BNN model and the IBNN model are compared and discussed, especially on the program running time and the dependence on time or meteorology factors. Further the comparison also is made between the IBNN method and several common applied machine learning regression methods. The results show that the IBNN model performs relatively better in all evaluation indicators. Finally, conclusions are summarized and the future work is briefly mentioned.

2. Method of Improved Bayesian Neural Networks (IBNN)

This section firstly provides the basic BNN model structure, then the improved BNN model is presented in this paper. The selected evaluations of IBNN prediction model are briefly introduced. Finally, the impacts of inputs and relative factors are discussed.

2.1. Basic BNN Model Structure

Bayesian feedforward neural network structure (BFFNN) model is selected as the basic neural networks in our study for residential load forecasting. The principle of Bayesian approach is described in [16, 26]. The FFNN structure is shown in Figure 1.

Figure 1: Structure of FFNN.

To obtain high accuracy of the electricity demand prediction, the authors also apply the most popular structure of NN which is called multilayer perceptron (MLP). This structure mostly has an input layer, one or several hidden layers, and an output layer. Every layer obtains its weight and bias matrixes through Bayesian training algorithm.

It can be found from the process of BNN in Figure 2 that the NN forecasting model is built through finding a minimum error between the predict values and the actual observed values, which is adjusted by some defined rules, , , and in the software until it satisfies a defined error rule (<), as shown in the following:

Figure 2: Structure diagram of BNN training algorithm.

Here in (1), is the real historical value, is the weight matrix of hidden layer, is the bias vector of hidden layer, and and express the weight matrix and bias vector of output layer, respectively. and are the functions of output layer and hidden layer, respectively, as shown in

After obtaining the forecast model, the forecast load can be given by (4). Here in the following equation, is a vector contains the latest values of inputs which form this forecast model, and there may be one or several past historical load values. is the forecast load value by the built IBNN model.

From a time-series view, a dynamic BNN method is widely applied in prediction models. The past data are seen as feedback in the model. In this paper, the time-series forecasting problem is defined as Nonlinear Autoregressive with External (Exogenous) Input (NARX), with feedback connections enclosing several layers of the network [27, 28]. The Bayesian time-series neural network structure can be simplified as Figure 3, where TDL means tapped delay line.

Figure 3: Structure of TSNN.

However, the performance of the time-series BNN model is affected by the setup of time delay line, and with the increasing number of time delay values, the computing time is significantly growing. Due to the fact that there are only several input vectors, such as time of everyday, day-type, ambient temperature, and relative humidity, many of them are meteorological data, and another problem may be that the model is greatly affected by the meteorological data.

2.2. Improved Bayesian Neural Networks (IBNN) Model

The structure of IBNN is illustrated in Figure 4, besides the time inputs like time and day-type and meteorology inputs like temperature and humidity, the model also uses the historical load data as inputs.

Figure 4: Structure and process of IBNN model.

Different from traditional time-series with continuous historical values or fixed interval historical values feedback, an improved model with highly correlation historical values of prediction target as inputs is designed to obtain relatively higher accuracy but shorter computing time. The chosen historical data may be only one vector which is close to the forecast interval or several vectors from same time intervals and close intervals.

Pearson correlation coefficient () and Spearman correlation coefficient () are used to measure their correlations. If each variable has scalar observations, the Pearson correlation coefficient, , is defined as [2931]

where and are the mean and standard deviation of , respectively, and and are the mean and standard deviation of . The above equation can also be described as correlation coefficient based on the covariance of and ,

The Spearman correlation coefficient () [32, 33], , can be computed by the following equation:

Figures 57 are the correlation coefficients results of three residences from a sample of 300 homes supplied by Ausgrid from July 2010 to June 2011, which are No. 11, No. 17, and No. 50, respectively.

Figure 5: Correlation coefficients of past time load vectors of No. 11.
Figure 6: Correlation coefficients of past time load vectors of No. 17.
Figure 7: Correlation coefficients of past time load vectors of No. 50.

As can be seen from the figures, the solid line indicates the correlation coefficients calculated by different time delay vectors and the current predicted value, and the dotted line is the line connecting the highest value point of the correlation coefficients in the delays of the daily cycle. It can be clearly seen from the dotted line and the calculated values that the highest value of the correlation coefficients in the daily cycle is obtained by delaying an integral multiple of 24 hours, that is, the same time period of the past day. In addition, the calculation results of No. 11 and No. 50 can also be seen that the correlation coefficients calculated by the delay of weekly cycle may be slightly higher than other daily calculation values. This is because most households have significant differences between working days and rest days and individual households have a special electricity consumption cycle mode in one week, which leads the current forecast value have a high correlation values with the same time delays on the same day-type of past weeks. In order to further numerically compare the correlation coefficients, the partial operation results are listed in Table 1.

Table 1: Correlation coefficients of past time load vectors of three residences.

It can be seen from the table that the correlations between the historical electricity consumption over the past two hours and the current forecast period are significantly lower and are even generally lower than the past delay time of day cycles to the current forecast period. For families with obvious weekly cycles, as No. 11 and No. 50, the weekly same time period correlations are even higher than the closest daily delay correlation.

It can also be obtained from the above analysis that there are differences in the electricity consumption patterns of different households, and different input historical data make the forecast model different and the performance different. Specific analysis of specific targets is required, so that it is possible to select predictive variables with higher prediction accuracy. To find how to select the historical data as inputs can better improve the performance of forecast model and how the model related factors impact the performance, several compared forecast models are designed in the next section.

2.3. Performance Evaluation

The following four indexes are selected to evaluate the forecast model performance. MSE is the mean square error or the residual mean square. Equation (7) shows how to calculate the MSE value. is a vector of predictions, and is the vector of observed values corresponding to the inputs to the function which generates the predictions. stands for the square of the errors. An MSE value closer to 0 indicates a fitting that is more useful for prediction when the model is not overfitted.

MAPE is the mean absolute percentage error, which is accuracy evaluation and comparison of a forecasting method in statistics and especially a widely used metric in energy [32, 34]. The definition of MAPE is as follows:

However, for small household electricity consumption, the daily electricity consumption varies greatly with time. If only the MAPE is considered, large percentage errors will be obtained in small power consumption periods, so that the MAPE throughout the day will become very large. The low power consumption is often only tens to hundreds of watts per hour, but at peak times, electricity is often used more than kW per hour. In other words, it is more important to accurately predict the electricity consumption during peak hours, so the mean absolute error (MAE) is also adopted to evaluate the prediction accuracy. The MAE can be calculated as

Another statistic metric, regression coefficient, , is also applied to indicate the amount of variance explained by the model. is defined as (10), is the mean of the observed data value, is the mean of the predicted values, is the residual sum of squares, and is the explained sum of squares. can take on any value between 0 and 1, with a value closer to 1 indicating that a greater proportion of variance is accounted for by the model [35].

3. Inputs Selection and Relative Factors Analysis

In this part, aiming at giving a better chosen on input vectors of IBNN method load forecasting model, authors define several models of BNN to analyze different factors.

3.1. Basic Inputs Selection of IBNN Model

At first, an IBNN model is built with 16 load related inputs, named as BNN_16. The vectors of inputs are shown in

Here in the above equation, the input factors which are mostly considered in the existed models would be the time category and the meteorological category. In this article, time of everyday, , day-type (which is defined as integers from 1 to 7 to express Monday to Sunday and 8 to express special holidays), , ambient temperature, , and relative humidity, , are firstly considered as the inputs. is used to represent the series order number of historical sample data’s intervals, and is the observed interval time. For instance, means the temperature vector observed from the past one interval. Due to the fact that there is no record before the historical first interval, here use as the initial value to complement the vector. If other vectors lack some items, use the same complement method. According to [36], for marine climate or inshore areas, relative humidity also may affect the consumption on electricity. As in this model, there is no time delay in the NN structure and there is a study suggesting that human’s perception of temperature and relative humidity is delay with some time [37]. Taking this study as a reference, then adding historical environmental data as input vectors also very likely can improve the BNN prediction model’s performance. With similar meaning of subscript, the eight historical data vectors in the back are actual load in the first past interval, , second past interval, , third past interval, , fourth past interval, ,same interval of yesterday, , same interval of the day before yesterday, , same interval in last week of the same day-type, , and same interval in week before last week of the same day-type, , respectively. These eight inputs of the forecast model basically cover the most relevant historical load values within past two weeks. The target vector is defined as , which is the time series of last observation.

The actual load data used in this article are from a sample of 300 homes supplied by Ausgrid from July 2010 to June 2011. The related weather information is from Australian Bureau of Meteorology. No. 17 of 300 homes is selected to validate and discuss the built forecasting models as it has the largest average daily electricity demand. The average daily electricity consumption is 36.83 kWh and the average interval electricity consumption is 0.77 kWh. As the data is observed every half hour, the total sample number is 17520 in one whole year with 365 days. The training set, validation set, and test set is created by a fixed partition algorithm called ‘divideind’ in Matlab with 60%, 20%, and 20% respectively.

3.2. Models with Different Periods of Historical Data

As the electricity demand is testified to vary with changes of the weather factors, the period of historical data, to great extent, decides the performance of obtained forecasting model. In Table 2, number of input vectors is increased with extending the range of historical data.

Table 2: Inputs of IBNN with different period of historic data.

Due to the demand usually with a weekly cycle, here only add vectors of real load data at the same time in past weeks. The model is studied with real load data in the next part. Because the time interval of the data applied in this paper is 0.5 hour, to clearly illustrate the different time period, the compared models are named with real time period in inputs selection.

According to the models built in Table 2, the following values are obtained from the IBNN program. When increasing the number of inputs with historic data, in Table 3, the results show that performance of training set will be slightly better with the raising number of inputs. However, the test set is obtained with random algorithm, and from the table, the performance of test set does not keep improving with the increasing inputs number. This suggests if the real load observations for training are enough, there is no need to add too many historic inputs vectors.

Table 3: Results of IBNN models with different range of history data.

The comparison between real data and the IBNN model with different numbers of historic inputs is illustrated in Figure 8. Obviously, the prediction model BNN_0.5 hour is significantly less effective than the other three. The graph also shows that when historical data reaches a certain time range, using more input vectors with longer history data can no’t apparently improve the accuracy of the forecasting model and only increase the computing time.

Figure 8: Comparison of real data and forecasting values under BNN_0.5hour, BNN_2 weeks, BNN_1 month, and BNN_2 months of 8 hidden neurons.
3.3. Models with Different Prediction Time Range

To test the accuracies of the IBNN models under different prediction time, compared models with 16 inputs are designed, which are listed in Table 4.

Table 4: Inputs of IBNN under different prediction time.

For better upgrading the forecast learning model, in our design the chosen historical inputs should be the latest observed actual load data. With this consideration in Table 4, it can be seen that the load historical input vectors are changed according to the prediction time range. First is a model to forecast load half hour ahead, second is to forecast load 1 hour ahead, until the last is to forecast load 24 hours ahead. The models are distinguished by the subscripts which represent the prediction time range. The prediction of electricity demand is usually used for energy management and electricity devices control, so with different optimization methods, the forecasting model needs to have reasonable prediction time.

To predict the electricity load of several hours in advance, five forecasting models which applied the same sample data are built to examine the effectiveness of the IBNN model in the above. Table 5 gives the results of the model with different forecasting time ahead.

Table 5: Results of BNN_0.5h, BNN_12h, and BNN_24h.

From the table it is clearly that the IBNN forecasting model can give a relatively high accuracy in very short-term forecasting. With the prediction time range extending, the MSE and R-square values become lower. However, when the range is more than 4 hours, there is not evident changing trend from the results. That means when existing sample data are used to forecast the load in a very short time in advance, the accuracy is very high, but when to forecast the load in the following few hours or a day, the accuracy seems to be around a lower boundary.

As the discussion above, Figure 9 shows two days’ comparison between the real data and the forecasting data with different forecasting times. From the graph, it is apparently that the dashed line, which is results from the model of forecasting 0.5 hour ahead, is very close to the black line, which is the actual load observation. The other two lines are results from forecasting models of 12 hours and 24 hours in advance respectively, it cannot easily define which one is better.

Figure 9: Comparison of real data and forecasting values under BNN_0.5h, BNN_12h, and BNN_24h of 8 hidden neurons.
3.4. Range Requirement of Load Sample Data

Evidently that the forecasting model will have higher accuracy with longer period of historic electricity data. However, it is more useful to give a forecasting model which can predict the electricity load even with a relative short load recorded period, such as one year or just several months. To discuss such problem in this article, the authors try to assess the IBNN forecasting model (BNN_16) with five different periods of actual sample data under 8 hidden neurons, which are 1 month, 3 months, half year, 9 months, and a whole year, respectively.

The obtained values in Table 6 show that, with longer history period, the MSE becomes higher and the R-square becomes lower. However, the performance of the test set gets better. As the case studied here is one home in Sydney, seasonal weather factors must be taken into consideration. From the table, it can be concluded that, to obtain higher accuracy of prediction, at least half year load record should be used for building the forecasting model.

Table 6: Results of BNN_16 model with different periods of historical data.

4. Results and Discussion

4.1. Comparison between IBNN and TSNN

The same data set as above is chosen for the following validation and analysis. The authors use a toolbox named Neural Net in Matlab to run the designed forecasting model. The time-series Bayesian Neural Networks (TS_BNN) method has been described in the introduction part. The inputs of time-series BNN are defined as follows:

Table 7 lists the computing results of TS_BNN model and BNN_16 under different hidden neurons or time delays. Here the authors firstly analyze effect of the time delays and hidden neurons on the TS_BNN forecasting model. From the results it can be found that TS_BNN with the increasing of the closed-loop input time delays and the performance of prediction model is evidently improved. However, when the time delays are set to 50 intervals, 25 hours, which is one more hour than one day, the running time reaches half past three minutes.

Table 7: Results of time-series BNN and BNN_16 model.

From Table 7, it shows that, with the increasing of hidden neurons number, the MSE of the training set becomes lower and R-square values become higher; these performances seem facially better. However, the MSE values of test set do not decrease and the R-square values do not increase completely with the increasing of hidden neurons. Above computing results show that prediction model can not obtain completely better performance with the number of hidden neurons. In Table 7, when the hidden neurons of BNN_16 are 8, the MSE and R-square of test set have reached close to the optimal value. When the hidden neurons are more than 8, the computing results are in repeated fluctuations. Correspondingly, it can be seen that, with the increase of hidden neurons, the operation time of BNN prediction model is obviously prolonged. Therefore, it is very important to select appropriate hidden neurons according to the model to improve the performance and efficiency of BNN model and reduce the computation time. In general, the numbers of hidden neurons are within the range of 3~8. It also should to be noted that the best performance of TS_BNN model is under the condition with 50 time delays and 2 hidden neurons, with more than three minutes of running time. The closest performance of BNN_16 is under the condition of 8 hidden neurons with only six seconds. Obviously BNN_16 forecast model significantly reduces the time.

The randomly selected comparison between actual recorded data and the IBNN model can be seen in Figure 10, which is from Oct 2010 to Oct 2010. Here the illustration of one week’s forecasting load values and real load values comparison is just randomly selected. From the figure, it is obvious that the forecasting model performs well with the trend varying of the real data. However, when the actual data changed severely, the deviation between actual data and forecast data increased to a certain extent.

Figure 10: Comparison of real data and forecasting values under IBNN of 8 hidden neurons.

Table 8 discussed the influence of input vectors under TS_BNN model. Due to the time delay of historical load data, the TS_BNN model only uses four related factors as input vectors, which are time, temperature, relative humidity, and day-type. The approach used here is to eliminate one of the relevant factors to observe the results of the computation.

Table 8: Results of time-series BNN under different input vectors.

From the above results, although it is difficult to say what kind of factors related has more impact on electricity consumption of the consumer, by comparing the situations of eliminating a factor and using total four input vectors, the results show whatever reducing anyone of the four inputs; the performance of the prediction model became lower than the model with total four inputs. It means that all the four factors should have influence on the consumer’s electricity load.

To analyze the impacts of different considered input factors on IBNN model, here the authors define 13 models with different numbers of inputs vectors (listed in Table 9).

Table 9: Comparison models with the basic IBNN with 16 inputs.

Besides discussing the impact of each factor, here the authors also want to find if increasing the number of historic temperature or humidity inputs can improve the forecasting performance. Under this consideration, 2 models called Less T and Less RH are defined, unlike applying another two past historical input vectors of temperature and humidity, applying only one temperature input and one humidity input.

As mentioned above, the effects of different input factors on the model are analyzed. The MSE and R-square of training set are calculated with 3, 8, and 15 hidden neurons, respectively, to find similar results and exclude the impact on system randomly selecting data.

From the calculated values in Table 10, there are the following results. First, the comparison of the ‘Less T’ and ‘Less RH’ with the normal 16 inputs model (BNN_16) shows that increasing the input number of historic weather data cannot improve the model’s forecasting performance and surprisingly slightly lower the performance. However, the results in the table only show that this unit is not very sensitive with the temperature and humidity or there is a deduction that adding historical load vectors as inputs enhances the robustness of the prediction model. Second, deleting one related factor in the input vectors, there are ‘no T’, ‘no RH’, ‘no time’, and ‘no day-type’, four compared items with the results of BNN_16. The temperature has the largest impact on the model’s accuracy as the MSE of no T becomes the biggest and R-square gets the lowest value. There is not noticeable effect of the other three factors from the obtained values. However, it cannot just from the simple calculated values define that these three factors have positive effects on improving the model. Time and day-type to the basic load forecasting knowledge will significantly affect some users’ consumption mode, so the above results only can show that, under this IBNN model, the effect of these three factors is not obvious.

Table 10: Results on analyzing impacts of related input factors.

From Tables 9 and 10, the comparison of results by TS_BNN model and BNN_16 model validates the first deduction that the input historical load vectors enhance the robustness of the prediction model. In other words, it may also be noted that the increasing of inputs vectors on historical load data may improve the stability of the prediction model, and the influence of other factors on the performance is relatively reduced.

In order to further validate the effectiveness of the model, Table 11 uses the same BNN_16 structure to train the prediction models for 15 households which are randomly selected from the same Ausgrid yearly data set.

Table 11: Results on different datasets.

As can be found from Table 11, the performances of TSNN and IBNN training are very close, but the computing time is reduced by an average of 31 times. The shortest reduction time is from No. 28 family, which is 8 times shorter, and the longest reduction time is from No. 55 family which is shortened by more than 83 times.

4.2. Comparison with Other Prediction Methods

The comparison is between the proposed IBNN method and the methods from the MATLAB Statistics and Machine Learning Toolbox. The parameters of machine learning methods are defined as defaults. Table 12 lists all the computing results with the inputs the same as BNN_16 of No. 17 household dataset.

Table 12: Performance results of prediction methods.

As can be seen from Table 12, the MSE and R values of the IBNN method show the best results compared to other machine learning methods. Although the MAE value of IBNN is slightly inferior to the Bagged Trees method, it is greatly reduced in computation time by nearly five times. In order to more intuitively compare various methods, the authors have defined the corresponding number as the horizontal coordinate to draw the four evaluation indicators, respectively, as shown in the Figure 11.

Figure 11: Performance comparison of prediction methods.

5. Conclusion

Traditional time-series BNN load forecast model has some problems when applied in residential load forecast area, such as long running time and relatively strong dependence on time and weather factors. To solve these problems, based on basic BNN training method and simple FFNN structure, an improved BNN forecast model is built by augmenting historical load data as inputs through correlation analysis of electricity consumption at different delayed time scales. Further from the impact factors analysis, containing different inputs, number of hidden neurons, historic period of data, forecasting time range, and range requirement of sample data, some advices are given on how to better choose these factors. To validate the effectiveness of IBNN model, several residential sample datasets of a whole year from Ausgrid have been selected to build the IBNN models. The results compared with the time-series prediction model and common applied machine learning methods show that the IBNN model can significantly reduce calculating time and even when the time or meteorological factors are missing; it can still predict the electricity demand with a high accuracy. Future work will focus on the application of IBNN forecasting model in renewable residential energy management, especially for PV-storage system.

Data Availability

The data used in this article are provided by a power company named ‘Ausgrid’, which can be found by the following link.

Conflicts of Interest

The authors declare that they have no conflicts of interest.


The authors would like to thank Ausgrid and Bureau of Meteorology in Australia for the data used in this article. This work was supported by the National Natural Science Foundation of China (Grants nos. 51478258 and 51405287) and Shanghai Committee of Science and Technology (Grant no. 18030501300).


  1. C.-N. Yu, P. Mirowski, and T. K. Ho, “A Sparse Coding Approach to Household Electricity Demand Forecasting in Smart Grids,” IEEE Transactions on Smart Grid, vol. 8, no. 2, pp. 738–748, 2017. View at Google Scholar · View at Scopus
  2. X. Qiu, Y. Ren, P. N. Suganthan, and G. A. J. Amaratunga, “Empirical Mode Decomposition based ensemble deep learning for load demand time series forecasting,” Applied Soft Computing, vol. 54, pp. 246–255, 2017. View at Publisher · View at Google Scholar · View at Scopus
  3. Y.-H. Hsiao, “Household electricity demand forecast based on context information and user daily schedule analysis from meter data,” IEEE Transactions on Industrial Informatics, vol. 11, no. 1, pp. 33–43, 2015. View at Publisher · View at Google Scholar · View at Scopus
  4. S. Aman, M. Frincu, C. Chelmis, M. Noor, Y. Simmhan, and V. K. Prasanna, “Prediction models for dynamic demand response: Requirements, challenges, and insights,” in Proceedings of the IEEE International Conference on Smart Grid Communications, 2016.
  5. M. Behl, F. Smarra, and R. Mangharam, “DR-Advisor: A data-driven demand response recommender system,” Applied Energy, vol. 170, pp. 30–46, 2016. View at Publisher · View at Google Scholar · View at Scopus
  6. Y. Chen, P. Xu, Y. Chu et al., “Short-term electrical load forecasting using the Support Vector Regression (SVR) model to calculate the demand response baseline for office buildings,” Applied Energy, vol. 195, pp. 659–670, 2017. View at Publisher · View at Google Scholar · View at Scopus
  7. B. L. P. Cabrera and F. Schulz, “Forecasting generalized quantiles of electricity demand: a functional data approach,” Journal of the American Statistical Association, vol. 112, no. 517, pp. 127–136, 2017. View at Publisher · View at Google Scholar · View at MathSciNet
  8. A. Marszal-Pomianowska, P. Heiselberg, and O. Kalyanova Larsen, “Household electricity demand profiles - A high-resolution load model to facilitate modelling of energy flexible buildings,” Energy, vol. 103, pp. 487–501, 2016. View at Publisher · View at Google Scholar · View at Scopus
  9. Y. Liang, D. Niu, M. Ye, and W.-C. Hong, “Short-term load forecasting based on wavelet transform and least squares support vector machine optimized by improved cuckoo search,” Energies, vol. 9, no. 12, 2016. View at Google Scholar · View at Scopus
  10. Y. Liang, D. Niu, Y. Cao, and W.-C. Hong, “Analysis and modeling for China's electricity demand forecasting using a hybrid method based on multiple regression and extreme learning machine: A view from carbon emission,” Energies, vol. 9, no. 11, 2016. View at Google Scholar · View at Scopus
  11. A. S. Ahmad, M. Y. Hassan, M. P. Abdullah et al., “A review on applications of ANN and SVM for building electrical energy consumption forecasting,” Renewable & Sustainable Energy Reviews, vol. 33, pp. 102–109, 2014. View at Publisher · View at Google Scholar · View at Scopus
  12. Z. Hu, Y. Bao, T. Xiong, and R. Chiong, “Hybrid filter–wrapper feature selection for short-term load forecasting,” Engineering Applications of Artificial Intelligence, vol. 40, pp. 17–27, 2015. View at Publisher · View at Google Scholar
  13. H. Shayeghi, A. Ghasemi, M. Moradzadeh, and M. Nooshyar, “Simultaneous day-ahead forecasting of electricity price and load in smart grids,” Energy Conversion and Management, vol. 95, pp. 371–384, 2015. View at Publisher · View at Google Scholar · View at Scopus
  14. L. Xiao, J. Wang, R. Hou, and J. Wu, “A combined model based on data pre-analysis and weight coefficients optimization for electrical load forecasting,” Energy, vol. 82, pp. 524–549, 2015. View at Publisher · View at Google Scholar · View at Scopus
  15. L. Y. Xiao, J. Z. Wang, X. S. Yang, and L. Y. Xiao, “A hybrid model based on data preprocessing for electrical power forecasting,” International Journal of Electrical Power & Energy Systems, vol. 64, pp. 311–327, 2015. View at Publisher · View at Google Scholar · View at Scopus
  16. P. Lauret, E. Fock, R. N. Randrianarivony, and J.-F. Manicom-Ramsamy, “Bayesian neural network approach to short time load forecasting,” Energy Conversion and Management, vol. 49, no. 5, pp. 1156–1166, 2008. View at Publisher · View at Google Scholar · View at Scopus
  17. H. S. Hippert and J. W. Taylor, “An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting,” Neural Networks, vol. 23, no. 3, pp. 386–395, 2010. View at Publisher · View at Google Scholar · View at Scopus
  18. L. Hernandez, C. J. M. Baladron, B. Aguiar et al., “A Survey on Electric Power Demand Forecasting: Future Trends in Smart Grids, Microgrids and Smart Buildings,” Ieee Communications Surveys and Tutorials, vol. 16, no. 3, pp. 1460–1495, 2014. View at Google Scholar
  19. J. G. Jetcheva, M. Majidpour, and W.-P. Chen, “Neural network model ensembles for building-level electricity load forecasts,” Energy and Buildings, vol. 84, pp. 214–223, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. M. Ghayekhloo, M. B. Menhaj, and M. Ghofrani, “A hybrid short-term load forecasting with a new data preprocessing framework,” Electric Power Systems Research, vol. 119, pp. 138–148, 2015. View at Publisher · View at Google Scholar · View at Scopus
  21. M. Ghofrani, M. Ghayekhloo, A. Arabali, and A. Ghayekhloo, “A hybrid short-term load forecasting with a new input selection framework,” Energy, vol. 81, pp. 777–786, 2015. View at Publisher · View at Google Scholar · View at Scopus
  22. S. Hassan, A. Khosravi, and J. Jaafar, “Examining performance of aggregation algorithms for neural network-based electricity demand forecasting,” International Journal of Electrical Power & Energy Systems, vol. 64, pp. 1098–1105, 2015. View at Publisher · View at Google Scholar · View at Scopus
  23. P. Bento, J. Pombo, M. Calado, and S. Mariano, “A bat optimized neural network and wavelet transform approach for short-term price forecasting,” Applied Energy, vol. 210, pp. 88–97, 2018. View at Publisher · View at Google Scholar
  24. Y. Wang and J. M. Bielicki, “Acclimation and the response of hourly electricity loads to meteorological variables,” Energy, vol. 142, pp. 473–485, 2018. View at Publisher · View at Google Scholar · View at Scopus
  25. X. Zhang and J. Wang, “A novel decomposition‐ensemble model for forecasting short‐term load‐time series with multiple seasonal patterns,” Applied Soft Computing, vol. 65, pp. 478–494, 2018. View at Publisher · View at Google Scholar · View at Scopus
  26. D. J. MacKay, “Bayesian Interpolation,” Neural Computation, vol. 4, no. 3, pp. 415–447, 1992. View at Publisher · View at Google Scholar
  27. B. P. Hayes, J. K. Gruber, and M. Prodanovic, “A Closed-Loop State Estimation Tool for MV Network Monitoring and Operation,” IEEE Transactions on Smart Grid, vol. 6, no. 4, pp. 2116–2125, 2015. View at Publisher · View at Google Scholar · View at Scopus
  28. K. M. Powell, A. Sriprasad, W. J. Cole, and T. F. Edgar, “Heating, cooling, and electrical load forecasting for a large-scale district energy system,” Energy, vol. 74, pp. 877–885, 2014. View at Publisher · View at Google Scholar · View at Scopus
  29. R. A. Fisher, Statistical methods for research workers, 1958. View at MathSciNet
  30. M. G. Kendall, The Advanced Theory of Statistics, Charles Griffin, 1976. View at MathSciNet
  31. W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery, Numerical Recipes in C. Art of Scientific Computing, vol. 10, 1992. View at MathSciNet
  32. C. Tofallis, “Erratum: A better measure of relative prediction accuracy for model selection and model estimation,” Journal of the Operational Research Society, vol. 66, no. 3, pp. 524–524, 2015. View at Publisher · View at Google Scholar · View at Scopus
  33. D. J. Best and D. E. Roberts, “Algorithm AS 89: The Upper Tail Probabilities of Spearman's Rho,” Journal of Applied Statistics, vol. 24, no. 3, pp. 377–379, 1975. View at Publisher · View at Google Scholar
  34. C. Bergmeir, R. J. Hyndman, and B. Koo, “A note on the validity of cross-validation for evaluating autoregressive time series prediction,” Monash Econometrics & Business Statistics Working Papers, 2015. View at Google Scholar · View at MathSciNet
  35. X. Chai, S. Zheng, S. Geng, and L. Zhang, “The prediction of railway vehicle vibration based on neural network,” Journal of Information and Computational Science, vol. 12, no. 16, pp. 5889–5899, 2015. View at Publisher · View at Google Scholar · View at Scopus
  36. J. Hong and W. S. Kim, “Weather impacts on electric power load: Partial phase synchronization analysis,” Meteorological Applications, vol. 22, no. 4, pp. 811–816, 2015. View at Publisher · View at Google Scholar · View at Scopus
  37. D. Wey, A. Bohn, and L. Menna-Barreto, “Daily rhythms of native Brazilians in summer and winter,” Physiology & Behavior, vol. 105, no. 3, pp. 613–620, 2012. View at Publisher · View at Google Scholar · View at Scopus