Table of Contents Author Guidelines Submit a Manuscript
Journal of Applied Mathematics
Volume 2013, Article ID 953548, 10 pages
http://dx.doi.org/10.1155/2013/953548
Research Article

Sensitivity Analysis of Wavelet Neural Network Model for Short-Term Traffic Volume Prediction

Transportation College, Southeast University, Nanjing, Jiangsu 210096, China

Received 23 August 2013; Revised 12 December 2013; Accepted 13 December 2013

Academic Editor: Han H. Choi

Copyright © 2013 Jinxing Shen and Wenquan Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In order to achieve a more accurate and robust traffic volume prediction model, the sensitivity of wavelet neural network model (WNNM) is analyzed in this study. Based on real loop detector data which is provided by traffic police detachment of Maanshan, WNNM is discussed with different numbers of input neurons, different number of hidden neurons, and traffic volume for different time intervals. The test results show that the performance of WNNM depends heavily on network parameters and time interval of traffic volume. In addition, the WNNM with 4 input neurons and 6 hidden neurons is the optimal predictor with more accuracy, stability, and adaptability. At the same time, a much better prediction record will be achieved with the time interval of traffic volume are 15 minutes. In addition, the optimized WNNM is compared with the widely used back-propagation neural network (BPNN). The comparison results indicated that WNNM produce much lower values of MAE, MAPE, and VAPE than BPNN, which proves that WNNM performs better on short-term traffic volume prediction.

1. Introduction

The main purpose of short-term traffic volume prediction is to determine the traffic volume in the next short period of time [1]. There are a wide variety of needs for short-term traffic volume prediction depending on particular applications [2]. In fact, an accurate and robust short-term traffic volume prediction model is essential for developing proactive traffic control strategies to reduce congestion, delay, accident risks, and fuel consumption. In particular, the results of the short-term traffic volume prediction are fundamental to the performance of many components in intelligent transportation systems [3].

In the past decades, there has been a great deal of work that focused on the subject of short-term traffic volume prediction. The most commonly used methodologies in previous work contained the Kalaman filter theory [4, 5], Time series [69], nonparametric methods [10, 11], and so forth. Previous studies have reported that the above models accurately predicted the short-term traffic volume when the traffic volume was relatively constant. However, when short-term traffic volume fluctuates widely following the shortening of time intervals, the above methods do not perform well in predicting traffic volume variations.

Then researchers proposed several nonlinear prediction approaches, such as the neural networks, to capture the nondeterministic and complex nonlinearity of time series in traffic volume data. Among them, the multilayer perception (MLP) network was the most commonly used nonlinear prediction approaches to deal with the short-term traffic volume data [12]. An MLP network consists of three types of layers which are the input, the hidden, and the output. It is normally trained with the back-propagation algorithm, which is based on minimizing the sum of squared errors between the desired and actual outputs (the method could be called “BPNN”) [13]. BPNN model is superior to the linear statistical modal, such as ARIMA modal, because the former one is more sensitive to the dynamics of traffic volume than the latter and does not experience the overprediction characteristics of the linear statistical modal [14]. However, as mentioned by Adeli and Hung [15] and Karlaftis and E. Vlahogianni [16], the MLP model has its inherent shortcomings such as lack of an efficient constructive model, slow convergence rate resulting in excessive computation time, and entrapment in a local minimum.

In recent years, to overcome the drawbacks associated with the MLP network, the wavelet neural network model (WNNM) has been increasingly used for the short-term traffic volume modeling and prediction. The advantage of WNNM is that it combines the strengths of discrete wavelet transform and neural network processing to achieve strong nonlinear approximation ability. As discussed by Chen [17], WNNM yields much better results than BPNN, and it can enhance prediction accuracy, and thus, WNNM has shown potential superiority in short-term traffic volume prediction. Xie and Zhang [18] have investigated the performance of WNNM, BPNN model, and radial basis function neural network (RBFNN) for predicting the short-term traffic volume. Previous studies have generally reported that the WNNM model can achieve higher prediction accuracy for short-term traffic volume than the other models. The WNNM model was considered a good predictor with reasonable accuracy, stability and adaptability. Thus, the WNNM model was considered in this study for the analysis of short-term traffic volume prediction.

One of the most important features in the WNNM model is that the performance of WNNM relies heavily on the inherent network design parameters, such as the number of input neurons and the number of hidden neurons. The optimizations of WNNM significantly improves the predictive accuracy and reduce the training time [19]. The parameter combinations in the neural network are extremely large [20]. Then many researchers [21, 22] used genetic algorithm (GA) for optimizing the parameters in the neural network. However, although the GA can obtain the optimal parameters, the whole operation and decision processes are unknown. In other words, the GA does not provide information about the relationship between the prediction results and the changes of model parameters. In our study, we would like to figure out how different parameters in the WNNM model affect the predictions for short-term traffic volume rather than to obtain the optimal model. Thus, the GA was considered to be inappropriate for our study purposes and would not be analyzed.

Additionally, the time interval should be carefully selected in the prediction of traffic volume. The determination of time intervals could play a significant role for the prediction performance [23]. In practical engineering, the time interval for prediction should not be too short because the data may contain lots of random fluctuations and white noises that affect the prediction. Besides, the short time interval may not be directly useful for field traffic management [24]. The reasonable selected time interval will save space for data storage and improve the prediction accuracy [25]. However, the selected time interval should not be too long since we need to capture the changing trend in short-term traffic volume [26]. Previously, the time interval considered for short-term traffic volume prediction varied from 5 mins to 20 mins. In our study, we evaluated several different time intervals to determine the optimal value for traffic volume prediction in our dataset.

From the analysis above, it becomes clear that there is a requirement, both a practical and a theoretical one, to study how the uncertainty in the prediction results of WNNM can be apportioned to different sources of uncertainty in inputs. As such, the sensitivity of WNNM is analyzed in this study. The optimal model parameters are selected by studying the uncertain factors that are often associated with different number of input neurons, different number of hidden neurons and different time interval of dataset in models. To achieve this, the remainder of the paper is organized as follows. Section 2 is a brief introduction of the methods for predicting the short-term traffic volume. Data collection process and approaches to sensitivity analysis used in this study are described in Section 3. In Section 4, sensitivity analysis of WNNM and the comparison between WNNM and BPNN are implemented. At last, Section 5 summarizes the concluding comments of the study.

2. Methods

2.1. Wavelet Neural Network Model

Wavelet neural networks integrate the theory of wavelets with neural networks. One application of wavelet neural networks is that of function estimation. Given a series of observed values of a function a wavelet network can be trained to learn the composition of that function, and hence calculate an expected value for a given input [27]. The structure of a wavelet neural network is extremely similar to that of a layer neural network (see Figure 1).

953548.fig.001
Figure 1: Structure of a wavelet neural network.

Figure 1 shows a feed-forward neural network, taking input layer, with hidden layer and whose output layer consists of one or more linear combiners or summers. The hidden layer is made of neurons, whose activation functions are drawn from a wavelet basis. It is defined as where is the input data, is the scale or dilation factor that determines the characteristic frequency so that its variation gives rise to a “spectrum” and stands for the translation in time so that its variation represents the “sliding” of the wavelet over . is the weight connecting hidden layer and output layer ; is the Morlet wavelet function. It is the value which can calculate the wavelet coefficients on discrete dyadic scales and positions in time:

The prediction could be explained based on the input-output datasets as follows: where and represent the current flow status and feedback traffic flow input vector at time , and are scalar nonlinear mapping functions, and is error between the actual and predicted values of the future traffic flow output .

is calculated by function where, is forecast data and stands for actual data. During the calculation process, where, , , and are calculated by feedback of prediction error

Here, is the learning rate of variables updated in iteration.

2.2. Performance Criteria

The performance of WNNM was evaluated by using statistical indices mean absolute error (MAE), mean absolute percentage error (MAPE), and variance of absolute percentage error (VAPE), and correlation coefficient defined in (7). MAE and MAPE measure the mean prediction accuracy and VAPE reflects the prediction stability. measures the linear correlation between observed and modeled values: its optimal value is 1.0 where is the number of observations, and are the observed and calculated values at time , respectively, is the observed at time , and and are the means of the observed and calculated values.

3. Data Collection and Test Design

3.1. Data Collection

All the data used in this paper is provided by Traffic Police Detachment of Maanshan. These data was collected by loop detectors which were installed on the segment of Yushan Road from Huxi Road to Jiangdong Road in Maanshan, China. The data set contains 24 hours traffic volume data from May 1 to May 31 in 2012. Traffic volume data gathered from Yushan Road were recorded within the interval of 5 minutes.

In the survey, as the trunk road in the city, Yushan Road plays an important role in residents’ trips. The traffic volumes are similar on each workday because of regular trips. However, after analyzing the raw data of traffic volume, it is found that traffic volume makes a sudden change on workdays and weekends, respectively. In addition, traffic volume varies obviously in response to special events, such as incidents, accidents and even severe weathers. Therefore, considering their disturbance, the data series collected during accidents and bad weathers will be excluded as well.

However, for a nonman-made cause, there is some data lost on May 9, 22, and 30. The lost data has been made up by the mean values of the traffic volume at the same loop detector during the other two adjacent time intervals on those three days.

Above all, those yielded a data set for 12 days containing 3456 5-minute traffic volume observations to be used, see in Figure 2.

953548.fig.002
Figure 2: Traffic volume data for 5-minute intervals from May, 2012.

From Figure 2, although the data of traffic volume in one certain period is different, all the pictures show the same varying trends.

3.2. Approaches to Sensitivity Analysis

This paper pays attention to how to obtain the optimal model parameters of WNNM for improving the accuracy and stability of short-term traffic volume prediction. In order to solve the problem, we have to evaluate the output results of WNNM for different numbers of input neurons, different numbers of hidden neurons and traffic volume data set with different time intervals. Due to a variety of possible parameter combination, one of the simplest and common approaches is that of changing one-factor at a time, to see what effect it makes on the output [28].

Here, we assume that, is the short-term traffic volume predicted by WNNM, is the real traffic volume, is the number of input neurons, is the number of hidden neurons, is the time intervals of traffic volume, and the equation can be achieved as follow:

The relationship between and can be illustrated as follow:

represents the error between and . The error cannot be simply judged by any individual evaluation index. Hence, the values of MAE, MAPE, VAPE, and are selected to assess how different parameters exert influence on prediction results. Finally, the steps of sensitivity analysis of WNNM can be described as follows.

At first, the number of input neurons, , is changed; is generated at random and remains the same in the analysis procedure; also remains the same in the analysis procedure. can be inferred to be the optimal number of input neurons after comparative analysis of different prediction results.

Then, the number of hidden neurons, , is changed; the number of input neurons is confirmed as ; remains the same in the analysis procedure. The optimal hidden neurons will be obtained after repeating the computing process mentioned above.

At last, the time interval of traffic volume, , is changed; the number of input neurons is confirmed as ; the number of optimal hidden neurons is . The best time interval of short-term traffic volume can be obtained after discussing the prediction influence of short-term traffic volume in WNNM.

This appears a logical approach that if the single parameter changes, the prediction results will make difference accordingly. Furthermore, by changing one parameter at a time, one can keep all other parameters fixed. This increases the comparability of the results and minimizes the changes of computer programmer crashes, more likely when several parameters are changed simultaneously.

4. Implementation and Findings

4.1. Sensitivity Analysis of Variable Input Neurons

For the purpose of evaluating how the changing of the number of input neurons influences the prediction accuracy of WNNM, the numbers of input neurons are set from 1 to 8. When it was set as 1, traffic volume on May 23 and 24 was selected as the training set: data on May 23 was selected as input data and data on May 24 was selected as output data. In addition, traffic volume on May 30 and 31 was selected as testing set: data on May 30 was selected as input data and data on May 31 was selected as output data. In the same way, when the number of input neurons was set as 8, the data from May 8 to 24 was selected as the training set: data from May 8 to 23 was regarded as the input and the data on May 24 was the output. The data from May 15 to 31 was set as testing set: data from May 15 to 30 was regarded as the input data and the data on May 31 was selected as output data. The other parameters in WNNM were determined as follows: the number of hidden neurons was generated at random from 1 to 10 and stayed constant during the training. The learning rate was set to 0.1, momentum rate was fixed at 0.05, and the number of epochs to train through was set to 1000. The criterion values were shown in Table 1.

tab1
Table 1: Comparison of performance among WNNMs with different number of input neurons.

In Table 1, it is obvious that when the number of input neurons is set as 4, the minimum value of MAE and VAPE can be obtained as 6.3971 and 0.1243, respectively. When the number of input neurons was set as 6, the minimum value of MAPE can be got as 0.2949. As for the relevance indicators, when the number of input neurons was set as 4, the value of can be obtained as 0.9232 which means that the prediction value and test value of short-term traffic volume have high correlations.

From Figure 3, it is obvious that the optimal number of input neurons is 4. The WNNM with 4 input neurons produces the lowest values of MAE compared to models; it is about 7.92% lower than the average. Although WNNM with 6 input neurons produces the lowest value of VAPE compared to the other models. The WNNM with 4 input neurons has better correlation among observed and modeled values than WNNM with 6 input neurons. The linear correlation value of WNNM with 4 input neurons is about 2.20% higher than WNNM with 6 input neurons.

953548.fig.003
Figure 3: Trends of performance criteria of WNNMs with different number of input neurons.
4.2. Sensitivity Analysis of Variable Hidden Neurons

In this experiment, the numbers of hidden neurons varied from 1 to 11, during the training of WNNM, and the input neutron was set as 4. Therefore, traffic volume from May 16 and 24 was selected as the training set: the data from May 16 to 23 was input data and the data on May 24 was output data. And the data from May 23 to 31 was selected as testing set: the data from May 23 to 30 was the input data and the data on May 31 was the output data.

In addition, the other parameters in WNNM were determined as follows: the learning rate was set to 0.1, momentum rate was fixed at 0.05, and the number of epochs to train through was set to 1000. The criterion values of MAE, MAPE, VAPE, and for the WNNM prediction result were shown in Table 2.

tab2
Table 2: Comparison of performance among WNNMs with different number of hidden neurons.

From Table 2, the test results of the prediction of short-term traffic volume show that when the number of hidden neurons is 6, the lower values of MAE, MAPE and VAPE can be calculated as 6.3971, 0.3169, and 0.1243, respectively. However, the test results of correlation coefficient indicate the highest correlation coefficient can be obtained as 0.9254 when the hidden neurons is set as 9.

From Figure 4, Number 6 is apparently the optimal number of hidden neurons. The WNNM with 6 hidden neurons produces the lowest values of MAE, VAPE and MAPE compared to models. The value of MAE in WNNM with 6 hidden neurons is about 5.14% lower than the average value of MAE. The value of MAPE in WNNM with 6 hidden neurons is about 17.92% lower than the average value of MAPE. And the value of VAPE in WNNM with 6 hidden neurons is about 5.14% lower than the average value of VAPE. The linear correlation value of WNNM with 9 hidden neurons is better than other models; it is only about 0.24% higher than WNNM with 6 hidden neurons. Therefore, it is apparent that WNNM with 6 hidden neurons has a similar correlation with WNNM with 9 hidden neurons.

953548.fig.004
Figure 4: Trends of performance criteria of WNNMs with different number of hidden neurons.
4.3. Sensitivity Analysis of Variable Traffic Volume for Different Time Interval

Whether different time intervals of short-term traffic volume exert influences on prediction accuracy of WNNM deserves further discussion. So we prolonged the time interval from 5 minutes to 10, 15, 20, and 30 minutes to test. During the training of WNNM, the number of input neurons was set to 4. And traffic volume from May 16 and 24 was selected as the training set, in which the data from May 16 to 23 was input data and the data on May 24 was output data when training WNNM. And the data from May 23 to 31 was selected as testing set, in which the data from May 23 to 30 was the input data and the data on May 31 was output data when WNNM made prediction test.

What is more, the number of hidden neurons was set to 6; the initial weights were updated as soon as the output error of one training pattern was calculated. The learning rate was set to 0.1; the momentum rate was fixed at 0.05, and the number of epochs to train through was set to 1000. The criterion values of MAE, MAPE, VAPE, and for the WNNM prediction result were shown in Table 3.

tab3
Table 3: Comparison of performance among WNNMs with different time intervals.

In Table 3, it is clear that when we set the time interval as 5 minutes, the lowest value of MAE can be got as 6.3971. And the lowest value of MAPE and VAPE, 0.3021 and 0.0322, can be obtained when the time interval is prolonged to 15 minutes.

Besides, the testing results indicate that the biggest correlation coefficient will be calculated as 0.9407 if the time interval is set as 15 minutes, which proves the highest correlation between the prediction of WNNM and .

In Figure 5, we can easily find the optimal time interval should be 15 minutes. Although the data is set within the time interval of 5 minutes which can obtain the lowest value of MAE, the values of MAPE, VAPE, and all indicate that the result under the condition of the time interval of 15 minutes is superior to the others. Through the analysis, with the increase of time interval, the value of input data set is growing correspondingly. That is to say, in this situation, the criteria value of MAE is likely unfit to evaluate the feasibility of the different models.

953548.fig.005
Figure 5: Trends of performance criteria of data set with different time intervals.
4.4. Comparison between WNNM and BPNN

Previously, many methods have been used for the prediction of short-term traffic volume. Among them, the BPNN has been widely accepted as a commonly used method that has good performance for predicting traffic volume [14]. As a result, BPNN was selected to make comparisons with the optimized WNNM on short-term traffic volume prediction.

For the purpose of comparison, the traffic volume from May 16 to 24 was selected as the training dataset for the BPNN and WNNM: when training the two models, data from May 16 to 23 was considered as the input and data on May 24 was considered the output. The traffic volume data from May 23 to 31 was selected as the testing dataset: data from May 23 to 30 was the input and data on May 24 was the output. The other parameters in this two models were determined as follows: the number of hidden neurons was set to 6, the learning rate was set to 0.1, the momentum rate was fixed at 0.05, and the number of epochs to train through was set to 1000. The predictions of traffic volume by the WNNM and BPNN are shown in Figure 6.

953548.fig.006
Figure 6: Traffic volume in test set and its prediction by WNNM and BPNN.

The original traffic volume data was illustrated as the black line in Figure 6, and the predictions of traffic volume by the two models were also illustrated in the figure. It can be found that the predictions made by WNNM achieved a better fit to the original traffic volume data than BPNN, especially when traffic volume exceeded 100 pcu. We also calculated the values of MAE, MAPE, VAPE, and to make comparison between the two models. The comparison results are given in Table 4.

tab4
Table 4: Comparison of performance between WNNM and BPNN.

The comparison results are quite encouraging. As shown in Table 4, the optimal WNNM model achieves lower values of MAE, MAPE, and VAPE compared to BPNN. The value of in the WNNM was bigger than BPNN. The optimal WNNM model reduced the values of MAE, MAPE, and VAPE by 16.85%, 23.47%, and 187.35%, respectively, but increased the by 0.69%. The comparison results of the two models suggest that the optimal WNNM model has higher prediction accuracy than BPNN in the field of short-term traffic volume prediction.

5. Conclusion

In order to achieve a more accurate and robust traffic flow prediction model, the sensitivity of WNNM was discussed in detail in this study. The models are studied with different number of input neurons, different number of hidden neurons, and data set with different time interval. The average criteria values of MAE, MAPE, VAPE, and for each model were calculated and provided in the last line of each table. The test results show that the performance of WNNM depends heavily on network design parameters and the time interval data set. The test results show that the WNNM with 4 input neurons and 6 hidden neurons is the best choice with more accuracy, stability, and adaptability. In general, the optimal time interval is believed to considered 15 minutes which can help to get the most accurate prediction.

In this study, the data of short-term traffic volume for sensitivity analysis of WNNM was provided by the traffic police detachment of Maanshan. It should be cautioned that the research conditions, especially the sample data only in the one certain city rather on the data from other cities, limit our further discussion of the sensitivity of WNNM. But WNNM can be universally used in different situations, for its superiority on forecasting short-term volume can be easily found in the literature review.

In order to explore its superiority, the optimal WNNM is compared with the widely used back-propagation neural network (BPNN). The comparison results show that WNNM produces lower values of MAE, MAPE, and VAPE than BPNN, suggesting that the optimal WNNM is a better model for short-term traffic volume prediction. Actually, many research areas such as the complexity temporal structure of the predicted time series and training method have not been fully explored. It is reasonable to anticipate that as the research of WNNM moves along, its better performance will be achieved.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This research was supported by a grant from the Major State Basic Research Development Program of China (973 Program) (Grant no. 2012CB725402) and a grant from the 2013 Scientific Research and Innovation Project for Postgraduates in Jiangsu Province (Grant no. CXZZ13_0121). The authors appreciate the support from Southeast University and the voluntary respondents.

References

  1. S. A. Zargari, S. Z. Siabil, A. H. Alavi, and A. H. Gandomi, “A computational intelligence-based approach for short-term traffic flow prediction,” Expert Systems, vol. 29, no. 2, pp. 124–142, 2012. View at Google Scholar
  2. E. I. Vlahogianni, J. C. Golias, and M. G. Karlaftis, “Short-term traffic forecasting: overview of objectives and methods,” Transport Reviews, vol. 24, no. 5, pp. 533–557, 2004. View at Publisher · View at Google Scholar · View at Scopus
  3. L. Li, W. H. Lin, and H. Liu, Type-2 Fuzzy Logic Approach for Short-Term Traffic Forecasting, 2006.
  4. Y. Xie, Y. Zhang, and Z. Ye, “Short-term traffic volume forecasting using Kalman filter with discrete wavelet decomposition,” Computer-Aided Civil and Infrastructure Engineering, vol. 22, no. 5, pp. 326–334, 2007. View at Publisher · View at Google Scholar · View at Scopus
  5. D. Ngoduy, “Applicable filtering framework for online multiclass freeway network estimation,” Physica A, vol. 387, no. 2-3, pp. 599–616, 2008. View at Publisher · View at Google Scholar · View at Scopus
  6. B. Ghosh, B. Basu, and M. O'Mahony, “Bayesian time-series model for short-term traffic flow forecasting,” Journal of Transportation Engineering, vol. 133, no. 3, pp. 180–189, 2007. View at Publisher · View at Google Scholar · View at Scopus
  7. A. Stathopoulos and M. G. Karlaftis, “A multivariate state space approach for urban traffic flow modeling and prediction,” Transportation Research C, vol. 11, no. 2, pp. 121–135, 2003. View at Publisher · View at Google Scholar · View at Scopus
  8. J.-N. Xue and Z.-K. Shi, “Short-time traffic flow prediction using chaos time series theory,” Journal of Transportation Systems Engineering and Information Technology, vol. 8, no. 5, pp. 68–72, 2008. View at Google Scholar · View at Scopus
  9. X. Fei, C.-C. Lu, and K. Liu, “A bayesian dynamic linear model approach for real-time short-term freeway travel time prediction,” Transportation Research C, vol. 19, no. 6, pp. 1306–1318, 2011. View at Publisher · View at Google Scholar · View at Scopus
  10. B. L. Smith, B. M. Williams, and R. Keith Oswald, “Comparison of parametric and nonparametric models for traffic flow forecasting,” Transportation Research C, vol. 10, no. 4, pp. 303–321, 2002. View at Publisher · View at Google Scholar · View at Scopus
  11. R. E. Turochy and B. D. Pierce, “Relating short-term traffic forecasting to current system state using nonparametric regression,” in Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (ITSC '04), pp. 239–244, October 2004. View at Scopus
  12. S.-Y. Yun, S. Namkoong, J.-H. Rho, S.-W. Shin, and J.-U. Choi, “A performance evaluation of neural network models in traffic volume forecasting,” Mathematical and Computer Modelling, vol. 27, no. 9–11, pp. 293–310, 1998. View at Publisher · View at Google Scholar · View at Scopus
  13. B. Ghosh, B. Basu, and M. O'Mahony, “Random process model for urban traffic flow using a wavelet-bayesian hierarchical technique,” Computer-Aided Civil and Infrastructure Engineering, vol. 25, no. 8, pp. 613–624, 2010. View at Publisher · View at Google Scholar · View at Scopus
  14. B. L. Smith and M. J. Demetsky, “Traffic flow forecasting: comparison of modeling approaches,” Journal of Transportation Engineering, vol. 123, no. 4, pp. 261–266, 1997. View at Google Scholar · View at Scopus
  15. H. Adeli and S. Hung, Machine Learning Neural Networks, Genetic Algorithms, and Fuzzy Systems, John Wiley & Sons, 1994.
  16. M. G. Karlaftis and E. Vlahogianni, “Statistical methods versus neural networks in transportation research: differences, similarities and some insights,” Transportation Research C, vol. 19, no. 3, pp. 387–399, 2011. View at Publisher · View at Google Scholar · View at Scopus
  17. S. Y. Chen and W. Wang, “Traffic volume forecasting based on wavelet transform and neural networks,” in Advances in Neural Networks—ISNN, vol. 3973, pp. 1–7, 2006. View at Google Scholar
  18. Y. Xie and Y. Zhang, “A wavelet network model for short-term traffic volume forecasting,” Journal of Intelligent Transportation Systems, vol. 10, no. 3, pp. 141–150, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  19. E. I. Vlahogianni, “Enhancing predictions in signalized arterials with information on short-term traffic flow dynamics,” Journal of Intelligent Transportation Systems, vol. 13, no. 2, pp. 73–84, 2009. View at Publisher · View at Google Scholar · View at Scopus
  20. B. Abdulhai, H. Porwal, and W. Recker, “Short-term traffic flow prediction using neuro-genetic algorithms,” ITS Journal, vol. 7, no. 1, pp. 3–41, 2002. View at Publisher · View at Google Scholar · View at Scopus
  21. R. K. Belew, J. McInerney, and N. N. Schraudolph, Evolving Networks Using the Genetic Algorithm with Connectionist Learning, 1990.
  22. J. R. Koza and J. P. Rice, “Genetic generation of both the weights and architecture for a neural network,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN '91), pp. 397–404, July 1991. View at Scopus
  23. J. Guo, B. M. Williams, and B. L. Smith, “Data collection time intervals for stochastic short-term traffic flow forecasting,” Journal of the Transportation Research Board, vol. 2024, no. 1, pp. 18–26, 2007. View at Publisher · View at Google Scholar · View at Scopus
  24. C. Oh, S. G. Ritchie, and J.-S. Oh, “Exploring the relationship between data aggregation and predictability to provide better predictive traffic information,” Journal of the Transportation Research Board, vol. 1935, no. 1, pp. 28–36, 2005. View at Google Scholar · View at Scopus
  25. F. Qiao, X. Wang, and L. Yu, “Optimizing aggregation level for intelligent transportation system data based on wavelet decomposition,” Journal of the Transportation Research Board, vol. 1840, no. 1, pp. 10–20, 2003. View at Google Scholar · View at Scopus
  26. F. Qiao, L. Yu, and X. Wang, “Double-sided determination of aggregation level for intelligent transportation system data,” Journal of the Transportation Research Board, vol. 1879, no. 1, pp. 80–88, 2004. View at Google Scholar · View at Scopus
  27. D. Veitch, Wavelet neural networks and their application in the study of dynamical systems [Dissertation], MSc in Data Analysis, Networks and Nonlinear Dynamics, Department of Mathematics, University of York, Helsington, UK, 2005.
  28. A. Saltelli, M. Ratto, T. Andres et al., Global Sensitivity Analysis: The Primer, Wiley-Interscience, 2008.