Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2014, Article ID 787102, 2 pages

Dynamics of Neural Networks and Applications in Optimization

1Department of Applied Mathematics, Yanshan University, Qinhuangdao 066001, China
2Department of Mathematics, Harbin Institute of Technology, Harbin 150001, China
3Department of Basic Science, Shijiazhuang Mechanical Engineering College, Shijiazhuang 050003, China
4Ramanujan Centre for Higher Mathematics, Alagappa University, Karaikudi, Tamil Nadu 630 004, India

Received 15 May 2014; Accepted 15 May 2014; Published 1 June 2014

Copyright © 2014 Huaiqin Wu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

In recent years, neural networks have been a subject of intense research activities due to their wide applications in different areas, such as image processing, pattern recognition, associative memory, and combinational optimization. In practical engineering applications, it is crucial to completely characterize the dynamical properties of neural networks via mathematical methods. Hence, the mathematics processing for the nonlinear dynamics of neural networks still possesses new challenges for researchers in this area. In this special issue, the global exponential stability issues are considered for almost periodic solution of the neural networks with mixed time-varying delays and discontinuous neuron activations; some sufficient conditions for the existence, uniqueness, and global exponential stability of almost periodic solution are achieved in terms of certain linear matrix inequalities. A new asymptotically stable theorem and two corollaries are presented for the unified recurrent neural networks; besides, the exponential state estimation problem is discussed for a class of discrete-time fuzzy cellular neural networks with mixed time delays.

The neural network approach has become an important means to provide real-time solutions to some optimization problems, especially for large-scale problems. Compared with the classical optimization approaches, the prominent advantage of neural computing is that it can converge to the optimal solution rapidly, and this advantage motivates researchers to propose an efficient algorithm, which is based on the neural network approach for nonlinear programming problems. It should be noted that the nonsmooth optimization issue plays an important role in many engineering applications. For example, in manipulator control or signal processing, the norm of the decision variable is usually required to be optimized, which results in the nonsmooth objective function [14]. In this special issue, based on smoothing approximation technique and projected gradient method, a neural network has been constructed to solve a kind of nonsmooth sparse reconstruction optimization problems.

For most neural networks, approaches to optimization focus on convex optimization; as a result, nonconvex optimization is rarely investigated. In particular, many nonlinear programming problems can be formulated as nonconvex optimization problems, and among nonconvex programming problems, as a special case, pseudoconvex programming problems are found to be more prevalent than other nonconvex programming problems. Pseudoconvex optimization problem has many applications in practice, such as fractional programming, computer vision, and production planning. Recently, neural networks for pseudoconvex optimization are studied massively and many good results are obtained in the literature [59]. In this special issue, a one-layer recurrent neural network is developed to solve pseudoconvex optimization with box constraints. Compared with the existing neural networks in solving pseudoconvex optimization, the proposed neural network has a wider domain for implementation.

This special issue has also introduced other applications of neural networks; for example, a hybrid partial least square backpropagation neural network has been presented to predict alpha radioactivity inside decommissioned pipes during nuclear facility disassembling, a novel adaptive linear and normalized combination method has been given to effectively fuse the component neural networks with the sample-varying normalized weights, an intelligent model has been established to predict the maximum lateral displacement of composite soil-nailed wall based on the backpropagation algorithm of artificial neural networks, and the dynamic fuzzy neural network has also been adopted to establish the PVC stripping process model based on the actual process operation datum and the PVC stripping process is decoupled by the distributed neural network decoupling module to obtain two single-input-single-output subsystems.

This special issue introduces some applications not falling in the neural network area as well; for example, the Milk-run schema is introduced into the express distribution logistics through the feasibility analysis of application of cyclic goods-taking schema in the express industry and effectively shortens the distance and lowers costs by means of reasonable route planning, a support vector regression model for wind speed prediction has been proposed based on standard support vector regression and a method of estimating wind speed probability distribution has been given by using Bernoulli’s law of large numbers, a new improved extreme learning machine algorithm has been proposed to handle the multicollinear problem appearing in calculating the extreme learning machine algorithm and the proposed algorithm is employed in bearing fault detection using stator current monitoring, and a gas-liquid phase flow velocity measurement method has also been presented to demodulate the oil-gas-water three-phase flow signal based on energy demodulation algorithm combing with time delay estimation technology.

It is certainly impossible to provide in this short editorial a more comprehensive description for all articles in this special issue. However, we sincerely hope that our efforts by compiling these articles can enrich our readers and inspire researchers with regard to the seemingly common but actually important issue of Dynamics Networks and Optimization.


We would like to thank the authors who submitted papers for consideration and the reviewers whose comments are important for us to make the decisions. All the participants have made it possible to have a very stimulating interchange of ideas. Many thanks are also due to the editorial board members of this journal owing to their great support and help for this special issue.

Huaiqin Wu
Wei Bian
Qintao Gan
Ramachandran Raja


  1. Q. Liu and J. Wang, “A one-layer projection neural network for nonsmooth optimization subject to linear equalities and bound constraints,” IEEE Transactions on Neural Networks and Linear Systems, no. 5, pp. 812–824, 2013. View at Google Scholar
  2. Q. Liu and J. Wang, “A one-layer recurrent neural network for constrained nonsmooth optimization,” IEEE Transactions on Systems, Man, and Cybernetics B: Cybernetics, vol. 41, no. 5, pp. 1323–1333, 2011. View at Publisher · View at Google Scholar · View at Scopus
  3. Q. Li and Y. Liu, “Neural network for nonsmooth pseudoconvex optimization with general constraints,” IEEE Transactions on Neural Networks, no. 131, pp. 336–347, 2014. View at Google Scholar
  4. S. Qin, W. Bian, and X. Xue, “A new one-layer recurrent neural network for nonsmooth pseudoconvex optimization,” Neurocomputing, vol. 120, pp. 655–662, 2013. View at Publisher · View at Google Scholar · View at Scopus
  5. W. Bian and X. Xue, “Neural network for solving constrained convex optimization problems with global attractivity,” IEEE Transactions on Circuits and Systems, vol. 60, no. 3, pp. 710–723, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  6. Q. Liu, C. Dang, and T. Huang, “A one-layer recurrent neural network for real-time portfolio optimization with probability criterion,” IEEE Transactions on Systems, Man, and Cybernetics B: Cybernetics, vol. 43, no. 1, pp. 14–23, 2013. View at Publisher · View at Google Scholar · View at Scopus
  7. A. Hosseini, J. Wang, and S. M. Hosseini, “A recurrent neural network for solving a class of generalized convex optimization problems,” IEEE Transactions on Neural Networks, vol. 44, pp. 78–86, 2013. View at Publisher · View at Google Scholar · View at Scopus
  8. Q. Liu, Z. Guo, and J. Wang, “A one-layer recurrent neural network for constrained pseudoconvex optimization and its application for dynamic portfolio optimization,” IEEE Transactions on Neural Networks, vol. 26, pp. 99–109, 2012. View at Publisher · View at Google Scholar · View at Scopus
  9. S. Qin, D. Fan, P. Su, and Q. Liu, “A simplified recurrent neural network for pseudoconvex optimization subject to linear equality constraints,” Communications in Nonlinear Science and Numerical Simulation, vol. 19, no. 4, pp. 789–798, 2014. View at Publisher · View at Google Scholar · View at MathSciNet