Table of Contents Author Guidelines Submit a Manuscript
Discrete Dynamics in Nature and Society
Volume 2009, Article ID 329173, 16 pages
Research Article

Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

1Applied Mathematics Department, Dalian University of Technology, Dalian 116024, China
2Department of Mathematics, Dalian Maritime University, Dalian 116026, China

Received 21 September 2008; Revised 5 January 2009; Accepted 31 January 2009

Academic Editor: Manuel de La Sen

Copyright © 2009 Huisheng Zhang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.