Table of Contents
Advances in Artificial Neural Systems
Volume 2009, Article ID 846040, 11 pages
http://dx.doi.org/10.1155/2009/846040
Research Article

Building Recurrent Neural Networks to Implement Multiple Attractor Dynamics Using the Gradient Descent Method

Brain Science Institute, RIKEN, 2-1 Hirosawa, Wako City, Saitama 351-0198, Japan

Received 31 March 2008; Accepted 22 August 2008

Academic Editor: Akira Imada

Copyright © 2009 Jun Namikawa and Jun Tani. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The present paper proposes a recurrent neural network model and learning algorithm that can acquire the ability to generate desired multiple sequences. The network model is a dynamical system in which the transition function is a contraction mapping, and the learning algorithm is based on the gradient descent method. We show a numerical simulation in which a recurrent neural network obtains a multiple periodic attractor consisting of five Lissajous curves, or a Van der Pol oscillator with twelve different parameters. The present analysis clarifies that the model contains many stable regions as attractors, and multiple time series can be embedded into these regions by using the present learning method.