Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015 (2015), Article ID 350102, 9 pages
http://dx.doi.org/10.1155/2015/350102
Research Article

Convergence Analysis of Contrastive Divergence Algorithm Based on Gradient Method with Errors

1Center for Intelligence Science and Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China
2School of Mathematic and Information Science, Henan Polytechnic University, Jiaozuo, Henan 454000, China

Received 12 May 2015; Accepted 1 July 2015

Academic Editor: Julien Bruchon

Copyright © 2015 Xuesi Ma and Xiaojie Wang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Contrastive Divergence has become a common way to train Restricted Boltzmann Machines; however, its convergence has not been made clear yet. This paper studies the convergence of Contrastive Divergence algorithm. We relate Contrastive Divergence algorithm to gradient method with errors and derive convergence conditions of Contrastive Divergence algorithm using the convergence theorem of gradient method with errors. We give specific convergence conditions of Contrastive Divergence learning algorithm for Restricted Boltzmann Machines in which both visible units and hidden units can only take a finite number of values. Two new convergence conditions are obtained by specifying the learning rate. Finally, we give specific conditions that the step number of Gibbs sampling must be satisfied in order to guarantee the Contrastive Divergence algorithm convergence.