Research Article

Semisupervised Deep Embedded Clustering with Adaptive Labels

Algorithm 1

Deep semiclustering with adaptive labels.
Input: the training dataset , the number of clusters k, the iteration maximum maxiter, and the training threshold.
Output: the cluster assignment Q, the cluster centroids , and the nonlinear mapping .
Begin
Pretraining computing:
To construct the deep code network.
To initialize network parameters based on the normal distribution.
To train each layer of the deep code network based on the denoising autoencoder strategy.
To connect each pretrained layer and fine-tune network parameters in an end-to-end manner.
To use pretrained deep code network to map raw data into the latent space for obtaining feature .
To use K-means to initialize centroids based on feature .
Clustering computing with adaptive labels:
To use equations (7) and (8) to compute cluster assignment Q and target assignment P.
To compute .
To use equation (10) for constructing the label list.
To dynamically rectify labels based on the adaptive label algorithm.
To compute the loss based on equation (11).
To update network parameters and centroids.
End