Abstract

Quantum particle swarm optimization (QPSO) is a population based optimization algorithm inspired by social behavior of bird flocking which combines the ideas of quantum computing. For many optimization problems, traditional QPSO algorithm can produce high-quality solution within a reasonable computation time and relatively stable convergence characteristics. But QPSO algorithm also showed some unsatisfactory issues in practical applications, such as premature convergence and poor ability in global optimization. To solve these problems, an improved quantum particle swarm optimization algorithm is proposed and implemented in this paper. There are three main works in this paper. Firstly, an improved QPSO algorithm is introduced which can enhance decision making ability of the model. Secondly, we introduce synergetic neural network model to mangroves classification for the first time which can better handle fuzzy matching of remote sensing image. Finally, the improved QPSO algorithm is used to realize the optimization of network parameter. The experiments on mangroves classification showed that the improved algorithm has more powerful global exploration ability and faster convergence speed.

1. Introduction

Quantum particle swarm optimization algorithm (QPSO) is a new evolutionary algorithm proposed in 2005 [1, 2]. Researchers found that the human learning process has great uncertainty which is very similar to the quantum behaviour of particle, so each individual can be described as a particle in quantum space. In recent years, a series of papers have focused on the application of QPSO, such as financial forecasting [3], sensor array [4], clinical disease diagnoses [5], classification and clustering [6], fuel management optimization [7], feature selection [8], and other areas [9, 10]. But traditional QPSO is easy to fall into local optimum value and the convergence rate is slow. How to avoid prematurity and maintain fast convergence rate at the same time is a major problem. In order to improve the performance of QPSO, researchers have made ​​some attempts. Sun et al. [11] reported a global search strategy of quantum behaved particle swarm optimization. In [12], chaotic mutation operator is introduced to quantum particle swarm optimization, instead of random sequences in QPSO; chaotic mutation operator is a powerful strategy to diversify the QPSO population and can improve the performance in preventing premature convergence to local minima. In [13], an improved quantum particle swarm optimization algorithm based on real coding method is presented which can improve the performance of QPSO.

Mangrove ecosystems play irreplaceable and important roles for the stabilization and equilibrium of coastal estuary as a unique ecosystem of land and sea [1417]. Grasping the status of mangrove communities timely and accurately can provide important information for the protection and restoration of mangrove ecosystems. But the spatial resolution of TM images is not high and the spectral similarity of mangrove communities is strong [1820]. Due to these problems, it is necessary to adopt a more intelligent approach to improve the accuracy of mangrove classification.

We can use synergetic neural network (SNN) proposed by Haken [21] to realize intelligent mangroves classification. The basic principle of synergetic neural network is that the pattern recognition procedure can be viewed as the competition progress of many order parameters. One advantage of synergetic neural network method is robustness against noise and occlusion; using this method we will be able to better handle fuzzy matching of mangroves classification whose contextual information is not complete. Synergetic pattern recognition method has been successfully used in face recognition [22], automatic control field [23, 24], and exon recognition [25]. Mangroves classification can also be considered as a problem of pattern recognition, and it is also entirely possible to use this method to solve mangroves classification. At present, the mainstream studies of SNN focus on the selection of prototype pattern vector [26], setting of attention parameter [2729], reconstruction algorithm of order parameters [30], and so on. The network parameters directly influence the synergetic recognition performance. The adjustment of network parameters is a global behavior and has no general research theory to control the parameters in the recognition process at present. In this paper, we proposed an improved algorithm which can effectively choose network parameters of SNN.

This paper is organized as follows. An improved QPSO algorithm is presented in Section 2. In Section 3, mangroves classification model based on synergetic neural network is introduced. An improved SNN model based on improved QPSO is presented in Section 4. In Section 5, some experimental results and conclusions are given.

2. Improved QPSO Based on Adaptive Behavior Selection (AQPSO)

In this section, we present an improved QPSO. First, diversity function is employed as feature functions. Second, adaptive movement behavior is introduced into quantum particle swarm optimization for the first time.

2.1. Diversity Function

Diversity function is used to describe the dispersion degree of particles. Diversity function describes adaptive diversity, where refers to the current iteration number. Considerwhere means the average fitness value of the th iterative and is current global optimal value. means the diversity is poor. means the diversity is good.

2.2. Adaptive Movement Behavior

From the QPSO algorithm, we can know local attractor is determined by local optimum and global optimum :where , , .

In standard QPSO algorithm, and are employed as two accelerating factors which not only will affect convergence speed, but also may lead to the occurrence of premature phenomenon. In the early iterations, the current position and the optimal location of particles are far away. At later stage, the current position and the optimal location are closer. To effectively avoid the occurrence of premature phenomenon, can be used to describe the closer degree between particles and the global optimal position, so as to select the corresponding acceleration factor:

2.3. The Improved Algorithm

The improved QPSO algorithm is shown as follows.(1)Initialize variables.(2)Generate initial artificial particle swarm.(3)Update the local optimum position and the global best position .(4)Each particle updates its location through different behavior.(5)Perform adaptive movement behavior by (3).(6)The threshold gradually reduced which can lead to the dispersion decreasing of particles.(7)Record the optimal value.(8)If the iteration is terminated, output the optimal value; otherwise return to Step (3).

3. An Improved SNN Model Based on AQPSO

3.1. Mangroves Classification Based on SNN Model

SNN model is a top-down network constructed by synergetics different from traditional neural network, and it does not produce the pseudostate. An unrecognized pattern, , is constructed by a dynamic process which translates into one of the prototype pattern vectors through status ; namely, this prototype pattern is closest to . The process is described as in the following equation:

A dynamic equation can be given for unrecognized pattern: where is the status vector of input pattern with initial value , is attention parameter, is prototype pattern vector, and is the adjoint vector of .

Corresponding dynamic equation of order parameters is where satisfies initial condition

A method for mangroves classification using synergetic neural network technique is presented, as shown in Figure 1.

The mangrove classification model based on SNN mainly includes the following steps.

Step 1. Extract feature from training corpus and calculate prototype pattern vectors which satisfied the condition of normalization and zero-mean.

Step 2. Obtain the adjoint pattern of the prototype patterns.

Step 3. Extract feature from test corpus and calculate test pattern vectors which satisfied the condition of normalization and zero-mean.

Step 4. Calculate the initial values of order parameters according to (6).

Step 5. Evolution equation is performed for the competition among order parameters according to (5).

Step 6. If the values of in the evolution can reach stable condition, output the final results; otherwise continue performing evolution equation in Step 5.

3.2. Parameters Optimization of SNN Based on AQPSO

The network parameters of synergetic neural network directly influence the synergetic recognition performance. There is no general research theory to control the parameters in the recognition process at present. How to construct network parameter and choose a more efficient optimization method is an important task. To solve these problems, AQPSO is used in this section to effectively select network parameters.

A parameters optimization of synergetic neural network based on AQPSO is shown as Figure 2. Firstly, we can reconstruct the attention parameter based on the measuring similarity between the prototype pattern and the test pattern :

Secondly, AQPSO algorithm is used to search the global optimum parameters of network in the corresponding parameter space.

The reconstruction and optimization of network parameters based on AQPSO can be described as follows.(1)Obtain feature vectors from train corpus and test corpus and construct prototype pattern and test pattern .(2)Calculate initial order parameter according to (6).(3)Set , according to (7).(4)Optimization algorithms are used to search the global optimum parameters of SNN in the corresponding parameter space.(5)Get best mangrove categories through the evaluation of order parameter equation (5).

4. Experiment

In the experiment, we take 10 mangrove images as training samples. The size of images is . The categories of the mangrove images are shown as Table 1.

The prototype pattern of training samples is shown in Figure 3.

The corresponding adjoint mode is shown in Figure 4.

In the next section, we use different mangrove images to test the performance of our proposed model.

4.1. Recognition of Single Mangrove Image
4.1.1. Recognition of Noisy Images

The samples to be identified are formed by adding noise to the original image, without translation, rotation, or scaling, as shown in Figure 5.

According to the stability analysis method, it can get fast and stable convergent curves when setting network parameters . The recognition process is shown in Figure 6.

From Figure 7, we can find category C1 eventually won through the competition. The evolution curve of competitive process could converge quickly and became stable after the 95th iterations.

4.1.2. Recognition of Incomplete Image

The incomplete image to be identified is shown in Figure 8.

The evolution curve is shown in Figure 9.

From Figure 9, we can find the initial order parameter of category C3 is not the biggest at the beginning (the biggest is C7); however, it eventually won through the competition and desired pattern is recognized. In this process, the category of ambiguous mangrove image will be determined. Meanwhile, the convergence rate of the competitive process could converge quickly and became stable after the 85th iterations.

4.2. Recognition of Large-Scale Images

We take 10 mangrove images as training samples. The size of image is . We get 120 test images by various processing methods, such as adding noise, rotation, scaling, cropping, and fuzziness on the training samples.

For comparison, we use four strategies:SNN: mangroves classification based on SNN,SNN + PSO: mangroves classification based on SNN and PSO,SNN + QPSO: mangroves classification based on SNN and QPSO,SNN + AQPSO: mangroves classification based on SNN and AQPSO.

The experiments are implemented in a notebook PC (Lenovo ThinkPad T430) which includes a 2.5 GHz CPU with 4 GB RAM. The operating system is Windows 7.

The parameter setting of PSO is shown as Table 2.

The parameter setting of QPSO and AQPSO is shown as Table 3.

The results are shown in Table 4. Each point is made from average values of over 30 repetitions.

From Tables 4 and 5, we can see no model performs better than others for all evaluation indicators, but the accurate rates of all the three parameter optimization models (SNN + PSO, SNN + QPSO, and SNN + AQPSO) are comparable to SNN model. The reason is that the attention parameters are very important for SNN and optimization algorithm is essential for better performance.

SNN + AQPSO can effectively improve accurate rate compared with the standard PSO algorithm and QPSO algorithm. The new behaviors of particle swarm presented in this paper are essential for better performance. At the same time, the behavior selection can enhance decision making ability of behavior selection. On the convergence time, there is a significant improvement as expected. In conclusion, the improved algorithm has better global search ability and fast convergence speed.

The convergence curve is shown in Figure 10. We can see there is a significant improvement as expected on the convergence accuracy. The experiment results show the improved QPSO has better global and local parameter searching abilities.

5. Conclusions

In the paper, we construct an improved quantum particle swarm optimization. Experiments on mangroves classification show the improved algorithm has more powerful global exploration ability with faster convergence speed.

We got the following conclusions.(1)The mangroves classification procedure can be viewed as the competition progress of many order parameters. The order parameter can reflect the similarity between the prototype and test pattern better.(2)The AQPSO has both global and local search ability and can effectively choose network parameters of SNN.

In the future, we will focus on the following two aspects.(1)The network parameters of SNN are very important for better recognition performance. The change of attention parameters will lead to complete different recognition results. We will use other optimization algorithms to search the global optimum parameters of SNN in the corresponding parameter space.(2)The behavior of particle swarm has a critical influence for better performance. In the future, we will introduce some new behaviors and apply this idea to other optimization tasks.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.