Abstract

Gravitational Search Algorithm (GSA) is a widely used metaheuristic algorithm. Although fewer parameters in GSA were adjusted, GSA has a slow convergence rate. In this paper, we change the constant acceleration coefficients to be the exponential function on the basis of combination of GSA and PSO (PSO-GSA) and propose an improved PSO-GSA algorithm (written as I-PSO-GSA) for solving two kinds of classifications: surface water quality and the moving direction of robots. I-PSO-GSA is employed to optimize weights and biases of backpropagation (BP) neural network. The experimental results show that, being compared with combination of PSO and GSA (PSO-GSA), single PSO, and single GSA for optimizing the parameters of BP neural network, I-PSO-GSA outperforms PSO-GSA, PSO, and GSA and has better classification accuracy for these two actual problems.

1. Introduction

The backpropagation algorithm (BP) [1] was proposed by Werbos and has been widely applied for classification and prediction in various areas, such as hand classification of hand manipulation [2], temperature prediction [3], wind speed forecasting [4], campus traffic congestion detection [5], and stock market prediction [6]. BP neural network consists of one input layer, more than one layer in the hidden layer, and one output layer. BP algorithm is composed of positive information transformation and negative error adjustment. In positive information transformation, input information is computed from the input layer via the hidden layers to the output layer and the status of the neural nodes in each layer only influences the status of the neural nodes of next layer. If the error between the outputs in the output layer and the output expectation is larger than the error goal, negative error adjustment is implemented from the output layer to the input layer on the original path to modify the weights and basis of each neuron until a desired goal. But it has a slow convergence rate and a fast tendency to become trapped in local minima.

Population-based metaheuristic optimization algorithms have two intrinsic considered features: exploration and exploitation. Exploration is a very effective way to explore the entire problem space and exploitation is able to make the problem converge to an optimal solution. The most popular metaheuristic algorithms are Genetic Algorithm (GA) [4, 68], Particle Swarm Optimization (PSO) [920], Artificial Bee Colony (ABC) Algorithm [21], Fruit Fly Optimization Algorithm (FOA) [22], and the Gravitational Search Algorithm (GSA) [2325] and so on. They have attracted the attention of more and more scholars and can effectively solve many scientific problems, such as classification and prediction. Kavoosi et al. used GA to forecast global carbon dioxide emissions [8], Agrawal and Bawane used PSO to classify multispectral satellite image [16], Chen and Liu used PSO to forecast coal logistics [26], Vazquez and Garro used ABC to classify crops [27], Kang et al. used FOA to classify rolling bearing fault [28], and Behrang et al. used GSA to forecast oil demand in Iran [29].

Recently, many heuristic optimization methods are utilized to optimize the weights and basis of many neural networks in order to improve the accuracy of classification and prediction. For example, GA optimized the parameters of BP neural network for stock market prediction and wind speed forecasting [4, 6], and PSO and BP neural network was integrated to construct an estimating model for the cost of plastic injection molding parts [9].

In this paper, we combine the Gravitational Search Algorithm (GSA) [23] with PSO algorithm to propose a new algorithm (written as I-PSO-GSA) for solving two kinds of classifications: surface water quality and the moving direction of robots. Here, when the acceleration coefficients in I-PSO-GSA are constant, I-PSO-GSA is written as PSO-GSA. I-PSO-GSA is employed to optimize weights and biases of backpropagation (BP) neural network. Being compared with combination of PSO and GSA (PSO-GSA), single PSO, and single GSA for optimizing the parameters of BP neural network, I-PSO-GSA outperforms PSO-GSA, PSO, and GSA and has the highest accuracy. The rest of the paper is structured as follows. Section 2 introduces the PSO, GSA, and I-PSO-GSA algorithms. Section 3 discusses the method of applying PSO, GSA, PSO-GSA, and I-PSO-GSA to solve classification problem of surface water quality and robot moving direction. Finally, Section 4 outlines our conclusions.

2. The Prior Work

We use BP neural network to perform the classification problems. Owing to the initial weights and basis of BP neural network randomly, the classification accuracy rate is not stable. Therefore, many intelligence algorithms are used to optimize the initial weights and basis of BP neural network for obtaining the higher classification rate.

The Gravitational Search Algorithm (GSA) based on the law of gravity and mass interactions was proposed in 2009 by Rashedi et al. [23] in solving various nonlinear functions. And the differences were obtained by GSA versus PSO and Central Force Optimization (CFO). It was shown that GSA outperforms PSO and Real Genetic Algorithm (RGA) in nonlinear functions.

Rashedi et al. [30] used GSA to set up a filter modeling by comparison with PSO and GA and the obtained results showed that GSA was superior to PSO and GA.

Sarafrazi et al. [31] introduced a novel operator called “Disruption,” originating from astrophysics, into GSA and obtained the improved GSA in nonlinear functions by comparison with PSO and GA was a valid method.

Farivar and Shoorehdeli [32] used Lyapunov stability theorem and the system dynamics concept to analyze the stability of particle dynamics in GSA by adapting parameters. On the basis of stability analysis, the modified GSA outperforms the standard GSA, PSO, RGA, and two methods of improved GSA in solving the optimization problem of various nonlinear functions and has a high performance.

Besides, GSA was used in various actual problems, such as economic optimization design of a shell and tube heat exchanger [33], multimodal optimization [34], and a Linear Array Antenna (LAA) optimized subsystem [35].

Many researchers have combined some intelligence algorithms with GSA and obtained some results. For example, Mirjalili et al. [24] combined PSO with GSA to train feedforward neural networks in terms of converging speed and avoiding local minima. Wang and Song [36] introduced a small constant into hybrid algorithm based on GSA and PSO for updating strategy to obtain four improved hybrid algorithms in nonlinear function optimization problems. Das et al. [37] use a new methodology to determine the optimal trajectory of the path for multirobots in a clutter environment using a hybrid approach of improved PSO with the local best value and improved GSA with the velocity updated by introducing a random constant.

In this paper, we change two acceleration coefficients from the constants into the functions on the iteration number for two kinds of classification. In the next section, we give a brief overview of GSA, PSO, and hybrid algorithm of GSA and PSO.

3. Classification Models

3.1. Particle Swarm Optimization

Particle Swarm Optimization (PSO) is an optimization algorithm based on the foraging behavior of birds, which is based on the random initialization of the population and the iterative update in the search process [38]. In the process of searching the optimal solution, each bird is considered as a particle without mass and volume. During the search process, the particles are able to record their current best position () and the global best position (). Velocity and position of each particle are calculated as follows:where and are the current velocity and position of the th particle at the th iteration, and are acceleration coefficients that control the influence of and on the search process, respectively, is a random number in , is the current best position of all the particles at the th iteration, is the best position among all the particles at whole iterations, and is the inertia weight.

3.2. Gravitational Search Algorithm

The Gravitational Search Algorithm (GSA), proposed by Rashedi et al., is a new algorithm based on the law of gravity. The agents in GSA are considered as objects with masses. Agents attract each other by the gravity force. The greater the quality, the stronger the gravity. Therefore, the location of the agent with the largest mass is the optimal solution [23].

Suppose that there are agents with -dimension. The position of the th agent is

At the th time, the force acting on the th agent from the th agent is defined as follows:where and are the masses of the th agent and the th agent, respectively, is the gravitational constant at the th time, is a small constant, and is the Euclidian distance between the th agent and the th agent [17].

At the th time, total force acting on the th agent is defined as follows:where is a uniform random variable in the interval .

According to the law of motion, the acceleration of the agent at the th time can be defined as follows:

In each iteration process, velocity and position of the th agent are updated by the following two equations:where is a uniform random variable in the interval and and are its current position and velocity, respectively.

3.3. The Combination Approach of PSO and GSA

In GSA, agents do not share population information with each other and have a weak capability of development. By use of the global optimal searching ability of PSO and the local searching ability of GSA, each agent is updated by the velocity of PSO and the acceleration of GSA. The approach is called PSO-GSA. Thus, the exploration ability and exploitation ability are better combined with the unceasing change of parameters.

The velocity and the position of the th agent are updated by the following two equations:where is the inertia weight, , , and are the velocity, current position, and acceleration of the th particle at the th iteration, respectively, and and are constant acceleration coefficients, respectively.

In this paper, we take that and are the exponential functions defined as follows:where is the initial value, is the final value, is the maximum iteration number, and is the current iteration number. In order to be distinguished from GSA-PSO, GSA-PSO with the functional acceleration coefficients (11) is named as I-PSO-GSA.

3.4. The Proposed Classification Model

I-PSO-GSA algorithm is employed to optimize the weights and biases of BP neural network and the mean square error (MSE) is fitness function of I-PSO-GSA. The fitness function of the th training sample can be defined as follows:where is the number of training samples, is the desired output of the th input unit in the th training sample, and is the actual output of the th input unit in the th training sample.

Suppose that the structure of BP neural network is a structure, where is the number of the nodes in the input layer, is the number of the nodes in the hidden layer, and is the number of the nodes in the output layer. And suppose that there are agents in the population, where every agent is a -dimension vector , where . We map into the weights and basis of BP neural network, where the components of are the weights between the input layer and the hidden layer, the components of are the basis of the hidden layer, the components of are the weights between the hidden layer and the output layer, and the components are the basis of the output layer.

The algorithm’s steps are summarized in detail as follows.

Step 1. Set the control parameters of the proposed GSA algorithm.

Step 2. Initialize the agents .

Step 3. Map agent into the weights and basis of BP neural network and calculate its fitness evaluation by use of (12).

Step 4. Compute gravitational constant and .

Step 5. Calculate and by (5) and accelerations of the agents by (6).

Step 6. Update parameters and by (11).

Step 7. Update velocity and positions of the agents according to (9) and (10).

Step 8. If the stopping criterion is met, then go to Step 9. Otherwise, go to Step 3.

Step 9. Output the best agent that is mapped into the weights and basis of BP neural network. The initial parameters of BP neural network are obtained. Then train BP neural network and test BP neural network.

4. Experimental Results

In this section, we compare I-PSO-GSA with PSO-GSA, PSO, and GSA by optimizing the parameters of BP neural network for two kinds of classification problems: surface water quality classification problem and robot moving direction classification problem.

In the experiments, the population size can be set to 50, the maximum number of iterations can be set to 200, and the initialization of the particle positions and velocities can be randomly generated in the interval . In I-PSO-GSA, and are exponential function (see (11)) and and . In PSO-GSA, . In PSO, , and decreases linearly from 0.9 to 0.3. In GSA, the gravitational constant () is set to 1, and initial values of acceleration and mass are set to 0.

4.1. Example  1: Surface Water Quality Classification

Water is the lifeblood of the whole national economy. The surface water quality has impacted more importantly on the human life and the human health. With the progress of the times and society, there is a growing awareness of the surface water quality. The classification of surface water is a qualitative description of water quality, which is the necessary means and tool for people to know the surface water quality and find out the existing problems. Only when we fully understand the main pollution factors and pollution sources can we make the reasonable treatment scheme and preventive measures.

Indicators have clear standard limits, including water temperature, PH value, dissolved oxygen, ammonia nitrogen, lead, and sulfur. In recent years, scholars have paid more attention to the research on the surface water quality classification problem, and many methods have been widely studied and applied, such as Particle Swarm Optimization, Ant Colony Optimization, single factor and comprehensive factor discriminant evaluation method, and regression analysis.

4.1.1. Data Sources

The data are derived from the weekly report of the section of the Yellow River basin automatically monitored in the main river basin in Ministry of Environmental Protection of the People’s Republic of China. On the basis of surface water environmental function and the target of protection, all kinds of water are divided into five categories according to the functions from high to low: Class I is mainly used in the source water and the national nature reserve; Class II is mainly suitable for the primary protection zones of centralized domestic surface water drinking water, cherished aquatic habitats, spawning grounds for fish and shrimp, a shuttled bait of larvae juveniles and young fish, and so forth; Class III is mainly fit for the second protection zones of centralized domestic surface water drinking water, the wintering ground of fish and shrimp, migration pathway, and aquaculture region in fishing area and the swimming area; Class IV is mainly suitable for general industrial water areas and nondirect contact recreational water areas; Class V is mainly fit for the agricultural water areas and general required landscape water. The water in inferior V is that whose polluted degree is superior to the water in Class V.

In this paper, we choose 306 pieces of data from 2015 to 2016 of three monitoring points: the Lanzhou new town bridge, Xinzhou Wanjiazhai Reservoir, and Weinan Tongguan suspension bridge in the upper, middle, and lower reaches of the Yellow River Basin, respectively. According to the standard of surface water environment, firstly section block water is eliminated and there remain 276 pieces of data, whose distribution is only 1 piece of data in Class I, 155 pieces of data in Class II, 33 pieces of data in Class III, 76 pieces of data in Class IV, 10 pieces of data in Class V, and only 1 piece of data in inferior V, shown in Figure 1. Owing to slight influences for surface water quality, inferior V water is also eliminated. Therefore, there remain 275 pieces of data, that is, 275 samples. The classification distribution graph of all samples has four features: pH value, dissolved oxygen (DO (mg/l)), chemical oxygen demand (COD (mg/l)), and ammonia (NH3-N (mg/l)). In this paper, 140 samples are chosen to be trained, and the remaining samples are chosen to be tested.

4.1.2. Classification Results

In this section, we utilize I-PSO-GSA, PSO-GSA, PSO, and GSA to optimize the parameters of BP neural network with only one hidden layer to solve surface water quality classification problem. BP neural network is a 4-S-5 network, where 4 is the number of the input nodes in the input layer, is the number of neural nodes in the hidden layer, and 5 is the number of the neural nodes in the output layer. We choose in BP neural network, respectively. The accuracy rates of the classification are shown in Table 1. From Table 1, it is shown that the accuracy rates of I-PSO-GSA are all more than 91% and the best accuracy rate of I-PSO-GSA is 96.30% when and . Every accuracy rate of I-PSO-GSA is more than that of PSO-GSA, PSO, and GSA. Therefore, I-PSO-GSA outperforms PSO-GSA, PSO, and GSA on surface water quality classification.

4.2. Example  2: Robot Moving Direction Classification

In the field of robot research, mobile robot has been widely studied because of its strong usability. Autonomous navigation is the most basic function of the mobile robot, which requires that the robot can reach its destination in an unknown environment. In general, some sensors are often installed into the robot to achieve autonomous navigation. The moving direction can be divided into four classes: forward, slight right, slight left, and sharp turn.

4.2.1. Data Sources

The samples are derived from the UCI provided by Ananda Freire and Barreto. Each sample has 24 features collected by 24 sensors at every angle placed in the waist of each robot. In this paper, 150 samples of data were selected from each group and thus 4 groups have 600 samples. In this subsection, 500 samples are chosen to train BP neural network, and the remaining 100 samples are used to test BP neural network.

4.2.2. Classification Results

In this section, we utilize I-PSO-GSA, PSO-GSA, PSO, and GSA to optimize the parameters of BP neural network with 24-S-4 to solve robot moving direction classification problem, where 24 is the number of the input nodes in the input layer, is the number of neural nodes in the hidden layer, and 4 is the number of the neural nodes in the output layer. We compare the performance of BP neural network with . The accuracies of the classification are shown in Table 2. From Table 2, it is shown that the accuracy rates of I-PSO-GSA are all more than 89% and larger than those of PSO-GSA, PSO, and GSA. The best accuracy rate of I-PSO-GSA is 95% when . Therefore, the results also show that I-PSO-GSA outperforms PSO-GSA, PSO, and GSA for robot moving direction classification. We can note that the classification accuracy rates of GSA and PSO are the worst and the combination of GSA and PSO improves the classification accuracy rate.

4.3. Discussion

The number of the nodes in the hidden layer of BP neural network is changed from 7 to 15 in Example  1 and from 45 to 51 in Example  2. The results show that the classification accuracy rate is changed with increasing but does not increase or decrease. Therefore the fitness of is connected with the classification accuracy rate. From Tables 1 and 2, I-PSO-GSA has the highest accuracy rate among four models: I-PSO-GSA, PSO-GSA, PSO, and GSA, but PSO and GSA are the worst. By comparison with in the hidden layer, when and obtained from Table 1, the accuracy of I-PSO-GSA reaches the highest value, 96.30%, while the accuracy rates of PSO-GSA, PSO, and GSA are 85.93%, 60.74%, and 85.19% when and 86.67%, 85.19%, and 86.67% when , respectively; when obtained from Table 2, the accuracy of I-PSO-GSA can be up to 95%, while the accuracy rates of PSO-GSA, PSO, and GSA are 88%, 27%, and 27%, respectively. Therefore, I-PSO-GSA is very reliable and has a very high accuracy for solving the classification problem.

By comparison, the optimization results of four algorithms optimizing BP neural network show that the sequences in Example  1 from the worst to the best are PSO, PSO-GSA, and I-PSO-GSA, but sometimes GSA has a high accuracy rate in comparison to PSO-GSA and PSO, and the sequences in Example  2 from the worst to the best are GSA, PSO, PSO-GSA, and I-PSO-GSA. Therefore, I-PSO-GSA outperforms PSO-GSA, PSO, and GSA and the results show that the cooperative behaviors of the particle’s velocity in PSO and the acceleration in GSA have improved the classification accuracy rate.

5. Conclusion

GSA has a slower search process to impact exploitation ability, but it has a very strong exploration ability. PSO has precise exploitation ability, but it is easy to fall into local minimum. In this paper, the advantages of PSO and GSA are combined to improve an algorithm, PSO-GSA, for solving the classification problems of surface water quality and robot moving direction. Thus I-PSO-GSA with the strong exploitation ability of PSO and GSA exploration ability is utilized to optimize the parameters of the BP neural network. From Tables 1 and 2, the results show that the proposed I-PSO-GSA improves the problem of trapping in local minima and has very good convergence speed. We take the same parameters and to be the exponential function in I-PSO-GSA for solving the actual problems and the classification results also show that I-PSO-GSA outperforms PSO-GSA, PSO, and GSA. Hence I-PSO-GSA can be adopted as the classification tool for surface water quality and robot moving direction.

In order to improve the classification accuracy rate of the actual problems, the parameters and can be further improved by use of other functions whether and are the same or different. Some population-based metaheuristic optimization algorithms can be employed to be combined with each other to optimize the parameters of BP neural network and other neural networks.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work is fully supported by the National Natural Science Foundation of China (no. 61774137).