Abstract

We present a novel hybrid algorithm based on particle swarm optimization (PSO) and simulated annealing (SA) for the design of two-dimensional recursive digital filters. The proposed method, known as SA-PSO, integrates the global search ability of PSO with the local search ability of SA and offsets the weakness of each other. The acceptance criterion of Metropolis is included in the basic algorithm of PSO to increase the swarm’s diversity by accepting sometimes weaker solutions also. The experimental results reveal that the performance of the optimal filter designed by the proposed SA-PSO method is improved. Further, the convergence behavior as well as optimization accuracy of proposed method has been improved significantly and computational time is also reduced. In addition, the proposed SA-PSO method also produces the best optimal solution with lower mean and variance which indicates that the algorithm can be used more efficiently in realizing two-dimensional digital filters.

1. Introduction

Design of two-dimensional (2D) filters has been considered extensively over the past two decades as it plays a very significant role in the domain of biomedical image processing, satellite imaging, seismic data processing, and so forth [1]. It is well known that digital filters are typically classified into two groups: recursive or infinite impulse response (IIR) and nonrecursive or finite impulse response (FIR) filters. Design of IIR filters has received much more attention, because IIR filters can provide a better performance than FIR filters, having the same number of filter coefficients. But the main problems of IIR filters are that they have a multimodal error surface and they may also be unstable in some cases. To overcome the problem of multimodal error surface, a global optimization method can be adopted more efficiently. The stability problem can be tackled by limiting the problem space as appropriate constraints from the beginning of optimization routine [1, 2]. Similar to 1D filter, 2D IIR filters can also meet the same desired specifications with less number of coefficients than required for an equivalent 2D FIR filter. The design methodology for 2D filters grouped in two ways: McClellan transformation based design of 2D filters from 1D prototype and another one based on appropriate optimization techniques [16]. In optimization based methods, the design problem can be formulated as a constrained minimization problem and they are solved by various global optimization techniques. Previously reported work on this problem has applied different optimization techniques, like neural network (NN) [1], genetic algorithm (GA) [2], computer language GENETICA [3], Taguchi-based immune algorithm [4], Bees algorithm [5], and particle swarm optimization (PSO) [6], efficiently. Most of these algorithms exhibit slow convergence to achieve a good near-optimum solution and are easily trapped into local optima. These shortcomings can be avoided by introducing SA with PSO because the first one has strong local exploration capabilities and PSO exhibits fast global searching abilities.

Earlier reported works demonstrated that by combining SA with PSO, significant improvements are achieved for different engineering problems, like partner selection in virtual enterprise [7], job shop scheduling problem [8], holon task allocation scheme [9], energy consumption reduction in embedded systems [10], and numerical verification of several benchmark functions [11]. Zhao et al. [7, 9] and Jamili et al. [8] combined SA with PSO for solving different practical problems, where SA starts with the global best solution generated by PSO. These hybrid algorithms [79] can be easily converted to basic PSO by ignoring the SA subroutine, and, at the same time, it can be used as a conventional SA by assuming population size to be one particle. It is demonstrated in [79] that the hybrid method outperforms conventional GA, SA, and PSO not only by providing quality solution but also by exhibiting greater convergence speed. Idoumghar et al. [10] and Shieh et al. [11] applied this hybrid algorithm for solving several benchmark functions and also for reducing energy consumption in embedded system memories. It is proved that [10] the hybrid SA-PSO outperforms most of the recently introduced PSO based methods, like QIPSO (quadratic interpolation based PSO), GMPSO (Gaussian mutation based PSO), and so forth, in terms of accuracy, robustness, and convergence speed. Motivated by the improved features of hybrid SA-PSO, here the design problem of 2D IIR filter is implemented more efficiently. Simulation results using this approach ensure better quality solutions with less computational time. The novelty of the proposed work is as follows: (i) new hybrid algorithm for designing 2D recursive filters, (ii) the effective application of Metropolis criterion to escape from local optimum, and (iii) adaptive simulated annealing (ASA) for better control of convergence speed and accuracy.

The rest of the paper is organized as follows. The design problem of 2D IIR filters and the stability condition are presented in Section 2. Section 3 introduces the preliminary concept of PSO. The proposed design method using hybrid SA-PSO is described in Section 4. The simulation results indicating the comparisons with different design examples are shown in Section 5. Finally, conclusions and future work are given in Section 6.

2. Problem Formulation of 2D IIR Filter

Here we consider the design problem of th order, 2D IIR filter with transfer function [6]: where are filter coefficients. Let the frequencies , , , and . The design task of 2D filters is to find a transfer function as in (1), such that the magnitude function approximates the desired amplitude response in some optimal sense. This approximation can be achieved by minimizing , where Here , , and is a positive integer (e.g., , or 8). Hence, the primary aim is to minimize the difference between actual and desired amplitude response of the filter at gird points. Since denominator consists of only 1st degree factors, the stability condition is given by Therefore, the design task of 2D recursive filters reduces to the following constrained minimization problem: minimize satisfying the constraints as given in (3).

Assuming that , the transfer function of second-order 2D filter can be written as follows: Now substitute and in (4), then can be further written as follows:

From (5), can be written as follows: Hence the corresponding magnitude response of 2D filters can be compactly written as follows: To compare the performance of proposed SA-PSO method, the similar design specifications have been used for desired amplitude response [16]: Consequently, the desired amplitude response of 2D filter satisfying (8) is depicted in Figure 1.

3. An Overview of Particle Swarm Optimization

Particle swarm optimization (PSO) is a population based evolutionary algorithm which can mimic biological mechanisms, like bird flocking or fish schooling [12]. During the evolutionary process, the swarm particles are attracted towards the location of the best fitness of the particle itself (local best) and the location of the best fitness achieved by the whole population (global best). The position and velocity of th particle can be represented as follows: and . Particle velocities on each dimension are limited to which controls the local and global exploration capability of a particle. The best position of each particle is assumed as and the best fittest particle in the group is . Then, the new velocities and positions for next evaluations are calculated by [1214] where and are cognitive and social acceleration control parameters and and are two random numbers . One important variant of standard PSO is the constriction factor based PSO (CPSO), which was proposed by Clerc and Kennedy [14]. The CPSO can generate higher-quality solutions than the standard PSO with inertia weight and it guarantees the convergence of the search procedure. In CPSO, the particle velocity is updated by following the equation: where is called constriction factor, given by , .

It is well known that in PSO process, inertia weight plays an important role for balancing between exploration-exploitation trade-off. In [15, 16], it has been demonstrated that chaotic inertia weight based PSO (PSO-CIW) [17] is the best strategy for better accuracy whereas PSO with random inertia weight (PSO-RANDIW) produces optimal solutions with better efficiency. Therefore, these two best performed inertia weight strategies combined with time varying inertia weight (PSO-TVIW) are considered here to compare the performance of the proposed SA-PSO. The summary of different inertia weight strategies reported earlier is given in Table 1 along with the required constraints.

4. The Proposed SA-PSO Algorithm

The main benefit of PSO is that it is a problem independent, stochastic search optimization method. Due to its stochastic nature, it reveals insufficient global searching ability at the end of a run, particularly in multimodal functions. Hence to escape from local minima and increase the diversity of particle, the SA is integrated with PSO. Similar to PSO, SA is also a heuristic based random search global optimization method proposed by Kirkpatrick et al. [18]. During the search process, SA accepts not only better candidate solutions but also weakening solutions in certain degree according to Metropolis criterion [19], which can be mathematically represented by where denotes change in energy due to perturbation of parameters and is the current temperature of system. To validate (12), a random number is generated and checked for whether or not. A suitable cooling schedule, based on adaptive simulated annealing (ASA), is introduced here to update temperature of the system [20]. The annealing schedule for th iteration is written as follows: where is starting temperature, represents quenching factor, and is the dimension of search space. Finally, in our proposed method, a well constructed stopping condition is included to minimize the execution time and computational efforts. The proposed algorithm terminates when either of minimum values of objective function and final temperature or maximum number of iterations is reached [21].

The design of 2D recursive filter starts by assuming a population of random solutions in multidimensional search space. Here, each particle consists of 15 positional coordinates which is represented by

Therefore, 2D filter coefficients are represented as a vector , selected in the interval of [−3, 3]. Each entry of is optimized based on the flowchart presented in Figure 2 and the proposed design flow is given below.

Step 1. Specify desired filter specifications as given in (8). Assume initial temperature is and minimum level of temperature is , initialize , 4, or 8 and , and use (5) to calculate and for , before optimization starts.

Step 2. Initialize swarm size, minimum value of objective function (min_E), random population of coefficients vector and maximum number of evaluations .

Step 3. Set the iteration number , assign best solution , and use (2) for computing the fitness of the best solution .

Step 4 (calculate fitness). Compute fitness values for each particle in the swarm.

Step 5 (update ). Particle best position is updated based on Metropolis criterion (12) as follows: if current fitness of selected particle < fitness of , then current position of the particle is accepted as new with probability 1; otherwise current particle will be accepted according to , where and .

Step 6 (update ). Now assign best particles value to and calculate the new velocity and position by (9) and (10) for all particles.

Step 7 (reduce Temperature). Calculate new temperature specified in cooling schedule (13). If , then terminate searching for best solution; otherwise go to Step 8.

Step 8 (update velocities and positions). Calculate velocities using (9) and positions for each particle by (10).

Step 9. Update , and check whether or minimum value of cost function is achieved by any particle. If yes, then assign ; otherwise go back to Step 4.

Step 10. Design 2D filter based on coefficients of and compute the magnitude response.

5. Simulation Results and Discussion

In order to validate the performance of proposed SA-PSO, we implemented the algorithm using MATLAB 2009 on Genuine Intel(R) Core 2 Duo CPU E7300 @2.66 GHz, 2 GB RAM. Here, all the associated algorithms are executed for 20 independent runs with 40,000 functional evaluations (FE). Table 2 shows the choice of several required parameters for GA and SA-PSO based methods. In case of proposed SA-PSO based methods, initial population size is chosen as 100. It is also observed that with the further increase in population size, the solution quality remains unchanged. In SA-PSO-TVIW, the inertia weight is decreased from 0.9 to 0.4 to obtain the best possible solutions whereas in SA-PSO-RANDIW it is varied randomly in between 0.5 and 1. The starting temperature for annealing is chosen to be very high, so that it can escape from local optima easily. The parameters for cooling schedule are chosen to be very small to assure global convergence; otherwise it can skip the true global solutions [20, 21].

Table 3 shows an analysis of the experimental results of the proposed method with NN (2001), GA (2003), GENETICA (2006), TBIA (2008), QPSO DGM (2010), ASA, and PSO-TVIW with respect to the best known solutions. The best results among different methods are highlighted in bold and it can be easily verified that the experimental results of the proposed method, that is, SA-PSO-TVIW and SA-PSO-RANDIW, are better than all other reported methods [16]. For example, the percentage improvement in provided by the proposed SA-PSO-TVIW method in comparison to earlier reported methods [14, 6] is 23.4%, 52%, 5.23%, 2.54%, and 23.32%, respectively.

Further, in Table 4, for different values of , the best possible solutions (best) and worst solutions (worst) are listed for different methods which clearly reveal that the best/worst solutions obtained by the proposed SA-PSO based algorithm are best for all runs. Furthermore, Table 5 presents the statistics for average (mean) and variance (VAR) of the optimal solutions for different values of . From Table 5, it can be observed that the proposed SA-PSO algorithm exhibits better performance with less computational time (i.e., less than half time as compared to GA or ASA) and produces optimal solution with very lower variance. Hence, it can be implemented more efficiently in real-time designing of 2D filters.

Figures 3(a)3(f) show the calculated amplitude response of 2D filter for . The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures 3(a) and 3(b), respectively. For comparison purpose, the results obtained by NN, GA, GENETICA, and TBIA are also included in Figure 3(c) to Figure 3(f). A closer look at these figures reveals that SA-PSO based methods, that is, SA-PSO-TVIW and SA-PSO RANDIW, yield a better approximation to the desired response and stop-band ripple is much less with respect to other competitive methods [16]. Figures 4(a) and 4(b) show the magnitude response obtained using SA-PSO based methods for whereas Figures 5(a) and 5(b) represents the magnitude response for .

Figures 6(a) and 6(b) show the best convergence profiles of SA-PSO based methods for and 4, respectively. Among the 20 independent executions of four different algorithms (the existing two algorithms: GA and PSO-RANDIW and the proposed two algorithms: SA-PSO-TVIW and SA-PSO-RANDIW), the best performed runs are plotted here. Initially for few thousands FEs, PSO-RANDIW converges faster than GA, SA-PSO-RANDIW, and SA-PSO-TVIW. After certain number of iterations, the PSO-RANDIW and GA both exhibit premature convergence and settle to near optimal solutions. After 30,000 FEs, SA-PSO based methods fall below the curves of both GA and PSO, because SA helps particles to jump out from local optima from the beginning of a search. Due to this, at the end of search procedure, the proposed hybrid method provides best candidate solution. Hence, the proposed hybrid algorithm produces better optimal solutions during the search process of 2D recursive filters.

6. Conclusions

In this paper, a novel hybrid evolutionary algorithm based on PSO and SA is proposed for finding the global best solution of second-order, two-dimensional recursive digital filters. The proposed hybrid method integrates the global search capability of PSO with SA to escape from local minima. The experimental results are compared with earlier reported algorithms, namely, NN, GA, GENETICA, and TBIA, in addition to different variants of PSO. The comparison results indicate that SA-PSO based methods exhibit better performance in all experiments and provide best optimum solution during search mechanism. Also, the proposed method produces the best solution with lower mean and variance. Hence, the proposed approach could be an alternative to implement the two-dimensional digital filters for real-time applications. Emphasis on higher order two-dimensional recursive filter design with guaranteed stability and further reduction in computational complexity is left as future work.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.