Research Article  Open Access
Construction Example for Algebra System Using Harmony Search Algorithm
Abstract
The construction example of algebra system is to verify the existence of a complex algebra system, and it is a NPhard problem. In this paper, to solve this kind of problems, firstly, a mathematical optimization model for construction example of algebra system is established. Secondly, an improved harmony search algorithm based on NGHS algorithm (INGHS) is proposed to find as more solutions as possible for the optimization model; in the proposed INGHS algorithm, to achieve the balance between exploration power and exploitation power in the search process, a global best strategy and parameters dynamic adjustment method are present. Finally, nine construction examples of algebra system are used to evaluate the optimization model and performance of INGHS. The experimental results show that the proposed algorithm has strong performance for solving complex construction example problems of algebra system.
1. Introduction
Algebra system is one of the broad parts of mathematics, together with number theory, geometry, and analysis. It has wide applications in astronomy, biology, construction, computer science, and so on. Many scientific researchers focus on its studies. However, with the rapid development of science and technology, more and more complex algebra systems emerge in large numbers, so the algebra system researchers meet a lot of difficult problems. One of them is to construct appropriate examples for the algebra system so as to prove the existence of the algebra system. This is because there is the tremendous computing workload to construct appropriate examples that are satisfied with the operations of the algebra system. In the current study of algebra system, the construction examples of algebra system are often manually achieved. However, for a multielement complex algebra system, the construction examples are very difficult. In order to understand the construction example of algebra system better, we introduce its construction by an example of N() algebra system.
Above all, we begin this section by introducing a simplified definition of N() algebra system [1, 2].
Definition 1. An N() algebra is a system where is nonempty set, 0 is a constant element of , and * and Ī are two binary operations on , obeying the following axioms. For all ,(F_{1}),(F_{2}),(F_{3}).
For Definition 1, there are two works for researchers to do.(1)Give some examples to prove the existence of this algebra.(2)How many solutions are satisfying the three conditions (F_{1}), (F_{2}), and (F_{3}) simultaneously?
Example 2. Let be a finite set of distinct elements. We define operations * and Ī by the following equations, respectively:
In (1) and (2), , .
Next we consider(F_{1}),(F_{2}),(F_{3}),where ; ; .
For this algebra system, researchers hope to construct some examples that satisfy the conditions (F_{1}), (F_{2}), and (F_{3}) and to prove the existence of the algebra. However, the calculated amount of this work is so large that computing cost is typically high. Like Example 2, each variable or āā can take different values ; the combinatorial number with the repetition for all and is . Obviously it is a NPhard problem. It will result in the combination blast when the dimension increases. So the enumeration method can hardly be used to construct an example for the algebra system ().
Therefore, for this kind of NPhard problem, this paper proposes an intelligent metaheuristic algorithm to get a set of and quickly that satisfy conditions (F_{1}), (F_{2}), and (F_{3}).
2. The Optimization Model of Algebra System
For the construction example of an algebra system, we can consider it as an optimized constraint satisfaction problem (OCSP) that satisfies conditions (F_{1}), (F_{2}), and (F_{3}). Then it can be solved by the intelligent optimization algorithm.
To predigest the solving process and promote its practicability, we turn this problem into nonconstrained optimization problem. The optimization model is as follows: where for any , , and āā in ,
The functions , , and are to transform the constraint satisfaction problem to a nonconstraint optimization problem by adding the penalty factors. When , , and meet conditions (F_{1}), (F_{2}), and (F_{3}), respectively, the objective values of each functions are 0; otherwise, a penalty factor 1 is as the objective value.
For this problem, there are multiple solutions that meet conditions (F_{1}), (F_{2}), and (F_{3}). Therefore, sometimes researchers hope to get as many solutions as possible. To resolve the multisolution problem, we introduce tube table (tube table: TT) to store the obtained solutions and to avoid duplication of solutions. The definite means are as follows: where TT is the tube table. is defined as a penalty function. If is not in tube table TT, is equal to 0. Otherwise, is assigned a penalty value .
Under this condition, the objective function can be expressed as
It is clear that, for this optimization problem, the objective value of optimal solution is .
In order to solve the nonconstrained optimization problem, we propose a new harmony search (HS) algorithm which is a big improvement for classical HS algorithm [1] and NGHS [3] method. We called it improved NGHS (INGHS).
3. Classical Harmony Search (HS) and NGHS
Classical harmony search (HS) is derivativefree metaheuristic algorithm [4, 5]. It mimics the improvisation process of music players and attains the ideal harmony by adjusting the HM. HS algorithm has the same feature as genetic algorithm (GA). The HS algorithm is good at identifying the new regions of the searching space in a reasonable time; however, it has difficulties performing a local search for numerical applications [3, 6ā8].
So, several variants of HS have been proposed to improve the performance of the HS algorithm [3, 6ā14], such as localbest harmony search algorithm with dynamic subpopulations (DLHS) [6], selfadaptive global best harmony search algorithm (SGHS) [7], intelligent tuned harmony search (ITHS) [8] algorithm, and exploratory power of the harmony search (EHS) [3] algorithm. In addition, Zou et al. presented a novel global harmony search algorithm for unconstrained problems (NGHS) [9, 10]. The literature [11] presents the recent advances in HS algorithm. Tuo and Yong proposed HSTL methods for large scale optimization problem and presented an improved harmony search with chaos (HSCH) [12], as well as other variants [13ā15] of HS and some applications [16ā19] of HS.
Consider an optimization model as follows: where is the optimization objective function, is the solution vector that consists of decision variables , and is feasible range of values for decision variable .
In the HS algorithm, represents the harmony and denotes the melody of harmony . HS uses three rules (memory consideration, pitch adjustments, and randomization) to optimize the harmony memories (HM) composed of HMS harmony vectors.
3.1. The Classical Harmony Search (HS) Algorithm
The steps in the procedure of classical harmony search algorithm are as follows.
Step 1 (initialize the harmony memory). The harmony memory (HM) consists of HMS harmony vectors. Each harmony vector is generated from a uniform distribution in the feasible space, as where and HMS represent the number of decision variables and the size of harmony memory, respectively, and denotes a uniform distribution random number between 0 and 1, as
Step 2 (improvise a new harmony via three rules). There are three rules that can be used to improvise a new harmony vector .
(a) Memory Consideration. A decision variable value of the harmony vector will be adopted by choosing from the harmony memory with probability HMCR.
(b) Pitch Adjustment. Get a component randomly from an adjacent value of one decision variable of a harmony vector with probability PAR.
(c) Random Generation. Generate a component randomly in the feasible region with probability 1HMCR.
The improvisation procedure of a new harmony vector works as Algorithm 2.
A trial harmony vector is generated in Step 2. Next, in Step 3, it will be decided whether to survive.
Step 3 (select operator). Get the worst harmony vector from the HM (see Algorithm 1).


Step 4 (check stopping criterion). If the stopping criterion (maximum function evaluation times: maxFEs) is satisfied, computation is terminated. Otherwise, Steps 2 and 3 are repeated.
3.2. The NGHS Algorithm
In the novel global harmony search (NGHS) algorithm [9], three significant parameters, harmony memory considering rate (HMCR), fret width (FW), and pitch adjusting rate (PAR), are excluded from NGHS and a random selection rate is included in the NGHS. In Step 3, NGHS works as Algorithm 3, where and , respectively, represent indexes of the best harmony and the worst harmony in HM. is a uniformly generated random number in , and it should set parameter for the 01 knapsack problem and set parameter for continuous optimal problem.

3.3. The Proposed HS Method
In this section, we proposed a new harmony search algorithm which is improved based on classical HS algorithm and NGHS method, so we call the proposed algorithm INGHS.
Since the HS algorithm origination, it has been applied to many practical optimization problems. However, for large scale optimization problems, classical HS has slow convergence and low precision, which is due to the fact that because a new decision variable value in harmony memory (HM) can be generated only by using pitch adjustment and randomization strategies during the search procedure, the memory consideration rule is only used to adjust the existing decision variable values according to the harmony memory (HM). Thus HS can maintain a strong performance of exploration, but not a good performance of exploitation, and, in the later stage of search, it is characterized by slow convergent. Therefore, for solving the large scale optimization problem, the key is to balance the global exploration performance and the local exploitation ability.
Because the construction example of the algebra is a large scale highdimensional optimization problem, to achieve the most satisfactory optimization performance by applying the HS algorithm to a given problem, we adopt four optimization strategies and dynamical parameter control method to balance the global exploration power and the local exploitation ability.
It is of very importance between the convergence and the diversity in order to improve the efficiency of the search. In the classical HS algorithm, a new harmony is generated in Step 2. After the selecting operation in Step 3, the population variance may increase or decrease. With a high population variance, the diversity and exploration power will increase, and in the same time the convergence and the exploitation power will decrease accordingly. Conversely, with a low population variance, the convergence and the exploitation power will increase [8]; the diversity and the exploration power will decrease. So it is significant how to keep the balance between the convergence and the diversity. Classical HS algorithm loses its ability easily at the later evolution process [3], because of improvising a new harmony from HM with a high HMCR and local adjusting with PAR. And HM diversity decreases gradually from the early iteration to the last. However, in HS algorithm, a low HMCR employed will increase the probability (1HMCR) of random select in search space; the exploration power will improve, but the local search ability and the exploitation accuracy cannot be improved by the single pitch adjusting strategy.
To overcome the inherent weaknesses of HS, in this section, we present INGHS method to construct example for algebra. The INGHS algorithm works as Algorithm 4 and Figure 1.

In Algorithm 4, to resolve the discrete optimization problems, the function Round() is used to round each element of to the nearest integer.
(1) Novel Global (NG) Best Strategy. In NGHS algorithm, the novel global best strategy is very effective to explore the global best solution. So, in the proposed method, the novel global best strategy is adopted (see Algorithm 5), where the best and worst, respectively, represent indexes of the best harmony and the worst harmony in HM.

(2) Parameters Dynamically Changed. To balance the exploration and exploitation power of the INGHS algorithm efficiently, HMCR, PAR, FW, and NGP parameters are dynamically adapted to a suitable range with the increase of generations. Equation (11) shows the dynamic change of HMCR, PAR, and NGP, respectively, (see [8]).
It can be seen that the parameter HMCR gradually increased from to linearly and the parameter NGP increased with low velocity in the early stage and it increased sharply in the final stage. That is because, in the beginning, in order to explore the global optimal solution, the harmony consideration rules and NG strategies are carried out with a smaller probability, and, in later stage, INGHS methods begin to focus on the local exploitation that needs a high probability to employ the NG strategy and harmony consideration rules. The benefits of doing so can get more opportunities to reinforce the global exploration by strengthening disturbance in the early stage and can acquire high precision solution by carrying local intensification search in the later stage. For the same reason, FW is decreased gradually in order to reduce perturbation step size step by step, and the variation of PAR from 0.55 to 0.3 is to reduce the probability of pitch adjustment.
3.4. The Construction Example for Algebra System with the Proposed HS Method
For an N() algebra system, there are multiple solutions that meet the conditions. Sometimes we need to obtain multiple solutions. So we adopt the proposed HS algorithm and tube table technology to resolve the multisolution problem. Its flow chart is as shown in Figure 2 and in Pseudocode 1.

3.5. Effect of Parameters HMCR and NGP on INGHS Performance
In this section, we determine the effect of parameters , , , and on the performance of the proposed algorithm INGHS. The experiments are investigated for algebra system , where .
(1) Investigation for and (initialize , ). Let change from 0.1 to 0.6; then, for each , let to 1. So there are pairs of parameters (, ). For each pair of parameters (, ), we execute the proposed algorithm (INGHS) āā times and then record the number of solutions obtained. The result is shown in Figure 3.
(2) Investigation for and (initialize , ). Let to 0.7 and let to 1, respectively. For each pair of parameters (, ), the proposed algorithm INGHS is performed āā times and then we record the number of solutions obtained. The number of solutions obtained is shown in Figure 4.
It can be seen from Figure 3 that most solutions can be obtained when and . From Figure 4, we find that most solutions are found when and .
From the above, the proposed algorithm has better performance when it is set as , , , and .
4. Computational Experiments and Results
In this section, we have tested the proposed algorithm over a set of N() algebra system construction examples problems.
4.1. The Proposed Algorithm for Solving the Construction Examples of Algebra System
Definition 1 is chosen as the optimization objective algebra system. For Definition 1, we, respectively, set the following:(1); (2); (3); (4); (5); (6); (7); (8);(9); .
Then we use the proposed algorithm to construct the examples for the algebra system on .
In order to deal with the optimization problems easily, we replace the elements , , , , , , , , , and with 1, 2, 3, 4, 5, 6, 7, 8, 9, and 10, respectively. So, for each , the upper limit of variables about optimization problems is set as and the lower limit is set as 1.
4.2. The Parameters Setting for Algorithms
In the experiment test, we set the following parameter values: ; for NGHS algorithm, , ; for the proposed algorithm (INGHS), , ,, , , , , , , and .
When and , there are 3 and 16 solutions for the N() algebra system, respectively. So we set max looping times and 16, respectively. When and , we do not know how many solutions are there for the algebra system. So we set . However, with the increasing of , there are much more solutions, and computerās running time is very long. So we set and 5000 when and .
4.3. The Experiment Results and Analysis
In order to evaluate the performance of the proposed algorithm INGHS, we compared its success rate of finding the solutions with HS, HSTL, and NGHS algorithms in the same conditions. The results are shown in Table 1.

When āā, the solutions, as shown in (12a)ā(12c), are obtained by using the proposed algorithm. And when āā, the solutions obtained by INGHS method are shown in (13a)ā(13p). For other instances , due to the existence of many solutions, we only present one solution for each instance. All is shown in (14)ā(20).
āThe first solution on is as follows: āThe second solution on is as follows: āThe third solution on is as follows:
āThe first solution on is as follows: āThe second solution on is as follows: āThe third solution on is as follows: āThe fourth solution on is as follows: āThe fifth solution on is as follows: āThe sixth solution on is as follows: āThe seventh solution on is as follows: āThe eighth solution on is as follows: āThe ninth solution on is as follows: āThe tenth solution on is as follows: āThe eleventh solution on is as follows: āThe twelfth solution on is as follows: āThe thirteenth solution on is as follows: āThe fourteenth solution on is as follows: āThe fifteenth solution on is as follows: āThe sixteenth solution on is as follows:
āOne of the solutions on () is as follows: āOne of the solutions on () is as follows: āOne of the solutions on () is as follows: āOne of the solutions on () is as follows: āOne of the solutions on () is as follows: āOne of the solutions on () is as follows: āOne of the solutions on () is as follows:
From Table 1, it can be seen obviously that the success rate of all instances of the proposed method is higher than HS, HSTL, and NGHS.
For instance, , and 10, and Figures 5, 6, 7, 8, 9, and 10 show the convergence curves of three algorithms (HS, NGHS, and INGHS). It is evident from Figures 5ā10 that the proposed algorithm is better than HS and NGHS for all instances. The convergence curve of the proposed algorithm can maintain falling until it finds the best solution. When , the convergence curve of INGHS method falls faster than HS, HSTL, and NGHS algorithms, and it keeps declining actively until the best solution is found.
(a)
(b)
(a)
(b)
(a)
(b)
(a)
(b)
(a)
(b)
(a)
(b)
5. Conclusions
This paper has investigated nine construction example problems of algebra system and converted the construction example problems into optimization problems. A novel harmony search algorithm (INGHS) is proposed to solve the problems. Global best strategy and dynamic parameters adjustment are employed in INGHS. The experimental results on 9 instances of algebra system demonstrate that the proposed algorithm is more effective than HS, HSTL, and NGHS algorithms. Further research will investigate the INGHS algorithm to solve the combinatorial optimization problems and some practical optimization problems.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
Acknowledgments
The authors would like to thank all the editors and referees for their valuable comments. This work was supported by the National Natural Science Foundation of China under Grant no. 11401357 and no. 81160183, the Scientific Research Program funded by Shaanxi Provincial Education Department under Grant no. 14JK1141, and Hanzhong Administration of Science & Technology under Grant no. 2013hzzx39.
References
 F. Deng and Y. Xu, āOn the N(2,2,0) algebra,ā Journal of Southwest Jiao Tong University, vol. 31, no. 4, pp. 457ā653, 1996. View at: Google Scholar
 F. Deng and L. Yong, āRegular semigroups of an N(2,2,0) algebra,ā Journal of Advances in Mathematics, vol. 41, no. 6, pp. 665ā671, 2012. View at: Google Scholar  MathSciNet
 S. Das, A. Mukhopadhyay, A. Roy, A. Abraham, and B. K. Panigrahi, āExploratory power of the harmony search algorithm: analysis and improvements for global numerical optimization,ā IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 41, no. 1, pp. 89ā106, 2011. View at: Publisher Site  Google Scholar
 Z. W. Geem, J. H. Kim, and G. V. Loganathan, āA new heuristic optimization algorithm: harmony search,ā Simulation, vol. 76, no. 2, pp. 60ā68, 2001. View at: Publisher Site  Google Scholar
 K. S. Lee and Z. W. Geem, āA new metaheuristic algorithm for continuous engineering optimization: harmony search theory and practice,ā Computer Methods in Applied Mechanics and Engineering, vol. 194, no. 36–38, pp. 3902ā3933, 2005. View at: Publisher Site  Google Scholar
 Q.K. Pan, P. N. Suganthan, J. J. Liang, and M. F. Tasgetiren, āA localbest harmony search algorithm with dynamic subpopulations,ā Engineering Optimization, vol. 42, no. 2, pp. 101ā117, 2010. View at: Publisher Site  Google Scholar
 Q.K. Pan, P. N. Suganthan, M. F. Tasgetiren, and J. J. Liang, āA selfadaptive global best harmony search algorithm for continuous optimization problems,ā Applied Mathematics and Computation, vol. 216, no. 3, pp. 830ā848, 2010. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 P. Yadav, R. Kumar, S. K. Panda, and C. S. Chang, āAn intelligent tuned harmony search algorithm for optimisation,ā Information Sciences, vol. 196, pp. 47ā72, 2012. View at: Publisher Site  Google Scholar
 D. Zou, L. Gao, J. Wu, and S. Li, āNovel global harmony search algorithm for unconstrained problems,ā Neurocomputing, vol. 73, no. 16–18, pp. 3308ā3318, 2010. View at: Publisher Site  Google Scholar
 D. Zou, L. Gao, J. Wu, S. Li, and Y. Li, āA novel global harmony search algorithm for reliability problems,ā Computers & Industrial Engineering, vol. 58, no. 2, pp. 307ā316, 2010. View at: Publisher Site  Google Scholar
 Z. W. Geem, Recent Advances in Harmony Search Algorithm, Springer, New York, NY, USA, 2010, https://sites.google.com/a/hydroteq.com/www/HS_Structure.pdf.
 S. Tuo and L. Yong, āImproved harmony search algorithm with chaos,ā Journal of Computational Information Systems, vol. 8, no. 10, pp. 4269ā4276, 2012. View at: Google Scholar
 M. G. H. Omran and M. Mahdavi, āGlobalbest harmony search,ā Applied Mathematics and Computation, vol. 198, no. 2, pp. 643ā656, 2008. View at: Publisher Site  Google Scholar  MathSciNet
 A. Kattan and R. Abdullah, āA dynamic selfadaptive harmony search algorithm for continuous optimization problems,ā Applied Mathematics and Computation, vol. 219, no. 16, pp. 8542ā8567, 2013. View at: Publisher Site  Google Scholar  Zentralblatt MATH  MathSciNet
 L. Wang, R. Yang, Y. Xu, Q. Niu, P. M. Pardalos, and M. Fei, āAn improved adaptive binary harmony search algorithm,ā Information Sciences, vol. 232, pp. 58ā87, 2013. View at: Publisher Site  Google Scholar
 H. Sarvari and K. Zamanifar, āImprovement of harmony search algorithm by using statistical analysis,ā Artificial Intelligence Review, vol. 37, no. 3, pp. 181ā215, 2012. View at: Publisher Site  Google Scholar
 M. Hadwan, M. Ayob, N. R. Sabar, and R. Qu, āA harmony search algorithm for nurse rostering problems,ā Information Sciences, vol. 233, pp. 126ā140, 2013. View at: Publisher Site  Google Scholar
 N. Sinsuphan, U. Leeton, and T. Kulworawanichpong, āOptimal power flow solution using improved harmony search method,ā Applied Soft Computing Journal, vol. 13, no. 5, pp. 2364ā2374, 2013. View at: Publisher Site  Google Scholar
 H. Wang, X. Yuan, Y. Wang, and Y. Yang, āHarmony search algorithmbased fuzzyPID controller for electronic throttle valve,ā Neural Computing and Applications, vol. 22, no. 2, pp. 329ā336, 2013. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2015 FangAn Deng et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.