Mathematical Problems in Engineering

Mathematical Problems in Engineering / 2015 / Article
Special Issue

Mathematical Modeling and Analysis of Soft Computing

View this Special Issue

Research Article | Open Access

Volume 2015 |Article ID 836925 | 15 pages | https://doi.org/10.1155/2015/836925

Construction Example for Algebra System Using Harmony Search Algorithm

Academic Editor: Shifei Ding
Received24 Jun 2014
Accepted15 Sep 2014
Published19 Jan 2015

Abstract

The construction example of algebra system is to verify the existence of a complex algebra system, and it is a NP-hard problem. In this paper, to solve this kind of problems, firstly, a mathematical optimization model for construction example of algebra system is established. Secondly, an improved harmony search algorithm based on NGHS algorithm (INGHS) is proposed to find as more solutions as possible for the optimization model; in the proposed INGHS algorithm, to achieve the balance between exploration power and exploitation power in the search process, a global best strategy and parameters dynamic adjustment method are present. Finally, nine construction examples of algebra system are used to evaluate the optimization model and performance of INGHS. The experimental results show that the proposed algorithm has strong performance for solving complex construction example problems of algebra system.

1. Introduction

Algebra system is one of the broad parts of mathematics, together with number theory, geometry, and analysis. It has wide applications in astronomy, biology, construction, computer science, and so on. Many scientific researchers focus on its studies. However, with the rapid development of science and technology, more and more complex algebra systems emerge in large numbers, so the algebra system researchers meet a lot of difficult problems. One of them is to construct appropriate examples for the algebra system so as to prove the existence of the algebra system. This is because there is the tremendous computing workload to construct appropriate examples that are satisfied with the operations of the algebra system. In the current study of algebra system, the construction examples of algebra system are often manually achieved. However, for a multielement complex algebra system, the construction examples are very difficult. In order to understand the construction example of algebra system better, we introduce its construction by an example of N() algebra system.

Above all, we begin this section by introducing a simplified definition of N() algebra system [1, 2].

Definition 1. An N() algebra is a system where is nonempty set, 0 is a constant element of , and * and Δ are two binary operations on , obeying the following axioms. For all ,(F1),(F2),(F3).
For Definition 1, there are two works for researchers to do.(1)Give some examples to prove the existence of this algebra.(2)How many solutions are satisfying the three conditions (F1), (F2), and (F3) simultaneously?

Example 2. Let be a finite set of distinct elements. We define operations * and Δ by the following equations, respectively:
In (1) and (2), , .
Next we consider(F1),(F2),(F3),where ; ; .

For this algebra system, researchers hope to construct some examples that satisfy the conditions (F1), (F2), and (F3) and to prove the existence of the algebra. However, the calculated amount of this work is so large that computing cost is typically high. Like Example 2, each variable or    can take different values ; the combinatorial number with the repetition for all and is . Obviously it is a NP-hard problem. It will result in the combination blast when the dimension increases. So the enumeration method can hardly be used to construct an example for the algebra system ().

Therefore, for this kind of NP-hard problem, this paper proposes an intelligent metaheuristic algorithm to get a set of and quickly that satisfy conditions (F1), (F2), and (F3).

2. The Optimization Model of Algebra System

For the construction example of an algebra system, we can consider it as an optimized constraint satisfaction problem (OCSP) that satisfies conditions (F1), (F2), and (F3). Then it can be solved by the intelligent optimization algorithm.

To predigest the solving process and promote its practicability, we turn this problem into nonconstrained optimization problem. The optimization model is as follows: where for any , , and    in ,

The functions , , and are to transform the constraint satisfaction problem to a nonconstraint optimization problem by adding the penalty factors. When , , and meet conditions (F1), (F2), and (F3), respectively, the objective values of each functions are 0; otherwise, a penalty factor 1 is as the objective value.

For this problem, there are multiple solutions that meet conditions (F1), (F2), and (F3). Therefore, sometimes researchers hope to get as many solutions as possible. To resolve the multisolution problem, we introduce tube table (tube table: TT) to store the obtained solutions and to avoid duplication of solutions. The definite means are as follows: where TT is the tube table. is defined as a penalty function. If is not in tube table TT, is equal to 0. Otherwise, is assigned a penalty value .

Under this condition, the objective function can be expressed as

It is clear that, for this optimization problem, the objective value of optimal solution is .

In order to solve the nonconstrained optimization problem, we propose a new harmony search (HS) algorithm which is a big improvement for classical HS algorithm [1] and NGHS [3] method. We called it improved NGHS (INGHS).

3. Classical Harmony Search (HS) and NGHS

Classical harmony search (HS) is derivative-free metaheuristic algorithm [4, 5]. It mimics the improvisation process of music players and attains the ideal harmony by adjusting the HM. HS algorithm has the same feature as genetic algorithm (GA). The HS algorithm is good at identifying the new regions of the searching space in a reasonable time; however, it has difficulties performing a local search for numerical applications [3, 68].

So, several variants of HS have been proposed to improve the performance of the HS algorithm [3, 614], such as local-best harmony search algorithm with dynamic subpopulations (DLHS) [6], self-adaptive global best harmony search algorithm (SGHS) [7], intelligent tuned harmony search (ITHS) [8] algorithm, and exploratory power of the harmony search (EHS) [3] algorithm. In addition, Zou et al. presented a novel global harmony search algorithm for unconstrained problems (NGHS) [9, 10]. The literature [11] presents the recent advances in HS algorithm. Tuo and Yong proposed HSTL methods for large scale optimization problem and presented an improved harmony search with chaos (HSCH) [12], as well as other variants [1315] of HS and some applications [1619] of HS.

Consider an optimization model as follows: where is the optimization objective function, is the solution vector that consists of decision variables , and is feasible range of values for decision variable .

In the HS algorithm, represents the harmony and denotes the melody of harmony . HS uses three rules (memory consideration, pitch adjustments, and randomization) to optimize the harmony memories (HM) composed of HMS harmony vectors.

3.1. The Classical Harmony Search (HS) Algorithm

The steps in the procedure of classical harmony search algorithm are as follows.

Step 1 (initialize the harmony memory). The harmony memory (HM) consists of HMS harmony vectors. Each harmony vector is generated from a uniform distribution in the feasible space, as where and HMS represent the number of decision variables and the size of harmony memory, respectively, and denotes a uniform distribution random number between 0 and 1, as

Step 2 (improvise a new harmony via three rules). There are three rules that can be used to improvise a new harmony vector .
(a) Memory Consideration. A decision variable value of the harmony vector will be adopted by choosing from the harmony memory with probability HMCR.
(b) Pitch Adjustment. Get a component randomly from an adjacent value of one decision variable of a harmony vector with probability PAR.
(c) Random Generation. Generate a component randomly in the feasible region with probability 1-HMCR.
The improvisation procedure of a new harmony vector works as Algorithm 2.
A trial harmony vector is generated in Step 2. Next, in Step 3, it will be decided whether to survive.

Step 3 (select operator). Get the worst harmony vector from the HM (see Algorithm 1).

If fitness is better than fitness   then
  ; ( is the worst harmony vector in HM).
EndIf

For   to   do
  If  rand(0, 1) < HMCR then
    
    If  rand(0, 1) < PAR then
      ; //FW: fretwidth
      
    Endif
  Else
   
  Endif
End For

Step 4 (check stopping criterion). If the stopping criterion (maximum function evaluation times: maxFEs) is satisfied, computation is terminated. Otherwise, Steps 2 and 3 are repeated.

3.2. The NGHS Algorithm

In the novel global harmony search (NGHS) algorithm [9], three significant parameters, harmony memory considering rate (HMCR), fret width (FW), and pitch adjusting rate (PAR), are excluded from NGHS and a random selection rate is included in the NGHS. In Step 3, NGHS works as Algorithm 3, where and , respectively, represent indexes of the best harmony and the worst harmony in HM. is a uniformly generated random number in , and it should set parameter for the 0-1 knapsack problem and set parameter for continuous optimal problem.

For   to    do
  
  
  .
   If  %random mutation
   
  EndIf
EndFor

3.3. The Proposed HS Method

In this section, we proposed a new harmony search algorithm which is improved based on classical HS algorithm and NGHS method, so we call the proposed algorithm INGHS.

Since the HS algorithm origination, it has been applied to many practical optimization problems. However, for large scale optimization problems, classical HS has slow convergence and low precision, which is due to the fact that because a new decision variable value in harmony memory (HM) can be generated only by using pitch adjustment and randomization strategies during the search procedure, the memory consideration rule is only used to adjust the existing decision variable values according to the harmony memory (HM). Thus HS can maintain a strong performance of exploration, but not a good performance of exploitation, and, in the later stage of search, it is characterized by slow convergent. Therefore, for solving the large scale optimization problem, the key is to balance the global exploration performance and the local exploitation ability.

Because the construction example of the algebra is a large scale high-dimensional optimization problem, to achieve the most satisfactory optimization performance by applying the HS algorithm to a given problem, we adopt four optimization strategies and dynamical parameter control method to balance the global exploration power and the local exploitation ability.

It is of very importance between the convergence and the diversity in order to improve the efficiency of the search. In the classical HS algorithm, a new harmony is generated in Step 2. After the selecting operation in Step 3, the population variance may increase or decrease. With a high population variance, the diversity and exploration power will increase, and in the same time the convergence and the exploitation power will decrease accordingly. Conversely, with a low population variance, the convergence and the exploitation power will increase [8]; the diversity and the exploration power will decrease. So it is significant how to keep the balance between the convergence and the diversity. Classical HS algorithm loses its ability easily at the later evolution process [3], because of improvising a new harmony from HM with a high HMCR and local adjusting with PAR. And HM diversity decreases gradually from the early iteration to the last. However, in HS algorithm, a low HMCR employed will increase the probability (1-HMCR) of random select in search space; the exploration power will improve, but the local search ability and the exploitation accuracy cannot be improved by the single pitch adjusting strategy.

To overcome the inherent weaknesses of HS, in this section, we present INGHS method to construct example for algebra. The INGHS algorithm works as Algorithm 4 and Figure 1.

  %%% worst is the index of the worst harmony in HM
For to D
  If rand(0, 1)  <  
    % Random playing: randomly select any pitch within bounds
    
  Else
    If rand(0, 1)  <  HMCR
      % Memory considering: randomly select a note stored in HM
      
      If rand(0, 1)  <  PAR
       % Pitch adjusting: randomly adjust the pitch slightly
       
      EndIf
    ElseIf rand(0, 1)  <  NGP %
       
       
    EndIf
  EndIf
EndFor% Finished improvising a new harmony

In Algorithm 4, to resolve the discrete optimization problems, the function Round() is used to round each element of to the nearest integer.

(1) Novel Global (NG) Best Strategy. In NGHS algorithm, the novel global best strategy is very effective to explore the global best solution. So, in the proposed method, the novel global best strategy is adopted (see Algorithm 5), where the best and worst, respectively, represent indexes of the best harmony and the worst harmony in HM.

If rand(0, 1) < NGP (NGP is the rate of choosing the novel global best strategy)
  
  
End

(2) Parameters Dynamically Changed. To balance the exploration and exploitation power of the INGHS algorithm efficiently, HMCR, PAR, FW, and NGP parameters are dynamically adapted to a suitable range with the increase of generations. Equation (11) shows the dynamic change of HMCR, PAR, and NGP, respectively, (see [8]).

It can be seen that the parameter HMCR gradually increased from to linearly and the parameter NGP increased with low velocity in the early stage and it increased sharply in the final stage. That is because, in the beginning, in order to explore the global optimal solution, the harmony consideration rules and NG strategies are carried out with a smaller probability, and, in later stage, INGHS methods begin to focus on the local exploitation that needs a high probability to employ the NG strategy and harmony consideration rules. The benefits of doing so can get more opportunities to reinforce the global exploration by strengthening disturbance in the early stage and can acquire high precision solution by carrying local intensification search in the later stage. For the same reason, FW is decreased gradually in order to reduce perturbation step size step by step, and the variation of PAR from 0.55 to 0.3 is to reduce the probability of pitch adjustment.

3.4. The Construction Example for Algebra System with the Proposed HS Method

For an N() algebra system, there are multiple solutions that meet the conditions. Sometimes we need to obtain multiple solutions. So we adopt the proposed HS algorithm and tube table technology to resolve the multisolution problem. Its flow chart is as shown in Figure 2 and in Pseudocode 1.

Set TT as empty
For   to Max Loop times (ML)
  Run the proposed HS algorithm to get a best solution ;
  If the objective function value
    Put into tube table TT.
  EndIf
EndFor

3.5. Effect of Parameters HMCR and NGP on INGHS Performance

In this section, we determine the effect of parameters , , , and on the performance of the proposed algorithm INGHS. The experiments are investigated for algebra system , where .

(1) Investigation for and (initialize , ). Let change from 0.1 to 0.6; then, for each , let to 1. So there are pairs of parameters (, ). For each pair of parameters (, ), we execute the proposed algorithm (INGHS)    times and then record the number of solutions obtained. The result is shown in Figure 3.

(2) Investigation for and (initialize , ). Let to 0.7 and let to 1, respectively. For each pair of parameters (, ), the proposed algorithm INGHS is performed    times and then we record the number of solutions obtained. The number of solutions obtained is shown in Figure 4.

It can be seen from Figure 3 that most solutions can be obtained when and . From Figure 4, we find that most solutions are found when and .

From the above, the proposed algorithm has better performance when it is set as , , , and .

4. Computational Experiments and Results

In this section, we have tested the proposed algorithm over a set of N() algebra system construction examples problems.

4.1. The Proposed Algorithm for Solving the Construction Examples of Algebra System

Definition 1 is chosen as the optimization objective algebra system. For Definition 1, we, respectively, set the following:(1); (2); (3); (4); (5); (6); (7); (8);(9); .

Then we use the proposed algorithm to construct the examples for the algebra system on .

In order to deal with the optimization problems easily, we replace the elements , , , , , , , , , and with 1, 2, 3, 4, 5, 6, 7, 8, 9, and 10, respectively. So, for each , the upper limit of variables about optimization problems is set as and the lower limit is set as 1.

4.2. The Parameters Setting for Algorithms

In the experiment test, we set the following parameter values: ; for NGHS algorithm, , ; for the proposed algorithm (INGHS), , ,, , , , , , , and .

When and , there are 3 and 16 solutions for the N() algebra system, respectively. So we set max looping times and 16, respectively. When and , we do not know how many solutions are there for the algebra system. So we set . However, with the increasing of , there are much more solutions, and computer’s running time is very long. So we set and 5000 when and .

4.3. The Experiment Results and Analysis

In order to evaluate the performance of the proposed algorithm INGHS, we compared its success rate of finding the solutions with HS, HSTL, and NGHS algorithms in the same conditions. The results are shown in Table 1.


The number of solutionsHS HSTLNGHSINGHS
Max looping times
(ML)
Success timesSuccess rateMax looping times
(ML)
Success timesSuccess rateMax looping times
(ML)
Success timesSuccess rateMax looping times
(ML)
Success timesSuccess rate

2333100.00%33100.00%33100.00%33100.00%
316161593.75%1616100.00%1616100.00%1616100.00%
4Unknown25612749.61%25614857.81%25612749.61%25616263.28%
5Unknown62548577.60%62560096.00%62557992.64%62560196.16%
6Unknown1296101678.40%1296125897.07%1296124395.91%1296127198.07%
7Unknown2401183476.38%2401221292.13%2401211988.25%2401221792.34%
8Unknown4096791.93%4096336482.13%4096317477.49%4096338382.59%
9Unknown800000.00%8000576372.04%8000476859.60%8000615176.89%
10Unknown500000.00%5000345169.02%5000325165.02%5000374974.98%

When   , the solutions, as shown in (12a)–(12c), are obtained by using the proposed algorithm. And when   , the solutions obtained by INGHS method are shown in (13a)–(13p). For other instances , due to the existence of many solutions, we only present one solution for each instance. All is shown in (14)–(20).

The first solution on is as follows: The second solution on is as follows: The third solution on is as follows:

The first solution on is as follows: The second solution on is as follows: The third solution on is as follows: The fourth solution on is as follows: The fifth solution on is as follows: The sixth solution on is as follows: The seventh solution on is as follows: The eighth solution on is as follows: The ninth solution on is as follows: The tenth solution on is as follows: The eleventh solution on is as follows: The twelfth solution on is as follows: The thirteenth solution on is as follows: The fourteenth solution on is as follows: The fifteenth solution on is as follows: The sixteenth solution on is as follows:

One of the solutions on () is as follows: One of the solutions on () is as follows: One of the solutions on () is as follows: One of the solutions on () is as follows: One of the solutions on () is as follows: One of the solutions on () is as follows: One of the solutions on () is as follows:

From Table 1, it can be seen obviously that the success rate of all instances of the proposed method is higher than HS, HSTL, and NGHS.

For instance, , and 10, and Figures 5, 6, 7, 8, 9, and 10 show the convergence curves of three algorithms (HS, NGHS, and INGHS). It is evident from Figures 510 that the proposed algorithm is better than HS and NGHS for all instances. The convergence curve of the proposed algorithm can maintain falling until it finds the best solution. When , the convergence curve of INGHS method falls faster than HS, HSTL, and NGHS algorithms, and it keeps declining actively until the best solution is found.

5. Conclusions

This paper has investigated nine construction example problems of algebra system and converted the construction example problems into optimization problems. A novel harmony search algorithm (INGHS) is proposed to solve the problems. Global best strategy and dynamic parameters adjustment are employed in INGHS. The experimental results on 9 instances of algebra system demonstrate that the proposed algorithm is more effective than HS, HSTL, and NGHS algorithms. Further research will investigate the INGHS algorithm to solve the combinatorial optimization problems and some practical optimization problems.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The authors would like to thank all the editors and referees for their valuable comments. This work was supported by the National Natural Science Foundation of China under Grant no. 11401357 and no. 81160183, the Scientific Research Program funded by Shaanxi Provincial Education Department under Grant no. 14JK1141, and Hanzhong Administration of Science & Technology under Grant no. 2013hzzx-39.

References

  1. F. Deng and Y. Xu, “On the N(2,2,0) algebra,” Journal of Southwest Jiao Tong University, vol. 31, no. 4, pp. 457–653, 1996. View at: Google Scholar
  2. F. Deng and L. Yong, “Regular semigroups of an N(2,2,0) algebra,” Journal of Advances in Mathematics, vol. 41, no. 6, pp. 665–671, 2012. View at: Google Scholar | MathSciNet
  3. S. Das, A. Mukhopadhyay, A. Roy, A. Abraham, and B. K. Panigrahi, “Exploratory power of the harmony search algorithm: analysis and improvements for global numerical optimization,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 41, no. 1, pp. 89–106, 2011. View at: Publisher Site | Google Scholar
  4. Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, no. 2, pp. 60–68, 2001. View at: Publisher Site | Google Scholar
  5. K. S. Lee and Z. W. Geem, “A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice,” Computer Methods in Applied Mechanics and Engineering, vol. 194, no. 36–38, pp. 3902–3933, 2005. View at: Publisher Site | Google Scholar
  6. Q.-K. Pan, P. N. Suganthan, J. J. Liang, and M. F. Tasgetiren, “A local-best harmony search algorithm with dynamic subpopulations,” Engineering Optimization, vol. 42, no. 2, pp. 101–117, 2010. View at: Publisher Site | Google Scholar
  7. Q.-K. Pan, P. N. Suganthan, M. F. Tasgetiren, and J. J. Liang, “A self-adaptive global best harmony search algorithm for continuous optimization problems,” Applied Mathematics and Computation, vol. 216, no. 3, pp. 830–848, 2010. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  8. P. Yadav, R. Kumar, S. K. Panda, and C. S. Chang, “An intelligent tuned harmony search algorithm for optimisation,” Information Sciences, vol. 196, pp. 47–72, 2012. View at: Publisher Site | Google Scholar
  9. D. Zou, L. Gao, J. Wu, and S. Li, “Novel global harmony search algorithm for unconstrained problems,” Neurocomputing, vol. 73, no. 16–18, pp. 3308–3318, 2010. View at: Publisher Site | Google Scholar
  10. D. Zou, L. Gao, J. Wu, S. Li, and Y. Li, “A novel global harmony search algorithm for reliability problems,” Computers & Industrial Engineering, vol. 58, no. 2, pp. 307–316, 2010. View at: Publisher Site | Google Scholar
  11. Z. W. Geem, Recent Advances in Harmony Search Algorithm, Springer, New York, NY, USA, 2010, https://sites.google.com/a/hydroteq.com/www/HS_Structure.pdf.
  12. S. Tuo and L. Yong, “Improved harmony search algorithm with chaos,” Journal of Computational Information Systems, vol. 8, no. 10, pp. 4269–4276, 2012. View at: Google Scholar
  13. M. G. H. Omran and M. Mahdavi, “Global-best harmony search,” Applied Mathematics and Computation, vol. 198, no. 2, pp. 643–656, 2008. View at: Publisher Site | Google Scholar | MathSciNet
  14. A. Kattan and R. Abdullah, “A dynamic self-adaptive harmony search algorithm for continuous optimization problems,” Applied Mathematics and Computation, vol. 219, no. 16, pp. 8542–8567, 2013. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  15. L. Wang, R. Yang, Y. Xu, Q. Niu, P. M. Pardalos, and M. Fei, “An improved adaptive binary harmony search algorithm,” Information Sciences, vol. 232, pp. 58–87, 2013. View at: Publisher Site | Google Scholar
  16. H. Sarvari and K. Zamanifar, “Improvement of harmony search algorithm by using statistical analysis,” Artificial Intelligence Review, vol. 37, no. 3, pp. 181–215, 2012. View at: Publisher Site | Google Scholar
  17. M. Hadwan, M. Ayob, N. R. Sabar, and R. Qu, “A harmony search algorithm for nurse rostering problems,” Information Sciences, vol. 233, pp. 126–140, 2013. View at: Publisher Site | Google Scholar
  18. N. Sinsuphan, U. Leeton, and T. Kulworawanichpong, “Optimal power flow solution using improved harmony search method,” Applied Soft Computing Journal, vol. 13, no. 5, pp. 2364–2374, 2013. View at: Publisher Site | Google Scholar
  19. H. Wang, X. Yuan, Y. Wang, and Y. Yang, “Harmony search algorithm-based fuzzy-PID controller for electronic throttle valve,” Neural Computing and Applications, vol. 22, no. 2, pp. 329–336, 2013. View at: Publisher Site | Google Scholar

Copyright © 2015 FangAn Deng et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1130 Views | 440 Downloads | 2 Citations
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder

We are committed to sharing findings related to COVID-19 as quickly and safely as possible. Any author submitting a COVID-19 paper should notify us at help@hindawi.com to ensure their research is fast-tracked and made available on a preprint server as soon as possible. We will be providing unlimited waivers of publication charges for accepted articles related to COVID-19.