643
Views
1
CrossRef citations to date
0
Altmetric
Articles

Hybrid Multi-Evolutionary Algorithm to Solve Optimization Problems

ORCID Icon

ABSTRACT

The article presents a Hybrid Multi-Evolutionary Algorithm designed to solve optimization problems. The Genetic Algorithm and Evolutionary Strategy work together to improve the efficiency of optimization and increase resistance to getting stuck to sub-optimal solutions. Genetic Algorithm and Evolutionary Strategy can periodically exchange the best individuals from each other. The algorithm combines the ability of the Genetic Algorithm to explore the search space and the ability of the Evolutionary Strategy to exploit the search space. It maintains the right balance between the exploration and exploitation of the search space. The results of the experiments suggest that the proposed algorithm is more effective than the Genetic Algorithms and Evolutionary Strategy used separately, and can be an effective tool in solving complex optimization problems.

Introduction

Evolutionary Algorithms (EA) are a group of methods that mimic natural evolution and the theory of evolution, developed for living organisms by Charles Darwin. They simulate, using computer programs, processes of evolution in the world of living organisms – i.e. natural selection, mutation, recombination of genes, especially the rule of the “survival of the fittest.” The common idea of all EA is the same: they work with a population of individuals, in given environments, which can be defined based on the problem solved by the algorithm. Every individual (a potential solution to the problem) in the form of a chromosome is coded by binary, real numbers or composite data structure. A population is processed in a loop, by operators of fitness evaluation, reproduction, and mutation. An iteration of the loop is called generation. Every individual has assigned a value of fitness function (a numeric value, that determines the adaptation of this individual to the given environment). This function describes the ability of each individual to act as a parent for the next generation of the population. The environmental pressure causes, those well-adapted individuals have a better chance of surviving. Based on fitness values, individuals are selected for the parent’s pool and, as a result, they can transfer their genetic material to offspring. Evolutionary Algorithms search for optimum by the process of exploration and exploitation of search space. Exploration is the process of searching for a new region, where an optimum can exist. A search space is probing with the hope of finding other promising solutions that can be refined. Exploitation is the process of searching within the neighborhood of previously visited points, with the hope of improving a solution found till now. One of the most important problems in EA designing is to keep the right balance between exploration and exploitation. Evolutionary Algorithms are usually used in solving complex optimization problems, for which there are no other specialized techniques. They generally return a satisfying solution in an acceptable period of time but do not guarantee to find the global optimum. There are three basic types of EA: Genetic Algorithms (GA), Evolutionary Strategies (ES), Genetic Programming (GP).

Genetic Algorithm is the best-known type of Evolutionary Algorithms. The individuals in the GA, by default, are coded by binary strings (binary representation), but other representation, i.e. real numbers or composite structures of genes, are also possible. During optimization, GA should keep the right balance between exploration and exploitation, to avoid premature convergence to the local optima. The main parameters of the GA (the probability of selection and probability of mutation) control the ability to explore and exploit the search space and maintain the diversity of the population and the selection pressure. The algorithm can get stuck to the local optima if the exploration is too strong, or it can waste time on poor solutions and cannot focus on good solutions in a neighborhood, if the exploitation is too weak. The process of selection is used as a main mechanism of the exploitation, while crossing-over and mutation are methods of exploring the search space. More information on GAs can be found in publications (Goldberg Citation1989; Michalewicz Citation1992).

Evolutionary Strategies, in common with Genetic Algorithms, process a population of individuals in a loop, until the termination criterion is met. In contrast to GAs, ESs use primarily mutation and selection as search operators and the principle, that small changes have small effects. During the mutation, by default, a normally distributed random value is added to each gene of an individual. The size of mutation (i.e. the standard deviation of the normal distribution) can be governed by a self-adaptation process. After a predetermined number of fitness function calls or a number of generations, mutation can be adjusted to satisfy 1/5th rule (the ratio of successful mutations to all mutations). If too many mutations are successful, indicating that the search is too local, the mutation-size should increase, to obtain a greater diversity of individuals. If too few mutations are successful, indicating that the mutation-size is too large, the mutation size should decrease, to increase the accuracy of the search and accelerate the convergence of the algorithm. The simplest Evolutionary Strategy 1+1ES operates on a population of two individuals: the parent (current point) and the child (a result of parent’s mutation). If the child’s fitness is equal to or better than the parent’s fitness, the child becomes a parent in the next generation. Otherwise, the new child is created in the next loop. In strategy (1+λ)ES, λ children can be created and compete with the parent. In strategy 1,λES the best child becomes the parent in the next generation while the current parent dies. Strategy μ+1ES works with a population of μ individuals. A child is created as a mutation of a randomly selected parent, and together with the other μ individuals form a temporary population μ+1. From this population, μ of the best individuals are selected as the next population (a rejection of the worst). Strategy μ+λES uses the population of μ parents. A λ children are created as a result of mutation and recombination of parents. This makes, that strategy is less prone to get stuck to the local optima. More information on Evolutionary Strategies can be found in publications (Bäck, Hoffmeister, and Schwefel Citation1991; Beyer and Schwefel Citation2002).

The hybrid intelligent algorithm consists of a combination of methods and techniques of artificial intelligence, i.e. Genetic Algorithms, Fuzzy Logic, Artificial Neural Networks, etc. Such an algorithm is able to utilize the advantages of artificial intelligence methods and techniques while avoiding their disadvantages. They can be used to improve the efficiency, or where simple methods do not produce the satisfying results in the expected time. Hybrid intelligent algorithms, that use artificial intelligence methods, can be used in different optimization problems, for example, in multiobjective optimization (Pytel Citation2011; Pytel and Nawarycz Citation2012, Citation2010), Connected Facility Location Problem (Pytel and Nawarycz Citation2013), Clustering Problem (Pytel Citation2016) or Function Optimization Problem (Pytel Citation2017).

In this paper, a new Hybrid Multi-Evolutionary Algorithm, that combines the Genetic Algorithm and Evolutionary Strategy is proposed, so it is able to utilize the advantages of GA and ES while avoiding their disadvantages. GA is responsible for searching for new areas of optimum and protection against getting stuck to the local optimum. The operation of ES is limited to the local area, found by the GA. ES is responsible for quick convergence to an optimum value. GA and ES can exchange information about the optimization process and transfer the best individuals between them. The efficiency of the search in a proposed algorithm is enhanced in terms of the time needed to reach the global solution, and the number of calls to fitness function.

Many types of hybrid evolutionary algorithm have been proposed (i.e. Grosan and Abraham Citation2007; Qian et al. Citation2018). However, the presented approach differs from them in several aspects, as will be clear in the following. The main contributions of the paper can be summarized as follows:

  • A new idea of the hybrid multi-evolutionary algorithm is presented, which consists of the Genetic Algorithm and Evolutionary Strategy.

  • GA and ES are working in parallel, so the algorithm can use multiple processors and/or cores.

  • A new protocol for exchanging information and the best individuals between GA and ES was developed.

  • Extensive experiments on test functions show that the proposed algorithm achieves high-quality solutions and very good performance in comparison with other evolutionary methods.

The paper is organized as follows: Section “Problem Formulation” defines the problem to solve, and formalizes it as the Functions Optimization Problem. Section “The Proposed Hybrid Multi-Evolutionary Algorithm” reviews the recent proposals on function optimization and introduces a new hybrid multi-evolutionary algorithm. Section “Computational Experiment” describes in detail the experiments. In the first stage of experiments, different models of the algorithm were tested. As a result of this stage, the most effective model was chosen to stage 2. In the second stage of an experiment, a proposed algorithm was compared to Genetic Algorithms and Evolutionary Strategies. Section “Conclusions”, finally, concludes the paper and suggests future developments.

Problem Formulation

Optimization is the selection of the best element from the set of elements, with regard to some criterion, without violating some constraint. The Functions Optimization Problem (FOP) is a problem in which parameters of the function (variables) need to be adjusted to achieve the best (maximal or minimal) value of an objective function, under given constraints. The function optimization problem (e.g. a function maximizing problem) can be stated as follows:

(1) maxfxsubject to:ci0 for i=1,2,,kxS(1)

where: x=x1,x2,x3,,xn ; nN – is an n-dimensional vector of decision variables, fx – is the objective function of variables x, cix – are constraints, S – the search area.

The FOP problem can be used as a benchmark for various optimization methods, including Genetic Algorithms and Evolutionary Strategies. Various methods of solving FOP are discussed in the literature, for example (Jensi and Wiselin Citation2016; Karaboga and Basturk Citation2007; Potter and De Jong Citation1994).

The Proposed Hybrid Multi-Evolutionary Algorithm

Genetic Algorithms can search space of possible solutions, by use operators of cross-over and mutation, to solve sophisticated optimization problems. A binary representation of individuals is usually used in GA, but real numbers or complex data structures can also be used. Its convergence is very much dependent on the initial population. GAs are the most useful to discrete problems. One of the disadvantages of GA is low efficiency in the final search stage. Evolutionary Strategies, by default, use real numbers as a representation of individuals. They use primarily mutation and selection, as evolution operators. Competition between parents and descendants, for the selection as a new parent for the next generation, causes evolution pressure. ESs are prone to get stuck to sub-optimal solutions. The advantage of ESs is the rapid convergence to the optimum value. ESs should be applied to continuous problems. Hybridization of both algorithms can be a promising method for solving many continuous and discrete optimization problems.

The proposed algorithm (GA-ES) combines a GA’s ability to explore the search space and find areas of possible optimality and ability of ES’s to quickly converge to the optima. GA and ES can use the same operators of selection, mutation, individuals’ representation and start with the same initial population. After a predetermined number of generations, the best individuals from both algorithms can be compared. Depending on the result of this comparison, the transposition of individuals between the algorithms may be performed:

  • If the best individual in the GA is better than the best individual in the ES, this means finding a new area of optimum. The best individual in the GA replaces the parent, or the worst parent if there is more than one parent in the ES.

  • If the best individual in the ES is better than the best individual in the GA, then it means that ES found a new optimal solution. The best individual in the ES replaces the worst individual in the GA population.

The block diagram of a proposed hybrid algorithm is shown in .

Figure 1. The hybrid algorithm block diagram.

Figure 1. The hybrid algorithm block diagram.

Computational Experiment

The proposed algorithm can be used in different optimization problems, including solving the Function Optimization Problem. The experiments were divided into two stages. In the first stage, algorithms were compared, in which different types of evolutionary strategies were used, for example 1+1ES, μ+1ES, μ,λES, and μ+λES. The aim of this stage was to determine the optimal type of Evolutionary Strategy that best suits solving test problems. In the second stage, the results of the proposed hybrid algorithm were compared with the Evolutionary Strategy and the Genetic Algorithm.

A wide range of functions with different complexities in a diverse environment has been used during research:

  • f1x1,x2 – an easy function of two variables. The function has many local optima and was used for testing the algorithm’s ability to find the global optimum. The function is given by the formula:

(2) f1x1,x2=sinx1+0.6sin20x1sinx2(2)

where: x1,x20,π

The value of maximum 1.6 at point π/2,π/2.

  • f2x1,x2,,x10- the function of multiple variables with a low inclination angle, used for testing the ability of the algorithm to determine the exact solution. The function is given by the formula:

    (3) f2x1,x2,,x10=i=110sinxi(3)

where: x1,x2,,x100,π

The value of maximum 1 at point π2,π2,π2,π2,π2,π2,π2,π2,π2,π2.

  • f3 – one of the functions proposed in (Kwasnicka Citation1999). It is a very sophisticated function with many local optima with different values. This function permits to estimate the ability of the algorithm to solve difficult optimization problems. The first generation of individuals was placed in the local optimum (point [5, 5] with value 1.5). The algorithm should find the global optimum (point [50, 50] with value 2.5), avoiding the local optima (with values 1.0 and 2.0). The function is given by the formula:

    (4) f3x1,x2=17hieμix1x1i2+x2x2i2(4)

where: h1=1.5,h2=1,h3=1,h4=1,h5=2,h6=2,h7=2.5

μ1=μ2=μ3=μ4=μ5=μ6=μ7=0.01
x11,x12=5,5,x21,x22=5,30,x31,x32=25,25,
   x41,x42=30,5,x51,x52=50,20,x61,x62=20,50,
x71,x72=50,50
  • The Rastrigin function – is a non-convex, non-linear multimodal function. It was first proposed as a two-dimensional function and has been generalized to an n-dimensional domain (in experiments 2, 5 and 10 dimensions were used). Finding the minimum of this function is very difficult, due to a huge search space and a large number of local minima. The function is usually evaluated on the hypercube xi5.12,5.12, for all i=1,,d and the minimum fx=0 is at point x=0,,0. The function is given by the formula:

(5) fRax=10d+i=1dxi210cos2πxi(5)
  • The Styblinski-Tang function – is a continuous, multimodal, non-convex function, generalized to an n-dimensional domain (in experiments 2, 5 and 10 dimensions were used). Finding the minimum of this function is very difficult, due to its huge search space and its shape. The function is usually evaluated on the hypercube xi5,5, for all i=1,,d. The minimum is at point x=2.903534,,2.903534, and his value is fx=39.16599d. The function is given by the formula:

(6) fSTx=12+i=1d(xi416xi2+5xi)(6)
  • The Rosenbrock function (also referred to as the Valley or Banana function) – is a popular test problem for gradient-based optimization algorithms. The function is generalized to an n-dimensional domain (in experiments 2 dimensions were used). The function is unimodal, and the global minimum lies in a narrow, parabolic valley. However, even though, this valley is easy to find, convergence to the minimum is difficult. The function is usually evaluated on the hypercube xi5,10, for all i=1,,d and the minimum fx=0 is at point x=1,,1. The function is given by the formula:

(7) fRox=i=1d1100xi+1xi22+xi12(7)
  • The Shubert function – two-dimensional function with several local minima and many global minima. The function is usually evaluated on the square xi10,10, for all i=1,2. The minimum value is fx=186.7309. The function is given by the formula:

(8) fShx=i=15icosi+1x1+ii=15icosi+1x2+i(8)

Some of the chosen test functions are a minimization problem. In this case, they were converted into the maximization problem by negation and adding constant C. The constant C was determined for each function during the initial experiments. The values of the Genetic Algorithm’s parameters were set during the initial experiments. In the experiments, the following values of the Genetic Algorithm’s parameters were used:

  • the genes of individuals are represented by real numbers,

  • the probability of crossover = 0.8,

  • the probability of mutation = 0.15,

  • the number of individuals in the population = 25.

Various types of Evolutionary Strategies were used in experiments. For all of them, the parameters were set during the initial experiments. The mutation model with the addition of a random number in accordance with the normal distribution was used. To maintain the rule of 1/5 successes, if for the next 10 generations there was no improvement, the size of the mutation was enlarged by increasing σparameter. The number of parents and descendants in various types of ES was set based on recommendations from the literature and presented in .

Table 1. The number of parents and descendants in different types of ESs.

The fitness of the best individuals, found by GA and ES was compared after every 50 generations. Depending on the result of this comparison, the best individuals have been transferred between GA and ES. The algorithm was stopped when the best individual reached the predetermined value of the optimized function. The value of the stop criterion and the value of the constant C used for each function are presented in .

Table 2. The value of the algorithm stop criterion and the value of the constant C used for each function.

Each algorithm was executed 10 times on a standard PC computer (CPU: Intel i3, RAM: 8GB, Windows 10 operating system). shows the average time and the average number of calls to fitness function needed to reach the predetermined value for test functions in stage 1 of an experiment.

Table 3. The average running time and the number of fitness function calls needed to reach the predetermined value of the optimized function.

Based on results from stage 1 experiments, a combination of GA with (1 + 1)-ES has the highest efficiency. In stage 2, this algorithm will be used to compare results with the Evolutionary Strategy (ES), and the Standard Genetic Algorithm (SGA) described in (Michalewicz Citation1992) and adapted to optimize the test functions. Each algorithm was executed 10 times on a standard PC computer (CPU: Intel i3, RAM: 8GB, Windows 10 operating system). shows the average time and the average number of calls to fitness function needed to reach the predetermined value for the test functions in stage 2 of an experiment. The graph in shows the average running time and the graph in shows the number of calls to fitness function, in the Genetic Algorithm (SGA), the Evolutionary Strategy and the proposed (GA-ES) algorithm.

Table 4. The average time and number of fitness function calls needed to reach the predetermined value of the optimized function.

Figure 2. The average running time in the Genetic Algorithm (SGA), the Evolutionary Strategy and the proposed (GA-ES) algorithm.

Figure 2. The average running time in the Genetic Algorithm (SGA), the Evolutionary Strategy and the proposed (GA-ES) algorithm.

Figure 3. The number of fitness function calls in the Genetic Algorithm (SGA), the Evolutionary Strategy, and the proposed (GA-ES) algorithm.

Figure 3. The number of fitness function calls in the Genetic Algorithm (SGA), the Evolutionary Strategy, and the proposed (GA-ES) algorithm.

Conclusions

The proposed hybrid Genetic Algorithm-Evolutionary Strategy algorithm was able to find an acceptable solution for all tested functions.

In GA-ES algorithm, GA is responsible for searching for new areas, where optimum can exist, and provide protection against getting stuck to the local optima. ES operation can be limited to only the local area. It is responsible for searching the local area found by the GA.

The 1+1ES strategy is the most vulnerable to getting stuck to local optima. Strategies μ+1ES, μ,λES and μ+λES can increase the algorithm’s resilience to get stuck to the local optima but they require additional computing effort. The combination of GA and 1+1ES is the most efficient.

In all tasks, the GA-ES algorithm needed less fitness function calls, than SGA. The standard deviation of the found solutions was lower, which may indicate greater stability of the algorithm. In one task, a running time of the GA-ES algorithm was longer than the SGA, in all other tasks, the running time of GA-ES was shorter compared to SGA.

ES gives the best results when optimizing easy functions such as f1. However, the performance of this algorithm is not satisfying for the f2 function with many local optima or when it has to overcome the valley like in function f3. In this case, it may get stuck to the local optima or requires a large number of calls to the fitness function. ES was unable to solve four tasks from the tests, and in one of the tasks, the algorithm gets stuck to the local optima at least one time. In four tasks, the GA-ES algorithm needed less fitness function calls, than ES. In two tasks, a running time of GA-ES algorithm was longer than ES, in all other tasks, the running time of GA-ES was shorter.

The optimization of the functions has shown that the proposed algorithm has greater convergence and accuracy compared to SGA and the algorithm is more resistant to premature convergence to the local optimum compared to the ES’s.

In the proposed algorithm it is possible to perform parallel calculations by the Genetic Algorithm and the Evolutionary Strategy, e.g. by using multiple processors or processor cores.

The proposed algorithm is an efficient tool for solving function optimization problems. It could be used for solving a very wide range of optimization problems.

In single-objective optimization, for example, Function Optimization Problem, there is only one objective function that should be optimized. This problem is often used to test various optimization methods. In real life, problems with multi-criteria optimization are very common. In this case, there are at least two functions, which have to be optimized at the same time. No single solution exists that simultaneously optimizes each objective. This task is much more difficult and can result in a set of optimal solutions (Pareto optimal solutions). Future work will try to extend the approach to the Multi-objective Optimization Problem.

Statement

This manuscript has not been published elsewhere and that it has not been submitted simultaneously for publication elsewhere.

References

  • Bäck, T., F. Hoffmeister, and H.-P. Schwefel. 1991. A survey of evolution strategies. In Proceedings of the Fourth International Conference on Genetic Algorithms, San Diego, CA, USA, 2–9. Morgan Kaufmanns.
  • Beyer, H.-G., and H.-P. Schwefel. 2002. Evolution strategies: A comprehensive introduction. Journal Natural Computing 1 (1):3–52. doi:10.1023/A:1015059928466.
  • Goldberg, D. E. 1989. Genetic algorithms in search, optimization, and machine learning. Reading, MA: Addison-Wesley.
  • Grosan, C., and A. Abraham. 2007. Hybrid evolutionary algorithms: methodologies, architectures, and reviews. In  Abraham A., Grosan C., Ishibuchi H. (eds), Hybrid evolutionary algorithms. Studies in computational intelligence, vol. 75, 1-17, Berlin, Heidelberg: Springer. doi:10.1007/978-3-540-73297-6_1
  • Jensi, R., and J. G. Wiselin. 2016. An improved krill herd algorithm with global exploration capability for solving numerical function optimization problems and its application to data clustering. Applied Soft Computing 46:230–45. doi:10.1016/j.asoc.2016.04.026.
  • Karaboga, D., and B. Basturk. 2007. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. Journal of Global Optimization 39 (3):459–71. doi:10.1007/s10898-007-9149-x.
  • Kwasnicka, H. 1999. Obliczenia ewolucyjne w sztucznej inteligencji, (in Polish). Wroclaw: Oficyna Wydawnicza Politechniki Wroclawskiej.
  • Michalewicz, Z. 1992. Genetic algorithms + data structures = evolution programs. Berlin: Springer Verlag.
  • Potter, M. A., and K. A. De Jong. 1994. A cooperative coevolutionary approach to function optimization. In Davidor Y., Schwefel HP., Männer R. (eds.), Parallel problem solving from nature - PPSN III. PPSN 1994. Lecture NOTES IN COMPUTER SCIENCE, vol. 866, 249-257, Berlin, Heidelberg: Springer. doi:10.1007/3-540-58484-6_269
  • Pytel, K. 2011. The fuzzy genetic strategy for multiobjective optimization. In Proceedings of the federated conference on computer science and information systems, Szczecin.
  • Pytel, K. 2016. Hybrid fuzzy-genetic algorithm applied to clustering problem. In Proceedings of the 2016 federated conference on computer science and information systems, Gdansk.
  • Pytel, K. 2017. Hybrid multievolutionary system to solve function optimization problems. In Proceedings of the federated conference on computer science and information systems, Prague.
  • Pytel, K., and T. Nawarycz. 2010. Analysis of the distribution of individuals in modified genetic algorithms. In Rutkowski L., Scherer R., Tadeusiewicz R., Zadeh L. A., Zurada J. M. (eds.), Artificial intelligence and soft computing, ICAISC 2010. Lecture Notes in Computer Science, vol 6114, 197-204, Springer-Verlag Berlin Heidelberg. doi:10.1007/978-3-642-13232-2_24
  • Pytel, K., and T. Nawarycz. 2012. The fuzzy-genetic system for multiobjective optimization. In Rutkowski L., Korytkowski M., Scherer R., Tadeusiewicz R., Zadeh L.A., Zurada J.M. (eds.), Swarm and evolutionary computation, Lecture Notes in Computer Science, vol 7269, 325-332, Springer-Verlag Berlin Heidelberg. doi:10.1007/978-3-642-29353-5_38
  • Pytel, K., and T. Nawarycz. 2013. A fuzzy-genetic system for ConFLP problem. In Andrzej M. J. Skulimowski (ed.), Advances in decision sciences and future studies, vol. 2, 575-586, Krakow: Progress & Business Publishers.
  • Qian, X., Wang X., Su Y., and He L. 2018. An effective hybrid evolutionary algorithm for solving the numerical optimization problems. Journal of Physics: Conference Series 1004 (1): article id. 012020. doi:10.1088/1742-6596/1004/1/012020.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.