A novel metaheuristic algorithm for numerical function optimization : Blind , naked mole-rats ( BNMR ) algorithm

Optimization algorithms inspired by the world of nature have turned into powerful tools for solving the complicated problems. However, they have still some drawbacks need the investigation of new and better optimization algorithms. In this paper, we propose a new meta-heuristic algorithm called blind naked mole-rats (BNMR) algorithm. This algorithm has been developed based on the social behavior of the blind naked mole-rats colony in searching the food and protecting the colony against invasions. By introducing this algorithm, we have tried to overcome many disadvantages of the usual optimization algorithms including getting trapped in local minimums or having low rate of convergence. Using several benchmark functions, we demonstrate the superior performance of the proposed algorithm in comparison with some other well-known optimization algorithms.


INTRODUCTION
The meta-heuristic algorithms have been mostly derived from the behavior of biological systems (for example, genetic algorithm (GA), particle swarm optimization (PSO), stem cells algorithm (SCA)) or physical systems (for example, simulated annealing (SA)).Perhaps the main reason for selecting and developing these algorithms is the simplicity in formulating and understanding their development.
Among the well-known optimization algorithms, we can mention the genetic algorithm (GA).This algorithm is a technique of optimization firstly proposed by Holland (Holland, 1975).This algorithm is based on the idea of evolution in the nature and it looks into the problem on a fully random basis.
This method is based on some biological techniques *Corresponding author.E-mail: mtaherdangkoo@yahoo.com or taherdangkoo@shirazu.ac.ir.Tel: +98-9128243251.like genetic and mutation; the search is carried out in order to find better responses in each generation compared to the previous one.Among the features of the genetic algorithms are its ability to run in parallel way and its capability for searching very large and complicated spaces (Goldberg, 1989;Chang et al., 2010).Another well-known optimization algorithm is particle swarm optimization (PSO) (Clerc and Kennedy, 2002), which was developed for the optimization problems, and its features have so far been proved to be useful for both continuous and discrete functions.In this algorithm each particle is considered as a member of the society using the experiences of its previous particle as well as that of others in order to reach the ultimate goal.This algorithm is able to find the global optimum of the function in question in consequent iterations (Chakraborty et al., 2010).
Simulated annealing (SA) algorithm was presented basically intending to simulate between minimizing the target function of a problem and cooling until the time it gets to a state of basic energy (Kirkpatrick et al., 1983).In this algorithm, the first response is an important parameter and plays an important role and changes according to the type of problem.
Stem cells algorithm (SCA) has been designed based on the biological behavior of stem cells in order to restore or evolve the damaged organ in the matured human body (Taherdangkoo et al., 2011;Taherdangkoo et al., 2012b).This algorithm uses the features of stem cells like selfrenewal and the ability to evolve in a complete organ.
The artificial bee colony (ABC) algorithm has been designed based on the social behavior of honey bees in their colonies looking for food sources.This algorithm has many advantages and some disadvantages caused that some constraints and parameters have been defined in order to resolve the disadvantages of this method and to increase the convergence rate in different problems, yielding a better performance (Karaboga and Basturk, 2007;Akay and Karaboga, 2010).
Among key components of swarm intelligence we can mention self-organization and division of labor.Group cooperation is the key to reach the optimal and ideal response within shortest period of time.
In this paper, we have used BNMR algorithm for numerical functions optimization, and we use some Benchmark functions with high dimensions in order to prove better performance of this algorithm compared with other well-known optimization algorithms.The comparison part indicates better convergence rate and more precision in achieving the optimum response compared with other optimization algorithms.

BLIND, NAKED MOLE-RATS COLONY IN REAL WORLD
The blind naked mole-rats live in underground tunnels in some African countries and in green areas with almost fixed temperature and humidity.The temperature in the tunnels where the blind naked mole-rats live must always be between 30 to 35°C.Each tunnel has nearly a length of three kilometers (Brett, 1991;Bennett and Jarvis, 1988).These animals are the only small mammals that live in community and in large colonies, just like honey bees or ant colonies.Each colony consists of three groups of moles as follows: 1.A queen accompanied by one to three males that carry out the reproduction process.2. Employed moles that search for food sources and build suitable nests for the queen and the offspring's.3. Soldier moles that are responsible for cleaning the tunnels and protect the colony against the invasions.
In a real colony, the number of soldier moles is much higher than the employed moles.The tunnels where the blind naked mole-rats live are built with a unique regularity.These tunnels have chambers built for special Taherdangkoo et al. 3567 purposes: kitchen room is used for collecting the food, toilet room is used for placing the wastes, and when they are filled up, another chamber is dug for this purpose.
There are also some labyrinthine rooms.When invaders enter the tunnel, the soldiers quickly block their break into the tunnel with their wastes.Where their speed for this process is low, one of the soldiers sacrifices his life in order to help other soldiers find adequate time to stop the invaders breakthrough.

BLIND, NAKED MOLE-RATS (BNMR) ALGORITHM
The BNMR algorithm is an optimization algorithm, designed based on the social behaviour of blind naked mole-rats in a large colony.The research process starts from the centre of the colony, where the queen and offspring live.Note that for simplification purposes, we have placed the employed moles and soldier moles in one single group, which hereby is called the employed moles.
In the beginning with the production of the initial population of the blind naked mole-rats colony starts working in the whole problem space on a completely random way.Note that the number of the population is two times of the number of food sources and each of the food sources represents a response for problem space.
Let us define some parameters as follows: Where N is the number of members related to the number of problem's unknown parameters.Initial production of food sources within the parameter's borders is defined as follows: Where x i represents the i th food source and β is a random variable in the interval of [0, 1] and S represents the number of food sources.Hereby, the food sources (responses) in the search process are considered as targets to be found by employed moles (for example, finding the location of food sources and their neighbours, determining the volume of enrichment, discharging food sources and storing them in the kitchen room).
The random movement of employed moles starts from the centre of the colony towards food sources and their neighbours.Moreover, the employed moles carry out the process of digging labyrinthine tunnels to find food sources.
The conditions of food sources in terms of temperature and humidity must always be suitable and almost stable.

Sci. Res. Essays
These conditions are considered as an attenuation coefficient in the form of a random variable in the interval of [0, 1] in our algorithm in order to determine the movement of the employed moles from the target food sources towards their neighbors.In addition, we should consider the underground temperature in our algorithm defined as follows: Where H represents the soil temperature changing with the depth x as its variable.ρ (x) and C(x) are the thermal properties of the soil, i.e. the density and specific heat capacity respectively.ρ and C are variable related to x which means that during the movement, they vary with area's changes, however, due to the simplification of the algorithm, we considered them as constant.
Empirically, the multiplication of ρ and C should be a value in the range of [2 4].∆ T(x, t)   ∆ t shows the rate of the soil temperature changing with the time and it will be updated at each iteration in the proposed algorithm.The symbol f represents the volumetric contribution (in percent) of each element in the compound, while, the subscripts s, a, w indicate the components of the soil, for example sand, air and water, respectively.The sum of these volumetric percents is equal to unity in order to have a weighted averaging sum to obtain ρ C. In moving towards searching the neighbors of food sources, the attenuation coefficient A must be updated in each iteration.When the temperature arrives at zero (for example, in real conditions where temperature is inclined towards 30°C), the search of neighborhoods of food sources is carried out with lower intensity and while the temperature is close to average volume in the interval of [0, 1], the search of the neighborhoods of food sources is carried out with higher intensity.We consider this fact by the following equation: Where T is derived from equation (3) and α is a random variable in the interval of [0, 1], but for the sake of simplification in algorithm's implementation and calculation, we consider α = 0.95as a fixed parameter, and t is the iteration step.
Next, for each food source, two employed moles are sent, so the number of food sources is the half of the number of employed moles in the colony.After finding foods, their original quality and the path to reach them is saved in the memory of employed moles.In returning to the colony, the employed moles that carry information regarding the food sources share the information with others and with the queen.The obtained food sources are categorized with the probability P by the queen within a table sorted from the richest to lowest values, based on both the original quality of those food sources and the shorter paths to reach the food sources from the centre of the colony.The probability P is computed as follows: Where Fitness i is evaluated by its employed moles, FS i is related to the richest food source, R i is related to the route for reaching food source and N is the number of food sources which is the half of the number of employed moles.
The first food source with highest probability is select by the queen.Then, the two employed moles that carry the information of this food source are selected to do the recruitment process, for moving toward the food source.After reaching this food source, one of the two employed moles, which is the head of the group, randomly chooses some of the soldiers and leads the process of collecting the food.The other employed mole becomes the head of another group and by choosing some other soldiers starts searching the neighborhood area of the food source.All accompanying moles will carry information regarding the existence of food resources in the area adjacent to the main food source.The process for collecting the food from the food resources and concurrently searching their neighbors will reduce significantly the time to get to an optimum response in the implementation of the proposed algorithm and consequently increase the convergence rate.This process is repeated for all food resources along with their neighborhoods until no food resource is available.
Should be note, the search for neighbors of each food source is done in different directions starting from the food source.The directions are in 45 degrees from each other.So, there are 8 directions for searching the neighbors of a food source.
The next part of implementing the BNMR algorithm includes the consideration of defending the colony and preventing the invaders' break-in into the tunnels.In implementing our proposed algorithm, those points with low cost function are determined in each iteration and considered as invaders and they are eliminated from the implementation process (that is, they are not considered as members of the whole population involved in the process).The number of eliminated points in each iteration is augmented related with respect to that in the previous iteration by a factor, which is fixed by the designer.It is computed by the following equation: Where ζ ≥1 is a factor, which is set by the designer and B is the number of eliminated points for i th food source Taherdangkoo et al. 3569 in iteration t.Note that we replace the eliminated points in each iteration by new ones randomly selected from the space of selection.Using this idea, we apply some kinds of mutation process in the proposed algorithm, so that the new members having new information will prevent the algorithm being trapped in local minimums.
The pseudo code of the BNMR algorithm as follows: Sort the initial population based on their objective function value.

Test Benchmark Function
In order to assess the performance of our proposed algorithm that of other introduced optimization algorithms, we have compared the efficiency and the accuracy of BNMR algorithm using the 20 Benchmark functions (Suganthan, 2005) and the accuracy of all the mentioned algorithms using eight Benchmark functions (Suganthan, 2005).For all Benchmark functions, 50 independent runs were applied with different random seeds for generating the random variables, each contains 5000 iterations, and the population size was set to 100 for all optimization algorithms mentioned before except the BNMR algorithm.Note that when the space size goes up (for example more that 50 (Dimension>50)) the performance of optimization algorithms drops dramatically because the algorithm will face with several optimal answers.In this case of distribution, the selected answer in that considered space also adds complexity to the subject randomly.Table 1 shows the list of Benchmark functions.

Settings for algorithms
The number of Maximum generation and population size are common control parameters of the algorithms.In the experiments, maximum number of generation for the dimension of 10, 20, 30 and 40 are considered.Table 2 shows other control parameters of the algorithms and the schemes, with the value of these control parameters applied for GA, PSO, SA, SCA, ABC and BNMR algorithm.

Results comparison
We considered the performance of the BNMR algorithm and comparison between all mentioned optimization algorithms using 24 benchmark functions (Suganthan et al., 2005).These 24 functions consist of two categories.The first one is Unimodal functions: { F 1 (Shifted Sphere Function), F 2 (Shifted Schwefel's problem 1.2), F 3 (Shifted Rotated High Conditioned Elliptic Function), F 4 (Shifted   -32.7, 32.7] [-32.7, 16] 20  the properties of these 24 functions are available in (Suganthan et al., 2005).Each algorithm has been executed 50 independent times and every time with different random seeds for generating the random variables for each Benchmark function.We want to show the comparison of the convergence of all the mentioned optimization algorithms by the number of fitness evaluations and the number of iterations.As can be seen, the BNMR algorithm has better convergence than the other optimization algorithms.In the other words, this proves that BNMR algorithm has the           For different dimensions such as 10, 20, 30 and 40 the mean and standard derivations (SD) function values of the best solution found by the algorithms have been saved by GA, PSO, SA, SCA, ABC and BNMR that are shown in Tables 3 to 10.
Meanwhile, we have studied the effect of scalability on the computational complexity of the BNMR algorithm for Weierstrass function as described in (Suganthan et al., 2005).We also have computed the CPU Time; the time when the algorithm achieves its best results, of each algorithm on different mentioned Benchmark functions.Tables 11 to 18 show these times.The characteristics of the computer on which the algorithms were run are: Macintosh OS, Two 2.93 GHz 6-Core Intel, 64 GB Ram, and the graphic card is ATI Radeon HD 5870 1 GB.
Moreover, code execution time ( T 0 ), execution time of Weierstrass function for 200,000 evaluations ( T 1 ) and for five runs, mean of the BNMR algorithm execution times on Weierstrass function for 200,000 evaluations ( ˆ T 2 ) were computed.The complexity of algorithm was computed by {( ˆ T 2 − T 1 ) /T 0 } and given in Table 19.
Most of the mentioned optimization algorithms in this paper achieved good results but not as well as the proposed method (BNMR algorithm) since the other algorithms are intensively depended on the adjustment of their parameters.For instance, in ABC algorithm, we need to change its parameters (for example, MR, SF and etc.) for each function to obtain a good result for that function and due to this fact the algorithm cannot be an ideal one.

DISCUSSION
The genetic algorithm implements the law of survival with a focus on the existing solutions in order to reach a better solution.The genetic algorithm suffers a lot from too much dependency on such techniques such as selection, mutation, crossover, etc. with the conditions of the problem and the initial conditions and a weak selection of these constraints and parameters have got a remarkable influence on the function of genetic algorithm (GA).Of course, despite the existence of optimum methods presented to complete the GA, it is still damaged in the  problems with continuous and discontinuous spaces with high dimensions of early convergence or repeated interruptions.For instance, where some of weak people join the referenced set and form it and the continuation of  Sci.Res.Essays examinations in the sample space is stopped, one of the reasons could be this issue that a very appropriate chromosome is selected for reproduction.As a result, the offspring becomes similar to its parent and this is when the chromosomes are very similar to each other.Thus, before reaching the optimal solution, the early convergence is formed.Of course, through applying constraints, we can prevent it, but this is when the dimensions of the issue do not rise.The particle swarm optimization (PSO) has got a very simple implementation, but it suffers from early convergence.Although PSO algorithm has got a more reasonable speed than other optimization algorithms (Deep and Bansal, 2009), but it cannot optimize the quality of solutions by increasing the number of repetitions.This issue is more visible when examining and optimizing the multi-model problems.The reason for its occurrence in gbest PSO is that the particles become convergent in one specific point, while this point is located on one line, between the best global position and the best individual position.The other problem arises from too much dependency to regulating the PSO algorithm parameters (Kang et al., 2012b).
The simulated annealing (SA) algorithm (Oliveira et al., 2012), Initial solution and neighboring formation mechanism are important parameters of the SA algorithm that play significant roles in reaching into the optimum solution.
After selecting these two parameters, an initial temperature analogous to the kinetic energy is attributed to the system.Selecting initial value is arbitrary, but is applied depending on the behavior of the function at the start point.This SA algorithm process raises the time needed to reach the optimum solution and also increases the repetitions.SA is scrutinized in different problems and by many researchers but one of its main deficiencies is the large time in reaching into the optimum solution that is still unsolved.
The self-renewal process plays an important role in the SCA algorithm (Taherdangkoo et al., 2012a), but the presence of symmetry (symmetric and asymmetric propagation) has increased the time of obtain the optimum solution and it practically reaches the final solution by high repetitions.This is considered as one of the main problems of this algorithm.
The artificial bee colony (ABC) algorithm has got more accuracy and speed, than other optimization algorithms, but the use of roulette wheel in selecting the employed bees and the relation of employed bees and onlooker bees has not been turned into a model appropriately, and this issue does not make this algorithm distinguished from other optimization algorithms, and finally a research for modified of basic of artificial bee colony algorithm, was able to solve the main problem of ABC algorithm by using controlled of phase and magnitude and could achieve reasonable results (that should be note that in our proposed method, the same algorithm has been used for comparison with proposed methods, but they have been included within the context as ABC algorithm) (Kang et al., 2011(Kang et al., , 2012a)).
Other optimization algorithms exist such as Bat algorithm (Yang, 2010), and etc.Although these almost new algorithms have solved the problems arising from other optimization algorithms, but through implementing relatively more constraints and conditions, they have more complexity compared to them in implementation.
Although the Bat algorithm has been able to combine and use the advantages of other optimization algorithms with echolocation of the bat, but it has given much complexity to the algorithm and we should not neglect its high dependency on the initial conditions and constraints of the problem.In this paper, we have tried to cover all problems extracted from other optimization algorithms, while no additional constraint has been used in implementing the algorithm and flexibility of the algorithm is maintained.The observation of results is a good evidence of the aforesaid issue.

Conclusion
We have introduced a novel meta-heuristic algorithm for addressing the global optimization problems.The new algorithm is based on social behavior of blind naked mole-rats in finding food sources and protecting the colony against invasion.We have tested the proposed algorithm on several Benchmark functions and the obtained results compared with the other optimization algorithms have demonstrated the superior performance of the proposed algorithm.
Figures 1, 2, 3 and 4 show the convergence of the BNMR algorithm on the Benchmark functions F 1 to F 20 by the number of fitness evaluations, and Figures 5, 6, 7 and 8 show the convergence of all the mentioned algorithms on the Benchmark functions F 21 , F 22 , F 23 , F 24 by the number of fitness evaluations, and Figures 9, 10, 11 and 12 show the performance of all the mentioned algorithms on Ackley, Rosenbrock, Schwefel and Weierstrass functions by the number of Iteration.

Figure 3 .
Figure 3.The results of applying the BNMR on F 11 −F 15 (Convergence of Functions F 11 −F 15 ).

Figure 4 .
Figure 4.The results of applying the BNMR on F 16 −F 20 (Convergence of Functions F 16 −F 20 ).

Figure 5 .
Figure 5.The results of applying the different algorithms on F 21 (Convergence of Function F 21 for all algorithms).

Figure 6 .
Figure 6.The results of applying the different algorithms on F 22 (Convergence of Function F 22 for all algorithms).

Figure 7 .
Figure 7.The results of applying the different algorithms on F 23 (Convergence of Function F 23 for all algorithms).

Figure 8 .
Figure 8.The results of applying the different algorithms on F 24 (Convergence of Function F 24 for all algorithms).

Figure 9 .
Figure 9.The results of applying different algorithms on Ackley function.

Figure 10 .
Figure 10.The results of applying different algorithms on Rosenbrock function.

Figure 11 .
Figure 11.The results of applying different algorithms on Schwefel function.

Figure 12 .
Figure 12.The results of applying different algorithms on Weierstrass function.

Table 1 .
The list of Benchmark functions.

Table 2 .
Values of parameters of each of six algorithms.

Table 3 .
The final results of mean and standard derivations (SD) of different algorithms for F 21 function.

Table 4 .
The final results of Mean and Standard Derivations (SD) of different algorithms for F 22 function.

Table 5 .
The final results of Mean and Standard Derivations (SD) of different algorithms for F 23 function.

Table 6 .
The final results of Mean and Standard Derivations (SD) of different algorithms for F 24 function.

Table 7 .
The final results of Mean and Standard Derivations (SD) of different algorithms for Ackley function.

Table 8 .
The final results of Mean and Standard Derivations (SD) of different algorithms for Rosenbrock function.

Table 9 .
The final results of Mean and Standard Derivations (SD) of different algorithms for Schwefel function.

Table 10 .
The final results of Mean and Standard Derivations (SD) of different algorithms for Weierstrass function.

Table 11 .
CPU time obtained by applying the different algorithms on F 21 function.

Table 12 .
CPU time obtained by applying the different algorithms on F 22 function.

Table 13 .
CPU time obtained by applying the different algorithms on F 23 function.

Table 14 .
CPU time obtained by applying the different algorithms on F 24 function.

Table 15 .
CPU time obtained by applying the different algorithms on Ackley function.

Table 16 .
CPU time obtained by applying the different algorithms on Rosenbrock function.

Table 17 .
CPU time obtained by applying the different algorithms on Schwefel function.

Table 18 .
CPU time obtained by applying the different algorithms on Weierstrass function.

Table 19 .
Time complexity of the BNMR algorithm on Weierstrass function.