Abstract
This study proposes the Fire Hawk Optimizer (FHO) as a novel metaheuristic algorithm based on the foraging behavior of whistling kites, black kites and brown falcons. These birds are termed Fire Hawks considering the specific actions they perform to catch prey in nature, specifically by means of setting fire. Utilizing the proposed algorithm, a numerical investigation was conducted on 233 mathematical test functions with dimensions of 2–100, and 150,000 function evaluations were performed for optimization purposes. For comparison, a total of ten different classical and new metaheuristic algorithms were utilized as alternative approaches. The statistical measurements include the best, mean, median, and standard deviation of 100 independent optimization runs, while well-known statistical analyses, such as Kolmogorov–Smirnov, Wilcoxon, Mann–Whitney, Kruskal–Wallis, and Post-Hoc analysis, were also conducted. The obtained results prove that the FHO algorithm exhibits better performance than the compared algorithms from literature. In addition, two of the latest Competitions on Evolutionary Computation (CEC), such as CEC 2020 on bound constraint problems and CEC 2020 on real-world optimization problems including the well-known mechanical engineering design problems, were considered for performance evaluation of the FHO algorithm, which further demonstrated the superior capability of the optimizer over other metaheuristic algorithms in literature. The capability of the FHO is also evaluated in dealing with two of the real-size structural frames with 15 and 24 stories in which the new method outperforms the previously developed metaheuristics.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Optimization is the process of decision-making between multiple approaches to achieve the best performance for dealing with a specific system problem. In recent decades, the importance of optimization in performance improvements of different engineering and economic design problems has gained increasing awareness. Specifically, the best decision or solution for a predefined design problem is identified by evaluating different alternative approaches. The predefined measure for the quality of a decision is considered by determining an objective function which is addressed in most of the cases as a performance evaluation index. In other words, optimization concerns the process of selecting the best decision among multiple alternative choices by considering the satisfaction of an objective function.
Regarding the rapid progression of various software programs and high-speed parallel processors in the computer science and technology fields, optimization has received heightened attention especially by engineering and economic experts. However, most calculus-based optimization algorithms are incapable of finding the global optimum solutions, which is considered the main deficiency of these algorithms. For instance, gradient-based algorithms require differentiable objective functions that are not achievable in dealing with complex optimization problems. In this regard, the metaheuristic algorithms have been proposed as successful practical methods that provide acceptable accuracy for different optimization purposes. The history of metaheuristic algorithms (Sörensen et al. 2018) can be broken down into five time-periods: (1) “Pre-Theoretical Period” (before 1940) had limited formal presentation of metaheuristics; (2) “Early Period” (1940–1980) witnessed the formal presentation of the mathematical formulation of metaheuristics; (3) “Method-Centric Period” (1980–2000) saw the introduction and application of many metaheuristic algorithms in different fields; (4) “Framework-Centric Period” (2000-present) observed the utilization of metaheuristics as strong frameworks in different optimization fields; and (5) “Scientific or Future Period” where the metaheuristics are assumed to turn into matter of science instead of art.
Based on the developments of different metaheuristic algorithms in recent decades, four categories can be determined by considering the main concept of these algorithms. The first category, “Evolutionary Algorithms,” represents the algorithms that are developed based on biological reproduction and evolution, such as the Genetic Algorithm (GA) (Holland 1984), Differential Evolution (DE) (Storn and Price 1997), and Biogeography-Based Optimizer (BBO) (Simon 2008). The second category includes the algorithms that are developed based on “Swarm Intelligence,” such as the Particle Swarm Optimization (PSO) (Eberhart and Kennedy 1995), Ant Colony Optimization (ACO) (Dorigo et al. 1996), and Firefly Algorithm (FA) (Yang 2012). In the third category, the “Physics-Inspired Algorithms” are the Harmony Search (HS) (Geem et al. 2001), Gravitational Search Algorithm (GSA) (Rashedi et al. 2009), Big-Bang Big-Crunch (BBBC) (Erol and Eksin 2006), Charged System Search (CSS) (Kaveh and Talatahari 2010a, b, c, d), Wind Driven Optimization (WDO) (Bayraktar et al. 2010), Multi-verse Algorithm (MVO) (Mirjalili et al. 2016), Rain Fall Optimization (RFO) algorithm (Aghay Kaboli et al. 2017), Chaos Game Optimization (CGO) algorithm (Talatahari and Azizi 2020b, 2021a), Crystal Structure Algorithm (Talatahari et al. 2021a, b, c, d, e), Material Generation Algorithm (Talatahari et al. 2021a), and Atomic Orbital Search (Azizi 2021). In the last category, the algorithms are developed based on the lifestyle of humans and animals and include the Bees Algorithm (BA) (Pham et al. 2006), Imperialistic Competitive Algorithm (ICA) (Atashpaz-Gargari and Lucas 2007), Bat Inspired Algorithm (BIA) (Yang 2010a), Sine Cosine Algorithm (SCA) (Mirjalili 2016), Jaya Algorithm (JA) (Rao 2016), Whale Optimization Algorithm (WOA) (Mirjalili and Lewis 2016), Grey Wolf Optimizer (GWO) (Mirjalili et al. 2014), Harris Hawks Optimization (HHO) (Heidari et al. 2019), Butterfly Optimization Algorithm (BOA) (Arora and Singh 2019), Pity Beetle Algorithm (PBA) (Kallioras et al. 2018), Arithmetic Optimization Algorithm (AOA) (Abualigah et al. 2021a), Aquila Optimizer (AO) (Abualigah et al. 2021b), Interior search algorithm (ISA) (Gandomi 2014), and Drone Squadron Optimization (DSO) (de Melo and Banzhaf 2018). It should be noted that some of the standard algorithms have been improved or hybridized for specific applications (Azizi et al. 2019a, b, 2020a, b; Sadollah et al. 2018; Talatahari and Azizi 2020a, c, 2021b; Talatahari et al. 2021b, d, e).
In recent years, some newer metaheuristics have not yet been specifically categorized into the mentioned classifications. For example, Hayyolalam and Pourhaji Kazem (2020) proposed the Black Widow Optimization (BWO) algorithm, which mimics the unique mating behavior of black widow spiders in nature. Nematollahi et al. (2020) developed the Golden Ratio Optimization Method (GROM) as a novel metaheuristic algorithm inspired by the golden ratio of plant and animal growth in nature. Zhang and Jin (2020) presented the Group Teaching Optimization Algorithm (GTOA) that mimics the group teaching mechanism of humans. Li and Tam (2020) developed the Virus Spread Optimization (VSO) as a novel metaheuristic algorithm inspired by the spread of viruses among hosts. Alsattar et al. (2020) proposed the Bald Eagle Search (BES) algorithm for optimum design purposes in which the hunting intelligence and strategy of bald eagles in searching fishes is followed. Feng et al. (2021) presented the Cooperation Search Algorithm (CSA), which is motivated by the team cooperation behaviors in modern enterprises. Ghasemian et al. (2020) developed the Human Urbanization Algorithm (HUA) as a new metaheuristic algorithm that mimics the human behaviors and actions for improving life situations and urbanization. Kaveh et al. (2020) proposed the Black Hole Mechanics Optimization (BHMO) for optimization purposes, which is based on the mechanics of black holes in space. Braik et al. (2021) presented Capuchin Search (CSA) as a novel metaheuristic algorithm for optimum design purposes based on the dynamic behavior of capuchin monkeys. Ahmia and Aider (2019) developed Monarchy Metaheuristic (MN) for optimization purposes that is inspired by the monarchy government system. Brammya et al. (2019) proposed the Deer Hunting Optimization Algorithm (DHOA) as an optimization algorithm, which considers the behavior humans exhibit when hunting deer. Besides, the most important application of these algorithms can be in dealing with engineering design problems including the stress-based topology optimization by Xia et al. (2018), material and shape optimization of structures by Wang et al. (2020), structural engineering design by Zhao et al. (2018), optimization of phase change material and insulation layer thickness (Daqiqnia et al. 2021), performance-based structural design optimization by Gholizadeh et al. (2020), and Optimal design of structures by Gholizadeh and Salajegheh (2009).
In this paper, the Fire Hawk Optimizer (FHO) is proposed as a novel metaheuristic algorithm inspired by the foraging behavior of whistling kites, black kites, and brown falcons. These birds catch prey in nature by means of setting fire and, thus, are aptly called Fire Hawks. A numerical investigation was conducted to evaluate the performance of the FHO algorithm by considering 233 mathematical test functions with dimensions of 2–100 and completing 150,000 function evaluations for optimization purposes. For comparative purposes, a total number of 10 different classical and new metaheuristic algorithms were analyzed as alternative approaches. The best, mean, median and standard deviation results of 100 independent optimization runs were obtained for comparison, and Kolmogorov–Smirnov (KS) test, Wilcoxon (W) sign rank test, Mann Whitney (MW) test, Kruskal Wallis (KW) test, and Post Hoc (PH) analyses were conducted accordingly. Moreover, two of the latest Competitions on Evolutionary Computation (CEC), including CEC 2020 on bound constraint problems and CEC 2020 on real-world optimization problems, were utilized for further performance evaluation of the FHO algorithm and comparison to other metaheuristic algorithms in the literature. The capability of the FHO is also evaluated in dealing with two of the real-size structural frames with 15 and 24 stories in which the new method outperforms the previously developed metaheuristics.
This research’s novelty can be seen from inspirational and computational points of view. The foraging behavior of fire hawks is utilized for the first time in this paper for developing a novel metaheuristic algorithm. Besides, the complexity level of the utilized test functions is also a different novelty aspect of this paper. It should be noted that the overall performance of numerous algorithms must be evaluated under the same conditions and with similar problems, and under diverse cases, the superiority of each algorithm cannot be proved or disproved by means of different examples and datasets. In this regard, the benchmark test functions of well-known competitions on evolutionary computation should be utilized for having a fair judgment so the capability of FHO as a novel algorithm has been evaluated by utilization of different sets of CEC test problems. This level of complexity in choosing test factions have been utilized for the first time in evaluating novel algorithms. However, the FHO algorithm’s advantages include being parameter-free, having quick convergence behaviour, and having the lowest possible objective function evaluations in dealing with different design examples. On the other hand, it cannot produce accurate answers; in other words, the FHO algorithm is an approximation algorithm like other metaheuristic algorithms. Nonetheless, a plethora of metaheuristic algorithms has been proposed through miscellaneous inspirational concepts from nature, which the mathematical models and the specific aspects of the algorithms in their searching groups should be distinct and novel so as to prepare and backup the research from the stable point. PSO, for example, is one of the pioneer algorithms in the metaheuristic area, in which the position updating process by the solution candidates is conducted using the global best and local best of each particle. In stark contrast, in the FHO algorithm, the position updating process is carried out by utilizing the better solution’s not the global best, and the mean of the solution candidates, which makes the searching process avoid entrapping in local optimum points. Furthermore, in the Genetic Algorithm (GA), a new solution candidate is created by combining two populations, so there is a possibility of reaching a solution that can be entrapped in the local optimum point. However, in the FHO algorithm, the mean of solution candidates in the specific territory is utilized to avoid entrapment in the local optimum and provide solutions that can finally reach the global optimum.
The main contribution of this paper is as follows:
-
Fire Hawks bizarre behaviour spread fire intentionally by carrying burning sticks in their beaks, and talons are examined and analysed to develop a mathematical model.
-
A unique nature-inspired FHO algorithm is developed using this model.
-
The FHO algorithm’s solution updating depends on the preys’ new position and safe places under/outside the fire.
-
FHO’s performance is extensively evaluated against a set of 233 benchmark functions and well-known CEC design examples. It is compared to a plethora of state-of-the-art metaheuristic algorithms.
The rest of the paper is divided into the following sections. In Sect. 2, the inspiration and mathematical model of the proposed FHO algorithm are presented. The numerical investigations, including the mathematical test functions, are indicated in Sect. 3. Alternative Metaheuristic algorithms used in this paper are described in Sect. 4. Furthermore, numerical results of the mathematical functions, statistical analysis, bound constraint benchmark problems of CEC 2020, and computational complexity and cost analysis are demonstrated in Sects. 5, 6, 7, and 8, respectively. Sections 9 and 10 indicate real-world constrained optimization problems of CEC 2020 and structural optimization, respectively. Finally, in Sect. 11, the core findings of this study are presented as concluding remarks.
2 Fire Hawk optimizer (FHO)
In this section, the inspiration concept of the FHO alongside the mathematical model of the proposed metaheuristic algorithm are illustrated.
2.1 Inspiration
Native Australians utilize fire as an effective tool to control and maintain balance of the local ecosystem and landscape, which has been a part of cultural and ethnical traditions for many years. Most of the time, the fires that are started on purpose or may naturally occur due to lightning can be spread by people and other factors, increasing the vulnerability of the native landscape and wildlife. Moreover, whistling kites, black kites, and brown falcons are also responsible for spreading fires across the country—this alternative cause has only been realized recently. These birds, known as Fire Hawks, try to spread fire intentionally by carrying burning sticks in their beaks and talons, which is reported as a destructive phenomenon in nature. Figure 1 provides images showing the behavior of these birds around fires.
As a mechanism to control and capture their prey, the birds pick up burning sticks and drop them in other unburned places in order to set small fires. These small fires scare the prey, including rodents, snakes, and other animals, and force them flee in a most hasty and nervous way that makes it much easier for the hawks to catch.
2.2 Mathematical model
The FHO metaheuristic algorithm mimics the foraging behavior of fire hawks, considering the process of setting and spreading fires and catching prey. At first, a number of solution candidates (\(\mathrm{X}\)) are determined as the position vectors of the fire hawks and prey. A random initialization process is utilized to identify the initial positions of these vectors in the search space.
where \({\mathrm{X}}_{\mathrm{i}}\) represents the ith solution candidate in the search space;\(\mathrm{d}\) represents the dimension of the considered problem; \(\mathrm{N}\) is the total number of solution candidates in the search space; \({\mathrm{x}}_{\mathrm{i}}^{\mathrm{j}}\) is the jth decision variable of the ith solution candidate; \({\mathrm{x}}_{\mathrm{i}}^{\mathrm{j}}(0)\) represents the initial position of the solution candidates; \({\mathrm{x}}_{\mathrm{i},\mathrm{min}}^{\mathrm{j}}\) and \({\mathrm{x}}_{\mathrm{i},\mathrm{max}}^{\mathrm{j}}\) are the minimum and maximum bounds of the jth decision variable for the ith solution candidate; and \(\mathrm{rand}\) is a uniformly distributed random number in the range of [0,1].
In order to determine the locations the Fire Hawks in the search space, the objective function evaluation for the solution candidates considers the selected optimization problem. Some of the solution candidates with better objective function values are represented as Fire Hawks, while the rest of the solution candidates are the prey. The selected Fire Hawks are utilized for spreading fires around the prey in the search space to make the hunting easier. Besides, the global best solution is assumed to be the main fire that is first utilized by the Fire Hawks to spread fires through the search space (nature). In Fig. 2a, b, the schematic presentation of these aspects is provided, which are mathematically presented as follows:
where \({\mathrm{PR}}_{\mathrm{k}}\) is the kth prey in the search space regarding the total number of m preys; and \({\mathrm{FH}}_{\mathrm{l}}\) is the lth fire hawk considering a total number of n fire hawks in the search space.
In the next phase of the algorithm, the total distance between the Fire Hawks and the prey is calculated. As a result, the nearest prey to each bird is determined so that the effective territory of these birds is distinguished. It should be noted that the nearest prey to the first Fire Hawk with the best objective function value is determined, while the territory of the other birds are considered by means of the remaining prey. Figure 3 provides an illustration of this perspective, where \({\mathrm{D}}_{\mathrm{k}}^{\mathrm{l}}\) is determined by means of the following equation:
where \({\mathrm{D}}_{\mathrm{k}}^{\mathrm{l}}\) is the total distance between the lth fire hawk and the kth prey; \(\mathrm{m}\) is the total number of prey in the search space; \(\mathrm{n}\) is the total number of fire hawks in the search space; and (\({\mathrm{x}}_{1},{\mathrm{y}}_{1}\)) and (\({\mathrm{x}}_{2},{\mathrm{y}}_{2}\)) represent the coordinates of the Fire Hawks and prey in the search space.
After conducting the mentioned procedure for measuring the total distance between the Fire Hawks and prey, the territory of these birds is distinguished by means of the nearest prey around them. By classifying the Fire Hawks and prey, the searching process of the algorithm is configured. It should be noted that the Fire Hawk with the better objective function value selects the best nearest prey in the search space for its specific territory. Then, the other Fire Hawks accomplish the next nearest prey in the search space, which supports that the strongest Fire Hawks accomplish perform more successful hunting than the weaker birds. In Fig. 4, the schematic presentation of determining Fire Hawks’ territory in the search space is provided.
In the next phase of the algorithm, the Fire Hawks collect burning sticks from the main fire in order to set fire in the selected area. In this stage, each bird picks up a burning stick then drops it in its specific territory to force the prey to hastily flee. Meanwhile, some birds are eager to use the burning sticks from other Fire Hawks’ territories; therefore, these two behaviors can be utilized as position updating procedures in the main search loop of FHO, as indicated in the following equation:
where \({\mathrm{FH}}_{\mathrm{l}}^{\mathrm{new}}\) is the new position vector of the lth Fire Hawk (\({\mathrm{FH}}_{\mathrm{l}}\)); \(\mathrm{GB}\) is the global best solution in the search space considered as the main fire; \({\mathrm{FH}}_{\mathrm{Near}}\) is one of the other Fire Hawks in the search space; and \({\mathrm{r}}_{1}\) and \({\mathrm{r}}_{2}\) are uniformly distributed random numbers in the range of (0, 1) for determining the movements of Fire Hawks toward the main fire and the other Fire Hawks’ territories (see Fig. 5).
In the next phase of the algorithm, the movement of prey inside the territory of each Fire Hawk is considered a key aspect of animal behavior for the position updating process. When a burning stick is dropped by a Fire Hawk, the prey decide to hide, run away, or will run towards the Fire Hawk by mistake. These actions can be considered in the position updating process by using the following equation:
where \({\mathrm{PR}}_{\mathrm{q}}^{\mathrm{new}}\) is the new position vector of the qth prey (\({\mathrm{PR}}_{\mathrm{q}}\)) surrounded by the lth Fire Hawk (\({\mathrm{FH}}_{\mathrm{l}}\)); \(\mathrm{GB}\) is the global best solution in the search space considered as the main fire; \({\mathrm{SP}}_{\mathrm{l}}\) is a safe place under the lth Fire Hawk territory; and \({\mathrm{r}}_{3}\) and \({\mathrm{r}}_{4}\) are uniformly distributed random numbers in the range of (0, 1) for determining the movements of prey toward the Fire Hawks and the safe place (see Fig. 6).
Besides, the prey may have movements toward the other Fire Hawks’ territory while there is a possibility in which the preys may get closer to the Fire Hawks in the near ambushes or even try to hide in a safer place outside the Fire Hawk’s territory in which they are entrapped. These actions can be considered in the position updating process by using the following equation (Fig. 7):
where \({\mathrm{PR}}_{\mathrm{q}}^{\mathrm{new}}\) is the new position vector of the qth prey (\({\mathrm{PR}}_{\mathrm{q}}\)) surrounded by the lth fire hawk (\({\mathrm{FH}}_{\mathrm{l}}\)); \({\mathrm{FH}}_{\mathrm{Alter}}\) is one of the other fire hawks in the search space; \(\mathrm{SP}\) is a safe place outside the lth Fire Hawk’s territory; \({\mathrm{r}}_{5}\) and \({\mathrm{r}}_{6}\) are uniformly distributed random numbers in the range of (0, 1) for determining the movements of preys toward the other Fire Hawks and the safe place outside the territory.
Based on the fact that the safe place in nature is a place that most of the animals gather to gather in order to remain safe and sound during a hazard, the mathematical presentation of \({\mathrm{SP}}_{\mathrm{l}}\) and \(\mathrm{SP}\) are formulated as follows:
where \({\mathrm{PR}}_{\mathrm{q}}\) is the qth prey surrounded by the lth fire hawk (\({\mathrm{FH}}_{\mathrm{l}}\)); \({\mathrm{PR}}_{\mathrm{k}}\) is the kth prey in the search space.
It should be noted that the territory of each fire hawk is assumed as a circular area for schematic presentation purposes, so the exact definition of the territory is dependent on the overall distances of the prey and the considered fire hawk. In other words, when prey is positioned in a specific fire hawk’s territory, it is assumed to be affected by the considered fire hawk and not the other ones, so the number of the preys and their distances to the considered fire hawk determine the limits of the territory of this fire hawk. Meanwhile, the possibility of the preys being outside their own territory is also considered in the position updating process regarding the fact that the preys should be affected by the fire hawks from other territories. The number of preys in each search loop is the total number of solution candidates minus the number of fire hawks determined randomly through the Brownian motion with a Gaussian distribution as one of the well-known distributions utilized in randomization procedures.
The general aspects of FHO including the boundary violation of solution candidates alongside the termination criterion are also considered in the mathematical model of this algorithm. In this regard, a mathematical flag is implemented in the FHO in which a boundary control for violating decision variables is determined while a predefined number of objective function evaluations or iterations can be utilized as termination criteria. In Fig. 8, the pseudo-code of the FHO algorithm is provided, and Fig. 9 presents the flowchart of this algorithm.
3 Mathematical test functions
In order to conduction a comprehensive investigation of FHO, a total of 233 mathematical test functions were collected, which are briefly described in this section. Meanwhile, all tests to evaluate the performance of FHO algorithm were conducted using a PC with the detailed parameters shown in Table 1. These functions have different dimensions varying from 2 to 100 and are among the most well-known unconstrained mathematical test functions in the field of global optimization. In this selection, 117 test functions were selected with minimum dimension of 2 and maximum dimension of 4 (F1–F117), which are denoted “Small Scale (SS) functions” in this paper. In addition, 58 selected functions with a dimension of 50 (F118–F175) were named “50D functions,” and 58 functions (F176–F233) with a dimension of 100 were termed “100D functions.” The general description of these test functions are provided in Table 2, and the 3D plots are provided in Figs. 10, 11, 12, 13. The fully-detailed mathematical presentations of these test functions were obtained from (Jamil and Yang 2013), (Jamil et al. 2013), (Yang 2010b), and (Liang et al. 2005).
4 Alternative metaheuristics
Regarding the fact that the overall performance of FHO should be evaluated by comparative means, a total of 16 metaheuristic algorithms, particularly latest and most important algorithms in the fields of optimization and metaheuristics, were chosen for this purpose. These include the Bat-Inspired Algorithm (BIA), Big-Bang Big-Crunch (BBBC) algorithm, Butterfly Optimization Algorithm (BOA), Cyclical Parthenogenesis Algorithm (CPA), Charged System Search (CSS) algorithm, Galactic Swarm Optimization (GSO), Jaya Algorithm (JA), Rain Fall Optimization (RFO) algorithm, Tug of War Optimization (TWO), Wind Driven Optimization (WDO) algorithm, Whale Optimization Algorithm (WOA) algorithm, Grey Wolf Optimizer (GWO), and the Ray Optimization Algorithm (ROA). It should be noted that some of these algorithms require a number of parameters to be tuned to improve their performance. Therefore, a parameter investigation process was conducted for this purpose, in which the most successful sets of parameters were derived from the literature. Table 3 specifies the parameters of these algorithms.
5 Numerical results of the mathematical functions
The results of FHO alongside the other 10 metaheuristic algorithms in dealing with the considered 233 mathematical test functions are described in this section. A total number of 150,000 objective function evaluations were considered while a tolerance of \(1\times {10}^{-12}\) is considered as the stopping criterion for the FHO and the selected alternatives. A total number of 100 optimization runs were also considered for statistical purposes in order to calculate the mean and standard deviation of the optimization results. Besides, a fixed random state was also considered in dealing with the FHO and the other alternatives in order to have a comparative investigation under equal condition. Since a total of 233 test functions were utilized for comparative purposes, a tolerance of \(1\times {10}^{-12}\) is achievable as stopping criterion of the optimization process due to the fact that all of the alternative algorithms have been implemented individually in this study and none of the previously published results have been utilized in this purpose which increase the complexity levels of the study. It also should be noted that this number of test functions are utilized for the first time for testing a metaheuristic algorithm; thus for a smaller number of test functions, a smaller tolerance as \({10}^{-30}\) or \({10}^{-50}\) can be utilized.
The best values of the considered 100 optimization runs are all presented in Table 4 for the SS, 50D and 100D mathematical test functions. Based on the results, FHO outranks the other metaheuristic algorithms in most of the cases. The mean values of the Function Evaluations (Fun. Evl.) in each of the conducted optimization runs regarding the alternative metaheuristics are presented in Table 27 in the Appendix. It is obvious that FHO is able to converge to better results than the other metaheuristics with a minimum number of required function evaluations which represents the ability of this algorithm to deal with computational complexity issues.
The convergence curves of the 100 optimization runs for FHO in Fig. 14 help to clarify the algorithm’s ability to deal with the mathematical test functions, in which the best, median and worst runs are highlighted for clarification. It can be concluded that FHO performs a fast optimization procedures in most of the cases, while completing the predefined 150,000 objective function evaluations is not required to achieve the tolerance of \(1\times {10}^{-12}\).
6 Statistical analysis
As the mean and standard deviation of the results do not effectively represent the overall performance of the algorithms in dealing with test functions, a complete statistical analysis was performed using the most conclusive statistical tests. Specifically, the Kolmogorov Smirnov (KS) test was used for evaluating the normality of results, the Mann Whitney (MW) test for two-by-two comparison of the summation of different metaheuristics’ ranks, Wilcoxon (W) signed ranks test for two-by-two comparison of the summation of different algorithms’ ranks, and the Kruskal Wallis (KW) test for comparing the mean of the metaheuristic algorithms’ ranks for representing the overall rankings of these algorithms. A Post Hoc (PH) analysis was also conducted for comprehensive comparative purposes.
6.1 Kolmogorov Smirnov (KS) test
The KS test was utilized to determine if the parametric or non-parametric statistical tests should be conducted on the collected dataset, considering the mathematical test functions and alternative metaheuristic algorithms. Specifically, if the p value of this statistical test is less than 0.05, the dataset is not distributed normally and the non-parametric test should be considered. If p value > 0.05, then parametric test should be conducted. Table 5 provides the results of this test for the collected dataset in the previous section regarding the SS, 50D, and 100D test functions. Based on the results, it is obvious that the non-parametric statistical tests, including the W, MW, and KW tests, should be determined for further investigation.
6.2 Mann Whitney (MW) test
In this statistical test, the null hypothesis denotes the difference between two randomly selected variables from different datasets by considering the summation of the variables’ ranks. In other words, the statistical dataset with smaller summation of ranks has better statistical behavior. In Table 6, the results of the MW test, including the summation of ranks for different metaheuristics in dealing with the SS, 50D and 100D functions, are presented in a two-by-two manner. In this table, if the summations of ranks regarding MW test in the second rows are lower than the values in the first rows, FHO is superior to the other metaheuristics. It should be noted that the proposed FHO has lower summation of ranks in most of the cases, which verifies its superiority over the other algorithms.
6.3 Wilcoxon (W) signed ranks test
In this section, the results of the W test are presented, in which the means of ranks are considered in a two-by-two manner for comparative purposes. The null hypothesis of this test denotes the difference between two randomly selected variables from two datasets, considering the mean of the variables’ ranks. In this test, the dataset with the lower mean of ranks represents better statistical behavior. The results of this test in dealing with the SS, 50D and 100D functions, including the summation of ranks for the considered alternative metaheuristic algorithms, are provided in Table 7 in a two-by-two manner. In this table, if the summation of ranks regarding W test in the second rows are lower than the values in the first rows, FHO is superior to the other metaheuristics. It is obvious that FHO provides a lower mean of ranks in most of the cases and, thus, outranks the other algorithms.
6.4 Kruskal Wallis (KW) test
The KW test is a well-known statistical test in which the overall rankings of multiple variables in different datasets are considered. Since the previously conducted MW and W tests are applied in a two-by-two manner based on the summation and mean of ranks, the KW test is utilized for comparing the mean of ranks between different datasets simultaneously. In this test, the dataset with the lower mean of ranks has better statistical behavior. The result of the KW test for different metaheuristics in dealing with the considered SS, 50D and 100D mathematical test functions are presented in Figs. 15, 16, and 17, respectively, in which the results of the algorithms with lowest mean of ranks (superior to others) are bolded. Again, it is obvious that FHO provides a lower mean of ranks in all of the cases and outranks the other algorithms accordingly.
6.5 Post Hoc (PH) analysis
In most statistical investigations, additional analyses are conducted for further evaluation and comparison of the statistical data from a different point-of-view. In this work, the results of the KW test were considered as the base data for PH analysis, in which the overall rankings of different metaheuristics were determined by considering the results of 233 mathematical functions simultaneously. According to the overall rankings of metaheuristics in Fig. 18, it is apparent that FHO has a higher ranking (bolded values) in all cases.
7 Bound constraint benchmark problems of CEC 2020
Regarding the fact that the proposed metaheuristic algorithms should be evaluated by means of difficult optimization problems which have been benchmarked in the competitions, one of the recent Competitions on Evolutionary Computation (CEC) about the single objective bound constrained numerical optimization as CEC 2020 which is proposed by Yue et al. (2020) including 10 of the well-established benchmark mathematical test problems is utilized in this paper to thoroughly investigate the overall behavior of the FHO in order to demonstrate its capability in dealing with the complex problems by comparing to the state-of-the-art algorithms. A brief description of these problems are presented in Table 13 while the complete mathematical formulation of these problems are presented in (Sallam et al. 2020; Yue et al. 2020).
Since the proposed metaheuristic algorithms should be evaluated using difficult optimization problems that have been benchmarked in competition, such as the Competitions on Evolutionary Computation (CEC). Recently, CEC 2020 on the single objective bound constrained numerical optimization was proposed by Yue et al. (2020). Herein, ten of the well-established benchmark mathematical test problems from CEC 2020 were utilized to thoroughly investigate the overall behavior of FHO and demonstrate its capability in dealing with the complex problems via comparison to other state-of-the-art algorithms. A brief description of these problems are presented in Table 8, while the complete mathematical formulation of these problems are given in (Yue et al. 2020).
The comparative results of FHO and other alternative metaheuristic algorithm, obtained from CEC 2020, are presented in Tables 9, 10, 11, regarding the 30 optimization runs that were conducted for statistical purposes. By considering the best, mean, and standard deviation of these runs for FHO and the alternative methods, it can be concluded that FHO exhibits the ability to equally compete with these algorithms. The values in Table 9 are the error values between the global optimum solutions and the optimum values calculated by each algorithm while the values in Tables 10 and 11 are the related mean and standard deviation of these error values.
Four prominent statistical analyses, including Kolmogorov–Smirnov (KS), Wilcoxon (W), Mann–Whitney (MW), and Kruskal–Wallis (KW), were used to make a better judgment about the performance of the FHO in dealing with the CEC 2020 bound-constrained test functions. In Table 12, the p values of the KS, MW, and W statistical tests are presented for comparative purposes, while in Table 13, the maximum difference between different approaches are presented considering the KS, MW, and W statistical tests in which the capability of the FHO is in perspective. It is obvious that the summation (MW test) and mean (W test) of ranks in FHO is lower than the other approaches in most of the cases, which represent the ability of the FHO in dealing with these kinds of difficult problems. In Table 14, the overall rankings of the algorithms in dealing with CEC 2020 benchmark test functions are presented in which the mean of ranks are in perspective. It should also be noted that FHO has the third ranking, which is somehow acceptable.
In order to obtain better judgment about the performance of FHO in dealing with the CEC 2020 bound constrained test functions, four well-known statistical analyses, including Kolmogorov–Smirnov (KS), Wilcoxon (W), Mann–Whitney (MW) and Kruskal–Wallis (KW), were utilized. In Table 12, the p values of the KS, MW, and W statistical tests are presented for comparative purposes. Table 13 displays the maximum difference between different approaches, i.e. KS, MW and W statistical tests, in which the capability of FHO is in perspective. It is obvious that the summation (MW test) and mean (W test) of ranks in FHO is lower than the other approaches in most of the cases, which represents the proposed algorithm’s ability to handle these kinds of difficult problems. In Table 13, the overall rankings of the algorithms in dealing with CEC 2020 benchmark test functions are presented, including the mean of ranks. It should also be noted that FHO has the third rank, which is somehow acceptable regarding the fact that the first two are amongst the most powerful and well-formulated algorithms which can be a great challenge for the improved versions of FHO to compete with in the future.
8 Computational complexity and cost analysis
In recent decades, complexity issues have been a major concern of new algorithms due to their importance in conducting time- and complexity-efficient optimization procedures. For this purpose, the computational complexity procedures of the CEC 2020 benchmark suit on bound constrained were utilized. In these functions, T0 denotes the run time of a specific mathematical procedure as presented in detail by Yue et al. (2020) as follows which should be repeated for 1,000,000 times:
TheT1 refers to the computational time for 200,000 function evaluations of G1 function; T2 presents the computational time of the considered algorithms (FHO in this paper) considering 200,000 function evaluations for G1 function; and \({\widehat{\mathrm{T}}}_{2}\) is the mean value of five calculations of T2. Table 15 presents the computational time complexity results of FHO and other algorithms by means of the mentioned procedures, in which the superiority of FHO is demonstrated.
In order to conduct the computational complexity analysis for a metaheuristic algorithm, the “Big O notation” can be utilized as a well-known mathematical notation in computer science. Herein, the required run time and memory usage of the algorithms were determined for comparative purposes. It should be noted that it is common to set numerical values for the complexity of an algorithm for testing; however, finding a solution for evaluating the run time issues in such algorithms is another concern that should be largely considered. In this regard, other complexity procedures should be utilized, while the complexity of an algorithm can be described regardless of the type of computer or hardware restrictions. In computer science, “Big O notation” refers to the required run time and memory usage of the algorithms, which are determined for comparative purposes. First, NP is determined as the total number of initial solution candidates, and D is the dimension of the considered problem. In the initialization phase of FHO, the computational complexity is determined as O(NP × D), then the computational complexity of the objective function evaluation in this phase is calculated as O(NP) × O(F(x)), where F(x) is the objective function of the considered problem. Each line in the main search loop of FHO has a computational complexity equal to the total number of iterations (MxIter). The position updating process for each of the solution candidates in the search space has a computational complexity of O(M × Iter × NP × D × 3) regarding the movements of Fire Hawks and prey. Finally, the objective function evaluation in the main search loop of the FHO has a computational complexity of O(M × Iter × NP × D × 3) × O(F(x)).
To further investigate the complexity issues of FHO, the balance between exploration and exploitation was considered as a key factor that lends insight about the convergence speed, which is one critical property of metaheuristic algorithms. Therefore, the diversity graphs for the FHO algorithm in dealing with F2, F8, F24, F47, F50, F61, F66, and F78 mathematical test functions were obtained and are depicted in Figs. 19, 20, 21, 22, 23, 24, 25, 26, respectively. It is obvious that the current population of FHO tends to localize the search during the optimization process for providing optimum results.
9 Real-world constrained optimization problems of CEC 2020
Often, the capability of a metaheuristic algorithm can be determined using real-world optimization problems, in which some sort of design constraints and bound constraints should be considered for obtaining proper or feasible solutions. In this regard, the capability of FHO was evaluated in dealing with the mentioned problems to make a valid judgment about its overall performance. To do so, the real-world constraint optimization problems from CEC 2020 (Kumar et al. 2020) and some other structural engineering problems from the literature were selected for numerical investigations. A brief description of these problems is presented in Table 16, while the complete mathematical formulation of these problems can be found in the literature. It is pertinent to mention that 25 independent optimization runs were conducted by using 20,000 function evaluations, and a simple penalty approach was utilized for constraint handling purposes.
In Tables 17 and 18, the comparative results of FHO and other algorithms are presented, including the design variables related to the best optimization run and the statistical results of 25 optimization runs. It is obvious that FHO provides very competitive result in this case.
In Tables 19 and 20, the best and statistical results of the conducted 25 optimization runs by FHO are compared with those of other algorithms reported in the literature. It can be seen that best result of multiple runs was achieved by FHO (2994.44), which can be compared to 2994.47 of another algorithm, thus verifying the acceptable performance of FHO. Meanwhile, the FHO is capable of providing better statistical results, including the mean of multiple runs which is 2994.46.
Tables 21 and 22 present the competitive and statistical results of multiple metaheuristics in dealing with the step-cone pulley constraint problem. It is apparent that the results of FHO are superior, achieving the best optimum value of 16.07 for the objective function, while the other algorithms could only provide 16.63.
In Tables 23 and 24, the competitive and statistical results of multiple metaheuristics in dealing with the multiple disk clutch brake constraint problem are presented. It can be seen that FHO achieved the best optimum value of 0.26 for the objective function, which is superior the other algorithms could only provide 0.31.
Comparison of the results of FHO and other metaheuristics from literature demonstrates that this algorithm is capable of yielding acceptable results in dealing with the complex optimization problems of CEC 2020 and other challenges. Besides, the statistical results of FHO, including the mean, worst, and standard deviation, of 25 independent optimization runs were also outstanding compared to the other approaches.
10 Structural optimization
Metaheuristic algorithms are one type of method to solve various optimization problems in different fields regardless of the possible multi-level complexity of these problems, which presents great challenges. Accordingly, the capability of these algorithm should be evaluated by means of difficult optimization problems, including the optimizing real-size structural frame designs of buildings. To perform a comparative investigation of such challenges, two benchmark structural frame design optimization problems, namely 15-story and 24-story buildings with steel wide flange sections as structural elements, were selected for further investigation. The first design example has 3 bays and 15 stories with a total number of 105 structural elements combined in 11 design groups. The second design problem has 3 bays with 24 stories with a total number of 268 structural elements combined into 20 design groups (Davison and Adams 1974).
The results of the weight optimization procedure for the considered benchmark structural frames of 15- and 24- stories utilizing FHO are provided in Tables 25 and 26, in which the results of the other algorithms are also presented for comparative purposes. It is obvious that FHO, again, produced outstanding results in dealing with these complex problems.
11 Conclusions
In this study, the FHO is proposed as a novel metaheuristic algorithm motivated by the foraging behavior of whistling kites, black kites, and brown falcons. These birds follow specific actions to catch prey, including setting fire to prey-dominant areas, and thus are called Fire Hawks. To evaluate the performance of this algorithm, a total of 233 mathematical functions with dimensions varying from 2 to 100 were utilized as test functions, while a maximum number of 150,000 function evaluations and 100 optimization runs were considered in the optimization process. A total of 10 alternative metaheuristic algorithms were utilized for comparative purposes, and well-known statistical analyses were also conducted to further verify the capability of FHO. In addition, two of the latest Competitions on Evolutionary Computation (CEC), i.e. CEC 2020 on bound constraint problems and CEC 2020 on real-world optimization problems, were considered for the performance evaluation of FHO in addition to complete computational cost and complexity analyses. The results of this research are summarized as follows:
-
FHO outranks the compared algorithms in the selected mathematical test functions, by converging to the predefined tolerance of the global best in a faster and more efficient way.
-
FHO is able to converge to the global best of the mathematical test functions by requiring a lower number of objective function evaluations, which proves its efficiency from a computational point-of-view.
-
The W and MW statistical test results prove that FHO provides lower values for mean and summation of ranks in a two-by-two comparison, which demonstrates the ability of this algorithm from statistical view point.
-
FHO outranks the other algorithms in terms of the mean of ranks in the KW test, whereby FHO has a lower mean, proving it comparable capability with the alternative methods.
-
FHO ranks third among seven of the well-established metaheuristic algorithms of the CEC 2020 completion on bound constraints, which is a step forward for the future challenges, while further study is suggested to improve FHO with a better ranking.
-
By evaluating the results of FHO in dealing with the CEC 2020 real-size optimization problems, it is proved that FHO has the ability to yield acceptable results in these cases.
-
FHO provides better and even more outstanding results in dealing with the structural engineering design problems compared to the state-of-the-art algorithms from CEC 2020.
For future research, FHO can be implemented in different fields of optimization by considering complex constraint problems, while improving the general formulation of this algorithm.
References
Abualigah L, Diabat A, Mirjalili S, Abd Elaziz M, Gandomi AH (2021a) The arithmetic optimization algorithm. Comput Methods Appl Mech Eng 376:113609. https://doi.org/10.1016/j.cma.2020.113609
Abualigah L, Yousri D, AbdElaziz M, Ewees AA, Al-qaness MAA, Gandomi AH (2021b) Aquila optimizer: a novel meta-heuristic optimization algorithm. Comput Ind Eng 157:107250. https://doi.org/10.1016/j.cie.2021b.107250
Aghay Kaboli SH, Selvaraj J, Rahim NA (2017) Rain-fall optimization algorithm: a population based algorithm for solving constrained optimization problems. J Comput Sci 19:31–42. https://doi.org/10.1016/j.jocs.2016.12.010
Ahmia I, Aider M (2019) A novel metaheuristic optimization algorithm: the monarchy metaheuristic. Turk J Electr Eng Comput Sci 27(1):362–376
Akhtar S, Tai K, Ray T (2002) A socio-behavioural simulation model for engineering design optimization. Eng Optim 34(4):341–354. https://doi.org/10.1080/03052150212723
Alsattar HA, Zaidan AA, Zaidan BB (2020) Novel meta-heuristic bald eagle search optimisation algorithm. Artif Intell Rev 53(3):2237–2264. https://doi.org/10.1007/s10462-019-09732-5
Arora S, Singh S (2019) Butterfly optimization algorithm: a novel approach for global optimization. Soft Comput 23(3):715–734
Atashpaz-Gargari E, Lucas C (2007) Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. In 2007 IEEE congress on evolutionary computation (pp 4661–4667). IEEE. https://doi.org/10.1109/CEC.2007.4425083
Azizi M (2021) Atomic orbital search: a novel metaheuristic algorithm. Appl Math Modell 93:657–683. https://doi.org/10.1016/j.apm.2020.12.021
Azizi M, Ejlali RG, Mousavi Ghasemi SA, Talatahari S (2019a) Upgraded whale optimization algorithm for fuzzy logic based vibration control of nonlinear steel structure. Eng Struct 192:53–70. https://doi.org/10.1016/j.engstruct.2019a.05.007
Azizi M, Mousavi A, Ejlali R, Talatahari S (2019b) Optimal tuning of fuzzy parameters for structural motion control using multiverse optimizer. Struct Design Tall Spec Build. https://doi.org/10.1002/tal.1652
Azizi M, Ghasemi Seyyed Arash M, Ejlali Reza G, Talatahari S (2020a) Optimization of fuzzy controller for nonlinear buildings with improved charged system search. Struct Eng Mech 76(6):781–797. https://doi.org/10.12989/SEM.2020a.76.6.781
Azizi M, Mousavi A, Ejlali R, Talatahari S (2020b) Optimum design of fuzzy controller using hybrid ant lion optimizer and Jaya algorithm. Artif Intell Rev 53:1–32. https://doi.org/10.1007/s10462-019-09713-8
Bayraktar Z, Komurcu M, Werner DH (2010) Wind Driven Optimization (WDO): a novel nature-inspired optimization algorithm and its application to electromagnetics. 2010 IEEE Antennas and Propagation Society International Symposium, 11–17 July 2010
Braik M, Sheta A, Al-Hiary H (2021) A novel meta-heuristic search algorithm for solving optimization problems: capuchin search algorithm. Neural Comput Appl 33(7):2515–2547. https://doi.org/10.1007/s00521-020-05145-6
Brammya G, Praveena S, Ninu Preetha N, Ramya R, Rajakumar B, Binu D (2019) Deer hunting optimization algorithm: a new nature-inspired meta-heuristic paradigm. Comput J. https://doi.org/10.1093/comjnl/bxy133
Brest J, Maučec MS, Bošković B (2020) Differential Evolution algorithm for single objective bound-constrained optimization: algorithm j2020. 2020 IEEE Congress on Evolutionary Computation (CEC), 19–24 July 2020
Camp CV, Bichon BJ, Stovall SP (2005) Design of steel frames using ant colony optimization. J Struct Eng 131(3):369–379. https://doi.org/10.1061/(ASCE)0733-9445(2005)131:3(369)
Coello CAC (2000) Use of a self-adaptive penalty approach for engineering optimization problems. Comput Ind 41(2):113–127
Coello Coello CA (2000) Use of a self-adaptive penalty approach for engineering optimization problems. Comput Ind 41(2):113–127. https://doi.org/10.1016/S0166-3615(99)00046-9
Daqiqnia AH, Fard Moradinia S, Baghalzadeh Shishehgarkhaneh M (2021) Toward nearly zero energy building designs: a comparative study of various techniques. AUT J Civil Eng 5(2):12–12. https://doi.org/10.22060/ajce.2021.20458.5771
Davison JH, Adams PF (1974) Stability of braced and unbraced frames. J Struct Div 100(2):319–334
de Melo VV, Banzhaf W (2018) Drone squadron optimization: a novel self-adaptive algorithm for global numerical optimization. Neural Comput Appl 30(10):3117–3144
Deb K, Srinivasan A (2006) Innovization: Innovating design principles through optimization. In Proceedings of the 8th annual conference on Genetic and evolutionary computation (pp 1629–1636). https://doi.org/10.1145/1143997.1144266
Degertekin SO (2008) Optimum design of steel frames using harmony search algorithm. Struct Multidiscip Optim 36(4):393–401. https://doi.org/10.1007/s00158-007-0177-4
Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern Part B 26(1):29–41. https://doi.org/10.1109/3477.484436
Eberhart R, Kennedy J (1995). A new optimizer using particle swarm theory MHS’95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, (pp 39–43). IEEE. https://doi.org/10.1109/MHS.1995.494215
Erol OK, Eksin I (2006) A new optimization method: big bang-big crunch. Adv Eng Softw 37(2):106–111. https://doi.org/10.1016/j.advengsoft.2005.04.005
Eskandar H, Sadollah A, Bahreininejad A, Hamdi M (2012) Water cycle algorithm—a novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput Struct 110–111:151–166. https://doi.org/10.1016/j.compstruc.2012.07.010
Feng Z-K, Niu W-J, Liu S (2021) Cooperation search algorithm: a novel metaheuristic evolutionary intelligence algorithm for numerical optimization and engineering optimization problems. Appl Soft Comput 98:106734. https://doi.org/10.1016/j.asoc.2020.106734
Ferreira M, Lisboa M, Silva Neto A, Sacco W (2018) A constrained ITGO heuristic applied to engineering optimization. Expert Syst Appl. https://doi.org/10.1016/j.eswa.2018.05.027
Gandomi AH (2014) Interior search algorithm (ISA): a novel approach for global optimization. ISA Trans 53(4):1168–1183
Gandomi AH, Yang X-S, Alavi AH (2013a) Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Eng Comput 29(1):17–35. https://doi.org/10.1007/s00366-011-0241-y
Gandomi AH, Yang X-S, Alavi AH, Talatahari S (2013b) Bat algorithm for constrained optimization tasks. Neural Comput Appl 22(6):1239–1255. https://doi.org/10.1007/s00521-012-1028-9
Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. SIMULATION 76(2):60–68. https://doi.org/10.1177/003754970107600201
Ghasemian H, Ghasemian F, Vahdat-Nejad H (2020) Human urbanization algorithm: a novel metaheuristic approach. Math Comput Simul 178:1–15
Gholizadeh S, Salajegheh E (2009) Optimal design of structures subjected to time history loading by swarm intelligence and an advanced metamodel. Comput Methods Appl Mech Eng 198(37–40):2936–2949
Gholizadeh S, Danesh M, Gheyratmand C (2020) A new Newton metaheuristic algorithm for discrete performance-based design optimization of steel moment frames. Comput Struct 234:106250. https://doi.org/10.1016/j.compstruc.2020.106250
Han J, Yang C, Zhou X, Gui W (2018) A two-stage state transition algorithm for constrained engineering optimization problems. Int J Control Autom Syst 16(2):522–534. https://doi.org/10.1007/s12555-016-0338-6
Hayyolalam V, Pourhaji Kazem AA (2020) Black widow optimization algorithm: a novel meta-heuristic approach for solving engineering optimization problems. Eng Appl Artif Intell 87:103249. https://doi.org/10.1016/j.engappai.2019.103249
Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H (2019) Harris hawks optimization: algorithm and applications. Futur Gener Comput Syst 97:849–872
Holland JH (1984) Genetic algorithms and adaptation. In: Selfridge OG, Rissland EL, Arbib MA (eds) Adaptive control of Ill-defined systems. Springer, Boston, pp 317–333
Jamil M, Yang X-S (2013) A literature survey of benchmark functions for global optimisation problems. Int J Math Modell Num Optim 4(2):150–194
Jamil M, Yang X-S, Zepernick H-J (2013) Test functions for global optimization: a comprehensive survey. In: Yang X-S, Cui Z, Xiao R, HosseinGandomi A, Karamanoglu M (eds) Swarm intelligence and bio-inspired computation. Elsevier, Hoboken, pp 193–222
Kallioras NA, Lagaros ND, Avtzis DN (2018) Pity beetle algorithm—a new metaheuristic inspired by the behavior of bark beetles. Adv Eng Softw 121:147–166
Kaveh A, Seddighian MR, Ghanadpour E (2020) Black hole mechanics optimization: a novel meta-heuristic algorithm. Asian J Civil Eng 21(7):1129–1149. https://doi.org/10.1007/s42107-020-00282-8
Kaveh A, Talatahari S (2009) Hybrid algorithm of harmony search, particle swarm and ant colony for structural design optimization. In: Geem ZW (ed) Harmony search algorithms for structural design optimization. Springer, Berlin, pp 159–198
Kaveh A, Talatahari S (2010a) A discrete big bang-big crunch algorithm for optimal design of skeletal structures. Asian J Civil Eng 11(1):103–122
Kaveh A, Talatahari S (2010b) An improved ant colony optimization for the design of planar steel frames. Eng Struct 32:864–873. https://doi.org/10.1016/j.engstruct.2009.12.012
Kaveh A, Talatahari S (2010c) A novel heuristic optimization method: charged system search. Acta Mech 213(3):267–289. https://doi.org/10.1007/s00707-009-0270-4
Kaveh A, Talatahari S (2010d) Optimum design of skeletal structures using imperialist competitive algorithm. Comput Struct 88:1220–1229. https://doi.org/10.1016/j.compstruc.2010.06.011
Kumar A, Wu G, Ali MZ, Mallipeddi R, Suganthan PN, Das S (2020) A test-suite of non-convex constrained optimization problems from the real-world and some baseline results. Swarm Evol Comput 56:100693. https://doi.org/10.1016/j.swevo.2020.100693
Li Z, Tam V (2020) A novel meta-heuristic optimization algorithm inspired by the spread of viruses. ArXiv Preprint arXiv. https://doi.org/10.48550/arXiv.2006.06282
Liang JJ, Suganthan PN, Deb K (2005) Novel composition test functions for numerical global optimization. In Proceedings 2005 IEEE Swarm Intelligence Symposium, 2005. SIS 2005. (pp 68–75). IEEE. https://doi.org/10.1109/SIS.2005.1501604
Mezura-Montes E, Coello CC, Landa-Becerra R (2003) Engineering optimization using simple evolutionary algorithm. In Proceedings. 15th IEEE International Conference on Tools with Artificial Intelligence 2003 Nov 5 (pp 149–156). IEEE. https://doi.org/10.1109/TAI.2003.1250183
Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based Syst 96:120–133. https://doi.org/10.1016/j.knosys.2015.12.022
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007
Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513
Mohamed AW, Hadi AA, Mohamed AK, Awad NH (2020) Evaluating the performance of adaptive gainingsharing knowledge based algorithm on CEC 2020 benchmark problems. 2020 IEEE Congress on Evolutionary Computation (CEC), 19–24 July 2020
Nematollahi AF, Rahiminejad A, Vahidi B (2020) A novel meta-heuristic optimization method based on golden ratio in nature. Soft Comput 24(2):1117–1151
Pham DT, Ghanbarzadeh A, Koç E, Otri S, Rahim S, Zaidi M (2006) The bees algorithm—a novel tool for complex optimisation problems. In: Pham DT, Eldukhri EE, Soroka AJ (eds) Intelligent production machines and systems. Elsevier Science Ltd, Hoboken, pp 454–459
Rao R (2016) Jaya: a simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int J Ind Eng Comput 7(1):19–34
Rao RV, Savsani VJ, Vakharia DP (2011) Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des 43(3):303–315. https://doi.org/10.1016/j.cad.2010.12.015
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inform Sci 179(13):2232–2248. https://doi.org/10.1016/j.ins.2009.03.004
Ray T, Liew KM (2003) Society and civilization: an optimization algorithm based on the simulation of social behavior. IEEE Trans Evol Comput 7(4):386–396. https://doi.org/10.1109/TEVC.2003.814902
Sadollah A, Sayyaadi H, Lee HM, Kim JH (2018) Mine blast harmony search: a new hybrid optimization method for improving exploration and exploitation capabilities. Appl Soft Comput 68:548–564
Sallam KM, Elsayed SM, Chakrabortty RK, Ryan MJ (2020) Improved Multi-operator differential evolution algorithm for solving unconstrained problems. 2020 IEEE Congress on Evolutionary Computation (CEC), 19–24 July 2020
Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713. https://doi.org/10.1109/TEVC.2008.919004
Sörensen K, Sevaux M, Glover F (2018) A history of metaheuristics. In: Martí R, Panos P, Resende MGC (eds) Handbook of heuristics. Springer International Publishing, Cham, pp 1–18
Storn R, Price K (1997) Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Global Optim 11(4):341–359
Talatahari S, Azizi M (2020a) Optimal design of real-size building structures using quantum-behaved developed swarm optimizer. Struct Design Tall Spec Build 29(11):e1747
Talatahari S, Azizi M (2020b) Optimization of constrained mathematical and engineering design problems using chaos game optimization. Comput Ind Eng 145:106560. https://doi.org/10.1016/j.cie.2020b.106560
Talatahari S, Azizi M (2020c) Optimum design of building structures using tribe-interior search algorithm. Structures 28:1616–1633. https://doi.org/10.1016/j.istruc.2020c.09.075
Talatahari S, Azizi M (2021a) Chaos game optimization: a novel metaheuristic algorithm. Artif Intell Rev 54(2):917–1004. https://doi.org/10.1007/s10462-020-09867-w
Talatahari S, Azizi M (2021b) Tribe-charged system search for global optimization. Appl Math Model 93:115–133. https://doi.org/10.1016/j.apm.2020.12.007
Talatahari S, Gandomi AH, Yang X-S, Deb S (2015) Optimum design of frame structures using the eagle strategy with differential evolution. Eng Struct 91:16–25. https://doi.org/10.1016/j.engstruct.2015.02.026
Talatahari S, Azizi M, Gandomi AH (2021a) Material generation algorithm: a novel metaheuristic algorithm for optimization of engineering problems. Processes 9(5):859. https://doi.org/10.3390/pr9050859
Talatahari S, Azizi M, Toloo M (2021b) Fuzzy adaptive charged system search for global optimization. Appl Soft Comput 109:107518. https://doi.org/10.1016/j.asoc.2021b.107518
Talatahari S, Azizi M, Tolouei M, Talatahari B, Sareh P (2021c) Crystal Structure Algorithm (CryStAl): a metaheuristic optimization method. IEEE Access 9:71244–71261. https://doi.org/10.1109/ACCESS.2021c.3079161
Talatahari S, Jalili S, Azizi M (2021d) Optimum design of steel building structures using migration-based vibrating particles system. Structures 33:1394–1413. https://doi.org/10.1016/j.istruc.2021d.05.028
Talatahari S, Motamedi P, Farahmand Azar B, Azizi M (2021e) Tribe–charged system search for parameter configuration of nonlinear systems with large search domains. Eng Optim 53(1):18–31. https://doi.org/10.1080/0305215X.2019.1696786
Wang C, Koh JM, Yu T, Xie NG, Cheong KH (2020) Material and shape optimization of bi-directional functionally graded plates by GIGA and an improved multi-objective particle swarm optimization algorithm. Comput Methods Appl Mech Eng 366:113017
Xia L, Zhang L, Xia Q, Shi T (2018) Stress-based topology optimization using bi-directional evolutionary structural optimization method. Comput Methods Appl Mech Eng 333:356–370. https://doi.org/10.1016/j.cma.2018.01.035
Yang X-S (2010a) A new metaheuristic bat-inspired algorithm. In: González JR, Pelta DA, Cruz C, Terrazas G, Krasnogor N (eds) Nature Inspired Cooperative Strategies for Optimization (NICSO 2010a). Springer, Berlin, pp 65–74
Yang X-S (2010b) Test problems in optimization. arXiv preprint arXiv:1008.0549
Yang X-S (2012) Nature-inspired mateheuristic algorithms: success and new challenges. J Comput Eng Inform Technol. https://doi.org/10.4172/2324-9307.1000e101
Yildiz AR, Abderazek H, Mirjalili S (2020) A comparative study of recent non-traditional methods for mechanical design optimization. Arch Comput Methods Eng 27(4):1031–1048. https://doi.org/10.1007/s11831-019-09343-x
Liang JJ, Qu BY, Suganthan PN, Liang JJ, Ali MZ, Qu BY, Awad NH, Biswas PP (2020) Problem definitions and evaluation criteria for the CEC 2020 special session and competition on single objective bound constrained numerical optimization. Technical Report 201911, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report, Nanyang Technological University, Singapore
Zhang M, Luo W, Wang X (2008) Differential evolution with dynamic stochastic selection for constrained optimization. Inform Sci 178(15):3043–3074. https://doi.org/10.1016/j.ins.2008.02.014
Zhang Y, Jin Z (2020) Group teaching optimization algorithm: a novel metaheuristic method for solving global optimization problems. Expert Syst Appl 148:113246
Zhao W, Du C, Jiang S (2018) An adaptive multiscale approach for identifying multiple flaws based on XFEM and a discrete artificial fish swarm algorithm. Comput Methods Appl Mech Eng 339:341–357
Acknowledgements
This research is supported by a research grant of the University of Tabriz (Grant No. 1615)
Funding
Open Access funding enabled and organized by CAUL and its Member Institutions.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
See Table 27.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Azizi, M., Talatahari, S. & Gandomi, A.H. Fire Hawk Optimizer: a novel metaheuristic algorithm. Artif Intell Rev 56, 287–363 (2023). https://doi.org/10.1007/s10462-022-10173-w
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10462-022-10173-w