A Novel Sarpa Salpa-Inspired Optimization Algorithm: Performance Evaluation and Comparison with Particle Swarm Optimization on Benchmark Functions
DOI:
https://doi.org/10.58916/jhas.v10i3.852الكلمات المفتاحية:
Sarpa Salpa-inspired optimization، metaheuristic algorithms، Particle Swarm Optimization، benchmark functions، multimodal optimization، global search، high-dimensional optimization، statistical evaluationالملخص
This paper introduces a novel optimization algorithm inspired by the behavior of the Sarpa Salpa fish, referred to as SSOA. The algorithm mimics the natural exploration and exploitation strategies of Sarpa Salpa, incorporating adaptive mechanisms for improved search efficiency in complex multimodal landscapes. Performance evaluation is conducted on standard benchmark functions Rastrigin, Griewank, Sphere, and Ackley across multiple dimensionalities (2D, 5D, 10D, 20D, and 50D). Statistical analyses over repeated trials show that SSOA outperforms the classical Particle Swarm Optimization (PSO) algorithm in terms of accuracy, robustness, and success rate, especially in higher-dimensional problems. In addition, a sensitivity analysis of key parameters (alpha, beta, gamma, and inertia weight) demonstrates the algorithm's resilience to parameter variations, while highlighting that extreme parameter values can degrade performance. Despite a moderate increase in computational cost, the algorithm demonstrates strong potential for solving challenging global optimization problems.
التنزيلات
المراجع
Ackley, D. H. (1987). A connectionist machine for genetic hillclimbing. Boston: Kluwer Academic Publishers.
Alvarez, J., Chen, Y., & Wang, L. (2024). Dynamic cooperative search strategies in fish schools: Implications for bio-inspired optimization models. Journal of Bio-inspired Optimization, 12 (3), 45–67.
Deb, K. (2000). An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering, 186 (2–4), 311–338. https://doi.org/10.1016/S0045-7825(99)00389-8
De Jong, K. A. (1975). An analysis of the behavior of a class of genetic adaptive systems (Doctoral dissertation). University of Michigan. ProQuest Dissertations and Theses.
Dorigo, M., & Stützle, T. (2004). Ant colony optimization. MIT Press.
Mubarak, A. M. (2025). Bridging Data Science and Big Data Analytics: Mathematical Foundations for Innovation and Scalable Efficiency. Bani Waleed University Journal of Humanities and Applied Sciences, 10(2), 27-38.
Eiben, A. E., & Smith, J. E. (2015). Introduction to evolutionary computing (2nd ed.). Springer. https://doi.org/10.1007/978-3-662-44874-8
Fister, I., Yang, X. S., Fister Jr., I., Brest, J., & Fister, D. (2023). A comprehensive review of nature-inspired metaheuristics. Swarm and Evolutionary Computation, 78, 101–120. https://doi.org/10.1016/j.swevo.2023.101120
Griewank, A. O. (1981). Generalized descent for global optimization. Journal of Optimization Theory and Applications, 34 (1), 11–39. https://doi.org/10.1007/BF00933304
Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks (Vol. 4, pp. 1942–1948). https://doi.org/10.1109/ICNN.1995.488968
Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2016). Grey wolf optimizer. Advances in Engineering Software, 69, 46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007
Rastrigin, L. A. (1974). Systems of extremal control. Moscow: Nauka.
Shi, Y., & Eberhart, R. (1998). A modified particle swarm optimizer. In Proceedings of the IEEE International Conference on Evolutionary Computation (pp. 69–73). https://doi.org/10.1109/ICEC.1998.699146
Surjanovic, S., & Bingham, D. (2013). Virtual library of simulation experiments: Test functions and datasets. Simon Fraser University. http://www.sfu.ca/~ssurjano
Talbi, E. G. (2009). Metaheuristics: From design to implementation. Wiley. https://doi.org/10.1002/9780470496916
Yang, X. S. (2014). Nature-inspired optimization algorithms. Elsevier. https://doi.org/10.1016/C2013-0-01368-0