The SSPCO (See-See Particle Chicks Optimization) is a type of swarm intelligence algorithm derived from the behavior of See-See Partridge. Although efficiency of this algorithm has been proven for solving static optimization problems, it has not yet been tested to solve dynamic optimization problems. Due to the nature of NP-Hard dynamic problems, this algorithm alone is not able to solve such optimization problems. Therefore, to enable the algorithm to optimally track the variable in these problems, it is necessary to be provided solutions with this algorithm so that can increase the performance of this algorithm for dynamic environments. In this paper, two solutions for combining SSPCO are presented: (1) the multi-swarm method and (2) memory with Gaussian density estimation. The problem with most multi-swarm methods is that as the population increases uncontrollably, the speed and efficiency of the algorithm gradually decreases. The multi-swarm methods presented in this paper is adapted to the problem space, and whenever there is a need to increase the population, a population is created adaptively, and this reduces the problems of previous methods. One of the issues that is being addressed to solve uncertainty problems is prediction of near future using data of the near past. In this article, to preserve past data a new memory called Gaussian density estimation memory is used. This memory fixes standard memory defects and improves the performance of the proposed algorithm. To evaluate the efficiency of the proposed method, the well-known moving peak benchmark function, which simulates behavior of dynamic problems, is used. The proposed algorithm is compared with the 10 most popular dynamic optimization algorithms. According to the experimental results, the proposed method reduces offline error to a great extent compared to other methods and the error produced by the proposed method is very small.
[1] Cruz, C., Gonzalez, J.R., and Pelta, D. A., “Optimization in Dynamic Environments: A Survey on Problems, Methods and Measures”. Journal Soft Computing-A Fusion of Foundations, Methodologies and Applications, vol. 15, pp. 1427-1448, 2011.
[2] Yang, S., “Genetic Algorithms With Elitism-Based Immigrants For Changing Optimization Problems”. Applications of Evolutionary Computing, vol. 4448, pp. 627-636, 2007.
[3] Hashemi, A.B., Meybodi, M.R., “Cellular PSO: A PSO for Dynamic Environments”. Advances in Computation and Intelligence, pp. 422–433, 2009.
[4] Yang, S., “Explicit Memory Schemes For Evolutionary Algorithms In Dynamic Environments”. Evolutionary Computation in Dynamic and Uncertain Environments, pp. 3-28, 2007.
[5] Yazdani, D., Nasiri, B., Sepas-Moghaddam, A., Meybodi, M.R., “A Novel Multi-Swarm Algorithm For Optimization In Dynamic Environments Based On Particle Swarm Optimization”, Applied Soft Computing, vol.13, pp. 2144- 2158, 2013.
[6] Omidvar, R., Parvin, H., and Rad, F., “SSPCO Optimization Algorithm (See-See Partridge Chicks Optimization)”. 14 th-Mexican international conferences on artificial intelligence, IEEE, 2015.
[7] Arena, Paolo, et al. “Self-organization in nonrecurrent complex systems”. International Journal of Bifurcation and Chaos vol. 10, no. 5, pp. 1115-1125, 2000.
[8] Alatas, Bilal, Erhan Akin, and Bedri Ozer, A., “Chaos embedded particle swarm optimization algorithms”. Chaos, Solitons & Fractals no. 40, vol. 4, 1715-1734, 2009.
[9] Lorenz, N., Edward “Deterministic nonperiodic flow”. Journal of the atmospheric sciences, no. 20, vol. 2, pp. 130-141, 1963.
[10] Gandomi, Amir, H., and Xin-She, Y., “Chaotic bat algorithm”. Journal of Computational Science, no. 5, vol. 2, pp. 224-232, 2014.
[11] Brank, J., “Evolutionary Optimization in Dynamic Environments,” Volume 3 of Genetic algorithm and evolutionary computation, Kluwer Academic Publisher, Massachusettes, USA, 2001.
[12] Brank, J., “Memory enhanced evolutionary algorithms for changing optimization problems”. In Proceeding of the 1999 IEEE Congress on Evolutionary Computation (CEC-1999), vol. 3, pp. 1875–1882, 1999.
[13] Brank, J., “Evolutionary Optimization in Dynamic Environments”. Kluwer, 2002.
[14] NGUYEN, T. T., “Continuous Dynamic Optimisation Using Evolutionary Algorithm”. Doctor of Philosophy Thesis, University of Birmingham, 2010.
[15] محمدپور و پروین."الگوریتم ژنتیک آشوبگونه مبتنی بر حافظه و خوشهبندی برای حل مسائل بهینهسازی پویا". مجله مهندسی برق دانشگاه تبریز. جلد 46. شماره 3. پاییز 1395.
[16] Mohammadpour, M., Parvin, H., “Genetic Algorithm Based on Explicit Memory for Solving Dynamic Problems”. In Journal of Advances in Computer Research Sari Branch Islamic Azad University, vol. 7, no. 2, pp. 53-68, 2016.
[17] Yang, S., “Genetic Algorithms With Elitism-Based Immigrants For Changing Optimization Problems”. Applications of Evolutionary Computing, vol. 4448, pp. 627-636, 2007.
[18] Li, C., Yang, S., “Fast Multi-Swarm Optimization For Dynamic Optimization Problems”. Natural Computation, ICNC '08. Fourth International Conference, vol. 7, pp. 624-628, 2008.
[19] Li , C., Yang, S., “An Island Based Hybrid Evolutionary Algorithm For Optimization”. Simulated Evolution and Learning, pp. 180-189, 2008.
[20] Rezazadeh, I., Meybodi, M. R. , and Naebi, A., “Adaptive Particle Swarm Optimization Algorithm For Dynamic Environments”. ICSI'11 Proceedings of the Second international conference on Advances in swarm intelligence, vol. I, pp. 120-129, 2011.
[21] Yazdani, D., Akbarzadeh-Totonchi, M. R., Nasiri, B., and Meybodi, M. R., “A New Artificial Fish Swarm Algorithm For Dynamic Optimization Problems”. Evolutionary Computation (CEC), IEEE Congress on, pp. 1-8, 2012.
[22] Mohammadpour, M., Parvin, H., Sina, M., “Chaotic Genetic Algorithm based on Explicit Memory with a new Strategy for Updating and Retrieval of Memory in Dynamic Environments,” In Journal of AI and Data Mining . (Vol 6, No 1), Pages: 191-205, 2018.
[23] Rastegar, R. and Meibodi, M. R., “Study of global convergence time complexity of Estimation of Distribution Algorithm”. Computer Engineering Department, Amirkabir University, Tehran, Iran, Sep 2007.
[24] Blackwell, T. M., and Branke, J., “Multi-swarms, exclusion and anti-convergence in dynamic environments”. IEEE Transactions on Evolutionary Computation, vol.10, pp.459–472, 2006.
[25] Yang, S., and Li, C., “A General Framework of Multipopulation Methods with Clustering in Undetectable Dynamic Environments”. IEEE Transactions on Evolutionary Computation, vol. 16, no. 4, pp. 556-577, 2012.
[26] Branke, J., Kaußler, T., Schmidt, C., and Schmeck, H., “A multipopulation approach to dynamic optimization problems”. In Adaptive Computing in Design and Manufacturing, pp. 299–308, 2000.
[27] Grefenstette, J., “Genetic algorithms for changing environments”. In Parallel Problem Solving from Nature, pp. 137–144, 1992.
[28] Blackwell, T. M., Branke, J., “Particle Swarms for Dynamic Optimization Problems”.Swarm Intelligence, pp.193–217, 2008.
[29] Mavrovouniotis, M. Li, Ch. and Yang, S. “A survey of swarm intelligence for dynamic optimization: Algorithm and application”, Swarm and Evolutionary Computation, Vol. 33, pp. 1-17, 2017.
[30] Mirjalili, S. Mirjalili S. and Lewis, A. "Grey Wolf Optimizer," Advances in Engineering Software, pp. 69: 46-61, 2014.
[31] Yazdani, D. Nasiri, B. Sepas-Moghaddam A. and Meybodi, M. "a novel multi-swarm algorithm for optimization in dynamic environments based on particle swarm optimization," Applied Soft Computing, 2013.
[32] Lung, R. and Dumitrescu, D. "A Collaborative Model for Tracking Optima in Dynamic Environments," in In: IEEE Congress on Evolutionary Computation, 2007.
[33] Ozsoydan, F. and Baykasoglu, A. "A multi-population firefly algorithm for dynamic optimization problems," in Evolving and Adaptive Intelligent Systems (EAIS), 2015 IEEE International Conference, 2015.
[34] Parvin, H. Nejatian, S. Mohammadpour, M. "Explicit memory based ABC with a clustering strategy for updating and retrieval of memory in dynamic environments," Applied Intelligence, Volume 48 , V, 11., pages, 4317-4337, doi: https://doi.org/10.1007/s10489-018-1197-z, 2018.
[35] N. P. Bakas, “Numerical solution for the extrapolation problem of analytic functions,” Research, vol. 2019, Article ID 3903187, 10 pages, 2019.
Mohammadpour, M., Minaei, B., & Parvin, H. (2021). Introducing a new meta-heuristic algorithm based on See-See Partridge Chicks Optimization to solve dynamic optimization problems. Soft Computing Journal, 8(2), 38-65. doi: 10.22052/8.2.38
MLA
Majid Mohammadpour; Behrooz Minaei; Hamid Parvin. "Introducing a new meta-heuristic algorithm based on See-See Partridge Chicks Optimization to solve dynamic optimization problems", Soft Computing Journal, 8, 2, 2021, 38-65. doi: 10.22052/8.2.38
HARVARD
Mohammadpour, M., Minaei, B., Parvin, H. (2021). 'Introducing a new meta-heuristic algorithm based on See-See Partridge Chicks Optimization to solve dynamic optimization problems', Soft Computing Journal, 8(2), pp. 38-65. doi: 10.22052/8.2.38
VANCOUVER
Mohammadpour, M., Minaei, B., Parvin, H. Introducing a new meta-heuristic algorithm based on See-See Partridge Chicks Optimization to solve dynamic optimization problems. Soft Computing Journal, 2021; 8(2): 38-65. doi: 10.22052/8.2.38