Metaheuristic algorithm selection system for black-box continuous optimization problems based on collaborative filtering

ZHANG Yong-wei1 WANG Lei2

(1.College of Electronics and Information, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu Province, China 212003)
(2.College of Electronics and Information Engineering, Tongji University, Shanghai, China 200092)

【Abstract】Selecting the best algorithm out of an algorithm set for a given problem is referred to as the algorithm selection (AS) problems. The importance of AS problems increases with the emergence of many optimization algorithms. Therefore, a five-star ranking system based on clustering is proposed, which maps the algorithm performance criteria to integers and reduces the rating space. An algorithm set is prepared, including 24 commonly used optimization algorithms and four algorithms that win the CEC competition in 2016 and 2017. A ranking matrix is obtained through the test on the performance of the selected algorithms on 219 benchmark problems. The evaluation matrix is used as the data source of the collaborative filtering algorithm to obtain a prediction model of algorithm rating. For a new problem instance, the model predicts the ranking of all the algorithms in the algorithm set. The results show that the prediction accuracy is high, and over 90% of predicted best algorithms are capable of solving the problem instance. Sensitivity analysis shows that the proposed method can still maintain high prediction accuracy with limited priori information.

【Keywords】 algorithm selection; continuous optimization; black-box optimization; collaborative filtering; metaheuristics; recommendation system;

【DOI】

【Funds】 National Natural Science Foundation of China (71371142, 71771176) Jiangsu Government Scholarship of Overseas Studies (JS-2015-200) Soft Science Foundation of Zhenjiang, Jiangsu Province, China (71371142, 71771176)

Download this article

    References

    [1] Wolpert D H, Macready W G. No free lunch theorems for optimization [J]. IEEE Trans on EvolutionaryComputation, 1997, 1 (1): 67–82.

    [2] Zeng Z L, Zhang H J, Zhang R, et al. Summary of algorithm selection problem based on meta-learning [J]. Control and Decision, 2014, 29 (6): 961–968 (in Chinese).

    [3] Rice J R. The algorithm selection problem [J]. Advances in Computers, 1976, 15 (C): 65–118.

    [4] Mersmann O, Bischl B, Trautmann H, et al. Exploratory landscape analysis [C]. The 13th Annual Conference on Genetic and Evolutionary Computation. Dublin: ACM, 2011: 829–836.

    [5] Muñoz M A, Kirley M, Halgamuge S K. Exploratorylandscape analysis of continuous space optimization problems using information content [J]. IEEE Transactions on Evolutionary Computation, 2015, 19 (1): 74–87.

    [6] Kanda J, Carvalho A de, Hruschka E, et al. Meta-learning to select the best metaheuristic for the traveling salesman problem: A comparison of meta-features [J]. Neurocomputing, 2016, 205: 393–406.

    [7] Parmezan A R S, Lee H D, Wu F C. Meta learning for choosing feature selection algorithms in data mining: proposal of a new framework [J]. Expert Systems with Applications, 2017, 75: 1–24.

    [8] Ferrari D G, De Castro L N. Clustering algorithm selection by meta-learning systems: A new distance-based problem characterization and ranking combination methods [J]. Information Sciences, 2015, 301: 181–194.

    [9] Brazdil P B, Soares C. A comparison of ranking methods for classification algorithm selection [C]. European Conference on Machine Learning. Barcelona: Springer, 2000: 63–75.

    [10] Kerschke P, Trautmann H. Automated algorithm selection on continuous black-box problems by combining exploratory landscape analysis and machine learning [J]. Evolutionary Computation, 2019, 27 (1): 99–127.

    [11] Muñoz MA, Kirley M, Halgamuge S K. A meta-learning prediction model of algorithm performance for continuous optimization problems [C]. International Conference on Parallel Problem Solving from Nature.Taormina: Springer, 2012: 226–235.

    [12] Bischl B, Mersmann O, Trautmann H, et al. Algorithm selection based on exploratory landscape analysis and cost-sensitive learning [C]. The 14th International Conference on Genetic and Evolutionary Computation. Philadelphia: ACM, 2012: 313–320.

    [13] EkstrandMD, Riedl J T, Konstan J A, et al. Collaborative filtering recommender systems [J]. Foundations and Trends in Human-Computer Interaction, 2011, 4 (2): 81–173.

    [14] Misir M, Sebag M. Algorithm selection as a collaborative filtering problem [R]. https://hal.inria.fr/hal-00922840, 2013.

    [15] Stern D, Herbrich R, Graepel T. Matchbox: largescale online bayesian recommendations [C]. The 18th International Conference on World Wide Web. Madrid: ACM, 2009: 111–120.

    [16] Stern D, Samulowitz H, Pulina L, et al. Collaborativeexpert portfolio management [J]. Artificial Intelligence, 2010, 116 (3): 179–184.

    [17] Misir M, Sebag M. Alors: An algorithm recommender system [J]. Artificial Intelligence, 2017, 244: 291–314.

    [18] Misir M. Data sampling through collaborative filtering for algorithm selection [C]. IEEE Congress on Evolutionary Computation. Donostia: IEEE, 2017: 2494–2501.

    [19] Wang S, Liu T Y, Research M, et al. Ranking-oriented collaborative filtering: A listwise approach [J]. ACM Transactions on Information Systems, 2016, 35 (2): 343–352.

    [20] Kuhn M, Johnson K. Applied predictive modeling [M]. New York: Springer, 2013: 69–71.

    [21] Molinaro A. Prediction error estimation: A comparison of resampling methods [J]. Bioinformatics, 2005, 21 (15):3301–3307.

    [22] Adorio E, Diliman U. MVF multivariate test functions library in C for unconstrained global optimization [R]. Diliman: University of the Philippines, 2005: 1–56.

    [23] Molga M, Smutnicki C. Test functions for optimizationneeds [EB/OL]. (2005-04-03)[2019-02-13]. https://www.robertmarks.org/Classes/ENGR5358/Papers /functions.pdf.

    [24] Jamil M, Yang X S. A literature survey of benchmark functions for global optimization problems [J]. International Journal of Mathematical Modelling and Numerical Optimisation, 2013, 4 (2): 150–194.

    [25] Nocedal J, Wright S J. Numerical optimization [M]. NewYork: Springer, 2006: 46–54.

    [26] Simon D. Evolutionary optimization algorithms [M]. Hoboken: John Wiley & Sons, Inc, 2013: 35–62.

    [27] Hansen N, Ostermeier A. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation [C]. IEEE International Conference on Evolutionary Computation. Nagoya: IEEE, 1996: 312–317.

    [28] Shi Y, Eberhart R. A modified particle swarm optimizer [C]. IEEE International Conference on Evolutionary Computation. Alaska: IEEE, 1998: 69–73.

    [29] Mehrabian AR, Lucas C. A novel numerical optimization algorithm inspired from weed colonization [J]. Ecological Informatics, 2006, 1 (4): 355–366.

    [30] Atashpaz-Gargari E, Lucas C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition [C]. IEEE Congress on Evolutionary Computation. Singapore: IEEE, 2007: 4661–4667.

    [31] Socha K, Dorigo M. Ant colony optimization for continuous domains [J]. European Journal of Operational Research, 2008, 185 (3): 1155–1173.

    [32] Yang X S, Deb S. Cuckoo search via Lévy flights [C]. World Congress on Nature & Biologically Inspired Computing. Coimbatore: IEEE, 2009: 210–214.

    [33] Elsayed S, Hamza N, Sarker R. Testing united multi-operator evolutionary algorithms-II on single objective optimization problems [C]. IEEE Congress on Evolutionary Computation. Vancouver: IEEE, 2016: 2966–2973.

    [34] Awad N H, Ali M Z, Suganthan P N. Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problems [C]. IEEE Congress on Evolutionary Computation. Donostia: IEEE, 2017: 372–379.

    [35] Brest J, Mauec M S, Bokovi B. Single objective real-parameter optimization: Algorithm JSO [C]. IEEE Congress on Evolutionary Computation. Donostia: IEEE, 2017: 1311–1318.

    [36] Kumar A, Misra R K, Singh D. Improving the local search capability of effective butterfly optimizer using covariance matrix adapted retreat phase [C]. IEEE Congress on Evolutionary Computation. Donostia: IEEE, 2017: 1835–1842.

This Article

ISSN:1001-0920

CN: 21-1124/TP

Vol 35, No. 06, Pages 1297-1306

June 2020

Downloads:3

Share
Article Outline

Knowledge

Abstract

  • 0 Introduction
  • 1 Collaborative filtering based on algorithm rating
  • 2 Performance indexes and parameter optimization of collaborative filtering
  • 3 Performance test of optimization algorithms
  • 4 Results analysis of numerical experiments
  • 5 Conclusions
  • References