Skip to main content


Binary Salp Swarm Optimization Algorithm for Feature Selection with Simulated Annealing

Issue Abstract

Abstract
Wrapper based Feature Selection is a popular Dimensionality reduction approach that searches for an optimum Feature Subset from a huge dataset which meets the Objective of selection and represents the structure of the entire dataset. Swarm Intelligence based Feature Selection Wrappers are known for their solution exploration and exploitation capability. However when there are multiple objectives to satisfy, wrapper algorithms suffer Pareto optimality problem. The objective of this paper is to overcome Pareto optimality problem and select relevant features with a Hybrid Salp Swarm Optimization Wrapper. The issue is approached with a fuzzy weighted single objective function embedded with classifier rate and Feature Selection ratio as selection factors. For next position selection and to improve solution exploration, Single Solution Simulated Annealing is combined with the Wrapper. Experimental results on 7 UCI benchmark datasets prove that the proposed method outperforms the existing Hybrid Binary Whale Swarm Wrapper in terms of time taken for Feature Selection. Also in terms of Error rate, Feature Reduction Ratio and Fitness measure, the proposed method showcases a decent performance.
Keywords: Feature Selection, Machine Learning, Meta-heuristics, Swarm Intelligence, Simulated Annealing


Author Information
Vaishali .R
Issue No
9
Volume No
4
Issue Publish Date
05 Sep 2022
Issue Pages
17-28

Issue References

References
1. Blum, A. L., & Langley, P. (1997). Selection of relevant features and examples in machine learning. Artificial Intelligence, 97(1–2), 245–271. https://doi.org/10.1016/S0004-3702 (97)00063-5
2. Blum, C., & Li, X. (2008). Swarm Intelligence in Optimization. In Swarm Intelligence (pp. 43–85). Berlin, Heidelberg: Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-74089-6_2
3. Cunningham, J. P., & Ghahramani, Z. (2014). Linear Dimensionality Reduction: Survey, Insights, and Generalizations, 16, 2859–2900. http://arxiv.org/abs/1406.0873
4. Dorigo, M., & Birattari, M. (2011). Ant colony Optimization. In Encyclopedia of machine learning (pp. 36–39). Springer.

5. Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016a).
6. Binary ant lion approaches for Feature Selection.
7. Neurocomputing, 213, 54–65. https://doi.org/10.1016/j.neucom.2016.03.101
8. Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016b).
9. Binary grey wolf Optimization approaches for Feature Selection. Neurocomputing, 172, 371–381. https://doi.org/10.1016/j.neucom.2015.06.083
10. Faris, H., Mafarja, M. M., Heidari, A. A., Aljarah, I. Ala’M, A. Z., Mirjalili, S., & Fujita, H. (2018). An efficient binary Salp Swarm Algorithm with crossover scheme for Feature Selection Problems. Knowledge-Based Systems, 154, 43-67 https://doi.org/10.1016/j.knosys.2018.05.009
11. Forest, O., Rodrigues, D., Pereira, L. A. M., Nakamura, R. Y. M., Costa, K.
A. P., Yang, X., Forest, O. (2014). Expert Systems with Applications a wrapper approach for Feature Selection based on Bat Algorithm. Expert Systems with Applications, 41(5), 2250–2258. https://doi.org/10.1016/j.eswa.2013.09.023
12. Ibrahim, H. T., Mazher, W. J., Ucan, O. N., & Bayat, O. (2018). A grasshopper optimizer approach for Feature Selection and optimizing SVM parameters utilizing real biomedical data sets. Neural Computing and Applications, 1–10.
13. Jain, A., & Zongker, D. (1997). Feature Selection: Evaluation, Application, and Small Sample Performance. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(2), 153–158. https://doi.org/10.1109/34.574797
14. Jiménez, F., Sánchez, G., García, J. M., Sciavicco, G., & Miralles, L. (2017). Multi-objective evolutionary Feature Selection for online sales forecasting.
15. Neurocomputing, 234(November 2016), 75–92. https://doi.org/10.1016/j.neucom.2016.12.045
16. Jiménez, F., Sánchez, G., & Juárez, J. M. (2014).
17. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction. Artificial Intelligence in Medicine, 60(3), 197–219. https://doi.org/10.1016/j.artmed.2013.12.006
18. Kanan, H. R., & Faez, K. (2008). An improved Feature Selection method based on ant colony Optimization (ACO) evaluated on face recognition system.
19. Applied Mathematics and Computation, 205(2), 716–725.
20. https://doi.org/10.1016/j.amc.2008.05.115
21. Khamees, M., Albakry, A., & Shaker, K. (2018, October). Multi-objective Feature Selection: Hybrid of Salp Swarm and Simulated Annealing Approach. In International Conference on New Trends in Information and Communications Technology Applications (pp. 129-142). Springer, Cham. https://doi.org/10.1007/978-3-030-01653-1_8
22. Liu, Y., Wang, G., Chen, H., Dong, H., Zhu, X., & Wang, S. (2011). An improved particle Swarm Optimization for Feature Selection. Journal of Bionic Engineering, 8(2), 191–200.
23. https://doi.org/10.1016/S1672-6529(11)60020-6
24. Mafarja, M. M., Eleyan, D., Jaber, I., Hammouri, A., & Mirjalili, S. (2017). Binary dragonfly algorithm for Feature Selection. In New Trends in Computing Sciences (ICTCS), 2017 International Conference on (pp. 12–17).
25. Mafarja, M., & Mirjalili, S. (2018). Whale Optimization approaches for wrapper Feature Selection. Applied Soft Computing, 62(November), 441–453. https://doi.org/10.1016/J.ASOC.2017.11.006
26. Mirjalili, S. (2015). Moth-flame Optimization algorithm: A novel nature-inspired heuristic paradigm. Knowledge-Based Systems, 89, 228–249.
27. Mirjalili, S. (2016). Dragonfly algorithm: a new meta-heuristic Optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Computing and Applications, 27(4), 1053–1073.
28. Mirjalili, S., Gandomi, A. H., Mirjalili, S. Z., Saremi, S., Faris, H., & Mirjalili, S. M. (2017). Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Advances in Engineering Software, 114, 163–191. https://doi.org/10.1016/j.advengsoft.2017.07.002
29. Mirjalili, S., & Lewis, A. (2016). The whale Optimization algorithm. Advances in Engineering Software, 95, 51–67.
30. Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey Wolf Optimizer. Advances in Engineering Software, 69, 46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007
31. Nakamura, R. Y. M., Pereira, L. A. M., Costa, K. A., Rodrigues, D.,
32. Papa, J. P., & Yang, X. S. (2012). BBA: A binary bat algorithm for Feature Selection. Brazilian Symposium of Computer Graphic and Image Processing, 291–297. https://doi.org/10.1109/SIBGRAPI.2012.47
33. Raymer, M. L., Punch, W. F., Goodman, E. D., Kuhn, L. A., & Jain, A. K. (2000). Dimensionality Reduction Using Genetic Algorithms. IEEE Transactions on Control Systems Technology, 4(2), 164–171. https://doi.org/10.1109/TCST.2011.2171964
34. Ahmad, S., Mafarja, M., Faris, H., & Aljarah, I. (2018). Feature Selection using Salp Swarm algorithm with chaos. http://hdl.handle.net/20.500.11889/5516

35. Unler, A., & Murat, A. (2010). A discrete particle Swarm Optimization method for Feature Selection in binary classification problems. European Journal of Operational Research, 206(3), 528–539. https://doi.org/10.1016/j.ejor.2010.02.032
36. Vaishali, R., Sasikala, R., Ramasubbareddy, S., Remya, S., & Nalluri, S. (2017, October). Genetic algorithm based Feature Selection and MOE Fuzzy classification algorithm on Pima Indians Diabetes dataset, IEEE, ICCNI 2017 (pp.1-5). https://doi.org/10.1109/ICCNI.2017.8123815
37. Wang, X., Yang, J., Teng, X., Xia, W., & Jensen, R. (2007). Feature Selection based on rough sets and particle Swarm Optimization. Pattern Recognition Letters, 28(4), 459–471.
https://doi.org/10.1016/j.patrec.2006.09.003
38. Yang, X.-S. (2010). Firefly algorithm, stochastic test functions and design Optimization. Int. J.Bio-Inspired Computation 2, 2(2), 78–84. https://doi.org/10.1504/IJBIC.2010.032124
39. Yun, C., Oh, B., Yang, J., & Nang, J. (2011). Feature subset selection based on bio-inspired algorithms. Journal of Information Science and Engineering, 27(5), 1667–1686.
40. Zawbaa, H. M., Emary, E., & Grosan, C. (2016). Feature Selection via chaotic antlion Optimization. PloS One, 11(3), e0150652.
41. Zawbaa, H. M., Emary, E., Parv, B., & Sharawi, M. (2016). Feature Selection approach based on moth-flame Optimization algorithm. In Evolutionary Computation (CEC), 2016 IEEE Congress on (pp. 4612–4617).