Feature selection represents a complex multi-objective optimization challenge aimed at identifying the optimal subset of features while maintaining high accuracy within the domain of machine learning, a task known for its difficulty. In this study, we devise a cost function that simultaneously optimizes classification accuracy and the selected features through linear weighting. Subsequently, we introduce an enhanced meta-heuristic approach named improved bald eagle search (IBES) algorithm to effectively optimize the designed cost function. IBES incorporates the opposition-based learning, levy flight, and a nonlinear control parameter strategy into the various stages of the conventional BES algorithm, to develop a wrapper-based feature selection method. Additionally, an efficient binary optimization called
-shape transfer function is employed to convert the solution space from continuous to binary. To assess effectiveness, we utilize 24 well-known data sets from the UCI repository and compare the performance of the proposed methods with a selection of established algorithms from the literature in terms of cost value, classification accuracy, computational time, and selected features. The results affirm the superiority of the proposed method across the majority of the tested data sets.