Abstract
This study proposes a framework that integrates a genetic algorithm with adaptive freezing (GAAF) to simultaneously optimize the feature subset, network architecture, and hyperparameters of a backpropagation neural network (BPNN) model; the framework is denoted as GAAF–BPNN. In contrast to conventional sequential optimization pipelines, the proposed framework optimizes the aforementioned elements simultaneously by using a multichromosome encoding scheme. Moreover, it incorporates a novel architecture-triggered freezing mechanism that dynamically excludes genes corresponding to inactive layers from genetic operations, enabling the construction of variable-depth network architectures while maintaining chromosome consistency. The GAAF–BPNN framework was evaluated in a vibration-signal-based tool wear prediction task, in which it achieved a substantially lower prediction error than did seven baseline models, including standard BPNN models and widely used tree-based ensemble models. The results of this study demonstrate that the proposed framework can effectively improve modeling accuracy while eliminating the need for manual feature selection and architecture tuning.
Keywords
Get full access to this article
View all access options for this article.
