Abstract
To address the susceptibility of sensor signals to strong noise interference, this study proposes a high-precision fault diagnosis method based on multi-sensor information fusion. The proposed approach integrates depthwise separable convolution (DSConv) with a multi-pooling attention (MPA) mechanism and an improved Vision Transformer (IVIT). First, continuous wavelet transform is employed to convert raw time-series signals collected from multiple sensors into two-dimensional feature representations, which are then fed into the model using a multi-sensor data-layer feature fusion strategy. Subsequently, channel shuffle and DSConv are utilized to effectively extract local features, while the MPA mechanism enhances the model’s capability to perceive critical features in noisy environments. Finally, the IVIT further strengthens feature extraction and representation capabilities, producing fault diagnosis results. Experimental evaluations on three datasets demonstrate that even under a signal-to-noise ratio of −2, MPAIT-Net outperforms other models, achieving average accuracies of 96.83, 95.00, and 98.92%, respectively. Moreover, the performance of the model on datasets with complex faults and faults of different degrees is studied. The results show that the model exhibits stable performance under different noise levels, showing excellent robustness and generalization ability. The corresponding source code is publicly available at https://github.com/xieph001/MPAIT-Net.
Keywords
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
