Abstract
Intelligent fault diagnosis based on multisensor data fusion techniques and two-dimensional convolutional neural network (CNN) has been widely developed and achieved numerous excellent results. Existing studies usually develop multi-input models to facilitate data fusion, lacking schemes for realizing the fusion in the process of data-to-image. Traditional methods that convert signals to grayscale maps and concatenate them into RGB images lose temporal correlation and are susceptible to interference. Besides, few studies integrated the favorable features at different stages in the CNN for diagnosis, which limits the diagnostic performance. To this end, this article proposes a multisource signal-to-image method called multidimensional distance matrix (MDM) and a multi-scale adaptive feature fusion convolutional neural network (MAFFCNN). First, MDM images emphasize the interrelationships between points of different moments in time series data and preserve temporal correlation. Then, the conv block of MAFFCNN can extract features at different scales in the image, and its attention branch can better aggregate the location information. Also, the MAFFCNN introduces efficient attention and cross-spatial learning to generate learnable weights based on the importance of features at different stages to achieve adaptive feature fusion. Finally, the above method is validated using the established gear fault dataset and the public bearing dataset. The experimental results demonstrate the effectiveness of the proposed methods and their excellent robustness in complex environments.
Keywords
Get full access to this article
View all access options for this article.
