Abstract
Effectively reducing the distribution discrepancy between domains is essential for enhancing the accuracy of multichannel fault diagnosis under cross-domain conditions. When extracting domain-invariant features from multichannel data, existing transfer learning methods struggle to minimize the conditional distribution discrepancy between different class-level subdomains while preserving intrinsic structural information. To address the above issue, this article proposes a novel tensor transfer learning approach termed multilinear joint distribution adaptation (MJDA). Specifically, inspired by JDA, we extend the JDA model into tensor space and formulate a corresponding optimization problem, which is efficiently solved by an alternating iterative algorithm. By estimating a set of transformation matrices without vectorization, MJDA directly minimizes the joint distribution discrepancy to extract domain-invariant features. Furthermore, twin support higher-order tensor machines are embedded as a tensor classifier, which not only provides pseudo labels for the target domain but also performs fault pattern recognition on the testing data. Extensive transfer experiments on planetary gearbox datasets demonstrate that the proposed MJDA consistently outperforms other typical tensor transfer learning models.
Keywords
Get full access to this article
View all access options for this article.
