Abstract
Multiple source transfer learning (MSTL) has been obtaining more and more applications especially from several related source domains to help the learning task on target domain. However, multiple source transfer learning algorithms often deal with the corresponding quadratic programming problems which may suffer a big computational burden caused by the kernel matrix computation. In this paper, a novel common-decision-vector based multiple source transfer classification learning (CDV-MSTL) is proposed which doesn't depend on the intrinsic structure of data. This algorithm is based on the structural risk minimization principle and the SVM like framework, so it has good adaptability and better accuracy. Based on the theory of CVM, CDV-MSTL is extended to its CVM based version which can realize fast training for large scale data. Extensive experiments on synthetic and real-world datasets demonstrate the significant improvement in classification performance obtained by the proposed algorithm over existing MSTL algorithm.
Get full access to this article
View all access options for this article.
