Abstract
In this paper we address the problem of dimensionality reduction of tensor data. There are three contributions in this paper. Local Homeomorphism is the intrinsic mathematical feature of manifolds and the basis of many manifold learning algorithms. However, these algorithms are developed for vector data, not suitable for tensor data. Our first contribution is to derive a tensor version of dimensionality reduction based on local homeomorphism. Tucker decomposition is widely used in dimensionality reduction of tensor data. However, Tucker decomposition without any regularization is actually a traditional subspace learning problem. Our second contribution is to propose a local homeomorphism regularized Tucker decomposition and applies it to dimensionality reduction of tensor data, called dimensionality reduction of tensor data based on subspace learning and homeomorphism, SLLH for short. As far as dimensionality reduction is concerned, only the core tensor in Tucker decomposition is the target, while the mode product matrices are only by-products. Therefore, many algorithms absorb all these mode product matrices into a big matrix by using the conversion theorem of tensor algebra. However, in Tucker decomposition, each mode product matrix represents dimensionality reduction for a specific dimension of tensor. Our third contribution is to propose an iterative solution method for SLLH, in which each mode product matrix of the current iteration is calculated from other mode product matrices and the core tensor of the previous iteration. The core tensor is evolved iteratively from the iteratively-calculated mode product matrices. The experimental results presented in this paper show that the proposed SLLH outperforms many of the state-of-the-art algorithms.
Get full access to this article
View all access options for this article.
