Abstract
Most feature selection methods determine a global subset of features, where all data instances are projected in order to improve classification accuracy. An attractive alternative solution is to adaptively find a local subset of features for each data instance, such that, the classification of each instance is performed according to its own selective subspace. This paper presents a novel application of Gaussian Processes (GPs) that improves classification performance by learning a set of functions that quantify the discriminative power of each feature. Specifically, GP regressions are used to build for each available feature a function that estimates its discriminative properties over all its input space. Afterwards, by locally joining these regressions it is possible to obtain a discriminative subspace for any position of the input space. New instances are then classified by using a K-NN classifier that operates in the local subspaces. Experimental results show that by using local discriminative subspaces, we are able to reach higher levels of classification accuracy than alternative state-of-the-art feature selection approaches.
Get full access to this article
View all access options for this article.
