Abstract
Feature selection, which can reduce the dimension of feature space without sacrificing the performance of the classifier, is an effective technique in text categorization. As many classifiers cannot deal with the features with large dimensions, the noisy, irrelevant and redundant information must be filtered from the original feature space. On this basis, a random walk algorithm based feature selection method (called RWFS) is proposed in this paper. Firstly, an optimal feature selection method (called OPFS) is used to select some features from the training set. Secondly, the redundant features are filtered by combining the random walk algorithm and a pre-determined threshold. Moreover, in order to search the optimal threshold, an improved artificial bee colony method (called IMABC) is proposed for parameter optimization. In the experiments, support vector machine (SVM) and k-Nearest Neighbor (KNN) classifiers are used on four corpuses. The experimental results show that, the proposed method is significantly superior to six typical feature selections, and can greatly reduce the dimension of vector space while guaranteeing the classification accuracy as measured by F1 measurement.
Keywords
Get full access to this article
View all access options for this article.
