Abstract
Recent deep model pruning methods predominantly focus on large-scale datasets and typically require finetuning before deployment. However, in real-world applications, pruning is often necessary for scenarios with fewer classification categories, where finetuning must be avoided to preserve the model’s generalization ability. To address these challenges, we introduce a novel pruning method called Cluster-based Redundancy Elimination (CRE). Specifically, CRE represents each convolutional kernel as a point in a high-dimensional space. A distance-based strategy is then used to compute a clustering radius for each convolutional layer. Based on these radii, core point filters are selected for pruning, as they represent redundant information that can be captured by neighboring filters in the high-dimensional space. This approach eliminates the need for finetuning, thus preserving the generalization of deep models. Extensive experiments on five benchmark datasets with limited classification categories, across multiple model architectures, demonstrate the effectiveness of our method and its superiority over several state-of-the-art pruning techniques.
Get full access to this article
View all access options for this article.
