Abstract
The lack of retrieval methods in the textile subdivision field makes it difficult for universal retrieval methods to meet the precision requirements of plaid fabric manufacturing enterprises. Based on the characteristics of plaid fabrics, this paper presents a novel image retrieval method for plaid fabrics using a convolutional neural network (CNN) with an attention mechanism. Global and local deep features are extracted by combining the attention mechanism with the CNN branches to fully characterize the plaid fabric images, which are then fused by an orthogonal fusion module. To reduce the amount of training data, a novel training strategy is designed to optimize the feature extraction and fusion tasks. The Annoy algorithm is used as the similarity measurement method to balance the retrieval precision and efficiency. To verify the proposed scheme, over 44,000 fabric samples are collected from the factory to build the image database as the benchmark. Experiments show that precision and recall at rank five are up to 77.5% and 57.1%, respectively, and the mean average precision is up to 0.758. Results prove that the proposed method is effective and efficient; it can provide references for fabric manufacturing and exploit the advantages of historical production experience.
Get full access to this article
View all access options for this article.
