Abstract
Pilling is a complex property of textile fabrics, representing, for the final user, a non-desired feature to be controlled and measured by companies working in the textile industry. Traditionally, pilling is assessed by visually comparing fabrics with reference to a set of standard images, thus often resulting in inconsistent quality control. A number of methods using machine vision have been proposed all over the world, with almost all sharing the idea that pilling can be assessed by determining the number of pills or the area occupied by the pills on the fabric surface. In the present work a different approach is proposed: instead of determining the number of pills, a machine vision-based procedure is devised with the aim of extracting a number of parameters characterizing the fabric. These are then used to train an artificial neural network to automatically grade the fabrics in terms of pilling. Tested against a set of differently pilled fabrics, the method shows its effectiveness.
Keywords
Introduction
As widely known, ‘pilling’ may be defined as a surface defect of textile fabrics consisting of a number of pills (i.e., roughly spherical masses) made of entangled fibres [1].
Since pilling represents, for the final user, a non-desired feature, its control, and measurement, is one of the main issues for textile industries. Standing on the surface of the fabric and unpleasantly perceived by the final user, pills are generally caused by the combination of washing and wearing of fabrics. Due to the abrasion of the fabric surface, loose fibres entangle into short fine hairs (fuzz) and, subsequently, develop into spherical bundles anchored to the surface of the fabric. The tendency of a fabric to be subjected to pills formation, as explained in the D4970/D4970M-10e1 Standard (ASTM, 2010) and in the European Standard EN ISO 12945–2:2000 [2, 3], is a very complex process closely linked to a number of parameters such as fibre content, title, number of twists, coverage factor of the mesh, type of fibre or blends, fibre dimensions, yarn construction and fabric finishing treatments.
Moreover, the pilling resistance of a specific fabric in actual wear varies more with the general conditions of use and of the individual wearers. Consequently, due to the complexity of the phenomenon, resistance to pilling (also referred as ‘pilling assessment') is recognized to be one of the most important foci of significant industrial research activity. In fact, the presence of pilling seriously compromises the textile's acceptability to consumers. One of the most diffused methods, to date, for pilling assessment consists of subjectively evaluating the fabric quality using appropriate equipment and a set of standard references, as described below.
Obviously, fabrics take a long time to be pilled in normal use; therefore, resistance to pilling is usually tested by simulated accelerated wear, followed by a manual assessment of the degree of pilling based on a visual comparison of the sample to a set of test images [4].
Fabrics are abraded by tumbling, brushing, or rubbing specimens with abrasive materials in machines (such as Martindale or Pilling box), and then compared by skilled workers with visual standards, which may be actual fabrics or photographs. On the basis of such a comparison, the experts define resistance to pilling using the so called ‘degree of pilling’ i.e., an index varying on a (arbitrary) scale ranging from a degree of 5, which means no pilling, to 1, which means very severe pilling [5,6]. Awkwardly, even if this approach is carried out by highly skilled workers, the reliability of pilling evaluation is quite limited and, as stated in [7], the accuracy is less than 80%.
In recent decades, automated visual inspection (
Early work was carried out in 1990 [8]; images of fabric samples obtained using Martindale are captured under near-tangential illumination, thus acquiring images with high pill-to-background contrast. Such images are binarized using two different thresholds, and then compared with a set of standard images. In [9] pill regions on fabric samples are localized by combining template matching techniques and image thresholding. In [10] operations in both the spatial and frequency domains are introduced to segment pills from the textured background of the fabric web. Such a method calculates the total area occupied by pills in the sample image and assigns a degree of pilling. In [11] statistical features such as mean, variance and median are employed to detect defects. In several other works (for instance in [12], just to cite one) digital image processing was used to determine pills size, number, total area and the mean area of pills on a fabric surface, especially based on thresholding. In [13] an edge-flow based fabric pilling segmentation algorithm that utilizes image colour, texture and phase of the edge flow vector was adopted in order to implement the pilling segmentation of various complex fabrics. This approach determines the total number of pills and the area and volume occupied by such pills.
A remarkable approach to extract pill features from fabric images was proposed in [14]; using a two-dimensional Gaussian fit theory authors train a ‘pill template’ using actual pill images, and determine a reasonable threshold for image segmentation using a histogram-fitting technique. Using such an approach five parameters to describe pill properties (pill number, mean area of pills, total area of pills, contrast and density) are defined; from such data a definition of pilling grade is also provided.
Two-dimensional Fourier analysis and wavelet were used in [15] with the purpose of objectively evaluating textile surface changes, including pilling. A more recent approach [16] used frequency domain image processing to separate periodic structures in the image (the fabric weave/knit pattern) from non-periodic structures in the image (the pills). However, frequency domain analysis cannot provide location information. In [17] a CCD camera was used to capture the image of a laser line projected onto the surface of a series of fabric specimens; by means of trigonometric calculations the three-dimensional shapes of the inspected fabrics are then evaluated. Such 3D reconstruction is then used to determine the number, area, and density of pills.
Even if the above-mentioned methods implement different strategies for assessing pilling of fabrics, almost all of them are focused on pill detection i.e., on image segmentation. This segmentation is, in turn, aimed at determining parameters such as the number and density of pills and/or the area occupied by the pills on the fabric surface. Once this task is performed, pilling is obtained as a parameter inferred from the number of pills, or by a comparison between the image of non-defected fabric with the one with pills. Moreover, almost all methods use, at some point, an image binarization using one or more thresholds and morphological operations on images.
The present work is meant to propose a different strategy for pilling evaluation based on the combination of image processing techniques and an AI-based approach. Instead of analysing images of pilled fabrics in order to segment the pills from the fabric web, the main idea of the paper is to devise a computer-based method able to extract a number of objective parameters from images of pilled fabric samples so that a feedforward backpropagation artificial neural network (
Machine vision system and parameter extraction
With the aim of developing a rapid and efficient pilling grading system, the first step is to perform a laboratory pilling classification, according to the standard, to be used as a reference. On the basis of this classification it is then possible to build a machine vision system able to carry out the following tasks: (1) real-time acquisition with a CCD camera, (2) extraction of a number of parameters from the images and (3) development of an ANN based grading tool (the ANN is trained to evaluate a pilling degree on the basis of the parameters extracted in the task (2)). Figure 1 shows a flow diagram of the above-mentioned tasks.

Flow diagram of the proposed procedure. After pilling, grading is performed by experts, a set of images is acquired, and, using image-processing algorithms, a number of parameters are extracted and used to train a neural network. Finally, a new set of fabrics is graded using the trained network.
The main goal of this task was to create a set of pilled fabrics with different degrees of pilling in the range 1–5 with a 0.5 interval between each degree. Nine different families of fabrics (with different colours and thicknesses) were selected (see Table 1). For each family, a set of samples was obtained by cutting nine specimens (38 mm diameter).
Fabric families, colour and thickness
Fabric families, colour and thickness
As a result, an overall amount of samples equal to 81 was obtained. Each sample was then tested with a Martindale pilling tester using an appropriate testing procedure, described below. This allowed the classification of each fabric, from each family, into nine different classes on the basis of visually-assessed pilling degree. As depicted in Table 2, the higher the class number, the lower the quality of the fabric surface: class 1 means no pilling while class 9 means severe pilling.
Classification of fabrics into nine classes
In order to have the same number of samples with the same pilling degree, the following procedure (for the sake of simplicity described for fabric family ‘I') was carried out.
The first step consisted of archiving a sample classified in class 1. Since class 1 refers to a non-defected sample, this step is performed by archiving a sample before laboratory testing is carried out.
The remaining eight samples were then laundered three times, conditioned in atmospheric conditions for textiles and placed on the eight abrading tables (available on the Martindale machine); afterwards, the samples were rubbed against the rotating abradents (rotational speed equal to 45 rpm) at low pressure (3 kPa) and with continuously changing directions (Shaw, 1954). The machine was run using a 100-cycle evaluation interval and then visually evaluated by a panel of three experts in order to assess the pilling grade. Test cycles were carried out until the three experts agreed that a sample, among the eight placed ones, had ‘reached’ class 2. At that moment, one of the samples was removed from the Martindale machine and the test started again. This step was performed until each sample was assigned to a pilling degree, according to Table 2. A unique identification number (‘id') was assigned to each sample; id.II.3., for instance, means that the sample taken from family II was graded as class 3.
A machine vision system was devised in order to extract a number of parameters from pilled samples. Such a system consists of a sealed cabin, used in previous work [19]. The cabin hosts a Canon EOS 500D camera (provided with a 22.3 × 14.9 mm2 CMOS sensor with a resolution equal to 4752 X 3168 pixel2). A CIE standard illuminant D65 lamp, placed frontally to the fabric samples (i.e., with camera principal axis approximately parallel to the normal vector of the fabric surface), was chosen in order to perform a repeatable and controlled acquisition. Images were acquired with an exposition time of 1/4 s with F-stop f/11, from a distance of approximately 50 cm so that the entire circular sample would fill the framing. The spatial resolution is equal to 0.013 mm/pixel so that the area of a single pill has approximately the size of 502-1002 pixels. Images are acquired in
Each of the 81 samples, classified using the above-described procedure, was acquired with this machine vision system, and each acquired image was stored on a PC. Thus, the result of this acquisition task consists of a database of 81 images.
With the aim of training the ANN system, the following 11 parameters were extracted from the images:
Entropy curve related parameters (no. 2 parameters);
Total skewness and kurtosis (no. 4 parameters);
CV coefficient (no. 1 parameter);
Brightness-related parameters (no. 4 parameters).
Prior to parameter extraction, a pre-processing of the images is provided with the aim of discarding both colour information and possible unevenness of illumination. This procedure is meant to extend the proposed work to other acquisition embodiments characterized by different illumination systems and even for environmental light conditions.
Pre-processing of images
Let

Fabric sample with id.IV.4. Original image
Such an image encompasses both the sample and the (white) background that has to be discarded. As a consequence an automatic procedure for cropping all
First, an Otsu-based thresholding [20] to the image
where
Such vertexes were used for cropping the original image, thus resulting in a new image
In order to discard colour information, image
where the matrix
The knowledge of the XYZ values allows the colour transformation in the CIELAB space simply using the XYZ to CIELAB relations [22]. As a consequence the
Even if the acquisition is performed using the above-mentioned appositely devised equipment, a homogeneous illumination of the samples is not assured. Accordingly, a method for removing possible unevenness in brightness needs to be used. Moreover, this guarantees the method to determine pilling grading even when using different illuminating systems. With this aim, first, the point cloud
Then a least-squares approximation using B-splines [23] of
Let
where
In Figure 3, the point cloud

Comparison between the surface
In Figure 4, a comparison between the image

(a) image
The above-described procedure was applied to all the 81 fabric samples, thus obtaining a set of corrected images

corrected images corresponding to the nine pilling classes for the fabric samples obtained for family IV
Once each of the original fabric sample images is processed, thus obtaining a set of images [
With this aim, each image
where
By definition, image entropy tends to zero if the image is uniform (flat), while it reaches its maximum value for highly disordered images. In Figure 6 the comparison between the entropy curves for differently graded fabric samples is shown (in the example of Figure 6 these are evaluated for Family IV). As expected, image entropy values tend to rise as the threshold increases (i.e., as the degree of disorder rises). Once a maximum value (equal to 1) is reached, entropy values decrease until the threshold is too high to effectively discriminate between objects in an image from the background. Once the entropy curve is computed, its standard deviation

Entropy curves for the 9 classes of family I
Generally speaking, fabrics with fewer defects tend to be characterized by lower
A second meaningful parameter, related to the entropy curve, can be defined by observing that the maximum value generally tends to be shifted towards lower brightness values as the pilling class worsens. As a consequence the ratio
In a recent study, the authors of [18] used an analogy between surface roughness definition in mechanics and the bi-dimensional brightness function. This analogy enabled the definition of two statistical parameters usually adopted for analysing mechanical rough surfaces:
Being the generic fabric sample image described by the grey-level matrix
The result consists of two curves

Once the curves are computed, it is possible to define for each image
In case of a fabric without pills, such four parameters roughly tend to zero, corresponding to an oscillating distribution close to the mean value (for a distribution of infinite accidental values, they would have exactly zero value). In the presence of one or more dark (or light) areas (i.e., pills), such parameters tend to assume values remarkably different from zero, thus indicating the occurrence of one or more peaks in distribution. In other words, the above-defined parameters characterize, by some means, fabric-pilling properties. However, as in the case of entropy curves, these are not used as stand-alone parameters for grading the fabrics, but rather as parameters for the ANN-based system.
In addition to those defined above, a parameter experimentally proven to be related to the pilling grade of fabrics is the so-called ‘variation parameter’
where σ and μ are, respectively, the standard deviation and the mean value of the brightness level for the image
Finally, four more parameters were extracted from all of the images of the set [
Since in the generic image
The major contribution of this work is to simultaneously take into account the parameters described above in order to effectively perform an automatic fabric pilling grading. This is accomplished by using the parameters for training a FFBP ANN that allows reaching the objectives of this work without formulating experimental thresholds for each statistical parameter. This is a straightforward approach able to prevent false alarms or unreliable detections and classifications. The use of an ANN-based approach is particularly suitable for the classification problem studied since, although such parameters are recognized as being influenced by pilling grade, the influence of each of them singularly is not known
Training Set
As described above, from each image
Parameters for family IV
Parameters for family IV
Since 81 fabric images were selected during the laboratory-testing phase, a database of 81×11 parameters was built, consisting of the following matrix
where the generic vector
In order to validate the behaviour of the ANN, only a subset of data was used to train the network i.e., the actual target set is defined as a subset (matrix sized 50×11)
The aim of the ANN is to classify the fabric into one of the nine classes shown in Table 2. For this purpose, for each column
ANN architecture and training
The network devised for the classification system, whose structure is shown in Figure 8, has three layers (input, hidden and output layer), a hidden layer made of sigmoid neurons followed by an output layer of logistic neurons, 11 input, h hidden, and nine output units. Usually it is possible to choose the best network by estimation, for a given problem, of the network architecture and parameters within a set of candidate configurations. For this purpose, the number of hidden units was varied from 12 to 33 with a step of three units, comparing the ANN performance using the training

ANN architecture consisting of three layers: the input layer is composed of 11 neurons; the hidden layer is composed of 15 neurons; the output layer consists of nine neurons, one for each class. In this example the ANN accepts as input a vector of parameters extracted for a ‘class 4’ fabric sample thus trained using a binary output with one value in fourth position.
Training was carried out using a rule based on the Levemberg-Marquardt algorithm [28] with a combination coefficient μ = 0.05. Such a value [28] allows the updating of the ANN weights in a stable and fast way by approximating the Hessian Matrix using Jacobian matrix. During the training, the weights and the biases of the network are iteratively adjusted to minimize a specified network error function (in this work the mean square error (MSE) correspondent to the training set elements is used). The error is monitored during the training process, and normally decreases during the initial phase of the training.
When the network becomes excessively specialized in reproducing the training data, the validation error (i.e., the MSE evaluated using the set
Training, for the set of 81 training vectors, is performed in 12 seconds using a PC with Core i5 and 8 Gb RAM. Once trained, the ANN accepts, as an input, any 11-element vector
In order to test the performance of the devised approach a series of six new fabric families (i.e., families X, XI, XII, XIII, XIV and XV) were collected and graded according to the procedure described in Section 2.2, thus resulting in a set of 54 differently graded fabric samples. Each sample was then acquired using the machine vision system described in Section 2.2 and the resulting images were processed in order to extract the corresponding 11 parameters as described in Sections 2.3–2.6. Finally, the set of parameters was used to simulate the ANN, thus obtaining a pilling grading for the 54 new samples.
With the aim of assessing the performance of the pilling grading system, a reliability index η, given by the following equation, was used:
where
Since a misclassification in class (
In effect, this misclassification is not considered to be serious by textile experts since the fabric brought to the market is actually even better than declared by the textile laboratory. Obviously the worse is the classification, the lower is the weight.
Conversely, if the ANN-based system misclassifies to a better class, i.e., (
In Table 4 the results of ANN simulation are compared with the classification performed by experts. The misclassification in terms of erroneous class is also provided. The reliability index for the 54 tested fabric samples is equal to 88.52%.
Comparison between ANN-based classification and experts’ classification, SOFM classification and k-means clustering-based classification, respectively
Since the present paper introduces a metric for pilling classification based on neural networks (that processes the image-based parameters so as to perform a supervised classification) it is useful to compare the obtained results with those obtainable using two other known methods: self-organizing feature map (SOFM) and k-means clustering [31].
SOFM is a type of ANN that produces a low-dimensional, discretized representation of the input space of the training samples; in this case the SOFM should be able to map a transformation from ℜ11 → ℜ9, i.e., from the input space of image parameters to classes. Since SOFM uses a neighbourhood function to preserve the topological properties of the input space, the space is processed in an unsupervised manner.
As a consequence the classification performed by this method occurs irrespective of the knowledge provided by visual human classification, being based solely on the topological position of parameters in the input space. Results obtained using a SOFM-based classification using an output map of size 3×3, devised in a MATLAB® software environment, are listed in Table 4. The reliability index, using SOFM, for the 54 tested fabric samples was 43.52%; this result is quite unsatisfactory since the SOFM tends to underestimate the pilling grade.
k-means clustering is a technique used to perform a partition of n observations into k clusters, in which each observation belongs to the cluster with the nearest mean, serving as a prototype of the cluster. In this case the aim was to perform a partition of 11 parameters into nine classes.
As in the case of SOFM, the classification performed using k-means is unsupervised. Applied to the selected parameters extracted from images as described above, this classifier performs better than SOFM, but the results, shown in Table 4, are still unsatisfactory with respect to those obtained using the ANN-based method.
The reliability index, using k-means, for the 54 tested fabric samples is equal to 65.19%. These results are not completely unexpected; even if each parameter extracted from fabric images describes a different property of the fabric texture, the pilling classification is performed by experts on the basis of the overall aspect of the fabric (in comparison with a standard). In other words, the human-based classification consists of an (unconscious) critical elaboration of visible features in the fabric aimed at performing a classification.
Accordingly, supervised training seems to be more reliable than unsupervised methods, where the parameters are processed only in terms of topological distances in the ℜ9 space (classes).
In order to test the repeatability of the proposed method, with particular reference to the image correction algorithm described in Section 2.3, the entire set of fabrics composing Family XI is acquired by moving the CIE standard illuminant D65 so that the projected light on the fabrics under investigation is deliberately non-homogeneous.
In particular, two different acquisitions are performed: the first is obtained by tilting the illuminant with an angle of approximately 30° with respect to the fabric surface normal vector (in both x and y directions); the second acquisition is performed by tilting the illuminant with an angle of approximately 80° along the x direction and 10° along the y direction.
In Figure 9 the two different acquisitions of the fabric belonging to Family XI and classified by experts in class 5 are depicted. In particular, in Figure 9a an example of a fabric image acquired by tilting the illuminant with an angle of approximately 30° with respect to the fabric surface normal vector (in both x and y directions) is shown; in Figure 9b an example of fabric image acquired by tilting the illuminant with an angle of approximately 80° along the x direction and 10° along the y direction is depicted.

a) example of a fabric image acquired by tilting the illuminant with an angle of approximately 30° with respect to the fabric surface normal vector (in both x and y directions); b) example of fabric image acquired by tilting the illuminant with an angle of approximately 80° along the x direction and 10° along the y direction
The acquired images are corrected, and the 11 fabric parameters to be used as input for the ANN are consequently extracted. In Table 5 the values of the extracted parameters for the fabric taken as an example in Figure 9 are listed.
Comparison between parameters extracted using the proposed method under controlled illumination and, respectively, by tilting the illuminant with an angle of approximately 30° with respect to the fabric surface normal vector (test 1) and by tilting the illuminant with an angle of approximately 80° along the x direction and 10° along the y direction (test 2)
Finally, the ANN is simulated using the 11 parameters. The results of classification, listed in Table 6, show that the method is able to coherently classify the fabrics even using a different illumination setup. Obviously, this simple test is not sufficient to assess the reliability of the method under strongly different conditions and further analysis is recommended for the future.
Classification results for Family XI using different illuminations
In the present work a method for automatically classifying the pilling grade of fabrics is devised and tested. The method integrates hardware and software, based on ANNs, in order to perform the classification.
The proposed method is proven to be effective since it respects human-based classification with a reliability of around 87.5%. Referring to the test set of 54 fabrics, the ANN-based software was able to correctly classify 44 fabrics (81.48% of the fabrics) and to classify nine fabrics in a conservative way. This means that if the textile company is willing to accept a conservative misclassification (i.e., a classification in class (
The definition of a reliability index, as discussed in the previous section, is crucial for two reasons: first, the index obviously enables the measurement of the performance of the developed method; second, it allows a comparison between the proposed method and other analogous systems (not referring to pilling grade but to machine vision systems in the textile field) provided by the literature. For instance in [29] an accuracy varying in the range 87.6–97.1% is assessed. In [30] an average classification rate is defined, varying in the range 84.4%–98.2%. Since these results support those provided by the present work, it can be asserted that the proposed method is aligned to the state of the art of machine vision-based systems in the textile field.
It must be noted that the proposed method still has some crucial drawbacks. The most important limitation is a typical weakness of ANN-based methods: to work properly, the system needs to be trained. This means that for any kind of fabric to be inspected, at least a set of specimens has to be processed according to the procedure described in Section 2.1.
In effect, there is no experimental evidence that ANN trained to perform pilling assessment on a certain type of fabric is reliable for performing classification for other, different, textiles. Since the experimental procedure is time-consuming the setup of the system requires a preliminary phase consisting of the construction of a database of fabric typologies.
Moreover, in case that the textile company using the method starts to produce a new kind of fabric, again the characterization of pilling grades has to be performed in order to train the ANN.
Fortunately, the computational time for extracting both image-related information and for training the ANN are not time-consuming: image processing is performed in less than 15 seconds using a PC with Core i5 and 8 Gb RAM, while training, depending on the size of the training set, may take from a few seconds (e.g., training set sized 50×11) to 2 minutes (e.g., training set sized 500 × 11). Another important issue to be taken into account is that the method has been tested on monochromatic fabrics. As a consequence it has to be tested further, and improved, when dealing with mélange or speckle fabrics where two or more colours coexist in weft and warp (or, generally speaking, in fibres).
Future work will be also addressed to test the reliability of the method under several non-homogeneous light conditions. In fact, the results in the present work demonstrate that image correction can reduce the impact of incorrect light on fabrics; this necessitates further study to assess if the method is effective using different kinds of illuminant placed in different positions, and even in the extreme conditions of environmental light. Moreover, since a number of methods for correcting images with non-uniform backgrounds exist [32], it could be useful to test a range of such methods in this particular textile application.
