Issue 
Wuhan Univ. J. Nat. Sci.
Volume 28, Number 6, December 2023



Page(s)  508  522  
DOI  https://doi.org/10.1051/wujns/2023286508  
Published online  15 January 2024 
Computer Science
CLC number: TS187; TP399
Automated Density Measurement of Weft Knitted Fabrics Using Backlight Imaging
^{1}
School of Textiles and Fashion, Shanghai University of Engineering Science, Shanghai 201620, China
^{2}
Office of Academic Research, Shanghai University of Engineering Science, Shanghai 201620, China
^{†} To whom correspondence should be addressed. Email: liushuhua1093003@163.com
Received:
28
July
2023
This paper proposes a new density measurement algorithm to address the issues of poor applicability and inaccurate results associated with the automatic density measurement algorithm for weftknitted fabrics. The algorithm involves collecting the transmitted light image of the knitted fabric, calculating the tilt angle using the skewing correction algorithm, and rotating the image to correct the weft skew present therein. The prerotated and postrotated images are then projected vertically and horizontally in grayscale, and the obtained projection curves are used to represent the distribution of loops in vertical and horizontal rows. This study proposed a wave peak coordinate verification algorithm that calculates the coursewise densities and walewise densities of the knitted fabric. In experiments, the proposed density measurement method is found to exhibit an accuracy above 98% when compared with the manual mode.
Key words: knitted fabric / fabric density / loop identification / image analysis / density measurement
Biography: ZHANG Jing, female, Master candidate, research direction: fabric image analysis. Email: ZhangJing0576@outlook.com
Fundation item: Supported by the National Natural Science Foundation of China (61876106), Shanghai Local CapacityBuilding Project (19030501200), and ClassⅢ Peak Discipline of Shanghai—Materials Science and Engineering (HighEnergy Beam Intelligent Processing and Green Manufacturing)
© Wuhan University 2023
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
0 Introduction
Fabric density refers to the number of yarns per unit length in a fabric. It significantly determines various textile properties, including appearance, texture, and air permeability. It is critical in evaluating textile quality and improving product design and manufacturing practices. Conventionally, fabric density measurement involves manual disassembly of the fabric to count the individual yarns, which results in irreversible damage to the material. Alternatively, a cloth counting glass can measure the yarns in a fixed length, which is a more convenient and commonly used approach. The traditional method is easy to operate but can quickly cause visual fatigue. Subjective factors, such as the experience and physical condition of the inspector, make it difficult to obtain accurate and rapid test results. A more efficient and precise automated fabric density measurement method must be developed to address these limitations.
In recent years, computer vision technology has been widely applied to quality assessment in the textile industry, achieving good results in fields such as defect detection^{[1]}, pilling evalution^{[2]}, fiber measurement^{[3]}, yarn inspection^{[4]}, nonwoven structure measurement^{[5]}, fabric surface smoothness measurement^{[6]}, and fabric density measurement. According to different methods of fabric image processing, existing density measurement methods can be divided into two categories: frequency domain method and spatial domain method.
Frequent domain analysis was employed in the early automatic fabric density measurement studies. This involves using methods such as the Fourier transform or wavelet transform to analyze fabric images with typical periodic textures. The fabric density is then calculated by reconstructing fabric images using the energy spectrum peaks that characterize the fabric structure's periodicity. In 1988, Imaoka et al^{[7]} designed an automatic fabric density measurement device using the Fourier method to analyze the density of woven fabrics. The highest test accuracy of 80% was achieved for plain samples. Wood^{[8]} proposed a Fourier transformbased fabric texture measurement method to characterize carpet patterns. Based on the fabric image's twodimensional Fourier transform power spectrum, the method measured the average power spectrum values of the circular region centered on the origin and the average power spectrum of the corner sectors centered on the origin. These values contained texture information that could be used for fabric density measurements since image texture is the basis for fabric density identification. Hosseini et al^{[9]} collected the values of the twodimensional power spectrum of the sector centered at the origin to obtain the angular Fourier power spectrum (APS). The fabric density was calculated from the texture orientation information provided in the APS waveform and the autocorrelation function obtained from the Fourier inverse transform. SariSarraf and Goddard^{[10]} used continuous concentric annular regions of predetermined width to cover the Fourier spectrum map of the fabric image. The spectral values within each annular region were summed and normalized to generate onedimensional features, which were then used to calculate the fabric density. Xu^{[11]} selected peaks in different directions in the energy spectrum obtained by FFT for image reconstruction. This allowed for separating periodic structures and extracting warp or weft components separately to achieve fabric density measurement. Moreover, the Fourier transform technique could obtain the spectral models of three essential tissues, plain, twill, and satin, and establish correspondence between the woven fabric surface texture and spectrogram models ^{[1214]}.
In 2005, Jeong et al^{[15]} concluded that the grayscale projection method outperformed the Fourier transform approach in density measurements. This is because the main principle of the fast Fourier transform (FFT) is to detect the peaks in the power spectrum that correspond to the warp and weft periodicity of the fabric. However, the fabric's weave pattern and color pattern may result in additional peak points interfering with the calculation results. A twosided fusion method^{ [1618]} has been proposed to mitigate the influence of yarn color. This method fused the front and back images of the fabric using algorithms such as wavelet transform and then used Fourier transform to measure the density of the fused images. This approach has been found to yield highly accurate results. It is worth noting that the Fourier transform method continues to be considered adequate for obtaining fabric density until 2015^{ [19,20]} and is particularly wellsuited for realtime monitoring of moving fabrics, making it suitable for use in online inspection systems^{ [21,22]} and portable inspection systems^{ [23]}. Moreover, for hightightness woven fabrics, the Fourier transform method has been found to exhibit good measurement performance^{[24]}.
The wavelet transform, an extension of the Fourier transform, represents another frequency domain method widely employed for extracting information regarding warp and weft density^{ [2527]}. This approach involves decomposing the fabric image into low and high frequencies and reconstructing the horizontal and vertical components to extract information regarding weft and warp yarn, respectively. The reconstructed image can provide insight into yarn orientation. However, yarn segmentation is not consistently precise, limiting the method's performance.
The frequency domain method generally applies to plain and twill fabrics and provides high accuracy in density measurement for solid color fabrics. However, misjudgment may occur in the density measurement of colored woven fabrics.
The spatial domain method relies predominantly on the periodic variation of grayscale values in fabric images to determine their structural parameters. In an ideal scenario, this technique yields a picture of yarn localization with apparent warp and weft segmentation. This algorithm demonstrates high recognition accuracy when the fabric density is low and the image contrast is high. Lin et al^{[28]} employed feature parameters derived from the GrayLevel Cooccurrence Matrix (GLCM) to measure fabric density. Their results showed that this technique accurately calculates the density of plain fabrics. However, it has limitations in the density detection of twill and satin fabrics. Ajallouian et al^{[29]} binarized gray images below a specific threshold value to generate black and white regions representing the intersections of warp and weft threads. The fabric density was then determined by counting the number of these regions in horizontal and vertical directions.
The grayscale projection method is the most widely used spatial domain technique for fabric density calculation. Pan et al^{[30,31]} proposed a method that utilized grayscale projection and correlation coefficient analysis to locate warp and weft yarns and determine the number of yarns per unit length, leading to corresponding warp and weft densities. This algorithm can accurately detect the density of the singlesystemmélange color fabric and doublesystemmélange yarndyed fabric. Zhang et al^{[32]} proposed a novel densitytissue structure formula for measuring hightightness woven fabrics' warp and weft density. The weft density of twill and satin fabrics was determined via the projection method, and the proposed formula was employed to calculate the warp density.
Fabric images can be affected by minor changes such as illumination and yarn deformation, leading to the presence of numerous noises that manifest as random polar points on grayscale projection curves. This interference poses a challenge to yarn segmentation and density calculation of fabric images. Researchers have proposed several solutions to tackle this issue. Wang etal^{[33]} computed the extreme waves by deriving the grayscale projection curves and subsequently determined the secondorder derivatives of the extreme waves. Local corrections were then made to derive density waves that indicate the position and number of yarns required for fabric density calculation. They also conducted a comparison of the detection efficiency of backlight and reflected light images and demonstrated that images obtained under transmitted light conditions are more efficient in the automatic detection of yarndyed fabric density. Zhang et al^{[34]} employed a mathematical statistical method and a data smoothing algorithm to process fabric image projection curves and obtain probability distribution maps of peaks. By calculating the number of peaks in the frequency curve, they could accurately detect the number of yarns and calculate the density of color woven fabrics. Yuan et al^{[35]} used the relative total variation model based on the view of structure extraction from the texture (RTVSFT) to obscure nonperiodic texture information of the fabric while preserving color, brightness, and yarn crossstructure information. A modified Gaussian filtering technique was applied to smooth extreme points in the grayscale projection curve to enable yarn segmentation and complete density calculation. Fourier transform^{[36]}, moving average filter^{[37]}, and locally weighted regression (LOESS)^{[38]} are also methods for analyzing grayscale projection curves and achieving improved curve fitting.
In the above methodology, it is necessary for the yarns of the woven fabric to be aligned parallel to the boundary of the sample image.
However, for fabrics with twill patterns or tilted yarns, it is imperative to identify the angle of the tilt to determine yarn spacing and measure yarn density accurately. Various algorithms, including the Fourier transform^{[37]} and Hough transform^{[3842]}, can be employed to calculate the tilt angle. Aldemir et al^{[43]} demonstrated that the number of peaks in the grayscale projection curve is minimal when there is no yarn tilt in the fabric image. Based on this observation, they identified the fabric image's tilt angle, followed by an anisotropic diffusion filtering (ADF) and Gabor filter to improve the edge information. Finally, they performed grayscale projection to determine yarn density. Zheng et al^{[44]} proposed a multiscale density detection approach using a texture unit detection model to extract texture units from the original fabric macroscopic image and obtain a microscale fabric image. They corrected the yarn skew based on the texture unit detection results and used the lightness gradient projection method to analyze yarn positions in the microscale subimages to determine the fabric density.
Recently, the emergence of deep learning techniques has sparked research interest in training the convolutional neural networks (CNN) to detect fabric density. Meng et al ^{[42]} utilized a portable device to capture fabric images and construct fabric datasets for training a multiscale convolutional neural network (MSnet) to locate yarns. Accurate density measurement is the foundation for further identifying weave patterns^{ [45]}. Although neural networks have shown good fault tolerance and adaptability, they require sufficient sample data for training and improving their adaptability to different sample detection. Furthermore, the mechanism of the training process remains insufficiently understood. As a result, few studies have employed deep learning methods for automated fabric density measurement.
In brief, although there have been significant achievements in detecting the density of fabrics automatically, some problems remain unresolved. As far as we know, the current research mainly focuses on woven fabrics, and there are few studies on the density detection of knitted fabrics. Knitted fabrics present additional complex challenges due to their looped structure and yarn characteristics. The reduced twist of the yarn utilized in the knitting process promotes greater hairiness on the fabric's surface. Knitted fabrics are susceptible to skewing during production due to structure differences (as shown in Fig.1). The Hough transform is the most commonly used method for yarn skew detection in woven fabrics, extracting linear features for yarn identification. In 2015, Zhang et al^{[46]} employed the Hough transform to determine the coursewise density of knitted fabrics by extracting wale from binary images. They performed geometric analysis to determine the course spacing for walewise density calculation. However, when the coursewise density is height, and the wale loops' leg gap is tiny, the curvature of knitted fabric loops can prevent accurate extraction of an ideal straight line, resulting in incorrect recognition of the Hough transform and division of wale. While the frequency domain method used for woven fabric density detection can also work for knitted fabrics, its applicability needs improvement. By using the Fourier transform to convert knitted fabric images to the frequency domain, bright spots are generated from both the loop and leg gap regions on the transverse of the twodimensional spectral map. Processing these bright spots is complicated, and the results are not satisfactory.
Fig.1 Illustration of woven fabric and knitted fabric 
A new density measurement algorithm is proposed for weftknitted fabrics to overcome issues with poor applicability and inaccurate results in this paper. The algorithm involves collecting a transmitted light image of the fabric, calculating the tilt angle using a loop skewing correction algorithm, rotating the image to correct skewing, and projecting the prerotated and postrotated images vertically and horizontally in grayscale. A wave peak verification algorithm is then used to calculate the coursewise density (wales per length) and walewise density (courses per length) of the fabric. The algorithm's validity is verified by testing different knitted fabric samples and comparing them with manually calculated results.
1 Methodology
1.1 Image Acquisition
The study used the KOPPACE 4KCH microscope image acquisition system to capture weftknitted fabric samples' reflected and transmitted light images in RGB mode. The system has two LED lamps (as shown in Fig.2) for adjusting illumination to obtain transmitted and reflected light. The acquired images were cropped to a dimension of 1 600 pixels × 1 600 pixels, corresponding to the actual size of 10 mm × 10 mm fabric.
Fig.2 Composite light source image acquisition system 
We captured transmitted light images of 30 sample knitted fabrics, as shown in Fig.3. Reflected light images of the same areas of the same fabric samples in Fig.3 were also captured to verify the method's effectiveness. The transmitted light images have more apparent holes than the reflected light images, making loop identification and localization easier. Figure 4 shows the distribution of the samples' coursewise and walewise densities.
Fig.3 Images of knitted fabric samples 
Fig.4 The distribution of the coursewise densities and walewise densities Note: The green barchart shows the number of samples in different intervals 
1.2 Iterative Threshold Segmentation
The process of deriving loop numbers from the knitted fabric image entails converting it into a binary representation, followed by row and column projections. The original image is downsampled to a resolution of 512 pixels×512 pixels to facilitate computation. The original image performs an edge enhancement operation to achieve better binarization results. In this study, the Laplace operator is employed for this purpose. This secondorder derivative operator selectively amplifies the regions of the image with high grayscale discontinuities, thereby enhancing its sharpness.
The Laplace operator finds the partial derivatives along the two spatial axes of the image $f\left(x,y\right)$. The Laplace value at position $\left(x,y\right)$ is defined as
$\begin{array}{l}{\nabla}^{\mathrm{2}}f\left(x,y\right)=\frac{{\partial}^{\mathrm{2}}f\left(x,y\right)}{\partial {x}^{\mathrm{2}}}+\frac{{\partial}^{\mathrm{2}}f\left(x,y\right)}{\partial {y}^{\mathrm{2}}}\\ =f\left(x+\mathrm{1},y\right)+f\left(x\mathrm{1},y\right)+f\left(x,y+\mathrm{1}\right)\\ +f\left(x,y\mathrm{1}\right)\mathrm{4}f\left(x,y\right)\end{array}$(1)
where $f\left(x,y\right)$ denotes the input image at the actual coordinate position $\left(x,y\right)$.
The original image is then superimposed with the Laplace image to obtain the sharpened image
${f}^{*}\left(x,y\right)=f\left(x,y\right){\nabla}^{\mathrm{2}}f\left(x,y\right)$(2)
The knitted fabric image, which had been sharpened, transformed into a grayscale representation. We noted that the fabric surface was affected by the presence of yarn hairiness. To mitigate this issue, a median filter with a window size 3×3 was applied to the grayscale image. This filtering process aimed to alleviate the interference signals arising from the inherent flaws of the yarn, such as hairiness, as well as the errors introduced during image acquisition.
An iterative method is utilized to identify the segmentation threshold that yields the minimum probability of misjudgment to facilitate the conversion of the smoothed image into a suitable binary representation. The gray probability density function of the foreground pixels is denoted as $p\left(x\right)$, while that of the background pixels is denoted as $q\left(x\right)$. The percentage of foreground pixels in the overall number of pixels is ${\theta}_{\mathrm{1}}$, while the proportion of background pixels is ${\theta}_{\mathrm{2}}=\mathrm{1}{\theta}_{\mathrm{1}}$.
Assuming the segmentation threshold is $T$, the probability of foreground pixels being misclassified as the background is given by
${E}_{\mathrm{1}}\left(T\right)={\theta}_{\mathrm{1}}{\int}_{T}^{+\mathrm{\infty}}p\left(x\right)\mathrm{d}x$(3)
while the probability of background pixels being misclassified as the foreground is denoted by
${E}_{\mathrm{2}}\left(T\right)={\theta}_{\mathrm{2}}{\int}_{\mathrm{\infty}}^{T}q\left(x\right)\mathrm{d}x$(4)
The probability of incorrect segmentation caused by the threshold $T$ is represented as
$E\left(T\right)={E}_{\mathrm{1}}\left(T\right)+{E}_{\mathrm{2}}\left(T\right)$(5)
The minimum error rate is achieved when $E\left(T\right)$ attains its minimum value, at which point
$\frac{\partial E}{\partial T}={\theta}_{\mathrm{2}}q\left(T\right){\theta}_{\mathrm{1}}p\left(T\right)=\mathrm{0}$(6)
This means that when ${\theta}_{\mathrm{2}}q\left(T\right)={\theta}_{\mathrm{1}}p\left(T\right)$, $E\left(T\right)$ is at its minimum value. Assuming that the grayscale values of both the foreground and background pixels in the image follow a normal distribution, their respective mean values and variances are denoted as ${\mu}_{\mathrm{1}}$,${\sigma}_{\mathrm{1}}^{\mathrm{2}}$,${\mu}_{\mathrm{2}}$,${\sigma}_{\mathrm{2}}^{\mathrm{2}}$. The following equation can be obtained
${\theta}_{\mathrm{2}}{\mathrm{e}}^{\frac{{\left({\mu}_{\mathrm{2}}T\right)}^{\mathrm{2}}}{\mathrm{2}{\sigma}_{\mathrm{2}}^{\mathrm{2}}}}={\theta}_{\mathrm{1}}{\mathrm{e}}^{\frac{({\mu}_{\mathrm{1}}{T)}^{\mathrm{2}}}{\mathrm{2}{\sigma}_{\mathrm{1}}^{\mathrm{2}}}}$(7)
For the sake of ease of computation, it is assumed that ${\sigma}_{\mathrm{1}}^{\mathrm{2}}={\sigma}_{\mathrm{2}}^{\mathrm{2}}={\sigma}^{\mathrm{2}}$, ${\theta}_{\mathrm{1}}={\theta}_{\mathrm{2}}=\frac{\mathrm{1}}{\mathrm{2}}$. According to Eq.(7), the optimal threshold value $T$ is
$T=\frac{{\mu}_{\mathrm{1}}+{\mu}_{\mathrm{2}}}{\mathrm{2}}$(8)
The process of iterative threshold segmentation is shown in Algorithm 1.
Algorithm 1 Iterative threshold segmentation  

Input: Grayscale image ${I}_{\mathrm{g}}$ Output: Binary image ${I}_{\mathrm{B}}$ 

1. By traversing the whole image $I$, the maximum gray ${g}_{\mathrm{m}\mathrm{a}\mathrm{x}}$ and minimum gray ${g}_{\mathrm{m}\mathrm{i}\mathrm{n}}$ of the image are obtained.  
2. Set $k=0$, the initial threshold $T(0)=\frac{1}{2}\left({g}_{\mathrm{m}\mathrm{i}\mathrm{n}}+{g}_{\mathrm{m}\mathrm{a}\mathrm{x}}\right)$.  
3. The image is partitioned into two subsets using a threshold value $T(k)$: ${G}_{1}\left(k\right)$ comprises pixels with grayscale values greater than $T(k)$, while ${G}_{2}\left(k\right)$ comprises pixels with grayscale values less than or equal to $T(k)$.  
4. Calculate the mean value of all pixels in ${G}_{1}\left(k\right)$ and ${G}_{2}\left(k\right)$, respectively.  
${\mu}_{1}(k)=\frac{1}{\mathrm{c}\mathrm{a}\mathrm{r}\mathrm{d}\text{}({G}_{1}(k))}\sum _{(x,y)\in {G}_{1}(k)}\u200af(x,y)$  (9)  
${\mu}_{2}(k)=\frac{1}{\mathrm{c}\mathrm{a}\mathrm{r}\mathrm{d}\text{}({G}_{2}(k))}\sum _{(x,y)\in {G}_{2}(k)}\u200af(x,y)$  (10)  
5. Calculate the new threshold: $T(k+1)=\frac{1}{2}\left({\mu}_{1}(k)+{\mu}_{2}(k)\right)$.  
6. If $\leftT(k+1)T(k)\right<\epsilon $, then the final threshold is $T(k+1)$, otherwise go to step 3.  
7. Set the value of pixels in the image ${I}_{\mathrm{g}}$, whose grayscale value is more significant than $T(k+1)$, to 1, and set the value of other pixels to 0, to obtain a binary image ${I}_{\mathrm{B}}$. 
1.3 Binary Image Projection
After binarizing the knitted fabric image, the grayscale projection curve is obtained through statistical analysis. In this study, the wale projection curve is computed based on the average grayscale value of each column pixel in the image
$F\left(x\right)=\frac{\mathrm{1}}{N}{\displaystyle \sum _{x=\mathrm{1}}^{N}}f\left(x,y\right)$(11)
where $N$ is the number of pixels in the image column, $f\left(x,y\right)$ is the grayscale value, and the value of $f\left(x,y\right)$ is 0 or 1 because the image has been binarized.
In the wale projection curve of knitted fabric, there is a loop column or leg gap column between two adjacent wave peaks. Therefore, the leg gap columns can be eliminated by locating the leg columns according to the wave peak coordinates, and an image containing only the information of the loop columns can be obtained. Similarly, getting an image containing only the leg gap column information is possible. This is shown in Fig.5.
Fig.5 Loop information image 
1.4 Skewing Correction Algorithm
The skewing phenomenon is unavoidable during the manufacturing process of weftknitted fabrics, as is aptly demonstrated in Fig.6. The extent of tilt significantly impacts the segmentation effect of the wale of the loops. To obtain precise measurements of knitted fabric density, it is imperative to precorrection the fabric image to compensate for a tilt that may be present when calculating the coursewise density. To address loop skewing caused by manufacture or tilt induced by image acquisition, this study employs a projectionbased approach commonly used in detecting skew in scanned documents for correction purposes.
Fig.6 Knitted fabric loop skewing 
Since the skewing in knitted fabrics is biased weft, it is only necessary to perform horizontal projections of the fabric's images at different angles. The course projection curve ($\mathrm{C}\mathrm{P}\mathrm{C}$) of the image is calculated using Eq.(11). The difference between adjacent elements of the course projection curve ($\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}$) can be obtained by using Eq.(12), thus generating the $\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}$ data vector.
$\text{}\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}=[\mathrm{C}\mathrm{P}\mathrm{C}\left(\mathrm{2}\right)\mathrm{C}\mathrm{P}\mathrm{C}\left(\mathrm{1}\right),\mathrm{C}\mathrm{P}\mathrm{C}\left(\mathrm{3}\right)\mathrm{C}\mathrm{P}\mathrm{C}\left(\mathrm{2}\right),\cdots ,\mathrm{C}\mathrm{P}\mathrm{C}\left(M\right)\mathrm{C}\mathrm{P}\mathrm{C}(M\mathrm{1})]$(12)
where $M$ is the number of pixel points in an image row.
As the image approaches a position where it is nearly horizontally aligned, a higher density of effective pixels will be present within a single row, resulting in the maximum amplitude and frequency of the $\mathrm{C}\mathrm{P}\mathrm{C}$ being reached. At this point, the difference between the maximum and minimum values within the $\mathrm{C}\mathrm{P}\mathrm{C},$ as well as between adjacent values of the $\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}$ data vector, will be maximized. An increase in the value of $\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}$ can be used as an indicator, which can be represented by calculating the standard deviation ($\mathrm{S}\mathrm{D}$) of the $\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}$ values. The mean value $\overline{\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}}$ and $\mathrm{S}\mathrm{D}$ of the $\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}$ can be calculated using Eq.(13) and (14), respectively.
$\overline{\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}}=\frac{\mathrm{1}}{n}{\displaystyle \sum _{i=\mathrm{1}}^{n}}CPCd(i)$(13)
$\mathrm{S}\mathrm{D}=\sqrt[]{\frac{\mathrm{1}}{n}{\displaystyle \sum _{i=\mathrm{1}}^{n}}{\left(\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}(i)\overline{\mathrm{C}\mathrm{P}\mathrm{C}\mathrm{d}}\right)}^{\mathrm{2}}}$(14)
where n refers to the number of elements within the sample.
By rotating the input image between $\mathrm{15}\xb0$ and $\mathrm{15}\xb0$ in steps of $\mathrm{1}\xb0$, and calculating the $\mathrm{S}\mathrm{D}$ for each angle, it is possible to identify the angle that can correct the bias weft by comparing the $\mathrm{S}\mathrm{D}$ values obtained from different rotation angles. Figure 7 shows the SD values of a fabric image at different rotation angles. The skewing of the fabric image is reasonably corrected when the rotation is 5°, at this point, the SD value reaches the maximum.
Fig.7 $\mathbf{S}\mathbf{D}$ values for different rotation angles 
After determining the skew angle, the image is subsequently rotated accordingly. As shown in Fig.8, the blue rectangle represents the position of the original image after rotation. In contrast, the green rectangle represents the size of the storage matrix of the rotated image. At this point, the image contains areas that do not exist any fabric image information and, therefore, need to be cropped to avoid interference with the calculation results. The processing area of the cropped image will be reduced (shown as the orange rectangle in Fig.8).
Fig.8 Rotation and cropping 
Recalculating the $\mathrm{C}\mathrm{P}\mathrm{C}$ of the image after rotation and cropping correction can more accurately reflect the structural information of the knitted fabric. The $\mathrm{C}\mathrm{P}\mathrm{C}$ of the image before and after correction are shown in Fig.9.
Fig.9 Image skewing correction 
From Fig.9, it can be seen that the $\mathrm{C}\mathrm{P}\mathrm{C}$ of the original image is messy; however, it has a certain periodicity, there are a large number of interference extreme points, and the valley coordinates cannot pinpoint the courses. After skewing correction, the $\mathrm{C}\mathrm{P}\mathrm{C}$ of the image has a better periodicity, and the courses can be located after removing the impurity valley points. To facilitate the subsequent calculation, the $\mathrm{C}\mathrm{P}\mathrm{C}$ is inversely processed to make its valley points become the peak points. Figure 10 displays four sets of fabric images after skewing correction, with loops neatly arranged along the horizontal direction.
Fig.10 Skew correction results of some samples 
1.5 Wave Peak Coordinate Verification Algorithm
Due to the even spacing of knitted fabric loops in the transverse direction, the peak point coordinates representing the loop columns and loop leg gap columns are expected to be evenly distributed in this direction. However, the counting of peak point coordinates reveals the presence of impurity peak points on the projection curve, as shown in Fig.11(a). These impurity points do not accurately represent the positions of loops and loop leg gaps, and their inclusion in the analysis leads to significant errors in the results.
Fig.11 Preliminary smooth projection curve 
This paper employs a moving average filter as a preliminary approach to smooth the projection curve. The moving average filter computes the average value of adjacent points to determine the value of a specific point, and the number of neighboring points determines the width of the filtering window. Averaging can reduce the magnitude of points with significant fluctuations and, to some extent, filter out highfrequency noise. The mathematical formulation of the algorithm can be expressed as follows
${y}_{i}=\{\begin{array}{cc}\frac{{\displaystyle \sum _{j=\mathrm{0}}^{M\mathrm{1}}}{S}_{i+j}}{M},& \mathrm{i}\mathrm{f}\text{}i\mathrm{0}\bigcap N(M\mathrm{1})\\ \mathrm{0},& \mathrm{o}\mathrm{t}\mathrm{h}\mathrm{e}\mathrm{r}\mathrm{w}\mathrm{i}\mathrm{s}\mathrm{e}\end{array}$(15)
where ${s}_{i+j}$ is the input signal, ${y}_{i}$ is the output signal, $M$ is the window size, and $N$ is the total number of data points.
If the window size is too small, it may retain many impurities, while if it is too large, it may cause the loss of details, which can affect the determination of the loop position. In this paper, a window size of 7 was selected.
There remain some impurity points in the preliminary smoothed curves, as shown in Fig.11(b). The LOESS algorithm has been commonly employed in previous literature to smooth projection curves. However, the focus of these studies has been exclusively on the density calculation of woven fabrics. Due to the dissimilarities between the structure of knitted and woven fabrics, the horizontal projection curve of knitted fabric contains significantly more impurities, making it arduous to select a suitable smoothing window using the LOESS method, consequently impacting the accuracy of the final density calculations. Given the abovementioned reasons, the wave peak coordinate verification algorithm introduced in this paper focuses solely on identifying and removing anomalous points without affecting the positions of other correct points. Firstly, the distances between adjacent wave peaks are calculated,
${L}_{n}=\left({x}_{\mathrm{2}}{x}_{\mathrm{1}},\text{}{x}_{\mathrm{3}}{x}_{\mathrm{2}},\cdots ,{x}_{n+\mathrm{1}}{x}_{n}\right)$
$=\left({k}_{\mathrm{1}},\text{}{k}_{\mathrm{2}},\cdots ,{k}_{n}\right)$(16)
where $n$ is the number of crest spacing, rearranged the ${L}_{n}$ in descending order.
The presence of impure wave peaks gives rise to smaller Xcoordinate distances that are included in ${L}_{n}$. To screen the data in ${L}_{n}$, which encompasses these distances, the values greater than or equal to the median are selected, and their average is then calculated to determine the screening threshold
${T}_{H}=\frac{{L}_{\mathrm{1}}+{L}_{\mathrm{2}}+\cdots +{L}_{\lceil n/\mathrm{2}\rceil}}{\lceil n/\mathrm{2}\rceil}$(17)
As both ends of the eccentric distance correspond to peak points, removing only one of them suffices. To accomplish this, the yvalues of the two wave crest points are identified and labeled as ${y}_{a}$ and ${y}_{b}$, respectively. The magnitudes of ${y}_{a}$ and ${y}_{b}$ are then compared, and the point with the larger yvalue is retained. Similarly, compare the horizontal distance and yvalue between the point at $x=\mathrm{1}$ (or $x=N$) and the adjacent point, and record the point satisfying the conditions as the crest point.
The impurityremoved wave peaks were obtained using the threshold value ${T}_{H}$ in the projection curve are shown in Fig.11(c). The coordinates of the peaks after impurity removal can be obtained through the wave peak coordinate verification algorithm proposed in this paper. These coordinates represent the actual positions used to locate the rows or columns of the loops.
1.6 Density Calculation
The number of longitudinal loops ${N}_{v}$ and the number of transverse loops ${N}_{h}$ can be calculated according to the number of crests or the average distance between crests. The density calculation formula is
${\rho}_{v}=\frac{\mathrm{5}\times \mathrm{P}\mathrm{P}\mathrm{I}\times {N}_{v}}{\mathrm{2.54}\times N}$(18a)
${\rho}_{h}=\frac{\mathrm{5}\times \mathrm{P}\mathrm{P}\mathrm{I}\times {N}_{h}}{\mathrm{2.54}\times N}$(18b)
${\rho}_{v}$ is the coursewise density of the knitted fabric (wales/5 cm), ${\rho}_{h}$ is the walewise density (courses/5 cm), $N$ is the size of the captured image (pixels), and $\mathrm{P}\mathrm{P}\mathrm{I}$ is the spatial resolution (pixels/inch).
Since the acquired image corresponds to a fabric of 10 mm×10 mm in size, the density calculation formula can be simplified as
${\rho}_{v}={N}_{v}\times \frac{\mathrm{50}}{\mathrm{10}}=\mathrm{5}{N}_{v}$(19a)
${\rho}_{h}={N}_{h}\times \frac{\mathrm{50}}{\mathrm{10}}=\mathrm{5}{N}_{h}$(19b)
2 Experiment and Result Analysis
In this paper, we study the density measurement of knitted fabrics and use plain stitch for experimental verification. Multiple pieces of weftknitted fabric were taken, and each piece of knitted fabric was photographed and sampled using the image acquisition system in Section 1.1 to ensure that the loop shape of the photographed image was relatively straightforward. The number of loops per unit length was counted manually with a cloth counter to provide a reference measurement. The captured images were subjected to density detection in MATLAB 2021a using the method of this paper, and the results obtained were compared with the manual measurements to assess the effectiveness of the proposed method. The computer used to conduct the experiments was equipped with Microsoft Windows 10 operating system, CPU Intel(R) Core(TM) i7, and 16.00 GB RAM.
2.1 Fabric Density Measurement
To illustrate the reliability of the proposed measurement method, the fabric densities of samples 130 obtained by the automatic process were compared with those obtained by the manual mode. The specific formula for the error is
$\mathrm{E}\mathrm{r}=\frac{\left\mathrm{A}\mathrm{D}\mathrm{M}\mathrm{D}\right}{\mathrm{M}\mathrm{D}}\times \mathrm{100}\mathrm{\%}$(20)
where AD represents the density measured by the proposed automated method, and MD represents the density measured manually. $\mathrm{E}\mathrm{r}$ represents the percentage value of the relative error.
Weft knitted fabric image collected in backlight mode, after threshold segmentation, a more prominent image of the loop structure can be obtained, based on which statistical analysis of the image space domain is carried out to obtain the density of weft knitted fabric. According to the calculation method in Section 1.6, two sets of coursewise density data and four sets of walewise density data can be obtained. CD1 represents the coursewise density obtained according to the number of wales positioning, and CD2 refers to the coursewise density calculated by the average wale spacing. WD can be divided into two groups of a and b according to the positioning of loop columns or loop leg gap columns. Then, a total of four groups of walewise density can be calculated according to course positioning and average course spacing, respectively. Average1 in walewise density represents the average value of the four data sets after eliminating outliers. At the same time, Average2 is the mean value of walewise density calculated from the average course spacing of the two groups. Median absolute deviation (MAD) is defined to determine outliers:
$\mathrm{M}\mathrm{A}\mathrm{D}=\mathrm{m}\mathrm{e}\mathrm{d}\mathrm{i}\mathrm{a}\mathrm{n}\left(\left{x}_{i}M\right\right)$(21)
where ${x}_{i}$ is the $i$th value in the dataset, $M$ is the median value in the dataset. Normal data generally fluctuate around the median, and if the data exceed a certain range, they are considered to be outliers. The judgment formula is as follows:
$\left\frac{{x}_{i}M}{\mathrm{M}\mathrm{A}\mathrm{D}}\right>\mathrm{3}$(22)
Data that satisfy the above equation are judged as outliers, and data that do not satisfy the equation are considered normal.
As shown in Table 1, the maximum error of density CD1 determined by the number of direct wales is 1.41%, and the average error is 0.38%. The average error of CD2 is 1.04%, with a maximum error of 2.93%, and some samples' density errors exceed 2%. Our method considers the recognition of leg columns in the edge regions of the image, and the edge recognition results are screened based on the average distance of the internal loops. Since the wales of knitted fabric seldom produce skewness in the production process, and the tilt caused by image acquisition is controllable, the loop wales of knitted fabric can be accurately positioned. Therefore, we choose CD1 as the final coursewise density.
As shown in Table 2, the average error of Average1 is 1.25%, which is higher than the average error of Average2 of 0.83%. Because the walewise density is generally more significant than the coursewise density, and due to the existence of skewing, half a course in the edge area after rotation and cutting may be positioned as one course (the position marked by the bottom yellow line in Fig.12), which is the main reason for the error of loop positioning. The maximum positioning error in a single row can result in half a course spacing error. In contrast, in coursewise density calculations, because one loop column has two leg columns, the maximum positioning error in a column of leg columns can result in a mistake of onequarter wale spacing. Therefore, we choose Average2 calculated based on the average course spacing as the final walewise density.
Fig.12 Marked courses 
The maximum coursewise density error measured by our method is 1.41%, and the average coursewise density error is 0.38%. The maximum walewise density error is 1.84%, and the average walewise density error is 0.83%. The measurement accuracy of the proposed method exceeds 98%. The time required for each measurement of fabric density by our method does not exceed 30 s, including image capture and computation time. In contrast, the manual method requires more than ten minutes and is subject to human status. Therefore, our method can replace the manual measurement method to a certain extent. Due to the influence of the weaving process, knitted fabric is prone to deformation, and the fabric density is uneven. To reduce the influence of this factor on the measurement results, images of different areas of the fabric can be collected for multiple calculations to obtain the average value.
Comparison of coursewise density accuracy
Comparison of walewise density accuracy
2.2 Comparison with Reflected Light Image Density Detection
We performed histogram equalization because some lightcolored fabrics do not segment thresholds well under reflection conditions. In Fig.13, the yellow lines mark the loop positions determined by the presented algorithm in both the transmitted and reflected light images. Comparatively, the positioning in the reflected light images is not accurate enough. Figure 14 shows the measurement results of the same fabric in the same area captured under reflection conditions using the proposed method in this paper, with average errors of 1.42% and 5.94% for coursewise density and walewise density, respectively. Most of the data errors are within 5%, but samples 5, 15, and 19 have a superior gloss under reflection due to their material properties, resulting in walewise density errors exceeding 20%. However, transmission light can effectively reduce the interference of this factor, thus obtaining more ideal measurement data. Images obtained in the backlight mode are more suitable for automated measurement of knitted fabric density than those obtained in reflection mode.
Fig.13 Loop identification comparison of transmitted light and reflected light 
Fig.14 Comparison of measured results of transmitted and reflected light 
2.3 Comparison with MultiScale Image Density Detection
In the experiment, measurements were taken on areas of the fabric with physical dimensions of 10 mm × 10 mm (Group 1), 5 mm × 5 mm (Group 2), and 3.33 mm × 3.33 mm (Group 3), as shown in Fig.15, and the resulting density data were compared. In the second group, the average error in coursewise density is 3%, and the error in walewise density is 10.7%. For the third group, the average error in coursewise density is 10.79%, while the walewise density error is notably higher at 66.84%. The results for Group 2 and Group 3 were significantly worse than those for Group 1, and are unsuitable for replacing manual measurements in actual production. The experimental results indicate that the larger the area of the measurement region and the more loops it contains, the more it can reduce the impact of identification errors on loops in the edge region and the smaller the error in the obtained fabric density values.
Fig.15 Comparison of images at different scales 
3 Conclusion
In this study, we present a new method for automatically detecting the density of knitted fabrics based on backlight illumination. An imaging system consisting of reflective and transmittance lamps positioned at the top and bottom of the fabric is employed to capture the reflective and transmittance images of the same region of the fabric under different illumination conditions, respectively. The acquired transmittance images are then processed through sequential steps of thresholding, loop skew correction, and projection to obtain the projection curve, which is further smoothed using a moving average filter. The proposed wave peak coordinate correction algorithm is used to determine the effective peaks accurately and, thus, to obtain the density of knitted fabrics. The proposed method is evaluated with the manual mode, and the results obtained are more similar to the manual method. The proposed method fully utilizes the light transmission property of the fabric and has a substantial density measurement performance. In future work, we will continue to refine the proposed approach to achieve the automatic measurement of the density of patterned knitted fabrics.
References
 Zhu R H, Xin B J, Deng N, et al. Semantic segmentation using DeepLabv3+ model for fabric defect detection[J]. Wuhan University Journal of Natural Sciences, 2022, 27(6):539549. [CrossRef] [EDP Sciences] [Google Scholar]
 Luo J,Xin B J, Yuan X W. Photometric stereobased 3D reconstruction method for the objective evaluation of fabric pilling[J]. Wuhan University Journal of Natural Sciences, 2022, 27(6):550556.. [CrossRef] [EDP Sciences] [Google Scholar]
 Xing W Y, Deng N, Xin B J, et al. An imagebased method for the automatic recognition of Cashmere and wool fibers[J]. Measurement, 2019, 141: 102112. [NASA ADS] [CrossRef] [Google Scholar]
 Wang W D, Xin B J, Deng N, et al. Objective evaluation on yarn hairiness detection based on multiview imaging and processing method[J]. Measurement, 2019, 148: 106905. [CrossRef] [Google Scholar]
 Chen Y, Deng N, Xin B J, et al. Structural characterization and measurement of nonwoven fabrics based on multifocus image fusion[J]. Measurement, 2019, 141: 356363. [NASA ADS] [CrossRef] [Google Scholar]
 Wang Y L, Deng N, Xin B J. Investigation of 3D surface profile reconstruction technology for automatic evaluation of fabric smoothness appearance[J]. Measurement, 2020, 166: 108264. [Google Scholar]
 Imaoka H, Inui S, Niwaya H, et al. Trial on automatic measurement of the density[J]. Sen'i Gakkaishi, 1988, 44(1): 3239. [CrossRef] [Google Scholar]
 Wood E J. Applying Fourier and associated transforms to pattern characterization in textiles[J]. Textile Research Journal, 1990, 60(4): 212220. [CrossRef] [Google Scholar]
 Hosseini Ravandi S A, Toriumi K. Fourier transform analysis of plain weave fabric appearance[J]. Textile Research Journal, 1995, 65(11): 676683. [CrossRef] [Google Scholar]
 SariSarraf H, Goddard Jr J S . Online optical measurement and monitoring of yarn density in woven fabrics[C]//Automated Optical Inspection for Industry. Beijing: SPIE, 1996: 444452. [Google Scholar]
 Xu B G. Identifying fabric structures with fast Fourier transform techniques[J]. Textile Research Journal, 1996, 66(8): 496506. [CrossRef] [Google Scholar]
 Xin B J, Yu X F, Wu Z P. Automatic identifying the woven fabric pattern by image analysis technique[J]. Journal of Donghua University (Natural Science), 2011, 37(1): 3541 (Ch). [Google Scholar]
 Li J L, Wang H A, Zhao D X, et al. Research on fabric image registration technique based on pattern recognition using crosspower spectrum[C]//2009 WRI World Congress on Software Engineering. New York: IEEE, 2009: 420423. [Google Scholar]
 Technikova L, Tunak M. Weaving density evaluation with the aid of image analysis[J]. Fibres & Textiles in Eastern Europe, 2013, 21(2): 574. [Google Scholar]
 Jeong Y J, Jang J. Applying image analysis to automatic inspection of fabric density for woven fabrics[J]. Fibers and Polymers, 2005, 6(2): 156161. [CrossRef] [Google Scholar]
 Zhang J, Xin B J, Wu X J. Density measurement of yarn dyed woven fabrics based on dualside scanning and the FFT technique[J]. Measurement Science and Technology, 2014, 25(11): 115007. [NASA ADS] [CrossRef] [Google Scholar]
 Zhang R, Xin B J. An investigation of density measurement method for yarndyed woven fabrics based on dualside fusion technique[J]. Measurement Science and Technology, 2016, 27(8): 085403. [NASA ADS] [CrossRef] [MathSciNet] [Google Scholar]
 Xin B J. Automatic measurement method of yarn dyed woven fabric density via wavelet transform fusion technique[J]. Journal of Fiber Bioengineering and Informatics, 2018, 9(2): 115132. [Google Scholar]
 Pan R R, Gao W D, Li Z, et al. Measuring thread densities of woven fabric using the Fourier transform[J]. Fibres & Textiles in Eastern Europe, 2015, 23(1): 3540. [CrossRef] [Google Scholar]
 Schneider D, Merhof D. Blind weave detection for woven fabrics[J]. Pattern Analysis and Applications, 2015, 18(3): 725737. [CrossRef] [MathSciNet] [Google Scholar]
 Tunák M, Linka A, Volf P. Automatic assessing and monitoring of weaving density[J]. Fibers and Polymers, 2009, 10(6): 830836. [CrossRef] [Google Scholar]
 Schneider D, Gloy Y S, Merhof D. Visionbased onloom measurement of yarn densities in woven fabrics[J]. IEEE Transactions on Instrumentation and Measurement, 2015, 64(4): 10631074. [CrossRef] [Google Scholar]
 Xiang Z, Zhang J F, Hu X D. Visionbased portable yarn density measure method and system for basic single color woven fabrics[J]. The Journal of the Textile Institute, 2018, 109(12): 15431553. [CrossRef] [Google Scholar]
 Pan R R, Zhang J, Li Z J, et al. Applying image analysis for automatic density measurement of hightightness woven fabrics[J]. Fibres and Textiles in Eastern Europe, 2016, 24(2): 6672. [CrossRef] [Google Scholar]
 Qin Y L, Xu F L. Analysis and research of the fabric density based on the wavelet transform[C]//2012 Fifth International Symposium on Computational Intelligence and Design. New York: IEEE, 2013: 197200. [Google Scholar]
 Jing J F, Liu S, Li P F, et al. Automatic density detection of woven fabrics via wavelet transform[J]. Journal of Information and Computational Science, 2014, 11(8): 25592568. [CrossRef] [Google Scholar]
 Jing J F, Deng Q Y, Li P F. Automatic inspection of woven fabric density based on digital image analysis[J]. Journal of Fiber Bioengineering and Informatics, 2015, 8(2): 259266. [CrossRef] [Google Scholar]
 Lin J J. Applying a Cooccurrence matrix to automatic inspection of weaving density for woven fabrics[J]. Textile Research Journal, 2002, 72(6): 486490. [CrossRef] [Google Scholar]
 Ajallouian F, Tavanai H, Palhang M, et al. A novel method for the identification of weave repeat through image processing[J]. Journal of the Textile Institute, 2009, 100(3): 195206. [CrossRef] [Google Scholar]
 Pan R R, Gao W D, Liu J H, et al. Automatic inspection of doublesystemmélange yarndyed fabric density with colorgradient image[J]. Fibers and Polymers, 2011, 12(1): 127131. [CrossRef] [Google Scholar]
 Pan R R, Liu J H, Gao W D. Measuring linear density of threads in singlesystemmélange color fabrics with FCM algorithm[J]. Color Research & Application, 2013, 38(6): 456462. [CrossRef] [Google Scholar]
 Zhang J, Pan R R, Wang J G, et al. An efficient method for density measurement for hightightness woven fabrics[J]. Textile Research Journal, 2017, 87(3): 329339. [CrossRef] [Google Scholar]
 Wang X C, Li X J. Recognition of fabric density with quadratic local extremum[J]. International Journal of Clothing Science and Technology, 2012, 24(5): 328338. [CrossRef] [Google Scholar]
 Zhang J, Pan R R, Gao W D, et al. Automatic inspection of yarndyed fabric density by mathematical statistics of subimages[J]. The Journal of the Textile Institute, 2015, 106(8): 823834. [CrossRef] [Google Scholar]
 Yuan X W, Xin B J, Luo J, et al. An investigation of woven fabric density measurement using image analysis based on RTVSFT[J]. The Journal of the Textile Institute, 2023, 114(9): 12911300. [CrossRef] [Google Scholar]
 Chen S C, Liu M Q, Zhang S L. Fabric patterns recognition based on weft phase difference[C]//2019 Chinese Control Conference (CCC). New York: IEEE, 2019: 78107815. [Google Scholar]
 Zhang J, Pan R R, Gao W D. Automatic inspection of density in yarndyed fabrics by utilizing fabric light transmittance and Fourier analysis[J]. Applied Optics, 2015, 54(4): 966972. [Google Scholar]
 Xiang Z, Chen K F, Qian M, et al. Yarndyed woven fabric density measurement method and system based on multidirectional illumination image fusion enhancement technology[J]. The Journal of the Textile Institute, 2020, 111(10): 14891501. [CrossRef] [Google Scholar]
 Pan R R, Gao W D, Liu J H, et al. Automatic inspection of woven fabric density of solid colour fabric density by the Hough transform[J]. Fibres & Textiles in Eastern Europe, 2010, 18(4): 4651. [Google Scholar]
 Jing J F, Liu S, Zhang L, et al. Skew detection and yarns density calculation for woven fabric[J]. Journal of Fiber Bioengineering and Informatics, 2014, 7(4): 615625. [CrossRef] [Google Scholar]
 Zhang J, Pan R R, Gao W D. A backlighting method for accurate inspection of woven fabric density[J]. Industria Textila, 2017, 68(1): 3136. [CrossRef] [Google Scholar]
 Meng S, Pan R R, Gao W D, et al. Woven fabric density measurement by using multiscale convolutional neural networks[J]. IEEE Access, 2019, 7: 7581075821. [NASA ADS] [CrossRef] [Google Scholar]
 Aldemir E, Özdemir H, Sarı Z. An improved gray line profile method to inspect the warpweft density of fabrics[J]. The Journal of the Textile Institute, 2019, 110(1): 105116. [CrossRef] [Google Scholar]
 Zheng D J, Wang L H. Multiscale density detection for yarndyed fabrics with deformed repeat patterns[J]. Textile Research Journal, 2017, 87(20): 25242540. [CrossRef] [Google Scholar]
 Meng S, Pan R R, Gao W D, et al. A multitask and multiscale convolutional neural network for automatic recognition of woven fabric pattern[J]. Journal of Intelligent Manufacturing, 2021, 32(4): 11471161. [CrossRef] [Google Scholar]
 Zhang X C, Li P, Jin F J. Density detection for knitted fabric based on image space domain method[C]//The 27th Chinese Control and Decision Conference (2015 CCDC). New York: IEEE, 2015: 56575662. [Google Scholar]
All Tables
All Figures
Fig.1 Illustration of woven fabric and knitted fabric 

In the text 
Fig.2 Composite light source image acquisition system 

In the text 
Fig.3 Images of knitted fabric samples 

In the text 
Fig.4 The distribution of the coursewise densities and walewise densities Note: The green barchart shows the number of samples in different intervals 

In the text 
Fig.5 Loop information image 

In the text 
Fig.6 Knitted fabric loop skewing 

In the text 
Fig.7 $\mathbf{S}\mathbf{D}$ values for different rotation angles 

In the text 
Fig.8 Rotation and cropping 

In the text 
Fig.9 Image skewing correction 

In the text 
Fig.10 Skew correction results of some samples 

In the text 
Fig.11 Preliminary smooth projection curve 

In the text 
Fig.12 Marked courses 

In the text 
Fig.13 Loop identification comparison of transmitted light and reflected light 

In the text 
Fig.14 Comparison of measured results of transmitted and reflected light 

In the text 
Fig.15 Comparison of images at different scales 

In the text 
Current usage metrics show cumulative count of Article Views (fulltext article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 4896 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.