联锁
磨损(机械)
耐磨性
复合材料
材料科学
工程类
结构工程
作者
Azita Asayesh,Fatemeh Mahmoodi
出处
期刊:International Journal of Clothing Science and Technology
[Emerald (MCB UP)]
日期:2024-03-05
卷期号:36 (2): 287-303
被引量:1
标识
DOI:10.1108/ijcst-05-2023-0073
摘要
Purpose Pilling and abrasion resistance are two of the most important mechanical properties of the fabric that influence the appearance and performance of the fabric, particularly in the case of knitted fabrics. Since, these fabric features are affected by fabric structure the aim of present research is to investigate how utilizing miss stitches and tuck stitches in the fabric structure for design purposes will influence the pilling and abrasion resistance of interlock weft-knitted fabrics. Design/methodology/approach In this research, interlock fabrics with different number of miss or tuck stitches on successive Wales were produced and pilling performance and abrasion resistance of the fabrics were investigated. Findings The results revealed that increasing the number of miss/tuck stitches on successive Wales decreases the abrasion resistance and enhances the pilling tendency of the fabric. The presence of miss/tuck stitches on both sides of the fabric improves the abrasion resistance and pilling performance of the fabric compared to fabrics containing these stitches on one side of the fabric. Furthermore, the fabric resistance against abrasion and pilling is higher in fabrics consisting of miss stitches compared to fabrics consisting of tuck stitches. Originality/value The use of tuck and miss stitches in designing the weft-knitted fabrics is a common method for producing fabrics with variety of knit patterns. Since pilling and abrasion resistance of the fabric influence on its appearance and performance, and none of the previous research studied the pilling and abrasion resistance of interlock-knitted fabrics from the point of presence of tuck and miss stitches on successive Wales of the fabric, this subject has been surveyed in the present research.
科研通智能强力驱动
Strongly Powered by AbleSci AI