Previous Article in Journal
Functional Textile Socks in Rheumatoid Arthritis or Psoriatic Arthritis: A Randomized Controlled Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Multimodal Feature Inputs Enable Improved Automated Textile Identification

Department of Mechanical Engineering, Faculty of Engineering, Computing and the Environment, Kingston University, Roehampton Vale, London SW15 3DW, UK
*
Author to whom correspondence should be addressed.
Textiles 2025, 5(3), 31; https://doi.org/10.3390/textiles5030031 (registering DOI)
Submission received: 3 June 2025 / Revised: 22 July 2025 / Accepted: 25 July 2025 / Published: 2 August 2025

Abstract

This study presents an advanced framework for fabric texture classification by leveraging macro- and micro-texture extraction techniques integrated with deep learning architectures. Co-occurrence histograms, local binary patterns (LBPs), and albedo-dependent feature maps were employed to comprehensively capture the surface properties of fabrics. A late fusion approach was applied using four state-of-the-art convolutional neural networks (CNNs): InceptionV3, ResNet50_V2, DenseNet, and VGG-19. Excellent results were obtained, with the ResNet50_V2 achieving a precision of 0.929, recall of 0.914, and F1 score of 0.913. Notably, the integration of multimodal inputs allowed the models to effectively distinguish challenging fabric types, such as cotton–polyester and satin–silk pairs, which exhibit overlapping texture characteristics. This research not only enhances the accuracy of textile classification but also provides a robust methodology for material analysis, with significant implications for industrial applications in fashion, quality control, and robotics.
Keywords: fabric texture classification; deep learning; convolutional neural networks (CNNs); feature extraction; material recognition fabric texture classification; deep learning; convolutional neural networks (CNNs); feature extraction; material recognition

Share and Cite

MDPI and ACS Style

Gnoupa, M.G.E.; Augousti, A.T.; Duran, O.; Lanets, O.; Liaskovska, S. Multimodal Feature Inputs Enable Improved Automated Textile Identification. Textiles 2025, 5, 31. https://doi.org/10.3390/textiles5030031

AMA Style

Gnoupa MGE, Augousti AT, Duran O, Lanets O, Liaskovska S. Multimodal Feature Inputs Enable Improved Automated Textile Identification. Textiles. 2025; 5(3):31. https://doi.org/10.3390/textiles5030031

Chicago/Turabian Style

Gnoupa, Magken George Enow, Andy T. Augousti, Olga Duran, Olena Lanets, and Solomiia Liaskovska. 2025. "Multimodal Feature Inputs Enable Improved Automated Textile Identification" Textiles 5, no. 3: 31. https://doi.org/10.3390/textiles5030031

APA Style

Gnoupa, M. G. E., Augousti, A. T., Duran, O., Lanets, O., & Liaskovska, S. (2025). Multimodal Feature Inputs Enable Improved Automated Textile Identification. Textiles, 5(3), 31. https://doi.org/10.3390/textiles5030031

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop