Predicting Fabric Appearance Through
Thread Scattering and Inversion
Mengqi (Mandy) Xia1,
Zhaoyang Zhang1,
Sumit Chaturvedi1,
Yutong Yi1,
Rundong Wu2, Holly Rushmeier1, Julie Dorsey1
Rundong Wu2, Holly Rushmeier1, Julie Dorsey1
1Yale University, 2Bytedance
ACM Transactions on Graphics (SIGGRAPH 2025)
Video
Abstract
The fashion industry has a real need to preview fabric designs using the actual threads they intend to use, ensuring that the designs they envisage can be physically realized. Unfortunately, today's fabric rendering relies on either hand-tuned parameters or parameters acquired from already fabricated cloth. Furthermore, existing curve-based scattering models are not suitable for this problem: they are either not naturally differentiable due to discrete fiber count parameters, or require a more detailed geometry representation, introducing extra complexity.
In this work, we bridge this gap by presenting a novel pipeline that captures and digitizes physical threads and predicts the appearance of the fabric based on the weaving pattern. We develop a practical thread scattering model based on simulations of multiple fiber scattering within a thread. Using a cost-efficient multi-view setup, we capture threads of diverse colors and materials. We apply differentiable rendering to digitize threads, demonstrating that our model significantly improves the reconstruction accuracy compared to existing models, matching both reflection and transmission. We leverage a two-scale rendering technique to efficiently render woven cloth. We validate that our digital threads, combined with simulated woven yarn geometry, can accurately predict the fabric appearance by comparing to real samples. We show how our work can aid designs using diverse thread profiles, woven patterns, and textured design patterns.
In this work, we bridge this gap by presenting a novel pipeline that captures and digitizes physical threads and predicts the appearance of the fabric based on the weaving pattern. We develop a practical thread scattering model based on simulations of multiple fiber scattering within a thread. Using a cost-efficient multi-view setup, we capture threads of diverse colors and materials. We apply differentiable rendering to digitize threads, demonstrating that our model significantly improves the reconstruction accuracy compared to existing models, matching both reflection and transmission. We leverage a two-scale rendering technique to efficiently render woven cloth. We validate that our digital threads, combined with simulated woven yarn geometry, can accurately predict the fabric appearance by comparing to real samples. We show how our work can aid designs using diverse thread profiles, woven patterns, and textured design patterns.
Gallery


Acknowledgement
We would like to thank Zhenyuan (Desmond) Liu for providing
the embroidery example, and Xiaoji (Joy) Zhang for guidance on
the simulation software. This work is supported by research grants
from Tata Sons Private Limited, Tata Consultancy Services Limited,
and Titan.