Predicting Fabric Appearance Through
Thread Scattering and Inversion

1Yale University, 2Bytedance

ACM Transactions on Graphics (SIGGRAPH 2025)

Video



Abstract

The fashion industry has a real need to preview fabric designs using the actual threads they intend to use, ensuring that the designs they envisage can be physically realized. Unfortunately, today's fabric rendering relies on either hand-tuned parameters or parameters acquired from already fabricated cloth. Furthermore, existing curve-based scattering models are not suitable for this problem: they are either not naturally differentiable due to discrete fiber count parameters, or require a more detailed geometry representation, introducing extra complexity.

In this work, we bridge this gap by presenting a novel pipeline that captures and digitizes physical threads and predicts the appearance of the fabric based on the weaving pattern. We develop a practical thread scattering model based on simulations of multiple fiber scattering within a thread. Using a cost-efficient multi-view setup, we capture threads of diverse colors and materials. We apply differentiable rendering to digitize threads, demonstrating that our model significantly improves the reconstruction accuracy compared to existing models, matching both reflection and transmission. We leverage a two-scale rendering technique to efficiently render woven cloth. We validate that our digital threads, combined with simulated woven yarn geometry, can accurately predict the fabric appearance by comparing to real samples. We show how our work can aid designs using diverse thread profiles, woven patterns, and textured design patterns.


Gallery

This figure shows how our pipeline can aid design by displaying previews of skirts woven from various patterns and thread colors. The second row demonstrates that by arranging four differently colored threads in various ways, we can achieve a similar overall color but with distinct highlights.
We demonstrate that our model can also be used to preview cloth with textured patterns. We leverage differentiable rendering to determine the absorption coefficients for generating desired patterns. This figure showcases a traditional Chinese wedding dress rendered using this method.




Acknowledgement

We would like to thank Zhenyuan (Desmond) Liu for providing the embroidery example, and Xiaoji (Joy) Zhang for guidance on the simulation software. This work is supported by research grants from Tata Sons Private Limited, Tata Consultancy Services Limited, and Titan.