Loading...
Thumbnail Image
Item

A multi-scale yarn appearance model with fiber details

Khattar, Apoorv
Zhu, Junqiu
Aubry, Jean-Marie
Padovani, Emiliano
Droske, Marc
Yan, Ling-Qi
Montazeri, Zahra
Research Projects
Organizational Units
Journal Issue
Abstract
Rendering cloth realistically has always been a challenge due to its intricate structure. Cloth is made up of fibers, plies, and yarns, and previous curvebased models, while detailed, were computationally expensive and inflexible for large pieces of cloth. To address this, we propose a simplified approach. We introduce a geometric aggregation technique that reduces ray-tracing computation by using fewer curves, focusing only on yarn curves. Our model generates ply and fiber shapes implicitly, compensating for the lack of explicit geometry with a novel shadowing component. We also present a shading model that simplifies light interactions between fibers by categorizing them into four components, accurately capturing specular and scattered light in both forward and backward directions. To render large pieces of cloth efficiently, we propose a multi-scale solution based on pixel coverage. Our yarn shading model can be rendered 3–5 times faster with less memory, for near-field views, compared to fiberbased models. Additionally, our multi-scale solution offers a 20% speed boost for distant cloth observation.
Citation
Source
Computational Visual Media
Conference
Keywords
46 Information and Computing Sciences, 4603 Computer Vision and Multimedia Computation, 4607 Graphics, Augmented Reality and Games, 4611 Machine Learning
Subjects
Source
Publisher
IEEE
Full-text link