Strands2Cards: Automatic Generation of Hair Cards from Strands
Tojo, Kenji ; Hu, Liwen ; Umetani, Nobuyuki ; Li, Hao
Tojo, Kenji
Hu, Liwen
Umetani, Nobuyuki
Li, Hao
Supervisor
Department
Computer Vision
Embargo End Date
Type
Conference proceeding
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
We present a method for automatically converting strand-based hair models into an efficient mesh-based representation, known as hair cards, for real-time rendering. Our method takes strands as inputs and outputs polygon strips with semi-transparent texture, preserving the appearance of the original strand-based hairstyle. To achieve this, we first cluster strands into groups, referred to as wisps, and generate hairstyle-preserving texture maps for each wisp by skinning-based alignment of the strands into a normalized pose in UV space. These textures can further be shared among similar wisps to better utilize the limited texture resolution. Next, polygon strips are fitted to the clustered strands via tailored differentiable rendering that can optimize transparent cluster-colored coverage masks. The proposed method successfully handles a wide range of hair models and outperforms existing approaches in representing volumetric hairstyles such as curly and wavy ones. Furthermore, our strip optimization can efficiently convert a full-hair model with more than 100 thousand strands within 20 seconds. Our method was extensively tested on both a hair database and many complex real-world hairstyles acquired using state-of-the-art hair capture methods.
Citation
K. Tojo, L. Hu, N. Umetani, and H. Li, “Strands2Cards: Automatic Generation of Hair Cards from Strands,” Proceedings of the SIGGRAPH Asia 2025 Conference Papers, pp. 1–11, Dec. 2025, doi: 10.1145/3757377.3763864
Source
Proceedings of the SIGGRAPH Asia 2025 Conference Papers
Conference
SIGGRAPH Asia 2025 Conference
Keywords
Subjects
Source
SIGGRAPH Asia 2025 Conference
Publisher
Association for Computing Machinery
