Scanning and animating characters dressed in multiple-layer garments

Pengpeng Hu, Taku Komura, Daniel Holden, Yueqi Zhong

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

Despite the development of user-friendly interfaces for modeling garments and putting them onto characters, preparing a character dressed in multiple layers of garments can be very time-consuming and tedious. In this paper, we propose a novel scanning-based solution for modeling and animating characters wearing multiple layers of clothes. This is achieved by making use of real clothes and human bodies. We first scan the naked body of a subject by an RGBD camera, and a statistical body model is fit to the scanned data. This results in a skinned articulated model of the subject. The subject is then asked to put on one piece of garment after another, and the articulated body model dressed up to the previous step is fit to the newly scanned data. The new garment is segmented in a semi-automatic fashion and added as an additional layer to the multi-layer garment model. During runtime, the skinned character is controlled based on the motion capture data and the multi-layer garment model is controlled by blending the movements computed by physical simulation and linear blend skinning, such that the cloth preserves its shape while it shows realistic physical motion. We present results where the character is wearing multiple layers of garments including a shirt, coat and a skirt. Our framework can be useful for preparing and animating dressed characters for computer games and films.
Original languageEnglish
Pages (from-to)961-969
Number of pages9
JournalThe Visual Computer
Volume33
DOIs
Publication statusPublished - 9 May 2017
Externally publishedYes

Keywords

  • Cloth animation
  • 3D scanning
  • Dressed character

Fingerprint

Dive into the research topics of 'Scanning and animating characters dressed in multiple-layer garments'. Together they form a unique fingerprint.

Cite this