RETARGETING MOTION OF CLOTHING
TO NEW CHARACTERS
Yu Lee and Moon-Ryul Jung*
Department of Media Technology, Graduate School of Media, Sogang University, Seoul, Korea
Keywords: Retargeting, Cloth, Deformation, Body-oriented Cloth, Collision Detection.
Abstract: We show how to transfer the motion of cloth from a source to a target body. The obvious method is to add
the displacements of the source cloth, calculated from their position in the initial frame, to the target cloth;
but this can result in penetration of the target body, because the shape of the target body. To overcome this
problem, we compute an approximate source cloth motion, which maintains the initial spatial relationship
between the cloth and the source body; then we obtain a detailed set of correction vectors for each frame,
which relate the exact cloth to this approximation. We then compute the approximate target cloth motion in
the same way as the source cloth motion; and finally we apply the detail vectors that we generated for the
source cloth to the approximate target cloth, thus avoiding penetration of the target body. We demonstrate
the retargeting of cloth using figures engaged in dance movements.
1 INTRODUCTION
Clothing simulation is an area in which physically
based techniques pay off, and quite a lot of work
along these lines has been reported (Terzopoulos and
Fleicher, 1988, Breen et al, 1994, Carignan et al,
1992, Vollino and Thalmann, 1995, Baraff and
Witkin, 1998 , Zhang and Yuen, 2000). Tools
derived from physically based techniques are
available in commercial modeling animation
packages, such as Maya and Syflex. These tools are
a great help to animators, but physically realistic
cloth simualtion remains difficult and time
consuming, even using commercial packages. So, it
would be useful to be able to capture cloth motion in
the same way as we capture human body motion
(Pritchard and Heidrich, 2003, White et al, 2007). At
the moment, there are a few systems for capturing
cloth motion, but there will appear some in the
future.
Assuming that capturing cloth motion is feasible
in the near future, we want to devise a method of
reusing, or more precisely retargeting, a cloth
motion from one body to another.
Although a cloth motion obtained this way will
be less physically realistic than the original, in most
cases all that is required is plausible cloth behavior,
* corresponding author ( moon@sogang.ac.kr)
not strict physical accuracy.
Motion retargeting techniques have been
developed for the body (Gleicher, 1998, Lee and
Shin, 1999, Choi and Ko, 2000) and the face (Noh
and Neumann, 2001, Pyun and Shin, 2003, Na and
Jung, 2004), but there is no published work on
retargeting cloth motions to new characters.
The basic method of retargeting the motion of
one character to another is to transform the source
motion so that the resulting motion satisfies
constraints imposed by the new character, while
preserving the characteristics of the source motion as
far as possible. To retarget a cloth motion we need to
consider the source body motion, the source cloth
motion, the target body motion, and the target cloth
configuration at the initial frame, as shown in Figure
1. This is a more constrained problem than that of
retargeting facial motions, where only the source
facial motion and the configuration of the target face
at the initial frame are used in a typical
implementation.
The simplest way to obtain a target cloth motion
is to compute the displacement of the source cloth at
each frame from the initial frame, and add that
displacement to the position of the target cloth at the
initial frame. But when we see the resulting target
cloth motion together with the target body motion,
we find that the cloth penetrate the body. That is
280