Performance-based control interface for character animation



Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement.  In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world.  The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the user's intention, and 2) simulating the appropriate action based on the intention and virtual context.  We solve this issue by first enabling the virtual world's designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions.  We then integrate the pre-recorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the user's personal style.


Adventure (19.5M)

Monkeybar (15.7M)

Trampoline (8.3M)

Rope (11.2M)

Rock climbing (4.8M)

Swimming (4.2M)

Spidy (6.8M)



    author = “Satoru Ishigaki and Timothy White and Victor B. Zordan and C. Karen Liu”,

    title = “Performance-based Control Interface for Character Animation”,

    journal = “ACM Transactions on Graphics (SIGGRAPH)”,

    year = 2009,

    volume = 28,

    number = 3,


Project Members

Satoru Ishigaki

Timothy White

Victor B. Zordan

C. Karen Liu