tangible and modular input device for character...

1
Tangible and Modular Input Device for Character Articulation Alec Jacobson 1 Daniele Panozzo 1 Oliver Glauser 1 edric Pradalier 2 Otmar Hilliges 1 Olga Sorkine-Hornung 1 1 ETH Zurich 2 GeorgiaTech Lorraine Figure 1: Assembled from modular, interchangeable, and hot-pluggable parts (left), our novel device forms a skeletal tree matching the Elephant. As the user manipulates each joint of the device, measured bone rotations animate a skeletal rig, and the Elephant comes to life. Interactively articulating virtual 3D characters lies at the heart of computer animation and geometric modeling. Expressive articula- tion requires control over many degrees of freedom: most often the joint angles of an internal skeleton. We introduce a physical input de- vice assembled on the fly to control any character’s skeleton directly. With traditional mouse and keyboard input, animators must rely on indirect methods such as inverse kinematics or decompose complex and integrated motions into smaller sequential manipulations—for example, iteratively positioning each bone of a skeleton hierarchy. While direct manipulation mouse and touch interfaces are successful in 2D [Shneiderman 1997], 3D interactions with 2D input are ill- posed and thus more challenging. Successful commercial products with 2D interfaces, e.g. Autodesk’s MAYA, have notoriously steep learning curves and require interface-specific training. Mouse and keyboard interfaces fall short because their control spaces do not match the perceptual space of the 3D interaction task [Jacob et al. 1994]. Hence, we propose direct physical manipulation via a tangible interface [Ishii and Ullmer 1997] with degrees of freedom matching the 3D rotations at skeletal joints in the virtual character. Our novel device is composed of modular, hot-pluggable mechani- cal parts. The user may quickly assemble measurement joints and branching splitters to create a custom device to control any virtual character with arbitrary topology skeleton (see Figure 1). Lever- aging modern advances in 3D printing, our parts are compact and comfortably held with one or two hands. Technical Contributions. Exploiting human proprioception and physical affordances, an assembled device allows interaction with a physical manifestation of the virtual character, without the need for a literal, fixed replication. Pairs of permanent magnets and Hall-effect sensors embedded in each joint measure rotations with accuracy of 1 at a frequency up to 250 Hz (more details in corresponding Technical Paper submission). The device is well suited not only for rapid prototyping, but also precise control tasks such as meticulous keyframe posing and real-time animation capture. Complementary to the physical device, we introduce algorithms to facilitate the device’s employment in the standard character rigging and animation pipelines. A novel semi-automatic registration algorithm accounts for the disparity between the device’s physical proportions and the virtual character’s. The user may quickly match the character’s rest pose and immediately begin animating (see accompanying video). Demo. Visitors may try our device in various 3D character articula- tion scenarios. Choosing from a catalog of virtual characters, visitors assemble a corresponding device from our kit of over 30 parts. Our system attaches the device to the character’s virtual skeleton. Then the visitor manipulates the device while seeing the character animate on screen in real time. We also prepare more advanced demos where visitors use the device to control physically based simulations and explore variational geometric modeling techniques. Our mechanical parts rely on internal electronic sensors rather than computer vision. Thus, the unpredictable lighting of the Emerging Technologies hall will emphasize that our device is suitable not only as a desktop tool at artist workstations, but also as a performance instrument in arbitrary environments. Our input device intensifies immersion and tangibility in the con- text of posing, designing and animating deformable 3D shapes. As displays make leaping advances toward convincing autostereoscopy and 3D printing becomes commonplace, we see potentially large impact from tangible input devices for virtual 3D content. To this end, we release complete hardware blueprints (OpenHardware) and accompanying source code in the hopes of fostering future explo- ration in this direction. We will encourage visitors to download our blueprints and construct their own input devices at home. References I SHII , H., AND ULLMER, B. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proc. CHI. JACOB, R. J. K., SIBERT, L. E., MCFARLANE, D. C., AND MULLEN,J R., M. P. 1994. Integrality and separability of input devices. ACM Trans. Comput.-Hum. Interact. 1, 1 (Mar.), 3–26. SHNEIDERMAN, B. 1997. Direct manipulation for comprehensible, pre- dictable and controllable user interfaces. In Proc. IUI, 33–39.

Upload: others

Post on 20-Jan-2021

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Tangible and Modular Input Device for Character Articulationpanozzo/papers/puppet-2014-etech.pdfTangible and Modular Input Device for Character Articulation Alec Jacobson 1Daniele

Tangible and Modular Input Device for Character Articulation

Alec Jacobson1 Daniele Panozzo1 Oliver Glauser1 Cedric Pradalier2 Otmar Hilliges1 Olga Sorkine-Hornung1

1ETH Zurich 2GeorgiaTech Lorraine

Figure 1: Assembled from modular, interchangeable, and hot-pluggable parts (left), our novel device forms a skeletal tree matching theElephant. As the user manipulates each joint of the device, measured bone rotations animate a skeletal rig, and the Elephant comes to life.

Interactively articulating virtual 3D characters lies at the heart ofcomputer animation and geometric modeling. Expressive articula-tion requires control over many degrees of freedom: most often thejoint angles of an internal skeleton. We introduce a physical input de-vice assembled on the fly to control any character’s skeleton directly.With traditional mouse and keyboard input, animators must rely onindirect methods such as inverse kinematics or decompose complexand integrated motions into smaller sequential manipulations—forexample, iteratively positioning each bone of a skeleton hierarchy.While direct manipulation mouse and touch interfaces are successfulin 2D [Shneiderman 1997], 3D interactions with 2D input are ill-posed and thus more challenging. Successful commercial productswith 2D interfaces, e.g. Autodesk’s MAYA, have notoriously steeplearning curves and require interface-specific training.

Mouse and keyboard interfaces fall short because their control spacesdo not match the perceptual space of the 3D interaction task [Jacobet al. 1994]. Hence, we propose direct physical manipulation via atangible interface [Ishii and Ullmer 1997] with degrees of freedommatching the 3D rotations at skeletal joints in the virtual character.

Our novel device is composed of modular, hot-pluggable mechani-cal parts. The user may quickly assemble measurement joints andbranching splitters to create a custom device to control any virtualcharacter with arbitrary topology skeleton (see Figure 1). Lever-aging modern advances in 3D printing, our parts are compact andcomfortably held with one or two hands.

Technical Contributions. Exploiting human proprioception andphysical affordances, an assembled device allows interaction with aphysical manifestation of the virtual character, without the need for aliteral, fixed replication. Pairs of permanent magnets and Hall-effectsensors embedded in each joint measure rotations with accuracy of∼1◦ at a frequency up to 250 Hz (more details in correspondingTechnical Paper submission). The device is well suited not only for

rapid prototyping, but also precise control tasks such as meticulouskeyframe posing and real-time animation capture. Complementaryto the physical device, we introduce algorithms to facilitate thedevice’s employment in the standard character rigging and animationpipelines. A novel semi-automatic registration algorithm accountsfor the disparity between the device’s physical proportions and thevirtual character’s. The user may quickly match the character’s restpose and immediately begin animating (see accompanying video).

Demo. Visitors may try our device in various 3D character articula-tion scenarios. Choosing from a catalog of virtual characters, visitorsassemble a corresponding device from our kit of over 30 parts. Oursystem attaches the device to the character’s virtual skeleton. Thenthe visitor manipulates the device while seeing the character animateon screen in real time. We also prepare more advanced demos wherevisitors use the device to control physically based simulations andexplore variational geometric modeling techniques.

Our mechanical parts rely on internal electronic sensors rather thancomputer vision. Thus, the unpredictable lighting of the EmergingTechnologies hall will emphasize that our device is suitable not onlyas a desktop tool at artist workstations, but also as a performanceinstrument in arbitrary environments.

Our input device intensifies immersion and tangibility in the con-text of posing, designing and animating deformable 3D shapes. Asdisplays make leaping advances toward convincing autostereoscopyand 3D printing becomes commonplace, we see potentially largeimpact from tangible input devices for virtual 3D content. To thisend, we release complete hardware blueprints (OpenHardware) andaccompanying source code in the hopes of fostering future explo-ration in this direction. We will encourage visitors to download ourblueprints and construct their own input devices at home.

References

ISHII, H., AND ULLMER, B. 1997. Tangible bits: Towards seamlessinterfaces between people, bits and atoms. In Proc. CHI.

JACOB, R. J. K., SIBERT, L. E., MCFARLANE, D. C., AND MULLEN, JR.,M. P. 1994. Integrality and separability of input devices. ACM Trans.Comput.-Hum. Interact. 1, 1 (Mar.), 3–26.

SHNEIDERMAN, B. 1997. Direct manipulation for comprehensible, pre-dictable and controllable user interfaces. In Proc. IUI, 33–39.