next up previous
Next: Recognition of American Sign Up: Applications Previous: A Modular Interface

Gesture Control for ALIVE, SURVIVE

 
 figure253

Figure 3: (a) Chris Wren playing with Bruce Blumberg's virtual dog in the ALIVE space, and (b) playing SURVIVE. (c) Real-time reading of American Sign Language (with Thad Starner doing the signing) (d) Trevor Darrell demonstrating vision-driven avatars.

In many applications it is desirable to have an interface that is controlled by gesture rather than by a keyboard or mouse. One such application is the Artificial Life IVE (ALIVE) system[9]. ALIVE utilizes Pfinder's support map polygon to define alpha values for video compositing (placing the user in a scene with some artificial life forms in real-time). Pfinder's gesture tags and feature positions are used by the artificial life forms to make decisions about how to interact with the user, as illustrated in Fig. 3(a).

Pfinder's output can also be used in a much simpler and direct manner. The position of the user and the configuration of the user's appendages can be mapped into a control space, and sounds made by the user are used to change the operating mode. This allows the user to control an application with their body directly. This interface has been used to navigate a 3-D virtual game environment as in SURVIVE (Simulated Urban Recreational Violence IVE) [18] (illustrated in Fig. 3(b)).



Christopher R. Wren
Tue Feb 24 22:06:47 EST 1998