In summation, the Conductor’s Jacket project has yielded significant and promising preliminary results that demonstrate a method for finding meaning in gestural data. By means of a signal-based, quantitative approach, a nascent technique has been developed for interpretive feature extraction and signal processing. Thirty-five expressive features have been identified in the performances of six conductors. The final goal of this work is to model and build systems to automatically recognize the affective and expressive content in live gestures, so as to improve upon the state of the art in interactive musical performances and gesture-based musical instruments. There now stands an immensely rich area for investigation and experimentation, which could yield answers to our most basic questions about the future of musical expression and performance.