2.6 Wearable interfaces for real-time interactive music

2.6.1 BodySynth

The BodySynth is a wearable, wireless muscle-activated MIDI controller that is used to generate music and lighting effects in time to a dancer’s movements. The basic system consists of four muscle tension (electromyogram, or EMG) sensors, a small body unit (1"x2.5"x4") for signal amplification and conditioning, a wireless transmission system, and a processor unit. The processor unit runs several real-time filters on an internal DSP processor, including metronomic functions, tempo adjustment (between 50-300 beats per minute), peak detectors, and impulse averagers. It can process up to eight channels at 40-80Hz sampling rate with twenty parameters per channel. It sends data out as MIDI note on, pitch bend, and continuous controller messages. Additional programs can also be loaded into its onboard RAM via an RS-232 port or changed using its keypad and display screen. Available extensions to the system include four more EMG inputs (for a total of eight), four more other inputs, and a cable to replace the wireless system.

The BodySynth was built by the independent team of electrical engineer Ed Severinghaus and performance artist Chris Van Raalte. The BodySynth has been used by performance artists Laurie Anderson on a European tour in 1992, San Francisco-based composer and performer Pamela Z (including for the Bang On a Can All-Stars concert at Lincoln Center), and the "Cyberbeat Brothers," the performing duo of Chris Van Raalte and John Zane-Cheong. While the hardware for the BodySynth shows a great deal of careful thought and design, it seems to suffer from the problems discussed in chapter one – that is, it is difficult to see the relationship between gesture and sound. As one reviewer wrote about a performance of Van Raalte’s, "it’s easy to miss the point of cyberdancing. There are no telltale signs that dancer and orchestra are one and the same – or that the music is moving to the beat of the performer, not the other way around…it’ll be a long time before any group using body synthesizers can challenge the New York Philharmonic Orchestra in concert. At Richland College, Mr. Van Raalte struggled to hit the six notes in sequence that make up the main refrain of Happy Birthday. ‘The body is not meant to do this stuff. The keyboard is a lot easier,’ he said." Even when the notes are easy to trigger, they are essentially limited to simple event triggers and continuous parameter shaping.

2.6.2 BioMuse

The BioMuse, an eight-channel, general-purpose ‘biocontroller,’ was developed at Stanford’s Center for Computer Research in Music and Acoustics (CCRMA) in 1989 by Hugh Lusted and Benjamin Knapp. Originally designed to enable aesthetic and recreational computer use for people with movement impairments and paralysis, it contained sensors for eye control (electrooculogram, or EOG), muscle tension signals (EMG), and brain waves (electroencephalogram, or EEG). Lusted and Knapp formed a company, BioControl Systems, in 1989, and introduced the BioMuse as a commercial product in 1992. The device consists of a rack-mountable box containing eight input channels, a programmable gain amplifier, a 30kHz 12 bit A/D converter, a Texas Instruments 320C25 DSP chip for filtering and pattern recognition, and a 19.2 kiloBaud, optoisolated serial output. The device outputs data as MIDI controller messages. Unusual for many physiological monitoring systems, it samples the data high enough for the EMG sensors that they use – each channel is sampled at 4kHz, which is more than necessary. The BioMuse also comes with a library of proprietary DSP algorithms, using primarily energy in the muscle signal instead of its time-domain amplitude. CCRMA doctoral student Bill Putnam also wrote a number of pattern recognition algorithms to detect and classify dynamic gestures in real-time.

The EMG electrodes are worn on an armband with electrodes on the inner surface, and a headband holds the EEG sensors. It seems from descriptions of performances that the majority of the users do not use the EEG; this suggests that brain waves are not yet understood well enough to be preferred in the volitional control of music. Like all sensor-based interfaces, the BioMuse is also subject to problems of use. As mentioned in a 1995 review, "the electrodes are covered with a conductive gel that picks up the signals generated by muscle movements and contractions. The gel, as gel will, tends to moisten in contact with perspiration and slide off. This causes the BioMuse to malfunction. These are the problems innovators on the cutting edge of technology often face: an invention that can, with no exaggeration, turn impulses of thought and movement into music, defeated by a slimy glob of blue gelatin."

Figure 7. Atau Tanaka

Media Artist Atau Tanaka was the first to compose a concert piece for the BioMuse, and continues to work intensively with it as a musical instrument. As a doctoral student at Stanford’s CCRMA, he developed a platform for his work using Opcode’s MAX programming environment, and later advanced the work at IRCAM and STEIM. Tanaka’s performances use the four EMG channels of the BioMuse; he uses a trio of differential gel electrodes for each input channel, and uses an armband to hold the sensors over the inner and outer forearms, biceps, and triceps. Tanaka wrote that his software patches mapped "incoming MIDI control data (representing EMG trajectories) to musical gestures. In this way, a physical gesture of the muscles effects melody, rhythm, timbral changes, and combinations." As with the BodySynth, however, it’s not clear from the literature how sophisticated the mappings were. Tanaka admitted that:

"there is a certain frustration in directly connecting the BioMuse ouput to MIDI devices in this way. The source biodata is a rich, continuous signal that is constantly changing. MIDI, on the other hand, is an event based music control specification. To better suit the nature of the biosignal, I have created Max patches to allow direct control of sound synthesis by sending MIDI System Exclusive to the synthesizer." These days, Atau Tanaka performs very regularly with the BioMuse, particularly as part of a unique group called SensorBand. He uses the BioMuse to trigger not only sounds but also images.

2.6.3 Lady’s Glove

French composer Laetitia Sonami developed the Lady’s Glove in collaboration with Bert Bongers at STEIM (Studio for Electro-Instrumental Music) in Amsterdam. After trying earlier models of glove-based interfaces, which Sonami found to be bulky and unwieldy, she began investigating other designs using Hall effect sensors and thin latex work gloves. The final version of the Lady’s Glove is for two hands and is made of a thin Lycra mesh with switches in the fingertips, Hall effect sensors at the joints, and resistive strips running the length of the fingers and metatarsals. The palm of each glove contains an ultrasound receiver that detects the strength of the signal coming from emitters on her shoes; using these, she can tell the distance between each hand and also the distance of each hand from the floor. A motion sensor determines the speed of her gestures. STEIM’s SensorLab analog-to-MIDI converter beltpack is used to condition, convert, and route the signals to the computer. Sonami writes and performs her own music for the Lady’s Glove, using samples, frequency modulation, and additive synthesis. She choreographs her pieces in a kind of dance form that resembles South Asian mudra hand patterns and sign language.

2.6.4 DancingShoes

Professor Joseph Paradiso of the MIT Media Lab first built a set of DancingShoes in 1997. These instrumented sneakers measure four points of pressure, bend, pitch, roll, 3-axis acceleration, twist, and 3 axes of position in each shoe. Many of these signals are converted to digital signals within the sneaker itself and broadcast to a nearby radio receiver; the shoe is also powered by its own battery that has a life of 3 hours. These shoes have gone through numerous revisions during the past two years and have been used in several public performances. In the most recent version, each sneaker measures 16 different parameters of the user’s movement. One of the DancingShoes is pictured below:

Figure 8. The Dancing Shoe

2.6.5 Miburi

Miburi, which means "gesture" in Japanese, is an advanced wearable gesture-sensing instrument that was commercially available until quite recently. Developed by Yamaha in Japan over a nine-year period, this stretchy cotton shirt embedded with sensors was introduced in Japan in 1994 and won the G-mark prize in 1996. It has been used in numerous live musical performances including Mort Subotnick’s "Intimate Immensity" at the Lincoln Center Summer Festival (where it was worn by a Balinese dancer and used to control two Ymaha Disklavier pianos), and percussionist Hiroshi Chu Okubo (the self-titled ‘first professional Miburi player’). The device’s basic S3 model consists of a stretchy fabric suit that contains bend/flex sensors for the shoulders, elbows, and wrists, two handgrip units (which have two velocity-sensitive buttons for each index, middle, ring, and pinky finger, and one see-saw key for each thumb), and a belt unit that collects the sensor data, and sets process controls. The data is conveyed over a cable to a remote synthesizer unit (that uses the S-VA synthesis architecture). Notes are generated by simultaneously moving a joint angle and pressing a key; the software automatically maps the joint bend information to MIDI notes and the hand controller signals to octave and velocity values.

The more recent R3 model comes with the AWM2 tone generation system, additional piezoelectric sensors for the performer’s shoes that measure toe and heel impact, and includes a wireless transmitter/receiver unit to replace the cable connection to the sound unit. It has 32 sound-producing positions (bend and straighten for each of the six flex sensors (12), taps for each heel and toe (4), and eight keys on each grip unit (16)). The faster the movement or keypress, the louder the sound. Sounds are made by combining arm gestures and key presses. Effects are made with the see-saw controllers on the grip unit; the right thumb automatically controls pitch bend, while the left thumb can control modulation, panning, etc.

The Miburi, although it was remarkable in many ways, suffered from its reliance on a simple, semaphore-like gesture language. The combination of joint movements and keypresses seemed stilted and contrived; there was no musical precedent for their choices and therefore they seemed a little random. It seems that they were chosen to conform with the limits of the sensors. However, the Miburi remains an inspiration because it was the first attempt by a large company to explore the potential in wearable musical instruments.

2.6.6 Benoit Maubrey’s Electro-Acoustic Clothing

American-born artist Benoît Maubrey has been experimenting with wearable audio performance art pieces since 1983 with his Berlin-based AUDIO GRUPPE. Their projects involve building and doing public performances with electro-acoustic clothes equipped with loudspeakers, amplifiers, and 257K samplers.

Figure 9. Audio Ballerina

The technology enables the performers to react directly with their environment by recording live sounds, voices, or instruments in their proximity, and amplifying them as a mobile and multi-acoustic performance. They also wear radio receivers, contact microphones, light sensors and electronic looping devices in order to produce, mix, and multiply their own sounds and compose these as an environmental concert. The performers use rechargeable batteries and/or solar cells. Various projects of Maubrey’s have included the Audio Jackets, Audio Herd, Audio Steelworkers (created for the Ars Electronica festival), Guitar Monkeys, Audio Subway Controllers, Audio Cyclists, Audio Ballerinas, Audio Guards, Audio Characters in an Audio Drama, Electronic Guys, Cellular Buddies, Audio Geishas. In general, the audio sounds of these devices lack musical content – as conceptual art they generate a pleasant effect, but the sounds from the costumes mostly consist of random noises, which would not interest a traditional musician.

2.6.7 The Musical Jacket

Another important example of wearable music interfaces is the Musical Jacket, designed and built in 1997 by a team at the MIT Media Lab, including Maggie Orth, Rehmi Post, Josh Smith, Josh Strickon, Emily Cooper, and Tod Machover. This jacket, which was successfully demonstrated at numerous conferences and trade shows, is a stand-alone, normal Levi’s jacket with speakers in the pockets, a small MIDI synthesizer on one shoulder, and a washable fabric keypad at another. All the necessary equipment is sewn into the jacket, and data and power are passed around via a conductive fabric bus. The jacket is designed for amateurs to play by tapping on their shoulder to trigger different sounds and sequences; while the interface is a bit awkward, the result is fun and satisfying for the novice. It also points to the possibility for more sophisticated functionality to be embedded in clothing.

2.6.8 Chris Janney’s HeartBeat

HeartBeat began as a collaborative project during the 1980s between Chris Janney, an Artist/Fellow at the MIT Center for Advanced Visual Studes, and dancer/choreographer Sara Rudner. The dance is a solo piece, with choreographic structure within which improvisation is taken. The dancer wears a wireless device that amplifies and sonifies the natural electrical impulses that stimulate the heart to beat. This forms the basis of the musical score, which is then overlaid with sounds of medical text, jazz scat, and the adagio movement of Samuel Barber’s String Quartet. The piece was recently revised for Mikhail Baryshnikov, who premiered "HeartBeat:mb" in January 1998 at City Center in New York and later took it on a world tour. Janney said about the piece, "it’s the easiest tied to the soul because it’s the heart. It makes you think about your own heart, your own mortality." This got a lot of attention and the prominence and skill of the artist always helps a lot! However, I have heard first-person accounts of the concerts that much of the excitement in the audience came from the fear that they had at his extremely elevated heartbeat; they were afraid for his health. Janney, well known for many other urban architectural music projects, openly admits to building the technology first and composing for it at the end; in "a lot of my projects, I’m building a musical instrument, and then I have to learn how to play it."

2.6.9 Others

David Rosenboom, composer and dean of the School of Music at the California Institute of the Arts, began investigating issues of biofeedback and brain activity in the 1960s, and published two books entitled Biofeedback and the Arts and Extended Musical Interface with the Human Nervous System. He spent many years writing and working with brainwaves and music, particularly with EEG and ERP (event related potential) signals. In the early 90s, Leon Gruenbaum invented the Samchillian TipTipTip CheeePeeeee, a unique, rewired QWERTY computer keyboard that hangs from his suspenders and is used to perform melodies. Sequences of keystrokes are converted to MIDI notes and played on an external synthesizer; it uses a relativistic, intervallic approach, where the keystroke you use tells the system what distance and direction to play from the previous note. Modulations and changes of the basic scale can be chosen, chords and patterns can also be created or selected in real-time, and key mappings can be reassigned. One of the advantages of this instrument is that one can play at a very fast speed, since typing on a computer keyboard requires less force; one drawback is that volumes and values for each note cannot be controlled. Gruenbaum performs with Vernon Reid in the avant-garde downtown New York scene; his performance style has been described as extremely low-key, where he taps almost imperceptibly on his keyboard without much external gesture. Other artists who have worked in the area of wearable music devices include Michel Waisvisz of STEIM, Laurie Anderson, Axel Mulder, and the Austrailian performance artist Stelarc.
 

 Chapter 3.1