The Affective DJ is a work in progress, that has only begun to be fully tested. The challenges of assessing multiple users' affective states and learning their preferred responses in different situations is a grand one. The present version of the Affective DJ uses only a portion of the data available; other data from the electromyogram, which indicates muscle tension, or from changes in heart rate with respect to respiration may prove more useful for predicting music preference. Ultimately, a learning algorithm should be combined with several features of these physiological signals together with features such as time of day and location of person (office, commuting, home, etc.) to help determine music preference.
The current system relies upon a very simple algorithm based on skin conductivity, which was confirmed in our (limited) tests to have a significant correlation with perceived excitement level of a song. We have also found certain skin conductivity changes (related to an orientation response) to be useful in controlling a wearable digital camera [HP98]. These two projects are examples of new efforts to fully understand affect in context, and to develop more sophisticated algorithms incorporating multi-modal sensing and pattern recognition for building interfaces that are more human-centered in design.