Human-computer interface design has become more sensitive to the needs of the user through user centered design practices and ``intelligent'' assistance programs. However, the computer still has no direct channel to confirm a user's affective reaction to the information or options being presented. People naturally learn from affective reactions. For example, if a human assistant made a suggestion which angered you, they would be hesitant to offer that suggestion again. A computer with the ability to recognize affective response could potentially do a better job of adapting its behavior to you.
Traditionally, the fields of psychology and psychophysiology have sought features which indicate common affective states across large populations, despite the fact that there are wide variations in individual response patterns due to gender, personality type and ethnic background [1]. In this paper, we propose a tailored model that is based on recognizing the physiological signature of a single user's affective patterns over time.
This analysis also differs from that of traditional emotion research in the time window considered. This experiment analyses how the user's affective state changes over an average period of three minutes, far longer than the one-to-ten second reaction usually analyzed in emotion research [2]. In other analysis, we have found that affective measurements for a single individual are easily overwhelmed by physiological factors within such a short window [3]; therefore in this work, we look for larger changes that could be detected in an individual over a three minute window.