Human conversational partners are able, at least to a certain extent, to detect the speaker’s or listener’s emotional state and may attempt to respond to it accordingly. When instead one of the interlocutors is a computer a number of questions arise, such as the following: To what extent are dialogue systems able to simulate such behaviors? Can we learn the mechanisms of emotional be- viors from observing and analyzing the behavior of human speakers? How can emotionsbeautomaticallyrecognizedfromauser’smimics,gesturesandspeech? What possibilities does a dialogue system have to express emotions itself? And, very importantly, would emotional system behavior be desirable at all? Given the state of ongoing research into incorporating emotions in dialogue systems we found it timely to organize a Tutorial and Research Workshop on A?ectiveDialogueSystems(ADS2004)atKlosterIrseein GermanyduringJune 14–16, 2004. After two successful ISCA Tutorial and Research Workshops on Multimodal Dialogue Systems at the same location in 1999 and 2002, we felt that a workshop focusing on the role of a?ect in dialogue would be a valuable continuation of the workshop series. Due to its interdisciplinary nature, the workshop attracted submissions from researchers with very di?erent backgrounds and from many di?erent research areas, working on, for example, dialogue processing, speech recognition, speech synthesis, embodied conversational agents, computer graphics, animation, user modelling, tutoring systems, cognitive systems, and human-computer inter- tion.
Author(s): Anton Batliner, Christian Hacker, Stefan Steidl, Elmar Nöth, Jürgen Haas (auth.), Elisabeth André, Laila Dybkjær, Wolfgang Minker, Paul Heisterkamp (eds.)
Series: Lecture Notes in Computer Science 3068 : Lecture Notes in Artificial Intelligence
Edition: 1
Publisher: Springer-Verlag Berlin Heidelberg
Year: 2004
Language: English
Pages: 328
Tags: Artificial Intelligence (incl. Robotics); User Interfaces and Human Computer Interaction; Multimedia Information Systems; Information Systems Applications (incl.Internet); Computer Appl. in Social and Behavioral Sciences; Computer Graphi
Front Matter....Pages -
From Emotion to Interaction: Lessons from Real Human-Machine-Dialogues....Pages 1-12
Emotions in Short Vowel Segments: Effects of the Glottal Flow as Reflected by the Normalized Amplitude Quotient....Pages 13-24
Towards Real Life Applications in Emotion Recognition....Pages 25-35
Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System....Pages 36-48
Neural Architecture for Temporal Emotion Classification....Pages 49-52
Empathic Embodied Interfaces: Addressing Users’ Affective State....Pages 53-64
Cognitive-Model-Based Interpretation of Emotions in a Multi-modal Dialog System....Pages 65-76
Affective Advice Giving Dialogs....Pages 77-88
A Categorical Annotation Scheme for Emotion in the Linguistic Content of Dialogue....Pages 89-100
Data-Driven Tools for Designing Talking Heads Exploiting Emotional Attitudes....Pages 101-112
Design of a Hungarian Emotional Database for Speech Analysis and Synthesis....Pages 113-116
Emotion and Dialogue in the MRE Virtual Humans....Pages 117-127
Coloring Multi-character Conversations through the Expression of Emotions....Pages 128-141
Domain-Oriented Conversation with H.C. Andersen....Pages 142-153
Simulating the Emotion Dynamics of a Multimodal Conversational Agent....Pages 154-165
Design and First Tests of a Chatter....Pages 166-177
Endowing Spoken Language Dialogue Systems with Emotional Intelligence....Pages 178-187
Do You Want to Talk About It?....Pages 188-192
Application of D-Script Model to Emotional Dialogue Simulation....Pages 193-196
Modeling and Synthesizing Emotional Speech for Catalan Text-to-Speech Synthesis....Pages 197-208
Dimensional Emotion Representation as a Basis for Speech Synthesis with Non-extreme Emotions....Pages 209-220
Extra-Semantic Protocols; Input Requirements for the Synthesis of Dialogue Speech....Pages 221-228
How (Not) to Add Laughter to Synthetic Speech....Pages 229-232
Modifications of Speech Articulatory Characteristics in the Emotive Speech....Pages 233-239
Expressive Animated Agents for Affective Dialogue Systems....Pages 240-243
Affective Feedback in a Tutoring System for Procedural Tasks....Pages 244-253
Generating Socially Appropriate Tutorial Dialog....Pages 254-264
The Role of Affect and Sociality in the Agent-Based Collaborative Learning System....Pages 265-275
Evaluation of Synthetic Faces: Human Recognition of Emotional Facial Displays....Pages 276-287
How to Evaluate Models of User Affect?....Pages 288-300
Preliminary Cross-Cultural Evaluation of Expressiveness in Synthetic Faces....Pages 301-304
Conversational H.C. Andersen First Prototype Description....Pages 305-308
Experiences with an Emotional Sales Agent....Pages 309-312
A Freely Configurable, Multi-modal Sensor System for Affective Computing....Pages 313-318
Gesture Synthesis in a Real-World ECA....Pages 319-322
Back Matter....Pages -