skip to main content
10.1145/1877826acmconferencesBook PagePublication Pagesicmi-mlmiConference Proceedingsconference-collections
AFFINE '10: Proceedings of the 3rd international workshop on Affective interaction in natural environments
ACM2010 Proceeding
Publisher:
  • Association for Computing Machinery
  • New York
  • NY
  • United States
Conference:
MM '10: ACM Multimedia Conference Firenze Italy 29 October 2010
ISBN:
978-1-4503-0170-1
Published:
29 October 2010
Sponsors:
Recommend ACM DL
ALREADY A SUBSCRIBER?SIGN IN

Reflects downloads up to 03 Mar 2025Bibliometrics
Skip Abstract Section
Abstract

It is our great pleasure to welcome you to the 3rd International Workshop on Affective Interaction in Natural Environments -- AFFINE 2010. AFFINE follows a number of successful workshops and events commencing in 2008.

A key aim of AFFINE is the identification and investigation of significant open issues in real-time, affect-aware applications 'in the wild' and especially in embodied interaction, for example, with robots or virtual agents. AFFINE seeks to bring together researchers working on the real-time interpretation of user behaviour with those who are concerned with social robot and virtual agent interaction frameworks.

The call for papers attracted several submissions from Europe, Asia, Africa, Canada and the United States. The program committee accepted 17 papers that cover a variety of topics, including multimodal human affect recognition, multimedia expression generation in robots and virtual agents, human-computer and human-robot interaction. In addition, the program includes a keynote talk by Prof. Antonio Camurri on the automated analysis of non-verbal expressive gesture and expressive social interaction in groups of users, for applications in novel multimodal interfaces and emerging user-centric media.

Skip Table Of Content Section
SESSION: Keynote address
keynote
Automated analysis of non-verbal affective and social behaviour

This keynote introduces recent research on the automated analysis of non-verbal expressive gesture and of expressive social interaction in groups of users, for applications in novel multimodal interfaces and emerging User-Centric Media.

SESSION: Analysis and recognition
research-article
Enjoyment recognition from physiological data in a car racing game

In this paper we present a case study on The Open Racing Car Simulator (TORCS) video game with the aim of developing a classifier to recognize user enjoyment from physiological signals. Three classes of enjoyment, derived from pairwise comparison of ...

research-article
RANSAC-based training data selection for emotion recognition from spontaneous speech

Training datasets containing spontaneous emotional expressions are often imperfect due the ambiguities and difficulties of labeling such data by human observers. In this paper, we present a Random Sampling Consensus (RANSAC) based training approach for ...

research-article
Genetic search feature selection for affective modeling: a case study on reported preferences

Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the ...

research-article
Real time labeling of affect in music using the affectbutton

Valid, reliable and quick measurement of emotion and affect is an important challenge for the use of emotion and affect in human-technology interaction. Emotion and affect can be measured in two different ways: explicit, the user is asked for feedback, ...

SESSION: Synthesis and generation
research-article
Synthesizing expressions using facial feature point tracking: how emotion is conveyed

Many approaches to the analysis and synthesis of facial expressions rely on automatically tracking landmark points on human faces. However, this approach is usually chosen because of ease of tracking rather than its ability to convey affect. We have ...

research-article
Selecting appropriate agent responses based on non-content features

This paper describes work-in-progress on a study to create models of responses of virtual agents that are selected only based on non-content features, such as prosody and facial expressions. From a corpus of human-human interactions, in which one person ...

research-article
Interpretation of emotional body language displayed by robots

In order for robots to be socially accepted and generate empathy they must display emotions. For robots such as Nao, body language is the best medium available, as they do not have the ability to display facial expressions. Displaying emotional body ...

SESSION: Interaction (1)
research-article
Closing the loop: from affect recognition to empathic interaction

Empathy is a very important capability in human social relationships. If we aim to build artificial companions (agents or robots) capable of establishing long-term relationships with users, they should be able to understand the user's affective state ...

research-article
Designing affective computing learning companions with teachers as design partners.

There is a growing interest in studying the potential of including models of emotion in Embodied Pedagogical Agents (EPA), included in Computer-Assisted Learning (CAL) software. Children's understanding and response to emotions matures alongside their ...

research-article
Ubiquitous social perception abilities for interaction initiation in human-robot interaction

Robots acting as assistants or companions in a social environment must be capable of sensing information about the location of the users and analysing and interpreting their social, affective signals in order to be able to plan and generate an ...

research-article
A motivational health companion in the home as part of an intelligent health monitoring sensor network

This paper describes our work in progress to develop a personal monitoring system that can monitor the physical and emotional condition of a patient by using contextual information from a sensor network, provide the patient with feedback concerning ...

research-article
Interpreting non-linguistic utterances by robots: studying the influence of physical appearance

This paper presents a survey in which participants were asked to interpret non-linguistic utterances made by two different types of robot, one humanoid robot and one pet-like robot. The study set out to answer the question of whether the interpretation ...

SESSION: Interaction (2)
research-article
Facilitative effects of communicative gaze and speech in human-robot cooperation

Human interaction in natural environments relies on a variety of perceptual cues to guide and stabilize the interaction. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should be able to manipulate and ...

research-article
Use of nonverbal speech cues in social interaction between human and robot: emotional and interactional markers

We focus on audio cues required for the interaction between a human and a robot. We argue that a multi-level use of different paralinguistic cues is needed to pilot the decisions of the robot. Our challenge is to know how to use them to pilot the human-...

research-article
On the importance of eye gaze in a face-to-face collaborative task

In the present work we observe two subjects interacting in a colla­borative task on a shared environment. One goal of the experiment is to measure the change in behavior with respect to gaze when one interact­ant is wearing dark glasses and hence his/...

research-article
Can polite computers produce better human performance

We claim that the concept from human-human social interactions can be expanded and utilized to facilitate, inform, and predict human-computer interaction and perceptions. By expanding on a qualitative model of politeness proposed by Brown and Levinson ...

research-article
Augmented photoware interfaces for affective human-human interactions

Watching, sharing and discussing photographs represents an important social experience with profound emotional connotations. Although digital photoware allows multiple opportunities for indexing, retrieval and visualization of captured photographs, it ...

Contributors
  • Uppsala University
  • Panteion University of Social and Political Sciences
  • Paris-Saclay University
  • Carnegie Mellon University
  • KTH Royal Institute of Technology
  • University of California, San Diego

Recommendations