Welcome to the INGREDIBLE WIKI !'''

The goal of the INGREDIBLE project is to propose a set of scientific innovations in the domain of human/virtual agent interaction. The project aims at developing a prototype of an autonomous virtual actor capable of dynamic affective bodily coupling with a human whose gestures are analyzed in real-time. The main focus is on the dynamics of the coupling of gestures. Such a dynamics is considered the key point of the feeling of the “presence of the other” and the “engagement” of the users in the virtual reality systems of the future. In human-human interactions each actor's gesture or variation of a gesture changes the other's behavior and vice versa. Depending on the participants’ emotional states, such a mutual influence can lead to the emergence of coordination, a gesture based dialogue (composed of initiatives and imitations) which is singular. It is such a singularity, so specific to “real” human interactions, which we want to model in the real/virtual space. In particular, our approach will include in an original manner several observed phenomena that appear in body interactions: integration of the different channels conveying the expressiveness (body position, quality of the movement, gesture), imitation, postural convergence and synchrony. The project will focus on the body expression of emotions during an interaction. It will also focus on the dynamics of the concatenation of gestures. The project will not tackle linguistic and verbal aspects. This choice is justified by a realistic evaluation of the complexity of the task that we aim to realize in a limited time (4 years). The real-time gesture based interaction alone is already a full-fledged domain and, as a first approach, it can be separated from the analysis of the speech. The generality of body interaction mechanisms through different contexts allows us to conduct this project without taking into account the verbal behavior and to pave the way for future studies on verbal and non-verbal interactions.

The project tends to bring together several scientific domains:

  • Study and formalization of the dynamics of the body interaction between humans
  • Study on the body expression and posture of emotions
  • Real time gesture recognition and anticipation
  • Modeling of autonomous interactive agents
  • Computer generation of expressive movements and real-time adaptation

The INGREDIBLE project aims also at defining a prototypic application by considering:

  • A generic approach to be applied on several animation platforms and on different scenarios.
  • An evaluation of the performances of various devices (Kinect, Optitrack, Moven …) and animation platforms (MARC, AréVi?/HLIB, SMR).
  • Several scenarios coming from different contexts (drama, sport, education…).
  • Users’ suggestions which will help us to satisfy their needs and expectations.

Welcome to Trac 0.11b1

Trac is a minimalistic approach to web-based management of software projects. Its goal is to simplify effective tracking and handling of software issues, enhancements and overall progress.

All aspects of Trac have been designed with the single goal to help developers write great software while staying out of the way and imposing as little as possible on a team's established process and culture.

As all Wiki pages, this page is editable, this means that you can modify the contents of this page simply by using your web-browser. Simply click on the "Edit this page" link at the bottom of the page. WikiFormatting will give you a detailed description of available Wiki formatting commands.

"trac-admin yourenvdir initenv" created a new Trac environment, containing a default set of wiki pages and some sample data. This newly created environment also contains documentation to help you get started with your project.

You can use trac-admin to configure  Trac to better fit your project, especially in regard to components, versions and milestones.

TracGuide is a good place to start.

The Trac Team

Starting Points

For a complete list of local wiki pages, see TitleIndex.