Designing Frameworks for Automatic Affect Prediction and Classification in Dimensional Space

author: Maja Pantic, Intelligent Behaviour Understanding Group (iBUG), Department of Computing, Imperial College London
published: Aug. 24, 2011,   recorded: June 2011,   views: 3382
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. To realize this prediction, next-generation computing should develop anticipatory user interfaces that are human-centred, built for humans, and based on naturally occurring multimodal human behaviour such as affective and social signaling.

The facial behaviour is our preeminent means to communicating affective and social signals. This talk discusses a number of components of human facial behavior, how they can be automatically sensed and analysed by computer, what is the past research in the field conducted by the iBUG group at Imperial College London, and how far we are from enabling computers to understand human facial behavior.

Disclaimer: There may be mistakes or omissions in the interpretation as the interpreters are not experts in the field of interest and performed a simultaneous translation without comprehensive preparation.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: