Face-to-face communications between humans involve emotions, which often are unconsciously conveyed by facial expressions and body gestures. Intelligent human-machine interfaces, for example in cognitive robotics, need to recognize emotions. This paper addresses facial expressions and their neural correlates on the basis of a model of the visual cortex: the multi-scale line and edge coding. The recognition model links the cortical representation with Paul Ekman's Action Units which are related to the different facial muscles. The model applies a top-down categorization with trends and magnitudes of displacements of the mouth and eyebrows based on expected displacements relative to a neutral expression. The happy vs. not-happy categorization yielded a. correct recognition rate of 91%, whereas final recognition of the six expressions happy, anger, disgust, fear, sadness and surprise resulted in a. rate of 78%.
Recognition of Facial Expressions by Cortical Multi-scale Line and Edge Coding
28.05.2013
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
DDC: | 629 |
Hierarchical multi-stream recognition of facial expressions
British Library Conference Proceedings | 2003
|Automatic recognition of human facial expressions
IEEE | 1995
|Automatic Recognition of Human Facial Expressions
British Library Conference Proceedings | 1995
|Real-time recognition of 6 basic facial expressions
British Library Online Contents | 1996
|Generating Realistic Facial Expressions with Wrinkles for Model-Based Coding
British Library Online Contents | 2001
|