Looking at / Sensing People

Looking at / Sensing People

Place: Large lecture room.

Affiliation:  Queen Mary University of London. U.K.    

Recent years have witnessed an unprecedented interest in the recognition of human behaviour, from analysis of facial expressions, body gestures or neurophysiological signals such as EEG and skin conductivity and heart rate. In this talk, we will present some of our recent works in those three areas. First we will present our works for facial alignment in the wild, that is on face alignment on images taken in unconstrained environments where there is occlusion, clutter, and large variations in pose, expressions and appearance. We will focus on a recent work ("Mirror, mirror on the wall, tell me, is the error small?", Heng Yang and Ioannis Patras, CVPR 2015) on facial alignment in uncontrolled environments ("in the wild"). We will show that a number of state of the art methods for localisation of object parts (facial feature and body part localisation) produce results that are not biletarly symmetrical - i.e. the results on the mirror image is are not mirrored versions of the results on the original. We show that this is not caused by training or testing sample bias - all algorithms are trained on both the original images and their mirrored versions. We then show that the difference between the results in the mirrored image and the original one is strongly correlated with the ground truth estimation error. We show two interesting applications - in the first it is used to guide the selection of difficult  samples  and  in  the  second  to  give  feedback  in  a popular Cascaded Pose Regression method for face alignment. Finally, we will give a brief overview of our recent activities on multi-modal analysis of neurophysiological human signals with the goal of estimating people's emotional / affective states.