Prof. Emilia Barakova Social Robotics LabEindhoven University of Technology Empathy in the interaction between caregivers and persons that cannot self-report: Results of combining physiological and behavioural measurements
In this talk, I will present our research on the emotional well-being of persons with visual impairments and severe intellectual disability and persons with dementia. Even trained caregivers cannot always recognise the discomfort or pleasure in the expression of these individuals. We developed a smart sock that measures physiological signals and is connected to a mobile application to detect and predict the changes in the emotional state in persons with visual impairments and severe intellectual disability and visualises it within a flower application to enhance the understanding and the communication possibilities between patient and caregiver. In another study, we combined physiological measurements and observations to detect positive engagement in persons with dementia when they are involved in activities as playing with a social robot or cognitive games. We point out the common and specific aspects for different user groups and show how different user groups and settings can use the results of this research line.
Frederiks, Kyra, Paula Sterkenburg, Emilia Barakova, and Loe Feijs. "The effects of a bioresponse system on the joint attention behaviour of adults with visual and severe or profound intellectual disabilities and their affective mutuality with their caregivers." Journal of Applied Research in Intellectual Disabilities (2019).
Sterkenburg, P.S., Barakova, E.I., Peters, P.J., Feijs, L.M. and Chen, W., 2017. A bioresponse system for caregivers of adults with severe or profound intellectual disabilities. JOURNAL OF MENTAL HEALTH RESEARCH IN INTELLECTUAL DISABILITIES Volume: 10 Special Issue: SI Supplement: 1 Pages: 121-121 Published: 2017
Perugia, G., van Berkel, R., Díaz-Boladeras, M., Català-Mallofré, A., Rauterberg, M. and Barakova, E., 2018. Understanding engagement in dementia through behavior. The ethographic and Laban-inspired coding system of engagement (ELICSE) and the Evidence-based Model of Engagement-related Behavior (EMODEB). Frontiers in psychology, 9.
Perugia, G., Rodríguez-Martín, D., Boladeras, M.D., Mallofré, A.C., Barakova, E. and Rauterberg, M., 2017, August. Electrodermal activity: explorations in the psychophysiology of engagement with social robots in dementia. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 1248-1254).
Prof. Hatice Gunes Department of Computer Science and TechnologyUniversity of Cambridge From one to many: Affect analysis in multi-person settings
Designing intelligent systems and interfaces with socio-emotional skills is a challenging task. Past works have mainly focussed on automatically analysing expressions and affect of people in individual settings. However, when we move from single user settings to multi-user ones, the process of affect analysis calls for new definitions, new datasets with meaningful annotations, and appropriate feature extraction and classification mechanisms in space and time. This talk will question some of the initial assumptions we have made in this area, and will present an overview of the works we have conducted in recent years.
Mou, W., Gunes, H., & Patras, I. (2019) Your Fellows Matter: Affect Analysis across Subjects in Group Videos. Proceedings of the 14th IEEE International Conference on Automatic Face and Gesture Recognition. https://doi.org/10.17863/CAM.37358
Mou, W., Gunes, H., & Patras, I. (2019) Alone vs In-a-group: A Multi-modal Framework for Automatic Affect Recognition. ACM Transactions on Multimedia Computing, Communications and Applications. https://doi.org/10.17863/CAM.37359
Mou, W., Gunes, H., & Patras, I. (2016). Alone versus In-a-group: A Comparative Analysis of Facial Affect Recognition. Proceedings of the ACM Multimedia Conference, 521-525. https://doi.org/10.1145/2964284.2967276
Mou, W., Celiktutan, O., & Gunes, H. (2015). Group-level arousal and valence recognition in static images: Face, body and context. Proceedings of the International Conference on Automatic Face and Gesture Recognition Workshops.
Mou, W., Tzelepis, C., Mezaris, V., Gunes, H., & Patras, I. (2019). A deep generic to specific recognition model for group membership analysis using non-verbal cues. Image and Vision Computing, 81 42-50. https://doi.org/10.1016/j.imavis.2018.09.005