The Pixels and Sounds of Emotion
The pixels and sounds of emotion! What if we could detect emotions in a general, user-agnostic fashion? Is it possible to capture human emotion solely by looking at the pixels of the screen and hearing the sounds of the interaction? No sensors, no access to biometrics, facial expressions or speech!
We are thrilled to present our new IEEE Transactions on Affective Computing paper (with Georgios Yannakakis and Antonios Liapis from the Institute of Digital Games - University of Malta) in which we transfer and introduce the idea of general-purpose deep representations for gameplaying to affective computing. We show we can predict arousal via audiovisual game footage across 4 very different games with top accuracies (as high as 85%) using the demanding leave-one-video-out validation scheme.
Exploring the next frontier of AI and ML at the Institute of Digital Games
Dr Konstantinos Makantasis, post-doctoral researcher in Artificial Intelligence at the Institute of Digital Games has been awarded a prestigious Marie Sklodowska-Curie Individual Fellowships for his Tamed (Tensor-bAsed Machine learning towards genEral moDels of affect) project. Tamed aims to create new methods and algorithms to realise aspects of general emotional intelligence (that is, the ability to understand and manage your own emotions and those of the people around you), one of the core long-term goals of artificial intelligence and artificial psychology.