Tag Archives: robots

Burcu Aysen Urgen Dissertation Defense 9/17 11am – Perception of Human and Robot Actions

Dissertation Defense

Spatio-temporal Neuroimaging of Visual Processing of Human and Robot Actions in Humans


Thursday, Sept 17, 2015, 11 am

Cognitive Science Building, Room 003

Successfully perceiving and recognizing the actions of others is of utmost importance for the survival of many species. For humans, action perception is considered to support important higher order social skills, such as communication, intention understanding and empathy, some of which may be uniquely human. Over the last two decades, neurophysiological and neuroimaging studies in primates have identified a network of brain regions in occipito-temporal, parietal and premotor cortex that are associated with perception of actions, also known as the Action Observation Network. Despite growing body of literature, the functional properties and connectivity patterns of this network remain largely unknown.

The goal of this dissertation is to address these general questions about functional properties and connectivity patterns with a specific focus on whether this system shows specificity for biological agents. To this end, we collaborated with a robotics lab, and manipulated the humanlikeness of agents that perform recognizable actions by varying visual appearance and movement kinematics. We then used a range of measurement modalities including cortical EEG oscillations, event-related brain potentials (ERPs), and fMRI together with a range of analytical techniques including pattern classification, representational similarity analysis (RSA), and dynamical causal modeling (DCM) to study the functional properties, temporal dynamics, and connectivity patterns of the Action Observation Network.

While our findings shed light on whether the human brain shows specificity for biological agents, the interdisciplinary work with robotics also allowed us to address questions regarding human factors in artificial agent design in social robotics and human-robot interaction such as uncanny valley, which is concerned with what kind of robots we should design so that humans can easily accept them as social partners.

Please join us and find out more about some of Burcu’s exciting and interdisciplinary work in the lab!

New paper accepted!

New paper!

Saygin, A.P., Chaminade, T., Ishiguro, H., Driver, J. & Frith, C. (2011) The thing that should not be: Predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Social Cognitive Affective Neuroscience. In Press.  PDF

Abstract: Using fMRI repetition suppression, we explored the selectivity of the human action perception system (APS), which consists of temporal, parietal, and frontal areas, for the appearance and/or motion of the perceived agent. Participants watched body movements of a human (biological appearance and movement), a robot (mechanical appearance and movement), or an android (biological appearance, mechanical movement). With the exception of extrastriate body area, which showed more suppression for humanlike appearance, the APS was not selective for appearance or motion per se. Instead, distinctive responses were found to the mismatch between appearance and motion: whereas suppression effects for the human and robot were similar to each other, they were stronger for the android, notably in bilateral anterior intraparietal sulcus, a key node in the APS. These results could reflect increased prediction error as the brain negotiates an agent that appears human, but does not move biologically, and help explain the “uncanny valley” phenomenon.

In addition to the really interesting data, this paper is extra-cool because we have the following citation:

    Lovecraft, H. P. (1936). The Shadow Over Innsmouth. In The Dunwich Horror and Others, S. T. Joshi, ed. Sauk City, WI: Arkham House.

And even better:

    Hetfield, J., Ulrich, L. and Hammett, K. (1986). The Thing That Should Not Be. In Master of Puppets, Electra Records.

Yes, yes I cited a song:

I will post a more detailed post about this study soon. Until then, Enjoy! |m|

Figure 2

Here is the attachments of this Post