Category Archives: Science

Burcu Aysen Urgen Dissertation Defense 9/17 11am – Perception of Human and Robot Actions

Dissertation Defense

Spatio-temporal Neuroimaging of Visual Processing of Human and Robot Actions in Humans

BURCU AYSEN URGEN

Thursday, Sept 17, 2015, 11 am

Cognitive Science Building, Room 003

Successfully perceiving and recognizing the actions of others is of utmost importance for the survival of many species. For humans, action perception is considered to support important higher order social skills, such as communication, intention understanding and empathy, some of which may be uniquely human. Over the last two decades, neurophysiological and neuroimaging studies in primates have identified a network of brain regions in occipito-temporal, parietal and premotor cortex that are associated with perception of actions, also known as the Action Observation Network. Despite growing body of literature, the functional properties and connectivity patterns of this network remain largely unknown.

The goal of this dissertation is to address these general questions about functional properties and connectivity patterns with a specific focus on whether this system shows specificity for biological agents. To this end, we collaborated with a robotics lab, and manipulated the humanlikeness of agents that perform recognizable actions by varying visual appearance and movement kinematics. We then used a range of measurement modalities including cortical EEG oscillations, event-related brain potentials (ERPs), and fMRI together with a range of analytical techniques including pattern classification, representational similarity analysis (RSA), and dynamical causal modeling (DCM) to study the functional properties, temporal dynamics, and connectivity patterns of the Action Observation Network.

While our findings shed light on whether the human brain shows specificity for biological agents, the interdisciplinary work with robotics also allowed us to address questions regarding human factors in artificial agent design in social robotics and human-robot interaction such as uncanny valley, which is concerned with what kind of robots we should design so that humans can easily accept them as social partners.

Please join us and find out more about some of Burcu’s exciting and interdisciplinary work in the lab!

Luke Miller Dissertation Defense 9/16 11am – Tool Use and Body Representations

Dissertation Defense

The Body in Flux: Tool Use Modulates Multisensory Body Representations

LUKE E. MILLER

Wednesday, Sept 16, 2015, 11 am

Cognitive Science Building, Room 003

Tool use is a hallmark of the human species and an essential aspect of daily life. Tools serve to functionally extend the body, allowing the user to overcome physical limitations and interact with the environment in previously impossible ways. Tool-body interactions lead to significant modulation in the user’s representations of body size, a phenomenon known as tool embodiment. In the present dissertation, I used psychophysics and event-related brain potentials to investigate several aspects of tool embodiment that are otherwise poorly understood.

First, we investigated the sensory boundary conditions of tool embodiment, specifically the role of visual feedback during tool use. In several studies, we demonstrate that visual feedback of tool use is a critical driver of tool embodiment. In one such study, we find that participants can embody a visual illusion of tool use, suggesting that visual feedback may be sufficient for tool-induced plasticity.

Second, we investigated the level of representation modulated by tool use. Is embodiment confined to sensorimotor body representations, as several researchers have claimed, or can it extend to levels of self-representation (often called the body image)? Utilizing well-established psychophysical tasks, we found that using a tool modulated the body image in a similar manner as sensorimotor representations. This finding suggesting that similar embodiment mechanisms are involved at multiple levels of body representation.

Third, we used event-related brain potentials to investigate the electrophysiological correlates of tool embodiment. Several studies with tool-trained macaques have implicated multisensory stages of somatosensory processing in embodiment. Whether the same is true for humans is unknown. Consistent with what is found in macaques, we found that using a tool modulates an ERP component (the P100) thought to index the multisensory representation of the body.

The work presented in this dissertation advances our understanding of tool embodiment, both at the behavioral and neural level, and opens up novel avenues of research.

Please join us and find out more about some of Luke’s exciting research in the lab over the past few years!

Why You Should Do Research as an Undergrad

Hello! This is Cindy, and I am a research assistant at the Saygin Lab of Cognitive Neuroscience and Neuropsychology. I’ve been doing this for a few quarters now, and it’s been a great experience. You should join me!

Being a research assistant for the Saygin Lab really solidified why I became a cognitive science major. I switched into it not really knowing what my major was about, but through the weekly lab meetings and conversing with others, I came to realize how lucky I was to have stumbled upon UCSD’s diamond in the rough.

I knew about the work that our lab does beforehand was because I am a chronic frequenter of geek websites, such as Cracked.com. Although perhaps not the most established forum, it often brings up sci-fi related research, such as was being done in the lab. Robots are a common topic, and the phenomenon of the Uncanny Valley inevitably came up. Imagine how surprised I was when I was reading up on the research opportunities at the CogSci department at UCSD, and found that that research had been accomplished by one of the professors! I sent in an application and was ecstatic when I was accepted.

I didn’t know what I was getting myself into, since my pre-med friends were always going on about how they created buffer solutions in chem research or cut up rats in bio research. Turns out that the experiments I was running for my grad student researcher was most like psychology research. Luke (the grad student) had us running a two session, two hour each experiment, which was not the most interesting thing I’ve ever done. When I talked to him about what I was collecting data for though, that was what was the exciting part. The overarching theme was something about embodied cognition and how we projected ourselves in space. For example, how do amputees see and feel about their prosthetics? It’s not technically a part of them, yet it should ideally function just as well or even better than what the respective limb did. He went on to do some research concerning a large plastic hand, similar to Edward Scissorhands. I did some external research and thought that this particular invention, a prosthetic arm that biomimics an octopus arm would yield interesting results pitting conventional arms vs. functionality.

How would these affect the self-perception of a human?

 

Lab members attend mini symposiums and conferences about where advances in cognitive science could lead to (in regards to technology). One that I was able to attend concerned the future of brain-computer interfaces or BCIs. Most of the technical engineering talk was lost on me, but the main thing I came away with was that pretty soon, we will be able to manipulate objects with our “minds.” This lead me to do some other outside research, and I found this prototype fashion accessory being demonstrated in conventions around the world: Necomimi Cat Ears. Its claim to fame is that it can change shapes when the wearer is focused or relaxed. This seems like an extraordinarily trivial application of this technology, but the concept is the same as more serious uses of BCIs. It is using EEG waves produced by the brain to control objects that aren’t necessarily part of a body.

An application of BCI technology: Wiggle your cat ears with your brain waves!

The coolest thing about researching in this lab while being a CogSci major was probably seeing how this current research was being taught in our classes. The fMRI book we discuss in meetings have direct repercussions on what I’m learning in my neuroscience class, and knowing that there’s still much debate going on whether cognition is distributed or specialized puts a whole different spin on my distributed cognition class. It’s also opened up a lot of doors. This summer, I advised a couple high school students getting their first taste of research and during the school year (we will have another blog post about that soon), I will be in the process of completing the Cognitive Science Honors Program.

I hope this has convinced you to do research as an undergrad! Use your four years wisely.

Saygin Lab Taco and "Spinal Tap" Viewing Party

Some members of Saygin Lab - Back row: Edward Nguyen (undergraduate), Maria Florendo (undergraduate), Ayse Saygin (professor), Angela Chan (undergraduate, now graduated), Burcu Urgen (graduate student). Front row: Jingwei Li (undergraduate, now graduated), Luke Miller (graduate student), Cindy Ha (undergraduate)

Is Biological Motion Special? Part 1: Multiple Object Tracking (MOT)

Multiple People Tracking?

Biological motion processing is fundamental to many important tasks, from hunting prey to social interaction. Due to the evolutionary importance of the domain, does biological motion receive preferential perceptual and neural processing? If so, at which processing levels do we see evidence for special treatment? Researching these questions can not only improve our understanding of biological motion perception, but can also shed light onto the nature of computations at different levels visual processing.

Earlier this year, we published a paper that used the Multiple Object Tracking (MOT) paradigm from vision science along with point-light biological motion stimuli to study whether attentional selection might be specialized for tracking biological targets.

Example point-light biological motion stimulus

“Imagine a primitive hunting party on the savannah stalking four weak gazelle amongst a larger herd” (Tombu & Seiffert, 2008). Or consider playing basketball, trying to remain mindful of the whereabouts of your teammates. Or trying to keep track of your friends at the July sales at Selfridges (shoes on sale!) These examples show the tracking of other animate entities is both commonplace and important. We wondered if this ecological importance would be reflected in performance in the MOT paradigm. We hypothesized people would track biological motion more effectively than inverted or scrambled control stimuli.

Videos depicting the trials for the biological motion experiment. Subjects tracked the targets coloured in red at the start. Video speed may not be exactly the same as the experimental trials.

And this was indeed what we found. But is this advantage due to the biologicalness of the targets per se? Or could it be because biological motion is a structured, coherent object, whereas the control stimuli are perceived as swirling groups of dots? We added a non-biological stimulus type that also has a canonical orientation in the next experiment: Subjects tracked the letter ‘R’, also composed of point-lights, presented upright, inverted or scrambled. Although inversion did not affect MOT performance for tracking letters, scrambling led to reduced tracking performance. Thus overall, while there may be some special role for biological motion (due to the inversion effect) in MOT, it appears structured objects, regardless of their biologicalness, are tracked better than unstructured ones.

Video for the letter condition. Subjects tracked the targets coloured in red at the start. Video speed may not be exactly the same as the experimental trials.

We next wondered what other naturalistic aspects of biological stimuli MOT might be sensitive to. Biological objects typically move in a manner that is consistent with the action they are performing. For example, if someone is facing leftward, they will in general be walking in that direction as well. We tested MOT with point-light walkers that walked naturally from one side of the screen to the other, and walkers that faced one direction, but moved in the other, i.e., moonwalking. The results showed absolutely no difference between participants’ ability to track natural walkers and moonwalkers.

Videos for the walker and moonwalker conditions. Subjects tracked the targets coloured in red at the start. Video speed may not be exactly the same as the experimental trials.

We also tested MOT using faces, another class of biological stimuli where inversion effects have been clearly demonstrated. In contrast to the advantage found for upright biological motion figures in MOT, here, inverted faces were more accurately tracked. Thus biological motion and face inversion effects were double dissociated in terms of their effects on MOT.

We discuss these results in detail in the paper, which is freely available online. To summarize, along with other recent studies, we found that the nature of the tracked objects in MOT do matter. MOT has been suggested to operate on ‘proto-objects’ extracted in early vision in a manner that is entirely encapsulated from higher-level representations (Pylyshyn, 2001). Our results, along with other work, argue against full encapsulation. Since MOT shows sensitivity to higher-level aspects of the stimuli, it either has access to computations performed in higher areas, or these computations are fed back to the levels at which MOT operates.

While MOT is sensitive to certain aspects of “biologicalness”, there does not appear to be a strong specialization for tracking biological, naturalistic, or ecologically valid stimuli. Instead, the results can be framed in terms of the extent to which stimuli can be segmented, grouped and selected as targets of object-based attention. Effectively grouped, coherent objects, but not necessarily biological objects, are tracked most successfully.

In terms of attentional selection and tracking, biological motion does receive some preferential treatment, but this seems largely due to an advantage for structured objects in general. In other words, when it comes to MOT, biological motion is not alone in the VIP section.

De–Wit, L., Lefevre, C., Kentridge, R., Rees, G. & Saygin, A.P. (2011) Investigating the status of biological stimuli as objects of attention in multiple object tracking. PLoS One. 2011; 6(3): e16232.

End of Quarter: More news from the lab

We have reached the end of Spring quarter and the lab has been busy.

On June 6, Cognitive Science graduate students presented their 2nd year research projects. Burcu A. Urgen presented a project from our lab entitled “Temporal Dynamics of Human Cortical Motor Activity during Action Observation: The Effect of Actor Appearance and Movement Kinematics”. On June 9, Angela Chan presented her undergraduate honors thesis entitled “Biological Motion as a Cue for Attention”. Burcu was supported by a research fellowship from Calit2, and Angela has been awarded a summer undergraduate research fellowship also from Calit2. Both students will continue working on their projects this summer. Congratulations to all the students who successfully presented their work this week!

Speaking of undergraduate research, Arthur Vigil has been awarded the Chancellor’s Undergraduate Research Scholarship to work in the lab for the next year. Congratulations Arthur!

We all hope to go on to have a productive summer – but now it’s time for a well-deserved break!

Professor Saygin and Angela Chan after Angela's honors thesis presentation

New paper accepted!

New paper!

Saygin, A.P., Chaminade, T., Ishiguro, H., Driver, J. & Frith, C. (2011) The thing that should not be: Predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Social Cognitive Affective Neuroscience. In Press.  PDF

Abstract: Using fMRI repetition suppression, we explored the selectivity of the human action perception system (APS), which consists of temporal, parietal, and frontal areas, for the appearance and/or motion of the perceived agent. Participants watched body movements of a human (biological appearance and movement), a robot (mechanical appearance and movement), or an android (biological appearance, mechanical movement). With the exception of extrastriate body area, which showed more suppression for humanlike appearance, the APS was not selective for appearance or motion per se. Instead, distinctive responses were found to the mismatch between appearance and motion: whereas suppression effects for the human and robot were similar to each other, they were stronger for the android, notably in bilateral anterior intraparietal sulcus, a key node in the APS. These results could reflect increased prediction error as the brain negotiates an agent that appears human, but does not move biologically, and help explain the “uncanny valley” phenomenon.

In addition to the really interesting data, this paper is extra-cool because we have the following citation:

    Lovecraft, H. P. (1936). The Shadow Over Innsmouth. In The Dunwich Horror and Others, S. T. Joshi, ed. Sauk City, WI: Arkham House.

And even better:

    Hetfield, J., Ulrich, L. and Hammett, K. (1986). The Thing That Should Not Be. In Master of Puppets, Electra Records.

Yes, yes I cited a song:

I will post a more detailed post about this study soon. Until then, Enjoy! |m|

Figure 2