×

Cognitive dynamics in collaborative spatial tasks

News / News / Cognitive dynamics in collaborative spatial tasks
01 August. 2019

Cognitive dynamics in collaborative spatial tasks

 

Alexia Galati, Ph.D.
University of North Carolina at Charlotte

In many everyday situations we must consider perspectives distinct from our own, including others’ emotions, perceptions, knowledge, and beliefs. Yet the fundamental cognitive skill of perspective-taking is subject to many underexplored constraints. In this talk, I explored two central debates about perspective-taking by leveraging collaborative spatial tasks.
 
One debate concerned the time-course of perspective-taking: specifically, whether initial processing defaults to the egocentric perspective. I presented work that addresses this question, using eye-tracking and mouse-tracking methods that permit sampling behavior at a fine-grained temporal scale. This work suggests that social and environmental factors predict perspective choices, despite the cognitive cost associated with adopting another’s perspective.
 
Another debate concerns the relationship between perspective-taking and task performance. It is largely unknown which perspective-taking strategies support optimal performance across diverse tasks. For example, although increased behavioral-matching (or “alignment”)can benefit performance in tasks that require close monitoring of another’s perspective (e.g., in direction giving), it can be detrimental in tasks requiring the foraging of information (e.g., in joint visual search). I described ongoing work that assesses, across different tasks, the relationship between interpersonal coordination and performance outcomes.
 
By identifying signatures of interpersonal coordination that are diagnostic of failures in perspective-taking and are predictive of joint performance, we can improve the design of technological interfaces and adaptive tools. I discussed potential applications, ranging from immersive analytic tools to support the sense making of complex data in teams, to augmented reality interventions to support the effectiveness of high-stakes missions (e.g., search-and-rescue).
 
                                                                                               
 
 
Biosketch of Alexia Galati
Dr. Alexia Galati is an Assistant Professor in Psychological Science at the University of North Carolina at Charlotte. She holds a B.A. (Honors) in Psychology from Stanford University,an M.A. in the Social Sciences from The University of Chicago, and a Ph.D. in Experimental Psychology from SUNY Stony Brook. From 2010–2016 she was a post-doctoral researcher at the Experimental Psychology Labat the University of Cyprus, and from 2016–2018 she was a Marie Skłodowska-Curie fellow at UC Merced’s Cognitive and Informational Sciences unit.
 
Dr. Galati studies perspective-taking in socially and environmentally situated contexts. Her work examines how language users keep track of their conversational partners’ perspective, how they adapt their linguistic and non-linguistic behavior to coordinate with their partners, and how successful that coordination ultimately is. She uses a variety of methods, including behavioral experiments, discourse analysis, eye-tracking, mouse-tracking, perceptual paradigms, and virtual reality technology.
 
 

Website: www.alexiagalati.com

More News
Information Webinar on new UCY Master on Artificial Intelligence
The webinar will be held on Wednesday May 4th at 5pm through Zoom (the link is given below), with...
02 May. 2022
“Data as Common Ground” series of events
We would like to thank all who participated at the “Data as Common Ground” series of events...
27 April. 2022
Presenting at the 36th Hybrid EFPSA Congress
The CYENS Centre of Excellence participated at the 36th EFPSA – European Federation of Psychology...
15 April. 2022
IMET 2022 - PAPER SUBMISSION DEADLINE EXTENSION
We are pleased to announce the new submission deadlines for the 2nd International Conference on...
13 April. 2022
Displaying results 25-28 (of 179)
 |<  <  3 - 4 - 5 - 6 - 7 - 8 - 9 - 10 - 11 - 12  >  >|