×

Cognitive dynamics in collaborative spatial tasks

News / News / Cognitive dynamics in collaborative spatial tasks
01 August. 2019

Cognitive dynamics in collaborative spatial tasks

 

Alexia Galati, Ph.D.
University of North Carolina at Charlotte

In many everyday situations we must consider perspectives distinct from our own, including others’ emotions, perceptions, knowledge, and beliefs. Yet the fundamental cognitive skill of perspective-taking is subject to many underexplored constraints. In this talk, I explored two central debates about perspective-taking by leveraging collaborative spatial tasks.
 
One debate concerned the time-course of perspective-taking: specifically, whether initial processing defaults to the egocentric perspective. I presented work that addresses this question, using eye-tracking and mouse-tracking methods that permit sampling behavior at a fine-grained temporal scale. This work suggests that social and environmental factors predict perspective choices, despite the cognitive cost associated with adopting another’s perspective.
 
Another debate concerns the relationship between perspective-taking and task performance. It is largely unknown which perspective-taking strategies support optimal performance across diverse tasks. For example, although increased behavioral-matching (or “alignment”)can benefit performance in tasks that require close monitoring of another’s perspective (e.g., in direction giving), it can be detrimental in tasks requiring the foraging of information (e.g., in joint visual search). I described ongoing work that assesses, across different tasks, the relationship between interpersonal coordination and performance outcomes.
 
By identifying signatures of interpersonal coordination that are diagnostic of failures in perspective-taking and are predictive of joint performance, we can improve the design of technological interfaces and adaptive tools. I discussed potential applications, ranging from immersive analytic tools to support the sense making of complex data in teams, to augmented reality interventions to support the effectiveness of high-stakes missions (e.g., search-and-rescue).
 
                                                                                               
 
 
Biosketch of Alexia Galati
Dr. Alexia Galati is an Assistant Professor in Psychological Science at the University of North Carolina at Charlotte. She holds a B.A. (Honors) in Psychology from Stanford University,an M.A. in the Social Sciences from The University of Chicago, and a Ph.D. in Experimental Psychology from SUNY Stony Brook. From 2010–2016 she was a post-doctoral researcher at the Experimental Psychology Labat the University of Cyprus, and from 2016–2018 she was a Marie Skłodowska-Curie fellow at UC Merced’s Cognitive and Informational Sciences unit.
 
Dr. Galati studies perspective-taking in socially and environmentally situated contexts. Her work examines how language users keep track of their conversational partners’ perspective, how they adapt their linguistic and non-linguistic behavior to coordinate with their partners, and how successful that coordination ultimately is. She uses a variety of methods, including behavioral experiments, discourse analysis, eye-tracking, mouse-tracking, perceptual paradigms, and virtual reality technology.
 
 

Website: www.alexiagalati.com

More News
RISE IMET 2020 - Submissions Now Open
Emerging Technologies and the Digital Transformation of Museums and Heritage Sites - Submissions...
26 July. 2019
The Complexity of Color Reproduction
Dr. Alessandro Artusi, DeepCamera MRG Leader was invited to give a talk in the International...
24 July. 2019
Tobii Pro Training for RISE Researchers
On Monday, July 22nd 2019, a full-day training on eye tracking via the use of cutting-edge...
24 July. 2019
Fighting “Everyday Bias” : Thoughtfulness in information access
We live in a modern world where information is presented to us in a blink of the eye. We all rely...
23 July. 2019
Displaying results 157-160 (of 179)
 |<  <  36 - 37 - 38 - 39 - 40 - 41 - 42 - 43 - 44 - 45  >  >|