Complete Story
 

01/30/2023

Neural Imaging Reveals Secret Conversational Cues

Complex signals underpin all human verbal communication

Studying human conversations isn’t a simple challenge. For instance, when humans start to talk to one another in a conversation, they coordinate their speech very tightly—people very rarely talk over one another, and they rarely leave long, unspoken, silent gaps. A conversation is like a dance with no choreography and no music—spontaneous but structured. To support this coordination, the people having the conversation begin to align their breath, their eye gaze, their speech melody and their gestures.

To understand this complexity, studying research participants in a lab looking at computer screens—the traditional setup of psychology experiments—isn’t enough. We need to study how people behave naturally in the real world, using novel measurement techniques that allow us to capture their neural and physiological responses. For instance, Antonia Hamilton, a neuroscientist at University College Londond, has recently used motion capture to identify a pattern of very rapid nods that listeners make to show that they are paying attention when someone is speaking. Hamilton shows that the interaction is improved by these subtle signals, but what’s also fascinating is that although the speakers can actually perceive this information, these body signals are not discernible to the naked eye.

In 2023, we will also finally be able to start capturing neural data while people are moving and talking to each other. This isn’t easy: Brain imaging techniques such as functional magnetic resonance imaging (fMRI) involve inserting participants inside 12-ton brain scanners. A recent study, however, managed that with a cohort of autistic participants. This paper represents a terrific achievement, but, of course, until fMRI techniques become much smaller and more mobile, it is not going to be possible to see how the neural data relates to the pattern of movements and speech in conversations, ideally between both participants in a conversation. On the other hand, a different technique—called functional near infrared dpectroscopy (fNIRS)—can be used while people move around naturally. fNIRS measures the same index of neural activity as fMRI via optodes, which shine light through the scalp and analyze the reflected light. fNIRS has already been deployed while people performed tasks outdoors in central London, proving that this method can be used to gather neural data in parallel with movement and speech data, while people interact naturally.

Please select this link to read the complete article from WIRED.

Printer-Friendly Version