[ad_1]
Reading human conversations Not a simple challenge. For example, when humans start talking to each other in a conversation, they coordinate their speech very tightly—people rarely talk to each other, and they leave long, unspoken, silent intervals. A dialogue is like a dance without music – spontaneous but structured. To support this coordination, conversationalists begin to coordinate their breathing, their eye contact, their speech intonation, and their gestures.
To understand this problem, simply studying participants in a laboratory and looking at computer screens – the traditional setting of psychological experiments – is not enough. We need to study how people naturally behave in the real world, using new measurement techniques that allow us to capture their neural and physiological responses. For example, Antonia Hamilton, a neuroscientist at University College London, Recently used motion capture To identify a pattern of very quick nods that listeners make to show that they are paying attention when someone is speaking. Hamilton shows that communication is enhanced by these subtle signals, but the fascinating thing is that although speakers can actually perceive this information, these physical signals are invisible to the naked eye.
In 2023, we’ll start capturing neural data as people move and talk to each other. It’s not easy: brain imaging techniques like functional magnetic resonance imaging (fMRI) involve inserting participants into 12-ton brain scanners. A recent studyHowever, he managed it with a crowd of autistic participants. This paper represents an amazing achievement, but, of course, until fMRI techniques become much smaller and more mobile, we won’t be able to see how the neural data relates to the movements and speech patterns of conversations. In a conversation. On the other hand, a different technique called functional near-infrared spectroscopy (fNIRS) can be used as people move around naturally. fNIRS measures the same index of neural activity as fMRI via optodes, which shine light on the scalp and analyze the reflected light. fNIRS has already been used When people performed tasks outdoors in central London, they demonstrated that the method could be used to collect neural data alongside movement and speech data while people interacted naturally.
In 2023, we will be able to see for the first time how this will work in large group conversations, which will reach their limit with about five people. Because conversations can be so flexible and open-ended, this is a major challenge, but essential if we want to understand how participants’ brains coordinate these finely timed conversational dances.
These advances will represent major advances in the scientific study of human conversation, one of the most fascinating areas of cognitive neuroscience and psychology. Of course, I’m a bit biased: I’ve studied human speech perception and production for decades, and I think our linguistic, social, and emotional brain processes converge in conversations. Conversations are universal, and are the main way humans use to manage social interactions and connections. They are very important for our mental and physical health. If we could completely demystify the science of conversations, we would be a long way from understanding ourselves.
[ad_2]