Our brains are capable of detecting the location of touch even when it’s not directly on the body, new research shows. An intriguing new study indicates that we can sense how an object we’re holding comes into contact with something else – almost as if it were an extension of ourselves.
If you’re holding a stick that you then use to tap something else, for example, the brain appears to activate a special set of neural sensors to work out what just happened using the vibration patterns as they’re sent through our nervous system.
Of course if something we’re holding is touched, we can feel the shift in pressure as it’s passed on to our fingers – but this latest study shows how we can also figure out the exact location of the contact on the object.
“The tool is being treated like a sensory extension of your body,” neuroscientist Luke Miller, from the University of Lyon in France, told Richard Sima at Scientific American.
Across 400 different tests, Miller and his colleagues got 16 study participants to hold wooden rods, and asked them to try and determine when two taps on those rods were made in locations close to each other.
And the volunteers were surprisingly good at it: they could recognise two touches in close proximity 96 percent of the time.
During the experiments, the researchers were also using electroencephalography (EEG) equipment to record the participants’ brain activity. These scans showed that the brain uses similar neural mechanisms – specifically in the primary somatosensory cortex and the posterior parietal cortex – to detect touches on both our own skin and on objects we’re holding.
We can probably identify the location of a touch on an object before it stops vibrating, the researchers suggest; this could happen in as short a time as 20 milliseconds, based on computer models the team ran as a follow-up to the main experiment.
This isn’t a completely new idea – think of visually impaired people using a cane to sense what’s around them – but no one has previously looked into what’s happening in the brain in so much detail before.
It seems that the brain is able to decode the vibrations as they come through certain nerve endings in our skin, called the Pacinian receptors. By receiving information from these receptors in our hands, the brain parts responsible can then figure out where an object is being hit – and the researchers think we may have even adapted the way we hold tools to get better feedback on what those tools are doing.
One area where this research might be useful is in changing the way prostheses are designed: if we understand how objects between the body and the rest of the world can pass on information to our brain, we might be able to make them work better as sensors.
The work builds on previous research from the same team into how objects can act as extensions to our body, but now we know more about what’s going on inside the brain when this weird phenomenon happens.
“We show that tools are fundamental to human behaviour in a previously underappreciated way: they expand the somatosensory boundaries of our body at the neural level,” write the researchers in their published paper.
“Hence, rather than stopping at the skin, our results suggest that somatosensory processing extends beyond the nervous system to include the tools we use.”
The research has been published in Current Biology.