Imagine feeling touch sensations, such as pressure and tingling, in response to everyday noises, like the sounds of running liquid, laughter and computer beeps. Though it may seems strange, this is the case for a woman who suffered damage to a part of her thalamus, an area of the brain.
We often think of humans as having five senses: sight, sound, touch, smell and taste. Yet even within one of these categories, there are different types of senses. For example, for the sense of sight, there are distinct receptors and perceptions for colors and brightness. Furthermore, the overall perception we have of our environment is usually a combination of multiple sensory modalities. Take, for example, what we refer to as a food’s flavor, which involves chemicals activating receptors involved in both our sense of smell and sense of taste. Although most of us do not feel auditory sounds as touch sensations, it is not difficult for us to appreciate that our brains must integrate various inputs to experience the world.
With our sense of touch, we are able to perceive spatial information, such as the edge of a box or at what angle a metallic key is sitting in the palm of your hand, and also temporal information, such as when your smartphone vibrates in your pocket. When he was a postdoctoral fellow in the Johns Hopkins University School of Medicine’s Department of Neurology, Jeffrey Yau and his co-authors published a study in which they used a technique called transcranial direct current stimulation to investigate how brain areas involved in vision and sound contribute to spatial and temporal touch.
Transcranial direct current stimulation is a noninvasive technique that involves placing electrodes on the scalp and passing a very small current between the electrodes. This stimulates the brain tissue directly beneath the electrodes. Yau explains that the group wanted to find out if there was a way to manipulate the sense of touch and improve these separate aspects (spatial and temporal). Beyond that, they also wondered how the two senses worked together, asking what the relationship is between the sense of touch and other sensory modalities.
“If we think of what the visual system is very good at, it’s processing shape or spatial information,” says Yau. And, indeed, emerging evidence in the literature suggests there are neural interactions between visual processing and spatial touch information. Similarly, Yau highlights an interaction between the auditory and temporal touch systems. “The auditory system is very good at processing temporal information; for example, the frequency of sounds you hear.” From this, it would make sense that auditory brain systems interact with temporal touch information. Along those lines, Yau and his colleagues found that stimulation over the visual cortex — but not auditory cortex — enhanced touch perception in a test that involved spatial orientation. Also, stimulation over the auditory cortex, not visual cortex, improved touch perception in a temporal frequency test.
In a clinical scenario, if a given sensory system is impaired rather than focusing only on that one sensory modality, Yau proposes that a more powerful rehabilitation approach may be one that is based on the understanding that senses are not strictly segregated. For example, to recover visual function, exploiting intact auditory and tactile systems may provide a complementary strategy.
Findings like these contribute to our understanding of how the brain is organized. In any of our everyday activities, our neural system takes in information from many different stimuli — the brightness of this screen, the temperature around you, your posture — so it follows that there must be neural cross-talk among the different sensory modalities through which we perceive and experience the world.