PHOTOGRAPH BY B. ANTHONY STEWART, NATIONAL GEOGRAPHIC
Published March 12, 2014
Next time something you hear goes in one ear and out the other, you have a built-in excuse. Just blame it on your Achilles' ear—a weakness that lies not in a mythical hero's heel, but in the real-life way the brain processes sound and memory.
That's the suggestion of a University of Iowa study comparing how well we recall something, depending on whether we see it, hear it, or touch it.
Associate professor of psychology and neuroscience Amy Poremba and graduate student James Bigelow asked a hundred undergraduates to participate in two related experiments. In the first, students listened to sounds, looked at images, and held objects. Then, after an interval ranging from one to 32 seconds, they were asked whether various stimuli were the same or different from the originals. In a second experiment, the students were asked to recall sounds, images, and objects after an hour, a day, and then a week. In both instances, the students' auditory recall came in last, lagging far behind the tactile and visual memories, which the students recalled at about the same level. The longer the time that elapsed, the greater the gap became, with auditory memory lagging farther and farther behind the other types of memory.
"Our auditory memory isn't as robust as we might like to think it is," says Poremba. "We think that we are great at integrating all the senses," but the experiment shows that tactile and visual memory easily trumped auditory memory.
The results further suggest that the brain processes tactile and visual memories through a similar mechanism, but that auditory memory is processed differently. This has potential implications for understanding the evolution of the human brain, says Bigelow, since the auditory memory of monkeys and chimpanzees also lags behind their tactile and visual memory.
See Me, Feel Me
As for the here and now, the study holds possible applications for teaching and learning. "This reinforces the importance of multisensory learning and shows that the tactile can be very important," says John Black, Cleveland E. Dodge Professor in the Department of Human Development at Teachers College, Columbia University. Current technology that combines multisensory and multimedia components—such as i-Pads, tablets, and e-textbooks—requires students to touch and move their fingers over the screen to access videos, voiceovers, and additional text, which can enable multisensory processing. "This is not to underplay the importance of the verbal, but it emphasizes that we should not forget about the other aspects. You need them all."
Indeed, the study is a reminder that we need to engage all the senses "to promote learning and memory," says Janet Brain, a learning disabilities specialist in New York. That approach is already "the hallmark of much of the reading instruction that's done with dyslexic children."
Technology Can Help
Along with Black, she finds that technology provides many possibilities for multisensory learning. Interactive computer graphics and videos that add more senses to the mix can "make visual cues much stronger" and "improve visual memory," she says—and can also increase attention span. In other words, the more varied ways in which you are exposed to and interact with the material, the more likely you will be to remember it.
And if you want a practical example of what can happen when you use primarily one—as opposed to multiple—senses in teaching, Brain points to the once ubiquitous approach to teaching foreign languages known as the audio-lingual method. One reason for its mixed success, Brain suggests, is that the language labs that were central to the approach could make for a kind of auditory vacuum, with students spending hours listening to audio recordings and sentence drills. Less emphasis was given to connecting the words heard to objects that would help endow meaning to the words and sentences in the drill—like passing around an apple when students learn the French word for the fruit, pomme.
The final takeaway may come from a Chinese proverb, say Poremba and Bigelow: "I hear and I forget; I see, and I remember."
What implications are there then for the popular language learning method that has learners only listening and not reading or writing, I wonder?
Multi-sensory learning is a fascinating field. I noticed that they didn't touch on scent or aroma in this article, which is interesting because several studies have shown that scents such as rosemary and lavender improve academic performance by boosting relaxed, alpha-wave brain activity which aids in cognition and memory.
As a result, there are several schools (and workplaces) in the United States and Canada that are now using scents to boost student and employee performance.
I am curious about cultures that pass down their history orally - through stories and poetry. This was common in nomadic cultures, and Middle Eastern societies placed great emphasis on poetic traditions.
This article hits the mark! I wish employers would "see" that employees have different learning processes. EVERYONE should read this article.
The flaw in this study is individual humans digest information in different ways. Had the study been loaded with students whose modality was auditory the results would've been different. The importance of this is that basing educational strategies on general information,such as this, ignores the need to understand how individuals acquire information and fit training to their needs.
@Carolyn Nevin The idea behind the communicative approach is that people should be hearing and using language in context. This means they should be listening and talking about things they hear, see, touch, feel, and so on, rather than just listening to and repeating sentences in a vacuum with no additional explanation. Grammar is learned intuitively because the context of the situation helps to provide meaning, and through exposure to a variety of contexts meaning is solidified and/or given additional nuance. This is consider best practice for people who actually want to be able to comprehend and use a foreign language for spoken communication.
@Bryan Darling Do you have any links to the studies? I want to learn more this is fascinating!
Feed the World
How do we feed nine billion people by 2050, and how do we do so sustainably?
We've made our magazine's best stories about the future of food available in a free iPad app.
Latest From Nat Geo
These cooing Casanovas use showstopping plumage to court females and fend off rivals.
Meet a trapper who keeps Florida's streets, sewers, and Kennedy Space Center alligator free.