Friday I finally finished reading Proust and the Squid: The Story and Science of the Reading Brain by Maryanne Wolf.
I enjoyed the book, though I don’t think the audio format served it well. Frequently, I wanted to leaf back and recheck something I read, or look at the spelling of a word, or skip the bibliographic notes. Even now, I’d like to be able to go back and recheck terms and concepts.
Though it’s possible to skip around in the audiobook by section, chapter, subsection, and various intervals, it’s hard to do so efficiently.
In addition, there were a lot of spots where I wanted to sit and think for a moment, digesting what I’d just learned. With a print book, these are the moments where you sit with your finger in between the pages, staring off into space while the brain whirs furiously. With an audiobook, these are the moments where either you press Pause, or you forget to press pause and have to backtrack, trying to find the point where you drifted away.
One of the things I found fascinating about the book is that, according to Wolf, reading is not naturally wired into the human brain. This means that in order to learn to read, each human brain must rewire its own circuits, adapting and connecting centers in the brain that are evolved for visual processing, pattern recognition, and verbal language.
Thus, the brain of someone who has learned to read is functionally different from the brain of someone who has not, and the brains of people who have learned to read pictographic languages function differently from the brains of those who have learned syllabic or phonetic writing systems.
As Mr. Spock would say, “Fascinating.”
Wolf’s research, and that of many of her colleagues, is directed toward individuals with dyslexia, and how their brains may function differently from the average reading brain. Instead of the circuits that would normally coordinate the various brain centers used in reading, dyslexic brains may form different, less efficient circuits. They can get the same information, but much less fluently.
Interestingly, dyslexia is also associated with greater spatial awareness and creativity, though no one is sure whether the different wiring causes or is caused by these other differences.
My thoughts, of course, went immediately to blind readers and to braille.
Reading Without Seeing
Does a person who learns to read braille rewire their brain in the same way as a person who learns to read print? Does it make a difference if braille is the first or only form of writing they acquire, or if they come to braille later in life, after acquiring the ability to read print?
Does the letter recognition-sound mapping-word recognizing process work the same, even though the letters are tactile rather than visual?
I think the answer must have huge implications for the question of whether or not to teach blind children braille.
One study Wolf refered to found that adults in remote country areas who had never been taught to read were unable to break words down into their component sounds, while adults of the same age and socioeconomic status who had learned to read were able to do so.
Reading and writing also mean being able to retain information without having to commit it all to memory, and to transfer that information far more quickly than by doing so verbally. I’m guessing–and I’m only guessing–that this could pave the way for more brain developmentas it frees more brain time up for other functions.
If learning to read helps the brain function more effectively, and if learning to read braille affects the brain in the same ways as learning to read print, then teaching blind children braille takes on huge importance, regardless of what other forms of information storage and retrieval may be available.
I’d love to see a brain imaging study of blind children learning braille, and comparing them to the brains of sighted children learning print. Then I’d like another comparing the brains of adults who have been blind from an early age and have or have not learned braille.
What About Text-to-Speech?
Another question that occurs to me is whether text-to-speech programs, such as screen readers, activate the same brain circuitry as visual reading in now-blind adults who have previously been fluent print readers.
Another study Wolf referred to found that adults who visualized letters (such as a capital “A”) activated the same brain circuitry as did people actually seeing the letters.
Before my vision loss, I was a highly fluent print reader, easily able to devour a couple of books in a week. I’m intimately acquainted with what text looks like and how it works.
So I wonder: When I interact with text on my computer, am I involving the same neuronal circuits as when I used to look at text onscreen or in a book? After all, I can interact with it in much the same way–moving by line, by word, by letter, forward and back, or by landmarks on the page such as headings, graphics, and paragraphs. It’s slower, but the process feels the same.
Despite the fact that I’m receiving it audibly, it’s a very active kind of reading, whereas listening to an audiobook on the reader or a CD player feels much more passive. Receptive. I still get the information, but I have no way of affecting how I receive it.
I can only guess at what’s going on based on how it feels to me, but I’d love to see a study of former print readers and non-print readers interacting with voice-synthesized text, and whether it differs from their brain function as they listen to a person read aloud.
My next audiobook is The Broken Kingdoms, the second book in N.K. Jemisin’s Inheritance Trilogy.
I read the first book, The Hundred Thousand Kingdoms several years ago and remember finding it intriguing. And it’s *not* popular science!
Oddly enough, the book is read by the same woman who read the book I just finished! I thought her voice sounded familiar…
The book runs eleven hours and eight minutes, so it may take me a while to get through.
What are you reading? Are you enjoying it? What would you like to read next?