Earlier this month (February, 2018), the Globe & Mail published an essay by author Michael Harris titled, “I have forgotten how to read.” In it, he recounted how he recently tried to read a single chapter of a book, but failed. Frustrated, instead turned to TV:
Paragraphs swirled; sentences snapped like twigs; and sentiments bled out. The usual, these days. I drag my vision across the page and process little. Half an hour later, I throw down the book and watch some Netflix.
Which, I think, is the poor choice of alternatives. Giving up doesn’t improve the skill set or fix the problem. As the American politician Claude Pepper is alleged to have said, “Life is like riding a bicycle: you don’t fall off unless you stop pedaling.” Harris, it seems, stopped pedalling before he was even through a mere chapter.
If, as Harris also writes, “mind is plastic,” and he believes his reading skills have diminished, then I would think the solution would be to retrain his mind, to relearn those skills, to strengthen the neural pathways associated with reading and comprehension, rather than continue to encourage them to atrophy. Get back on the bike and pedal harder. Read more, not less. As Groucho Marx quipped: *
I find television very educational. The minute somebody turns it on, I go to the library and read a good book.
Harris hadn’t become illiterate or dyslexic: his reading habits had changed as he immersed himself deeper into today’s social-media-driven technology; a medium that encourages short, emotion-filled, reactive – even knee-jerk – content, the stuff of immediate response, outburst and instant memes, rather than the stuff of deep thought. It’s a self-inflicted wound:
When we become cynical readers – when we read in the disjointed, goal-oriented way that online life encourages – we stop exercising our attention. We stop reading with a sense of faith that some larger purpose may be served. This doesn’t mean we’re reading less – not at all. In fact, we live in a text-gorged society in which the most fleeting thought is a thumb-dash away from posterity. What’s at stake is not whether we read. It’s how we read… The words I write now filter through a new set of criteria. Do they grab; do they anger? Can this be read without care? Are the sentences brief enough? And the thoughts? It’s tempting to let myself become so cynical a writer because I’m already such a cynical reader.
I think the many of us who share part of our lives online and are in constant communication with the social media world through devices understand. Even a passing attempt to keep up with the sheer volume of material on a Facebook timeline or Twitter feed runs in opposition to depth and focus. It becomes the Red Queen’s Race – you run as fast as you can in order to simply stay in the same place. But surrendering to it isn’t the answer.
Twitter’s move from a limit of 140 to 280 characters per tweet, last November, was earth-shaking for many people accustomed to (and enjoying) the enforced brevity and grammatical laxity of the old system. Twitter bloggers themselves wrote about an”emotional attachment” to the 140-character limit, but that’s irrelevant. The cultural and physiological attachment to it was much, much deeper: the longer tweet required, in effect, more literacy therefore more thinking. You may be forgiven for typing ‘ur’ instead of your or you’re in a text message, but not in proper text. For some users, the increased post capacity was a move towards more literacy. And that was what they disliked.**
Harris is right to be concerned: technology is changing us, evolving our behaviour, and not always in positive ways. In January, 2017, Dr. Martin L Kutscher asked in an article in Psychology Today, “Does reading on a screen interfere with in-depth learning?” and his unequivocal answer was , “Yes!” In his piece, he notes:
There is concern that the reliance upon shallow reading may interfere with the development of deep reading skills such as thoughtful pondering, critical analysis and inferential thinking. It is feared that neurological connections required for deep reading such as brain areas involved in visual processing and phonological processing may not be made in those people who learn primarily via shallow reading.
Shallow is a word that often comes to mind when I venture into Facebook with its torrent of shared hoaxes, fake news, mis-attributed quotes, whining responses and saccharine meme images. The “share this if you… meme” (pick a topic: hate violence, agree with this or that cause, don’t agree with this or that politician, like ice cream, dogs, grandchildren or hummingbirds, you have the best (pick one) wife/husband/grandkids/sister/brother, love to eat/wouldn’t ever eat this, and on and on and on…) isn’t a call to action. It’s a call to reaction. It’s like the pointless “prayers and thoughts” Republicans offer up after every mass shooting – an excuse not to act. A placebo, a substitute for action.
Nicholas Carr– author of The Shallows: What the Internet is Doing to our Brains – calls it a, “medium based on interruption” that is “changing the way people read and process information.”
We’ve come to associate the acquisition of wisdom with deep reading and solitary concentration, and he says there’s not much of that to be found online.
Which is a bit facile. I often spend hours researching and reading online (my own posts here are often quite lengthy as a result). I’m sure I’m not the only one who dives deep into online content, not merely skimming the surface. But perhaps that’s something that needs to be taught to a younger generation. Perhaps the problem is less to do with technology than education and upbringing.
Back in 2015, Naomi Baron, Professor of Linguistics and Executive Director, Center for Teaching, Research & Learning, American University, wrote,
…we want young people to delve into materials with a mindset prepared to take a reasoned, objective stand rather than memorizing or shooting opinions from the hip. What kind of materials? Largely written, and these days, there’s the rub. We teach the next generation to decipher words on a page, but as the form of what constitutes a page shifts, so does the nature of reading… The ways we use technologies lead us to develop particular habits of mind. With print, even though we might skim and scan, the default mindset is continuous reading. It’s also focusing on what we’re reading, even though sometimes our thoughts wander. Digital technologies engender a different set of habits and practices. Their default state is what I call reading on the prowl. Think of how much time you spend on each hit after doing a Google search. A minute? Ten seconds? And how likely are you to be multitasking while reading onscreen?
In the G&M article, Harris commented that,
The deep reading that a novel demands doesn’t come easy and it was never “natural.” Our default state is, if anything, one of distractedness. The gaze shifts, the attention flits; we scour the environment for clues. (Otherwise, that predator in the shadows might eat us.) How primed are we for distraction? One famous study found humans would rather give themselves electric shocks than sit alone with their thoughts for 10 minutes. We disobey those instincts every time we get lost in a book.
But again I disagree. Buddhists call it the “monkey mind” or “monkey brain” – the distracted, uncontrolled consciousness that flits and darts and jumps about. Perfect for social media where even a goldfish’s attention span seems egregious. Yet for millennia, religious and philosophical orders worldwide have encouraged and taught some form of contemplative mindfulness.
From the early human prehistory, storytelling has provided a quiet space for our contemplation. Having a focused attention in that moment has been the human condition since we started telling stories around Neanderthal campfires. It was the relief from constant attentiveness against predators. It was the place to go to, to escape the harsh, daily grind. The invention of writing, some 6,000 years ago, allowed literature to survive longer and be shared, which created a cultural space to which we could all retreat anytime. If you could (and did) read, that is.
Many, many artists have chosen contemplative solitude, or isolated environments to develop their works. Charles Dickens went on lengthy walks to think about books and plots without the distractions of home, often pacing out 20 miles or more each time. Ingmar Bergman moved to a tiny island where he could develop his film ideas and scripts in quiet solitude – and stayed there for 25 years. In his Nobel Prize acceptance speech, Ernest Hemingway spoke of his need to be alone to work.
In the classic work,Walden, Henry David Thoreau wrote that contrary to the monkey mind Harris describes, most people in fact are on mental cruise control:
We must learn to reawaken and keep ourselves awake, not by mechanical aids, but by an infinite expectation of the dawn, which does not forsake us in our soundest sleep. I know of no more encouraging fact than the unquestionable ability of man to elevate his life by a conscious endeavor… Every man is tasked to make his life, even in its details, worthy of the contemplation of his most elevated and critical hour.
Thoreau also, and more critically, added what today seems an appropriate description of the populist, anti-science, anti-intellectual, pro-faith audience (like those who helped elect Donald Trump):
The millions are awake enough for physical labor; but only one in a million is awake enough for effective intellectual exertion, only one in a hundred millions to a poetic or divine life. To be awake is to be alive. I have never yet met a man who was quite awake.
Intellectual somnambulance, it seems, is not simply a product of our own times. But he was writing a century-and-a-half ago, when he lacked such weapons of mass distraction as we have today. The change to “immediate access” technologies and indirect communication began in the early 20th century; first with the telephone, then radio, then TV, the internet and now smartphones. Our level of distractedness rose and our attention spans lowered with each development. But books and reading remained the bulwarks against them.
We managed to adapt reasonably well to the earlier technologies, so I suspect we will adapt to the latter ones, although not always in the best way possible. It is quite possible to break evolution’s forward-looking dictates and collectively devolve – especially culturally – into intellectual haves and have-nots. H.G. Wells’ vision of the Eloi and Morlocks seems prescient in today’s intellectually-challenged political climate where the ignorati Morlocks can rise to the White House.***
Sure, the information overload may be confusing, and it can be difficult to keep up with the cascade of trivialities on social media, but that doesn’t apply everywhere and to everyone, especially not to those who still retain a habit of deep reading. The solution for overload is to escape to somewhere you can put all your energy, all your brainpower, to use on a single task. And that task should be reading.
Attention and focus are what people give to an operatic or symphonic performance, to a religious or state ceremony – all performance forms that require focus and absorption to fully appreciate and engage in. Even a movie or TV show can capture our undivided attention (as can an audio book, podcast or radio drama). Reading is more intense simply because of the way it engages the brain, but it isn’t the only way to focus.
Deep reading is, as both Jonathan Gottschall (in The Storytelling Animal) and Peter Mendelsund (in What We See When We Read) tell us, is also a performance art. Neurological studies show it engages our brains, our emotional centres, our adrenaline and endorphins, our heart – but most of all our imaginations. It is not by any means a passive act like simply watching TV. The depths of a book is a rich world where we are actors who play, love, dance, soar, fight, live, age and die.
Reading makes us smarter, better people, with bigger vocabularies, better reasoning, better memory and less likely to fall for the corporate sales pitch or the fake-news politician. Study after study tells us that says reading is better for us than almost anything you can find on TV. Anne Cunningham, from the University of California, who wrote What Reading Does for the Mind, discovered was that reading gives you a better vocabulary than talking or TV, gives you more knowledge than TV; and boosts your memory and reasoning abilities. By which I assume she means avid readers are less likely to fall for cons and scams, and more likely to make informed, conscious decisions.
In his book, A Splendour of Letters, Nicholas Basbanes described himself as “…a writer who is obsessed with books in every imaginable sense and nuance of the word.” I couldn’t have said it better (and his book is a wonderful tribute to books and reading).
And as a side note, Jessica Stillman wrote a wonderful piece for Inc. Magazine, titled, “Why You Should Surround Yourself With More Books Than You’ll Ever Have Time to Read.” In it she writes that having a vast collection of books is,
…a powerful reminder of your limitations – the vast quantity of things you don’t know, half know, or will one day realize you’re wrong about. By living with that reminder daily you can nudge yourself towards the kind of intellectual humility that improves decision-making and drives learning.
It’s the sort of advice I take to heart; a practice I try to engage myself in weekly, if not daily (I bought four used books today and got three others in the mail this past week). And I like the guilt-free approach that I don’t always have to read every word in every one of them, but can enjoy their presence in smaller morsels. But the best advice of all, the best practice of all is simply to read. And then read more.
(Sometime soon I will finish a post I’ve been playing with for a while now, that illustrates the neural magic of reading and helps underscore what it means to call reading a performance art. In the meantime, I’ll read some more about it…)
* It wasn’t just the audience that literacy was losing to new media: it was the economic support. First radio started stealing advertising revenue. As one writer noted, “Radio liberated advertising from its relationship to literacy by communicating through music, jingles, and the spoken word.” By 1938 radio advertising revenue surpassed that from print advertising.
Television led the biggest assault against reading: one-third of all American homes had a set by 1952 and by 1969 it had a 97% market penetration in the USA. Since its introduction in 1939, TV rapidly captured advertising revenue – the first TV ad appeared in 1941. By 1954 CBS was the largest advertising medium in the world.
This combined draining of advertising revenue meant hardship for newspapers and magazines that depended on advertising to survive. The future of both the glossy magazine and the print newspaper looks dim. Yet curiously, while the book publishing industry is pumping out an increasing number of titles, sales are dwindling (in the USA the peak was in 2007). It has a lot to do with the fragmentation of the market, but that’s a post for another time.
** Twitter calls itself a “micro-blogging” platform, which is like calling a swift walk up a flight of stairs a micro-Olympics.
*** Technically, it began in the mid-19th century with the invention of the telegraph. But it was not a consumer-oriented technology like the others.
And as for the ignorati in political office: Trump, like The Block of Seven on our own council, is a perfect example of the Dunning-Kruger Effect which says, in effect, that “the people who are least competent tend to be most sure of themselves, while those with genuine skills are frequently wracked with doubts about their abilities.” Or, as Wikipedia tells it, it is a,
… cognitive bias wherein people of low ability suffer from illusory superiority, mistakenly assessing their cognitive ability as greater than it is. The cognitive bias of illusory superiority derives from the metacognitive inability of low-ability persons to recognize their own ineptitude
- Why Jesus Won’t Answer Your Prayers - © March 26, 2023
- The Father of Modern English - © March 19, 2023
- Not the Chaucer You’re Looking For - © March 17, 2023