You might be less likely to finish reading this article if you have an iPhone. And it’s not because I’m about to start bashing iPhones. It’s because your phone, and much of the other digital technology you use, is changing your brain, making you more distractible, more forgetful, and less creative. Sound far-fetched? Disturbing? Unlikely? You might want to turn off the TV, ignore the text that just came through, and settle in for a few minutes because what’s at stake is pretty fundamental to who we are.
This is your brain on digital media.
Maybe you have experienced what Nicholas Carr has experienced. You start to read a book – or even a long article – and a short way into it, your mind begins to wander. You weren’t always like this. You used to love reading – you could get lost in a book for hours. Now, after only a few minutes, you feel restless.
Or maybe you’ve misplaced your keys again or the new neighbor’s name has already slipped your mind. Perhaps your brain just feels drained, tired, overwhelmed. When it happened to me, I thought it was new motherhood and the accompanying sleep deprivation. You may have blamed your work stress, your busy schedule, or your age. Carr thinks it’s something else.
“Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts – the faster, the better,” Carr laments, not only of his own mind but of all our minds. And he thinks digital media are to blame.
“As our window onto the world, and onto ourselves, a popular medium molds what we see and how we see it – and eventually, if we use it enough, it changes who we are, as individuals and as a society,” writes Carr in The Shallows: What the Internet Is Doing to Our Brains.
The data are out there to show just how many fewer books people read these days, how much time children and adults spend in front of screens, and even how shockingly many text messages the average teenager sends a day. But for Carr, and for most of us who use technology, if we are honest about it, the most disturbing findings are personal.
Of his own recent difficulty concentrating while reading lengthy articles and books, Carr says, not entirely aiming to be humorous, “At first, I figured that the problem was a symptom of middle-age mind rot. But my brain, I realized, wasn’t just drifting. It was hungry. It was demanding to be fed the way the Net fed it – and the more it was fed, the hungrier it became.”
What’s perhaps especially disturbing about Carr’s experience is that he ought to be the exception to these sorts of feelings. If anyone should be able to ignore the chiming of his phone and the allure of snippets of mostly irrelevant information flowing through Facebook and Twitter all day long, shouldn’t it be an author who studied literature at Harvard?
Carr’s argument is that it’s not really his fault.
It’s a relatively new notion that our brains are malleable – or “plastic” in the neurological parlance. Scientists once believed that after childhood, our brains and their neuro-pathways were set. Research has revealed, however, that our brains change, forming new connections and losing old ones throughout our lives, depending on how they are used. A familiar example: when a person loses his or her sight, the brain often rewires some of its neurons to enhance the senses of hearing and touch. But subtler changes happen all the time and not only as the result of trauma. The tendency of the brain to rewire itself to support needed and repeated activities makes us prone to developing habits – both good and bad.
So, in a sense, Carr’s addiction to rapid-fire information and aversion to lengthy introspection isn’t his fault. It’s his brain’s.
Attention is fundamental to how we perceive our world, but it is so seemingly automatic that we are often unaware of the effort it takes. We call it “paying attention” because attending to something has a cost attached to it. Just as our finances are limited in quantity, so, too, are our mental resources.
Attention’s “basic mechanism is a process of selection,” according to Winifred Gallagher, author of Rapt: Attention and the Focused Life. Gallagher points out that there is always more potential input surrounding us than our brains can clearly process – what we experience are the things our brains deem important within the din.
For example, in a crowded restaurant, your brain filters out the background noise so you can concentrate on having a conversation with the people immediately around you. Your brain will shift focus in response to certain stimuli, however. If someone across the room calls your name or if a server suddenly drops a tray full of plates that shatter loudly on the ground, your attention will be redirected quickly.
Our minds are attuned to distraction for safety reasons – you wouldn’t want your ability to concentrate on your work report to be so adept that you weren’t distracted by your smoke alarm going off, for instance. Problems arise, however, when our attention is constantly pulled from the task or object at hand to things that are not fires – or really important or relevant at all. When we are distracted by a news headline reporting on the latest celebrity to give a baby an unconventional name; by a barrage of Tweets on the college basketball game that just ended; by the cute new photo of your second cousin’s best friend’s puppy on Facebook; by the compulsive need to check your phone for a text you might have missed from… well, anyone really, your attentional bank account can be rapidly depleted.
What’s more, many digital media devices make use of our brain’s built-in ability to be distracted. The cell phone rings, chimes, or vibrates, much like a smoke alarm bell, drawing our attention away from whatever we were previously engaged in doing. Online advertisers employ techniques such as pop-up windows, flash animation, and the use of our names to capture and redirect our attention. With our limited attentional currency, the continual stimulation of the digital realm can be mentally exhausting.
It’s simple: attention requires effort. In a world filled with endless diversions and bountiful but largely irrelevant information, our attention suffers. Our ability to focus declines. But what is unsettling is the notion that our faculties don’t just wane while we are using distracting technology; they are permanently weakened. And some fear that if we do not recognize this peril and take determined steps to win back our attentive powers, our ability to focus, to reflect, and to think critically and deeply might be forever diminished.
“True, insight can travel by Twitter,” notes P.M. Forni in The Thinking Life: How to Thrive in the Age of Distraction, “but there is no substitute for uninterrupted reflection and introspection – not if we want to discover who we really are, check if we are true to our own values, learn from our mistakes, and plan our future.”
Reflection and introspection are only possible when we focus. By now, you have perhaps already heard that multitasking is a myth. We can’t actually concentrate fully on multiple tasks. Rather, when we claim to be multitasking, we are actually rapidly changing tasks and foci back and forth. (This is why talking – or worse, texting – while driving is dangerous, even with a hands-free device.) With each switch, there is what is called a “change cost.” The cost may be in productivity or quality. We may be doing more but not as quickly or as well as we think.
“Multitasking also raises serious concerns about the shallow focus on the world that it encourages, reinforced by the flip, casual, rapid-fire style of electronic communication,” Gallagher asserts.
How many emails, text messages, and phone calls will interrupt your reading of this article? And how long will it take you to remember where you were – not just on the page but also in the argument and in your reaction to it?
How many times was I interrupted while writing this piece? How much less cogent is this article because of its writer’s own distractibility?
The questions can’t be answered quantitatively, but, I think, we sense that there are certainly costs in time and attention that come with interruption and distraction.
An Ethic of Attention
Though inattention may seem like mostly an annoyance, it also exacts a moral cost.
“Along with the costs to strong learning and deep thinking, hours spent in the thrall of alluring machines exact a toll from your attention to actual human beings,” Gallagher observes.
Forni quotes M. Scott Peck, who said, “The principal form that the work of love takes is attention. When we love another, we give him or her our attention; we attend to that person’s growth.”
With insufficient attention, marriages struggle. Children act out. Friendships fade.
Dr. Ross Oakes Mueller, associate professor of psychology at PLNU, noted, “Unplugging from technology will increase our capacity to be in relationship with the other and our ability to attend to the other – including reading non-verbal cues and making eye contact. It will increase our ability to be compassionate and invest in others. Emotional intelligence plummets with technology in front of us.”
We must give and receive attention to thrive as human beings, as attested by the tragedy of the orphans in post-Communist Romania who were so utterly deprived of attention that many were irreparably stunted intellectually, emotionally, and physically.
“Attention is not only a cognitive faculty,” says Forni, “but a moral imperative.”
It would seem to be a spiritual imperative as well. Spiritual growth requires paying attention to God and focusing on His will and Word. Attention is required for sharing His love with others as well. To show Christian love to the poor, for example, we must attend to their plight. We must take time to recognize it and respond. We must focus our attention outside ourselves.
“If life is valuable, it only makes sense to attend to it constantly,” Forni wisely observes.
Memory and Learning
“Anyone who has ever had to read the same paragraph twice understands why the connections among attention, memory, and learning are especially important for students – the very people most attracted to multitasking,” writes Gallagher.
When we forget a person’s name only moments after being introduced to him or her, we are not experiencing a problem with our memory but with our attention, Gallagher points out. When we focus, she notes, the part of the brain called the hippocampus, which is important to memory, is at work. When we are distracted, another part of the brain, the striatum, takes over. The striatum, she says, is involved in rote activities.
“As a result, even if you get the job done, your recollection of it will be more fragmented, less adaptable, and harder to retrieve than it would be if you have given it your undivided attention,” she writes.
This has important consequences for learning. When we were babies, we were really good at a couple of things: rapid learning and sleeping. Resting and learning are intertwined in ways older children, teenagers, and adults often take for granted. During sleep (especially REM sleep, where babies spend more time than adults) and other downtime, the brain consolidates, processes, and stores experiences into longterm memory. Just as sleep restores the body and enhances learning and memory, so, too, does waking downtime, according to recent research. The problem for students (and all of us) today is that what used to be downtime is now often screen time. The grocery store line, the walk between classes or from the office to the car, the waiting room at the doctor’s office, the minutes spent waiting for a colleague to arrive for a meeting or a friend for lunch – these prior downtimes are now used for checking emails, texts, or social networking sites or for playing games on smartphones.
“Ill at ease with the rare moments of true quiet still gracing our days, we fail to turn them into opportunities to assess who we are, where we have been, and what awaits us,” says Forni.
Our lack of mental downtime makes our brains tired, and fatigued brains are not well-equipped for learning. (If you ever pulled an all-nighter to study for a test in college, you might recall that the cost-benefit ratio was higher than you expected, coffee notwithstanding.)
Just in case or just in time?
Some might argue that it doesn’t matter if we are more forgetful in the digital age. The argument goes something like this: It’s not important to know specific facts anymore. It only matters if we know how to find what we need when we need it. In other words, why memorize the periodic table or key dates in American history “just in case” when you can simply Google them “just in time” when you need them?
The problem with this approach is that it works much better for trivia than for subjects with deep content. It also works better for settling quick debates or filling in some small gap of knowledge than for fueling creativity or critical thinking.
“Education is infinitely more than memorizing facts or data,” said PLNU professor of communication Dr. G.L. Forward. “It includes the ability to articulate and defend positions. There is the issue of whether you can hold two ideas in mind without a cognitive dissonance that is debilitating.”
The kind of learning Forward is referencing isn’t Google-able. Critical thinking requires deep thought, reflection, and often dialogue – and not the texting kind. It also requires, perhaps more than we might imagine, attention and memory.
“The accumulation of retained knowledge allows us to trace the connections with which we make sense of the world,” writes Forni.
Psychology professors Drs. Ross and Kendra Oakes Mueller agree.
“Part of what consolidates learning is attention and elaborations that facilitate our memory. Part of how memory works is by linking current information into information you already have,” Kendra said.
“The most creative people have databases of knowledge. Linking things together is what leads to creativity,” Ross added. “Reflection makes a huge difference in terms of creativity and connections. What you need to encode learning for the long term is not only rehearsal but also elaboration; deprived attention takes away both of those opportunities.”
If we rely on obtaining information “just in time,” we may limit our opportunities to form the kinds of connections between known and new information necessary for learning, innovation, and meaningful problem-solving. In this sense, relying on technology to do our remembering for us could pose a major obstacle to both creative and analytical pursuits. By taking over for our memory, Google could also diminish our critical thinking.
Moreover, online searching does not lend itself to the kind of elaboration on a subject that helps imbed the information into our memory. So, too, digital conversations that take place through quick tweets and short Facebook exchanges often do not allow for the kind of extended argument and analysis more formal writing and debating provides. What’s lost? The opportunity to elaborate, and thus, greater opportunities to connect, remember, and learn.
It all comes back to Carr’s concern about digital media actually changing our brains for the long run and not just while we’re online.
“When we are on the Net, our prevailing operational mode is one of retrieval, not retention,” says Forni. “The problem with that is that our brains’ neural pathways are only as good as the cognitive tasks we ask them to perform. When they do not make the effort to retain, they lose their ability to do so.”
Like Carr, Forni is concerned that the amount of time we spend engaged in superficial digital media consumption (scrolling through status updates or surfing from website to website, for example) is changing our brains. Rather than deeply etching into our neural pathways the practices of reflection, connection, and elaboration, we are teaching our brains to shift topics rapidly without pausing to interact with their content in ways that promote memory or learning. We might, in fact, be teaching ourselves to be forgetful. And what we might be forgetting is to think in ways that promote learning while bringing meaning and satisfaction to our lives.
A cure for what ails us?
Like a person with an eating disorder who cannot avoid food but must learn to develop a healthier relationship with it, most consumers of technology do not have the option of going totally unplugged. Work and social lives have been deeply entwined with technology. So how do we cope?
“One crucial fact often gets overlooked in laments about the electronic assault on your ability to focus: your machines are not in charge of what you attend to – you are. When they prove distracting, you have only to turn them off,” says Gallagher. Her assertion is one commonly made – it’s not the technology; it’s how you use it. But increasingly, both experts and technology users are finding that adage surprisingly insufficient.
“‘Technology is bad because people are not as strong as its pull,’” says a student in Sherry Turkle’s book Alone Together.
Resisting the pull of technology is about more than being able to set aside a shiny device. It’s about being able to establish boundaries and set aside fears, explain the Oakes Muellers. Ross provided a few examples of how a psychologist might help people with media addictions do this.
“On a practical level, they might do something like create out-of-office messages for email and as an automatic response to incoming text messages [that they could use to take a break from technology]. Clinically, they might need some cognitive restructuring – they will need to think of technology differently. We often feel fear about how people will react if we are not immediately accessible, but this is an example of distorted thinking.”
Even for those who don’t consider their use of technology excessive or addictive, certain practices may be able to help counter the effects on our brains of the time we do spend in the digital realm for work and pleasure.
For Christians, extended time spent in prayer or meditating on God’s Word provides respite from the digital realm (of course this is not the only reason we ought to pray and spend time in Scripture). Focused reflection serves as an antithesis to the often distracted and fractured state of mind that accompanies technology use.
“The multibillion-dollar entertainment industry of our time is essentially built upon humanity’s addiction to thought avoidance,” Forni observes.
To push back against this, we might purposefully engage in challenging reading or discussions with friends and colleagues. We might choose to immerse ourselves in the difficult instead of only the entertaining.
Forni suggests specifically scheduling time to think by saying no not only to screens but also to other obligations. He suggests, for example, spending lunch breaks and waiting times in thought rather than online. He also suggests writing regularly.
We might also spend time in nature, away from our screens, as advocated by, among others, Richard Louv, author of Last Child in the Woods: Saving our Children from Nature-Deficit Disorder and The Nature Principle: Human Restoration and the End of Nature-Deficit Disorder. Louv has spoken at PLNU twice, sharing his message about the importance of nature to our minds, bodies, and souls – especially in our technology-driven era.
Heather Ross, PLNU professor of philosophy, noted that the renewed interest in farming, hand-made crafts, and eating locally grown produce are signs that even young people do not desire a life that is purely digital.
“I find that students are receptive to time slowed down for pondering. It’s refreshing to them,” she said.
Turkle observes that young people “talk wistfully about letters, face-to-face meetings, and the privacy of pay phones. Tethered selves, they try to conjure a future different from the one they see coming by building on a past they never knew. In it, they have time alone, with nature, with each other, and with their families.”
Many young people recognize the limits of technology and the perils of their age. With awareness and guidance, their future need not be one of disembodied, shallow selves nor a backswing to an era without screens. For both adults and youth, there is hope for a future where balance is possible, where our brains and our selves are, in fact, much stronger than technology’s pull.