Penn State football coach Joe Paterno was showing his age recently when—referring to Twitter—he described it as “twiddle-dee, twiddle-dum.” He might not have been that far off. All communication is by its very nature an interruption of attention. Paying attention is critical to perceiving reality. Have we entered an age of too many interruptions, making it difficult for people to grasp reality?
This is a difficult consideration in an age when most everyone is a-flutter over Twitter and text messaging. But that’s partly due to people no longer understanding the nature of communication and how it helps or hurts paying attention. In fact, few people see the need to pay attention at all, preferring to multi-task through life. We need a refresher course on the nature of reality and human nature.
The nature of communication is that it is always an interruption—a break in your attention. When your spouse speaks to you, it breaks whatever train of thought you had going. That’s the upside of communication. “In the beginning…” was an interruption—a disruption of prevailing assumptions at that time. It still is. But here’s the downside.
The nature of interruptions is that too many reduce our ability to pay attention. Our human capacity to pay attention is shaped like a bell curve. An increase in communication elongates our span of attention—but only up to a point. Past the peak, people lose the capacity to pay attention. They go ADHD. There’s a downside to that.
The nature of wisdom is that it only comes to those who pay attention. In Proverbs 1:24, Wisdom calls out on the streets. Yet people refuse to listen. The parallel rephrasing in the verse tell us why: “I stretched out my hand and no one paid attention.” People have to pay attention over a sustained period of time in order to gain wisdom.
Being interrupted is more of an acute problem today than ever before in history. Until the early 1800s, communication was conducted through technologies that were polite—waiting until you were ready to pay attention. A posted note on the town square. A letter mailed to you. But beginning in the 1800s, new technologies accelerated the rate of interruptions. Attention spans began to shrink, as did the ability to assemble a complete picture of reality.
The early 1800s was a period of dazzling technology advancements, a “communications revolution,” writes historian Daniel Walker Howe.1 Americans set out to conquer “the first enemy,” distance. Samuel Morse invented the telegraph in 1849. By the 1870s, 650,000 miles of wire and 30,000 miles of submarine cable had been laid. A message could be sent from London to Bombay and back in as little as four minutes.2 More and more messages, faster and faster. But it became hard to pay attention.
You can draw a line from telegraph to telephone to TV to text messaging to Twitter. People today are continually interrupted—and used to it! Television, for example, “is in essence an interruption machine,” Maggie Jackson writes in Distracted.3 When the TV is on, children ages one to three exhibit the characteristics of attention-deficit syndrome. When adults twitter and text, studies show that they exhibit less ability to pay attention to important yet complex truths. We’re seeing the effects in college students.
Bright, intelligent college students increasingly demonstrate an inability to sort through the onslaught of information so easily accessed by web-based technologies, Mark Bauerlein of Emory University writes. He says only 16 percent of today’s students read the text on a web page line by line, word for word, and can pull together a coherent summary of what the author intended to say. The other 84 percent can only pick out individual words and sentences, “processing them out of sequence,” Bauerlein concludes.4 They lack wisdom because they can no longer pay enough attention.
We now live in an age of what some call “inattentional blindness,” according to researchers Arien Mack and Irvin Rock. Sort of a twiddle-dee, twiddle-dum age. In fact, many “no longer accept the possibility of assembling a complete picture of reality,” writes literary critic Sven Birkerts. Now that’s a problem that demands attention.
This is not a diatribe against technology. I own an iPhone. I text message. I’m not a Luddite. But I’m not texting every waking moment. Yes, there are stories of good connections being made by the availability of communications technologies such as Twitter. But do the exceptions prove the rule?
Paul Goodman of the New Reformation says technology was once considered a branch of moral philosophy, not of science. In the nineteenth century, science usurped philosophy’s role with an attitude of: “if something can be done it should be done.” Moral philosophy would counter: just because something can be done does not necessarily mean we ought to do it. Because the gospel is no longer considered by many to be a coherent definition of reality, we have no moral boundaries left for wise use of technologies. Instead, people sleep with their phone left on—don’t want to miss any breaking news. The sad part is these people can’t pay attention to the Big News.
Neil Postman predicted a “thought-world that functions not only without a transcendent narrative to provide moral underpinnings but also without strong social institutions to control the flood of information produced by technology.”5 That’s worth paying attention to as more and more people twitter—or is it twiddle-dee, twiddle dum?
1 Daniel Walker Howe, What Hath God Wrought: The Transformation of America, 1815-1848 (Oxford, UK: Oxford University Press, 2007), p. 5.
2 Tom Standage, The Victorian Internet: The Remarkable Story of the Telegraph and the Nineteenth Century’s On-Line Pioneers (New York, NY: Walker Publishing Company, 1998), pp. 25, 102.
3 Maggie Jackson, Distracted: The Erosion of Attention and the Coming Dark Age (New York, NY, Prometheus, 2008), p. 72.
4 Mark Bauerlein, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (New York, NY: Tarcher/Penguin, 2009), p. 143.
5 Postman, Technopoly: The Surrender of Culture to Technology (New York, NY: Random House, 1993), p. 83.