Saturday, July 2, 2011

The Shallows

The Shallows: What the Internet Is Doing to Our Brains

The Shallows: What the Internet Is Doing to Our Brains (Kindle Edition 

 The Shallows, from Flipkart.com, my review on Amazon.com)
Misled by a metaphor. "Filled, not with wisdom, but with the conceit of wisdom"

This is an outstanding book that does a superlative job of presenting in a cogent manner the history of the written word, the art of reading, the science of memory, and how the internet disrupts the neurological processes that are at the heart of comprehension.

It is a testament to the incendiary nature of the topic, to suggest that the internet may affecting our minds in in ways that may not be always positive, or it may actually be harming our capacity to focus, and doing so by actually altering the way our brain is wired, that even smart and reasonable people as John Battelle, author of the bestselling and an excellent book on the history of search engines on the net, The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture, become unhinged when commenting on the topic (see his post, no, more a hasty, flustered rage, Google: Making Nick Carr Stupid, But It's Made This Guy Smarter from June 2008, in response to Nicholas Carr's article, Is Google making us stupid? in the July/August 2008 issue of The Atlantic, Is Google Making Us Stupid? - Magazine - The Atlantic).


Nicholas Carr's book, however, is a very well-written book on the topic. Even if you disagree, for whatever reason, with the premise of the book, you owe it to yourself to read it. This is also not to gainsay the fact that Carr does have a predilection for sometimes succumbing to provocative, almost needling, sensationalistic headlines. Which can sometimes overwhelm the sound reasoning underneath. A minor broadside against Google (and Google Books) notwithstanding in the book (to sample, "The last thing the company wants is to encourage leisurely reading or slow, concentrated thought. Google is, quite literally, in the business of distraction", pg 157), this is not an incendiary hatchet job against the internet or any company. Whether you are convinced or not is besides the point, this book will surely enlighten you in at least some ways on how we think and and how we remember what we remember. And this is worth something, surely.

As we use a medium like a pencil and paper, or a computer, or a computer connected to the net, or a tablet, our expression becomes an extension that is amplified by the medium as well as an extension that is shaped by the medium in turn. For those who grew up with paper and pencil and have graduated to the computer, the author's observations will surely feel familiar.
At first I had found it impossible to edit anything on-screen. I’d print out a document, mark it up with a pencil, and type the revisions back into the digital version. Then I’d print it out again and take another pass with the pencil. Sometimes I’d go through the cycle a dozen times a day. But at some point—and abruptly—my editing routine changed. I found I could no longer write or revise anything on paper. I felt lost without the Delete key, the scrollbar, the cut and paste functions, the Undo command. I had to do all my editing on-screen. In using the word processor, I had become something of a word processor myself.

This thought, about becoming a sort of an extension, connected in a very strong way with the machine we used, is not new. However, there is a tradeoff here. While implements we use amplify or extend our natural capabilities, whether physical, sensory, or intellectual, as in the use of a plow, Geiger counter, they also deaden these extended part of our bodies. Therefore, the internet, in all its glory, provides us access to information that most could not have dreamt of a generation ago. The speed with which information can be accessed. The vast amounts of curated and uncurated information available at our fingertips. The relatively low cost of accessing such information would have been unthinkable even two decades ago. While paper encyclopedias cost tens of thousands of rupees, you can have access to an order of magnitude more information (think Wikipedia.org) for practically free, and with mostly the same degree of accuracy. However, the access to such a cornucopia of information is not without its dark side. And recent neural research into how the brain reacts when surfing information, how we respond to this overload of information, and how it affects us, has led to this new strand of thought that tells us that the mind loses some of its ability to think, focus, and indulge in deep, contemplative thought as a result.
Marshall McLuhan, who was Culkin’s intellectual mentor, elucidated the ways our technologies at once strengthen and sap us. In one of the most perceptive, if least remarked, passages in Understanding Media, McLuhan wrote that our tools end up “numbing” whatever part of our body they “amplify.” When we extend some part of ourselves artificially, we also distance ourselves from the amplified part and its natural functions.


At some point in the past, we all remembered stuff we needed to know. The written word did not exist. Then humans started writing, on caves, and then on tablets, then papyrus, then paper. The history of the written word can be traced back several thousand years, when the Sumerians started to use clay tablets inscribed with a reed. The Egyptians used scrolls made from papyrus (a plant) around 4500 BCE. The Greeks and Romans adopted these scrolls for their writings. And the writing then was hugely different than what we know it today. Like how? Well, it turns out, there were no spaces between the words. Right. So a sentence like "the quick brown fox jumps over the lazy dog" would have been written as "thequickbrownfoxjumpsoverthelazydog". Rather painful to read, right? The reason seems to be that the "lack of word separation reflected language’s origins in speech. When we talk, we don’t insert pauses between each word—long stretches of syllables flow unbroken from our lips". Not good. Certainly not good. Writing was not read as much as it was read out loud. The practice of reading silently came much later.

When the practice of placing spaces between words did occur,  it "alleviated the cognitive strain involved in deciphering text, making it possible for people to read quickly, silently, and with greater comprehension. Such fluency had to be learned. It required complex changes in the circuitry of the brain, as contemporary studies of young readers reveal."

Readers didn’t just become more efficient. They also became more attentive. To read a long book silently required an ability to concentrate intently over a long period of time, to “lose oneself” in the pages of a book, as we now say. Developing such mental discipline was not easy. The natural state of the human brain, like that of the brains of most of our relatives in the animal kingdom, is one of distractedness. Our predisposition is to shift our gaze, and hence our attention, from one object to another, to be aware of as much of what’s going on around us as possible.

The practice of deep reading that became popular in the wake of Gutenberg’s invention, in which “the quiet was part of the meaning, part of the mind,” will continue to fade, in all likelihood becoming the province of a small and dwindling elite. We will, in other words, revert to the historical norm.


And herein lies one of the reasons why the internet functions as a distracter, a destroyer of attention. Why? Because the internet presents information in a way that requires us to evaluate all available information, like hyperilnks, text-boxes, adverts, popups, tooltips, the chrome, everything, and make assessments as to their utility.

It is about two-fifths of the way through the (on page 111 or thereabouts) that the question that really forms the title of the book makes an appearance.
Now comes the crucial question: What can science tell us about the actual effects that Internet use is having on the way our minds work? (bold emphasis mine)
Dozens of studies by psychologists, neurobiologists, educators, and Web designers point to the same conclusion: when we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. It’s possible to think deeply while surfing the Net, just as it’s possible to think shallowly while reading a book, but that’s not the type of thinking the technology encourages and rewards.

“Our senses are finely attuned to change,” explains Maya Pines of the Howard Hughes Medical Institute. “Stationary or unchanging objects become part of the scenery and are mostly unseen.” But as soon as “something in the environment changes, we need to take notice because it might mean danger—or opportunity.” Our fast-paced, reflexive shifts in focus were once crucial to our survival. They reduced the odds that a predator would take us by surprise or that we’d overlook a nearby source of food. For most of history, the normal path of human thought was anything but linear.

Whenever we, as readers, come upon a link, we have to pause, for at least a split second, to allow our prefrontal cortex to evaluate whether or not we should click on it. The redirection of our mental resources, from reading words to making judgments, may be imperceptible to us - our brains are quick - but it’s been shown to impede comprehension and retention, particularly when it’s repeated frequently.
...
Hyperlinks also alter our experience of media. ... Links don’t just point us to related or supplemental works; they propel us toward them. They encourage us to dip in and out of a series of texts rather than devote sustained attention to any one of them. Hyperlinks are designed to grab our attention. Their value as navigational tools is inextricable from the distraction they cause.

The brain becomes better at what it is made to do. Simply put, practice makes perfect. Perfect at good things, perfect at not-so-good things. Perfect at insane things. Fungibility is a term used more in an economic sense, as in when money is termed as fungible, capable of being spent on interchangeable things. We can use money to buy a popcorn or a soda at the movies, or we can use the same money to buy a book and a coffee. The mind is not dissimilar. If we use it for something, then it is not being used for something else. It then becomes good at performing task A, and in fact it becomes over time less capable of doing task B. The mind allocates resources, in a recursive loop almost, to the task it is made to do most often.
The paradox of neuroplasticity, observes Doidge, is that, for all the mental flexibility it grants us, it can end up locking us into “rigid behaviors.”
...
It comes as no surprise that neuroplasticity has been linked to mental afflictions ranging from depression to obsessive-compulsive disorder to tinnitus.
...
In the worst cases, the mind essentially trains itself to be sick.
...
Experiments show that just as the brain can build new or stronger circuits through physical or mental practice, those circuits can weaken or dissolve with neglect. “If we stop exercising our mental skills,” writes Doidge, “we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead.”

Memory and Remembering
Since time immemorial, children were taught to memorize things they wanted to remember and understand. Hindu scriptures like the Vedas and the Gita, were memorized in their entirety by scholars, with amazing accuracy, and passed down from one generation to the other in this manner. Decade after decade. Century after century. Millennium after millennium. So was the case with other scriptures too, where it was vital that one memorized what one intended to remember. Word-of-mouth was the only way to transmit information.
Sometime during the industrialization of society, with the advent to machines, and calculators, and computers, and recording tape, and so on, the ability to memorize became more a sign of primitive mind unwilling to adapt with the times, ridiculed as nothing more than trying to "learn by rote", something frowned upon, as old-fashioned and out-of-tune with the modern direction the world was moving towards.
by the middle of the twentieth century memorization itself had begun to fall from favor. Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy.

In trying to understand how we learn, how we understand something, and how we remember stuff, it is required that we turn to the "science" guys to enlighten us. Whether it is psychologists who conduct experiments or researchers who put people into big scan machines (CAT for example) to actually peer inside our brains and see what happens when we read, surf, or do stuff.
In 1885, the German psychologist Hermann Ebbinghaus conducted an exhausting series of experiments, using himself as the sole subject, that involved memorizing two thousand nonsense words. He discovered that his ability to retain a word in memory strengthened the more times he studied the word and that it was much easier to memorize a half dozen words at a sitting than to memorize a dozen. He also found that the process of forgetting had two stages. Most of the words he studied disappeared from his memory very quickly, within an hour after he rehearsed them, but a smaller set stayed put much longer - they slipped away only gradually.

Repetition is key to memorization. And that we forget stuff, unless we make a conscious effort to memorize it. Furthermore...
Müller and Pilzecker concluded that it takes an hour or so for memories to become fixed, or “consolidated,” in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can sweep the nascent memories from the mind.
The very act of remembering, explains clinical psychologist Sheila Crowell in The Neurobiology of Learning, appears to modify the brain in a way that can make it easier to learn ideas and skills in the future.
The key to memory consolidation is attentiveness. Storing explicit memories and, equally important, forming connections between them requires strong mental concentration, amplified by repetition or by intense intellectual or emotional engagement. The sharper the attention, the sharper the memory. “For a memory to persist,” writes Kandel, “the incoming information must be thoroughly and deeply processed. This is accomplished by attending to the information and associating it meaningfully and systematically with knowledge already well established in memory.” If we’re unable to attend to the information in our working memory, the information lasts only as long as the neurons that hold it maintain their electric charge—a few seconds at best. Then it’s gone, leaving little or no trace in the mind.
Distracted To Distractions
When we are distracted, we are less able to act like who we are. And that includes losing, at least temporarily, some of the most human of emotions... Which is why some of the most profound philosophical thinking in the history of humanity has come from the minds of people who chose to cut themselves off from distractions and focus their minds on single-minded contemplation.
When our brain is overtaxed, we find “distractions more distracting.” (Some studies link attention deficit disorder, or ADD, to the overloading of working memory.) Experiments indicate that as we reach the limits of our working memory, it becomes harder to distinguish relevant information from irrelevant information, signal from noise. We become mindless consumers of data.
Hyperlinks are, simply put, distractions. They distract from the text we are reading. Visually, mentally, cognitively.
Deciphering hypertext substantially increases readers’ cognitive load and hence weakens their ability to comprehend and retain what they’re reading. A 1989 study showed that readers of hypertext often ended up clicking distractedly “through pages instead of reading them carefully.” A 1990 experiment revealed that hypertext readers often “could not remember what they had and had not read.”

...research continues to show that people who read linear text comprehend more, remember more, and learn more than those who read text peppered with links. (bold-emphasis mine)

The information flowing into our working memory at any given moment is called our “cognitive load.” When the load exceeds our mind’s ability to store and process the information—when the water overflows the thimble—we’re unable to retain the information or to draw connections with the information already stored in our long-term memory.

A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition.

The Conceit of Technology

It has for long been the pastime, almost an obsession, of experts and wanna-be experts and everyone in-between to prognosticate the demise of the book in response to every new technology that has emerged on the horizon. Some have been quick to be seduced by the allure of something new because it is new, while others see in the new technology a real promise of improving what has been a centuries old invention, the mass-printed book. Memory is like a tape recorder. No, it is like a compact disk. No, it is more similar to memory (Random Access Memory). Like a hard disk. Or maybe like Flash memory. There you go. All that science needs to do is to figure out a way to attach a 64 gig module, no make that a terabyte, or petabyte, chip to our brain, and then we will not need to remember anything. Because it will all be there. Accessible on this chip embedded inside our skulls. Better still, attach some optic-based, wireless network connection to the internet in our brain, and then we will become one with the internet; the large, connected, breathing, mass of life that is out there. We will all be infinitely smarter once that happens, and we will all be freed from our human limitations. Utopia will finally be with us. We will be gods. All of us. Able to recall every tweet, every Facebook post, every poke, every photo ever uploaded that talks about us or mentions us.
In an 1889 essay in the Atlantic Monthly, Philip Hubert predicted that “many books and stories may not see the light of print at all; they will go into the hands of their readers, or hearers rather, as phonograms.”
But not all have fallen for the wiles of every new technology. Witness the indignation expressed here by scientists who saw a computer's multi-tasking abilities as an invitation to distractedness.
A group of prominent computer scientists had been invited to PARC to see a demonstration of a new operating system that made “multitasking” easy.
...
“Why in the world would you want to be interrupted—and distracted—by e-mail while programming?” one of the attending scientists angrily demanded.

Even the analogy of comparing the human memory to computer memory, and of thinking that bits and bytes can serve as adequate measures of the human memory is not borne out by the actual composition of the brain.
Governed by highly variable biological signals, chemical, electrical, and genetic, every aspect of human memory - the way it’s formed, maintained, connected, recalled - has almost infinite gradations. Computer memory exists as simple binary bits - ones and zeros - that are processed through fixed circuits, which can be either open or closed but nothing in between.

Those who celebrate the “outsourcing” of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory. What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed, the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals.

The proponents of the outsourcing idea also confuse working memory with long-term memory. When a person fails to consolidate a fact, an idea, or an experience in long-term memory, he’s not “freeing up” space in his brain for other functions. In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections.


Are you a Luddite? Are you a closet Luddite? Yes. That's what you are. You would have us go back to the stone age. You are unable to handle technology.
So the internet is an unmitigated disaster, huh? So you are a luddite who would send us back to the good old days when there were no computers, no internet, no telephones, no television, no telegraph, no paper? No, the author does not say that. Quite the contrary.
The ability to skim text is every bit as important as the ability to read deeply. What is different, and troubling, is that skimming is becoming our dominant mode of reading. Once a means to an end, a way to identify information for deeper study, scanning is becoming an end in itself...
...
Research shows that certain cognitive skills are strengthened, sometimes substantially, by our use of computers and the Net. These tend to involve lower-level, or more primitive, mental functions such as hand-eye coordination, reflex response, and the processing of visual cues.
...
David Meyer, a University of Michigan neuroscientist and one of the leading experts on multitasking, makes a similar point. As we gain more experience in rapidly shifting our attention, we may “overcome some of the inefficiencies” inherent in multitasking, he says, “but except in rare circumstances, you can train until you’re blue in the face and you’d never be as good as if you just focused on one thing at a time.” ... What we’re doing when we multitask “is learning to be skillful at a superficial level.”


The brain races as it ingests new bytes of information on the net. Hyperlinks, snippets, multi-media inserts, comments, the "like/share/tweet/post/email" buttons, all cause us to, at least temporarily, stop what we are doing, focus on these distractions, evaluate their utility, and then return, if we so choose to, to the task we were doing a few moments earlier. All this means that we are constantly under information overload, our working memory is always struggling to juggle this constantly flowing inflow of information. Contemplative thought is the first to suffer. Comprehension along with it. Over time, our brains become more adept at such 'pecking' when on the net, we lose the ability to sit and comprehend a topic without getting distracted. Distraction makes for shallow learning.

Then there is also the disturbing finding that the internet, in some instances at least, encourages and actually drives more homogeneity in thinking, rather than the opposite, which is what we would expect, given the vast diversity of information supposedly within our reach. In other words, the internet can also become a vast echo chamber, amplifying and thereby distorting the importance of what is already within it.
James Evans, a sociologist at the University of Chicago, assembled an enormous database on 34 million scholarly articles published in academic journals from 1945 through 2005. He analyzed the citations included in the articles to see if patterns of citation, and hence of research, have changed as journals have shifted from being printed on paper to being published online.
...
In explaining the counterintuitive findings in a 2008 Science article, Evans noted that automated information-filtering tools, such as search engines, tend to serve as amplifiers of popularity, quickly establishing and then continually reinforcing a consensus about what information is important and what isn’t.


But, let's also be reasonable and not make the leap from a simple observation to a blanket assertion of moral decay.
It would be rash to jump to the conclusion that the Internet is undermining our moral sense. It would not be rash to suggest that as the Net reroutes our vital paths and diminishes our capacity for contemplation, it is altering the depth of our emotions as well as our thoughts.
Maybe technology and science will evolve where they don't place such a distracted burden on our cognitive senses. Maybe humans themselves will evolve, as they continually do. Maybe not. In which case the winners in this world of technology will be the ones who learn to keep technology at arms length while learning the technology itself. Maybe.

The Shallows: What the Internet Is Doing to Our Brains
The Big Switch: Rewiring the World, from Edison to Google
Does It Matter?: Information Technology and the Corrosion of Competitive Advantage

Some references in The Shallows:





-->
© 2011, Abhinav Agarwal. All rights reserved.