Pages

Tuesday 25 September 2012

Does The Internet Ruin Memory?


A new book on memory chronicles how we increasingly rely on technology like flash drives and less on personalized memories. Austen Rosenfeld on whether that makes us less human.

A tourist takes a picture of the Empire State Building on his iPhone, deletes it, then takes another one from a different angle. But what happened to that first image? The delete button on our cameras, phones, and computers is a function we use often without thinking, yet it remains a bizarre concept. Most things in the world don’t just disappear. Not our discarded plastic water bottles. Not the keys to the apartment. Not our earliest childhood memories.
“It is possible that every memory you have ever experienced that made its way into your long-term memory is still buried somewhere in your head,” Michael S. Malone writes in his new book The Guardian of All Things: The Epic Story of Human Memory. It is both a blessing and a curse that we cannot voluntarily erase our memories, Men in Black style. Like it or not, we are stuck with our experiences. It’s just one of the many ways that human beings differ from digital cameras.
Yet, humans are relying more and more on digital cameras and less on our own minds. Malone tells the story of how, over time, humans have externalized their internal memories, unhinging themselves from the experiences they own. The book is a chronological history—from the development of parchment paper, libraries, cameras, to microchips—about how we place increasing trust in technology.
Is it a good thing for electronic devices and the Internet to store our memories for us? When we allow that to happen, who do we become? Will our brains atrophy if we chose not to exercise them? Malone, who is a Silicon Valley reporter, shows us the technological progress, but backs away from deeper philosophical questions. His love for breaking news—the very idea of breakthrough—is apparent, but he fails to address the more distressing implications.
The biology of human memory is largely mysterious. It is one of the remaining brain functions whose location neuroscientists can’t place. Memory neurons are distributed all over the brain, hidden in its gray wrinkles like money behind couch cushions. “What a lark, what a plunge,” opens Virginia Woolf’s Mrs. Dalloway, as Clarissa tosses open her French windows and is transported into her remembered past. “Live in the moment” is a directive we often hear these days in yoga class, but our ability to weave in and out of the past is what makes life interesting and also difficult for humans.
The Neanderthal brain was powerful, but lacking a high-capacity memory, “forever trapped in the now,” according to Malone. The stories, images, and phrases that we turn over in our minds while lying awake in bed were different for them. Neanderthals could receive the stimuli of the world––colors, sounds, smells––but had limited ways to organize or access that information. Even the term Homo sapiens sapiens–––as opposed to the Homo sapiens neanderthalensis––reveals how our brains work differently from our ancestors. Translated from the Latin, it means knowing, knowing man. Not only do we know, but we know that we know. Our self-consciousness, that ability not only to make memories but to recall them, is what defines us.
Short-term memories are created by the synthesis of certain proteins in a cell and long-term memories are created by released magnesium, clearing the way for calcium. Each memory is then imbedded like handprints in concrete. This is what we know about the physical process of memory making. Why a person might remember the meal they ate before their parents announced a divorce, but not the announcement itself, remains a scientific mystery.
The advent of language is intrinsically linked to memory, and many early languages were simply mnemonic devices. They served as a method for sharing memories, an early form of fact-checking that also expands the lifetime of a memory. The Library of Alexandria is an example of a population’s desire to catalog a communal memory and situate it safely outside their own transient bodies. From stone, clay tablets, papyrus, parchment, vellum, and rice paper, changing media for chronicling memory also change memory-making itself.
The ancient Romans even had a discipline called Ars Memorativa, or the art of memory. They honored extraordinary acts of memorization, just as they honored extraordinary feats in battle, and Cicero excelled at this. Memorization was an art that could be honed using geometric patterns, imaginary structures and landscapes.
The invention of computer memory changes everything. We now have “Moore’s Law,” the notion that memory chips will double in performance every 18 months. Memory receptacles continue to decrease in size while our memories accumulate daily. Because of growing access to the Internet, Malone argues that individualized memory matters less and less. Schoolchildren today take open-book tests or with a calculator. “What matters now is not one’s ownership of knowledge, but one’s skill at accessing it and analyzing it,” he writes. However, something is lost when we easily consult the oracle for answers. We have unlimited access to a wealth of information, yet little of it belongs to us.
Human beings have a notion of self, a subjective world particular to us, thanks to our highly complicated and individualized brains that Malone compares to “the roots and branches of a tree.” We own our own hardware, and we all remember differently. The Internet offers us access to information, but it is really a part of the external world of colors and sounds that even Neanderthals could receive. A world in which all our memories are stored on electronic devices and all our answers can be found by Googling is a world closer to the Neanderthal’s than to a high-tech, utopian future. I don’t remember when I first learned the word déjà vu but I do remember the shirt I wore on the first day of 9th grade. Memory is a tool, but it can also teach us about what we think is important. Human memory is a way for us to learn about ourselves.

No comments: