Memory Psychology PDF
Document Details
Uploaded by MomentousSaxhorn
Tags
Summary
This chapter explores different aspects of memory, from historical examples of memorization techniques like Genghis Khan's use of song to modern strategies used in medical training. The process of acquisition, storage, and retrieval of information is discussed, along with memory errors and how memory can be improved.
Full Transcript
8 CHAPTER Memory G enghis Khan got around. At the dawn of the 13th century, the Mongolian warrior conquered the largest empire the world had ever...
8 CHAPTER Memory G enghis Khan got around. At the dawn of the 13th century, the Mongolian warrior conquered the largest empire the world had ever known: an expanse stretching from the Sea of Japan in the east to the Caspian Sea in the west, from Siberia in the north to India in the south. To conquer this territory and then maintain his domination, the emperor had to formulate complex plans. This created a problem: His soldiers were illiterate peasants, scattered over thou- sands of miles. How could he spread his complicated orders through the ranks quickly, simply, and without error? His solution: Put the orders in a song. All the Khan’s soldiers learned a small set of melodies, which they practiced as they traversed the mountains and steppes. Then, when the time for fighting arrived, commanders would set their orders to the tune of one of these melodies. The soldiers’ task was simple: memorize a few new verses for an old song, rather than a series of entirely unfamiliar, abstract instruc- tions. And if any one of the soldiers forgot the lyrics, hundreds of others could sing him the next line. Using this scheme, the soldiers crooned their battle instructions, and large segments of Eurasia fell. Others in the ancient world also relied on deliberate memorization strategies. The Greeks of classical Athens, for example, put a high value on public speaking, much of which was done from memory. The Greeks therefore developed a number of memo- rization tricks (some of which we’ll discuss later in the chapter) to help them in this endeavor. 301 Similar mnemonic tactics are used in the modern world. Medical students, for example, have developed strategies that help them memorize anatomy, drug names, and disease symptoms. Thus, they learn the 12 pairs of cranial nerves (olfactory, optic, oculomotor, trochlear, trigeminal, and so on) by taking the first letter of each word and forming a sentence built from new words that start with the same letters. The resulting sentence—“On Old Olympus’ Towering Tops A Friendly Viking Grew Vines and Hops”—paints a vivid image that’s far easier to remember than the original list. These examples remind us that—with just a bit of work—we can get enormous amounts of information into our memories, and then recall that information, in detail, for a very long time. But there’s also a darker side to memory: Sometimes we remember things that never happened at all. Indeed, far more often than we realize, our memories blend together separate incidents, introduce rogue details, and incorpo- rate others’ versions of events into our own recall. In this chapter, you’ll learn how these memory errors arise and what they tell us about remembering. How far off track can memory go? In one study, researchers planted in participants a memory of getting lost in the mall as a child, then being brought home safely by a friendly stranger. Nothing of the sort had happened to anyone in the study, but they came to vividly “remember” it anyhow. Other studies have planted false memories of vicious animal attacks, and even—in one remarkable study—a false memory of a hot- air balloon ride. How should we put these pieces together? How does memory operate, so that we can easily remember countless episodes, thousands of facts, and the lyrics to hun- dreds of songs? Why does Genghis Kahn’s lyrical trick, or the medical students’ sentence-building strategy, help memory? More broadly, what can we do to learn more rapidly and hold on to the information longer? And why do our memories sometimes betray us, leading us to endorse large-scale fictions? We’ll tackle all of these questions in this chapter. ACQ UISITION, STORAGE, RETRIEVAL Each of us has a huge number of memories. We can recall what we did yesterday, or last summer. We can remember what the capital of France is, or what the chemical formula is for water. We remember how to ride a bicycle and how to throw a baseball. These examples—remembering episodes, remembering general facts, and remembering skills or procedures—actually draw on different memory systems; but it also turns out that the various types of memory have some things in common, so let’s begin with the common elements. Any act of remembering requires success at three aspects of the memory process. First, in order to remember, you must learn something—that is, you must put some information into your memory. This point seems obvious, but it deserves emphasis because many failures of memory are, in fact, failures in this initial stage of acquisition. For example, imag- ine meeting someone at a party, being told his name, and moments later realizing that you don’t have a clue what his name is—even though you just heard it! This common (but embarrassing) experience is probably not the result of ultrarapid forgetting. Instead, it’s likely to stem from a failure in acquisition. You were exposed to the name but barely paid attention to it and, as a result, never learned it in the first place. 302 chapter 8 PMEMORYO The next aspect of remembering is stor- age. To be remembered, an experience must leave some record in the nervous system. This record—known as the memory trace—is squirreled away and held in some enduring form for later use. One question to be asked here is how permanent this storage is: Once information is in storage, does it stay there forever? Or does informa- tion in storage gradually fade away? We’ll tackle these questions later in this chapter. The final aspect of remembering is retrieval, the process through which you draw information from storage and use it. (A) (B) Sometimes, retrieval takes the form of recall—a process in which you retrieve information from memory in response to some 8.1 Using memory (A) In this card game, cue or question (Figure 8.1A). Trying to answer a question like “What is Sue’s boyfriend’s you need to recall which card is in which name?” or “Can you remember the last time you were in California?” requires recall. A position; in this case, position is the memory different way to retrieve information is through recognition (Figure 8.1B). In this kind of cue, and card identity is what you’re trying retrieval, you’re presented with a name, fact, or situation and are asked if you have to recall. (B) Most standardized tests, in multiple-choice format, rely on recogni- encountered it before. “Is this the man you saw at the bank robbery?” or “Was the movie tion. The correct answer is in front of you, you saw called Memento?” are questions requiring recognition. Recognition can also be as one of your options, and you need to tested with multiple items: “Which of these pictures shows the man you saw earlier?” recognize it. This latter format obviously resembles a multiple-choice exam, and in fact multiple- choice testing in the classroom probes your ability to recognize previously learned mate- rial. In contrast, exams that rely on essays or short answers emphasize recall. recall A type of retrieval that requires you to produce an item from memory in response to a cue or question. ACQ UISITION recognition A type of retrieval that People commonly speak of “memorizing” new facts or, more broadly, of “learning” new requires you to judge whether you have encountered a stimulus previously. material. However, psychologists prefer the term memory acquisition and use it to include cases of deliberate memorization (intentional learning) as well as cases of inci- acquisition The processes of gaining dental learning—learning that takes place without any intention to memorize and new information and placing it in often without the awareness that learning is actually occurring. (You know that grass is memory. green and the sky is blue, and you probably can easily recall what you had for dinner yes- intentional learning Placing new terday, but you didn’t set out to memorize these facts; the learning, therefore, was information into memory in anticipation incidental.) of being tested on it later. Memory acquisition is not just a matter of “copying” an event or a fact into memory, the way a camera copies an image onto film. Instead, acquisition requires incidental learning Learning without some intellectual engagement with the material—thinking about it in some way— trying to learn, and often without aware- and it’s then the product of this engagement (i.e., what you thought about during the ness that learning is occurring. event) that’s stored in memory. As we’ll see, this simple point turns out to have crucial implications for what you will remember and for how accurate (i.e., true to the actual history) your memory will be. Working Memory, Long-Term Memory How does memory acquisition proceed? The answer has to begin with the fact that we have several types of memory, each with different properties, and each type plays its PAcquisitionO 303 own role in the acquisition process. Historically, these different types have been described in terms of the stage theory of memory, which proposed (among other points) that memory acquisition could be understood as dependent on three types of memory: When information first arrived, it was stored briefly in sensory memory, which held onto the input in “raw” sensory form—an iconic memory for visual inputs and an echoic mem- ory for auditory inputs. A process of selection and interpretation then moved the infor- mation into short-term memory—the place you hold information while you’re working working memory A term describing on it. Some of the information was then transferred into long-term memory, a much the status of thoughts in memory that larger and more permanent storage place (Atkinson & Shiffrin, 1968; Broadbent, 1958; are currently activated. Waugh & Norman, 1965). This early conception of memory captured some important truths—but needs to be long-term memory The vast memory updated in several ways. As one concern, the idea of “sensory memory” plays a much depository containing all of an individ- ual’s knowledge and beliefs—including smaller role in modern theorizing and so, for example, many discussions of visual infor- all those not in use at any given time. mation processing (like our discussion in Chapters 4 and 5) make no mention of iconic memory. In addition, modern proposals use the term working memory rather than primacy effect In free recall, the short-term memory to emphasize the function of this memory: Ideas or thoughts in this tendency to recall the first items on the memory are currently activated, currently being thought about—and so they’re the ideas list more readily than those in the you are currently working on. Long-term memory, in contrast, is the vast depository that middle. contains all of your knowledge and all of your beliefs that you happen not to be thinking recency effect In free recall, the about at the moment, and this includes your beliefs about relatively recent events. Thus, tendency to recall items at the end of if just a few minutes ago you were thinking about your weekend plans but now you’re the list more readily than those in the thinking about something else, these plans are gone from working memory (because middle. you’re no longer working on them); and so, if you can recall your plans, you must be drawing them from long-term memory. Let’s note, though, that what’s at stake here is more than a shift in terminology, because the modern view also differs from the stage theory in how it conceptualizes memory. In the 100 older view, working memory was understood broadly as a storage place, and it was often Recency effect described as the “loading dock” just outside the long-term memory “warehouse.” In the 80 modern conception, working memory is not a “place” at all; instead, it’s just the name we give to a status. When we say that ideas are “in working memory,” this simply means—as Percent recall 60 we’ve already noted—that these ideas are currently activated. This focus on status is also the key to understanding the difference between working memory and long-term Primacy 40 memory—the modern conception emphasizes whether the mental content is currently effect active (working memory) or not (long-term memory), in contrast to older theory’s empha- 20 sis on time frame (“short term” or “long”). 0 1 5 10 15 20 PRIMAC Y AND RECENC Y Serial position Why should we take this broad proposal seriously? Why should we make any distinc- 8.2 Primacy and recency effects in free tion between working memory and long-term memory, and why should we think about recall Research participants heard a list of working memory in the way we’ve just described? As a first step toward answering these 20 common words presented at a rate of 1 questions, consider the results of studies in which participants hear a series of word per second. Immediately after hear- unrelated words—perhaps 15 words in total, or 20, presented one word at a time. At the ing the list, participants were asked to end of the list, the participants are asked to recall the items in any order they choose write down as many of the words on the list as they could recall. The results show (this is why the participants’ task is called free recall—they’re free to recall the items in that position in the series strongly affected any sequence). recall—participants had better recall for In this task, there’s a reliable pattern for which words the participants recall and which words at the beginning of the list ones they don’t. Words presented at the beginning of the list are very likely to be recalled; (a pattern called the primacy effect) and this memory advantage for early-presented words is called the primacy effect. Likewise, for words at the end of the list the last few words presented are also likely to be recalled; this is the recency effect. The (the recency effect). likelihood of recall is appreciably poorer for words in the middle of the list (Figure 8.2). 304 chapter 8 PMEMORYO What creates this pattern? As the to-be-remembered words are presented, the participants pay attention to them, and this ensures the activated status that we call “working memory.” There’s a limit, however, on how many things someone can think about at once, and so there’s a limit on how many items can be maintained in working memory. According to many authors, this limit is seven items, give or take one or two; the capacity of working memory is therefore said to be seven plus or minus two items (G.Miller,1956).As a result,it’s just not possible for the participants to maintain all of the list words in their current thoughts. Instead, they’ll just do their best to “keep up” with the list as they hear it. Thus, at each moment during the list presentation, their working memories will contain only the half-dozen or so words that arrived most recently. Notice that, in this situation, new words entering working memory will “bump out” the words that were there a moment ago. The only words that don’t get bumped out are the last few words on the list, because obviously no further input arrives to displace them. Hence, when the list presentation ends, these few words are still in working memory— still in the participants’ thoughts—so are easy to recall. This is why the participants reli- ably remember the end of the list; they are producing the result we call the recency effect. The primacy effect comes from a different source. We know that these early words are not being recalled from working memory, because they were—as we’ve already noted— bumped from working memory by later-arriving words. It seems, therefore, that the pri- macy effect must involve long-term memory—and so, to explain why these early words are so well recalled, we need to ask how these words became well established in long- term storage in the first place. The explanation lies in how participants allocate their attention during the list presentation. To put this in concrete terms, let’s say that the first word on the list is camera. When research participants hear this word, they can focus their full attention on it, silently rehearsing “camera, camera, camera,...” When the second word arrives, they’ll rehearse that one too; but now they’ll have to divide their attention between the first word and the second (“camera, boat, camera, boat,...”). Attention will be 70 divided still further after participants hear the third word (“camera, boat, zebra, camera, boat, zebra,...”), and so on through the list. 60 No delay Notice that earlier words on the list get more attention than later ones. At the list’s 50 start, participants can lavish attention on the few words they’ve heard so far. As they Percent recall hear more and more of the list, though, they must divide their attention more thinly, 40 simply because they have more words to keep track of. Let’s now make one more 30 assumption: that the extra attention given to the list’s first few words makes it more 30-second likely that these words will be well established in long-term memory. On this basis, par- 20 delay ticipants will be more likely to recall these early words than words in the middle of the 10 list—exactly the pattern of the data. Support for these interpretations comes from various manipulations that affect the pri- 0 0 1 5 10 15 macy and recency effects. For example, what happens if we require research participants to Serial position do some other task immediately after hearing the words but before recalling them? This other task will briefly divert the participants’ attention from rehearsing or thinking about 8.3 The recency effect and working the list words—and so the words will be bumped out of working memory. Working mem- memory Research participants heard sev- eral 15-word lists. In one condition (red), ory, in turn, was the hypothesized source of the recency effect, and so, according to our free recall was tested immediately after hypothesis, this other task—even if it lasts just a few seconds—should disrupt the recency hearing the list. In the other condition effect.And indeed it does.If participants are required to count backward for just 30 seconds (blue), the recall test was given after a between hearing the words and recalling them, the recency effect is eliminated (Figure 8.3). 30-second delay during which rehearsal Other manipulations produce a different pattern—they alter the primacy effect but was prevented. The delay left the primacy have no effect on recency.For example,if we present the list items more slowly,participants effect unaffected but abolished the recency have time to devote more attention to each word. But we’ve just proposed that attention effect, confirming that this effect is based helps to establish words in long-term memory. We should therefore expect that a slower on retrieval from working memory. PAcquisitionO 305 100 80 Retrieval from long-term Retrieval from working memory specifically memory specifically activated activated the hippocampus. the perirhinal cortex. Percent recall 60 Fast presentation 40 20 Slow presentation 0 0 1 5 10 15 20 (A) Serial position (B) 8.4 The primacy effect and long-term storage (A) The graph compares free-recall performance when item presentation is relatively slow (2 seconds per item) and fast (1 sec- ond per item). Slow presentation enhances the primacy effect but leaves the recency effect unaltered. (B) We can also confirm the distinction between working memory and long-term memory with fMRI scans. These suggest that memory for early items on a list depends on brain areas (in and around the hippocampus) that are associated with long-term memory; memory for later items on the list do not show this pattern (Talmi, Grady, Goshen- Gottstein & Moscovitch, 2005). This obviously provides confirmation that the recency items are coming from a different source than items heard earlier in the list. presentation will lead to a stronger primacy effect (since primacy depends on retrieval from long-term memory) but no change in the recency effect (because the recency items aren’t being retrieved from long-term memory). This is exactly what happens (Figure 8.4). RECODING TO EXPAND THE CAPACITY OF WORKING MEMORY As we’ve mentioned, working memory has a limited capacity. There is, however, enormous flexibility in how we use that capacity—and so, if we can pack the input more efficiently, we can increase the amount of information maintained in this memory. For example, consider an individual who tries to recall a series of digits that she heard only once: 177620001066 If she treats this as a series of 16 unrelated digits, she’ll surely fail in her attempt to remember the series. But if she thinks of the digits as years (i.e., the year the U.S. Declaration of Independence was signed; the year of the new millennium; and the year the Normans invaded England), the task becomes much easier because now she has just three items to remember. Cases like this one make it plain that working memory’s capacity can’t be measured chunking A process of reorganizing (or recoding) materials in working memory in digits, or words, or kilobytes. Instead, the capacity is measured in chunks. This by combining a number of items into a unscientific-sounding word helps us remember that this is a flexible sort of measure- single, larger unit. ment, because what’s in a chunk depends on how the person thinks about, and organ- izes, the information. Thus, if a person thinks of each digit as a chunk, working memory 306 chapter 8 PMEMORYO can hold (roughly) seven digits. If pairs of digits are chunked together, working mem- ory’s capacity will be more than a dozen digits. To see how important chunking can be, consider a remarkable individual studied by Chase and Ericsson (Chase & Ericsson, 1978, 1979, 1982; Ericsson, 2003). This fellow happens to be a fan of track events, and when he hears numbers, he thinks of them as finishing times for races. The sequence “3, 4, 9, 2,” for example, becomes “3 minutes and 49 point 2 seconds, near world-record mile time.” In this way, four digits become one chunk of information. The man can then retain seven finishing times (seven chunks) in memory, and this can involve 20 or 30 digits. Better still, these chunks can be grouped into larger chunks, and these into even larger ones. For example, finishing times for individual racers can be chunked together into heats within a track meet, so that, now, 4 or 5 finishing times (more than 12 digits) become one chunk. With strate- gies like this and with a lot of practice, this man has increased his apparent memory capacity from the “normal” 7 digits to 79 digits! Let’s be clear, though, that what has changed through practice is merely the man’s chunking strategy, not the holding capacity of working memory itself. This is evident in the fact that, when tested with sequences of letters rather than numbers—so he can’t use his chunking strategy—his memory capacity drops to a perfectly normal six conso- nants. Thus, the seven-chunk limit is still in place for this fellow, even though (with numbers) he’s able to make extraordinary use of these seven slots. Establishing Long-Term Memories So far, we’ve argued for a separation between working memory and long-term memory, and we’re starting to see indications of each memory’s attributes. Working memory has a small capacity—although it’s flexible in what it can hold, thanks to the process of chunking. Long-term memory, in contrast, is vast. After all, the average college student remembers the meanings of 80,000 words, thousands of autobiographical episodes, millions of facts, hundreds of skills, the taste of vanilla and the smell of lemon. All these things and more are stored in long-term memory. Working memory and long-term memory also differ in how they’re “loaded” and “unloaded.” To get information into working memory, all you need to do is pay attention to the material; that’s built into the definition of working memory. Getting information into long-term storage, in contrast, seems to take some time and effort; that was essential for our discussion of the primacy effect. We need to fill in some of the details, though, about how this “loading” of long-term memory works. With that, we’ll get a clearer picture of why working memory is defined the way it is—as an active process rather than as a mere storage box. THE IMPORTANCE OF ACTIVE ENGAGEMENT In explaining primacy, we made a key assumption—namely, that paying attention to words on a list helps you establish those words in long-term memory. Presumably the same would be true for other contents, so that, no matter what you’re memorizing, attention plays a key role in establishing memories. But is this assumption correct? Consider people’s memory for ordinary coins. Adults in the United States have probably seen pennies, for example, tens of thousands of times; adults in other coun- tries have seen their own coins just as often. But, of course, most people have little reason to pay attention to the penny. Pennies are a different color and size from the other coins, so we can identify them at a fast glance and with no need for further PAcquisitionO 307 8.5 An ordinary penny Despite having seen the U.S. penny thousands and thousands of times, people seem to have little recollection of its layout. Test yourself. Which of these drawings is most accurate? scrutiny. And, if attention is what matters for memory—or, more broadly, if we remember what we pay attention to and think about—then memory for the coin should be quite poor. In one study, participants were asked whether Lincoln’s profile, shown on the heads side of the penny, is facing to the right or the left. Only half of the participants got this question right—exactly what we’d expect if they were just guessing. Other participants were shown drawings of the penny, and had to choose the “right one” (Figure 8.5). Their performance was quite poor. These results—participants’ remark- ably poor memory for this coin despite countless opportunities to view it, provides striking confirmation that memory does require attention—it requires mental engagement with a target, not mere exposure (Nickerson & Adams, 1979; Rinck, 1999; for some complications, see Martin & Jones, 2006; in Figure 8.5, the top left drawing shows the correct layout). But we need to be more precise about what paying attention means, and what it accomplishes. To make the issue clear, imagine you want to order a pizza. You look up the pizza restaurant’s phone number on the Web or in a phone book, and then you walk across the room to pick up your phone and make the call. In this setting, you need to retain the number long enough to complete the dialing—and so, presumably, you’re paying attention to the number for that span of time. But you have no need to memo- rize the number for later use, and so you’re likely to think about the number in a maintenance rehearsal Mechanical limited way. Specifically, you’re likely to employ what’s called maintenance rehearsal— repetition of material without thinking a mechanical process of repeating the memory items over and over, giving little thought about its meaning or patterns. to what the items are or whether they form any pattern. This maintenance is easy and effective: It keeps the digits in your thoughts, and so you remember them long enough to place your call. But what happens if the line is busy when you call, and so you need to try again a moment later? In this setting, it’s quite likely that you’ll have forgotten the number and will need to look it up again! Apparently, maintenance rehearsal kept the number in working memory long enough for you to dial it the first time but utterly failed to establish it in long-term memory. As a result, you forget the number after just a few seconds. THE LINK BETWEEN LONG-TERM MEMORY AND UNDERSTANDING Apparently, establishing information in long-term storage is not an automatic process that is triggered merely by having the stimulus in front of your eyes or ears, or by hav- ing an idea mechanically maintained in working memory for a few seconds. Instead, 308 chapter 8 PMEMORYO some sort of work is involved so that, to put the matter simply, whether you’ll remem- ber something or not depends on how—and how fully—you thought about that infor- mation when you first met it. As we’ve seen, we can confirm these claims by documenting how poor memory is for material that you’ve encountered but not paid much attention to. Further confir- mation comes from studies that examine people’s brain activity during learning. In brief, these studies show that during the learning process, some sort of effort is cru- cial for establishing long-term memories. Specifically, the studies show that greater levels of activity during the initial memory acquisition are reliably associated with greater probabilities of recall later on. This is especially true for brain activity in the hippocampus and regions of the prefrontal cortex (Brewer, Zhao, Desmond, Glover, & Gabrieli, 1998; A. Wagner, Koutstaal, & Schacter, 1999; A. Wagner et al., 1998), but it may also include brain activity in the parietal cortex (A. Wagner, Shannon, Kahn, & Buckner, 2005). But what exactly is this brain activity accomplishing? Crucial information comes from studies that compare the memory effects of different types of engagement at the time of learning. In one study, participants were shown 48 words. As each word was presented, the participants were asked a question about it. For some words, they were asked about the word’s physical appearance (“Is it printed in capital letters?”); this kind of question should produce shallow processing—an approach emphasizing the super- shallow processing An approach to ficial characteristics of the stimulus. For other words, the participants were asked about memorization that involves focusing on the word’s sound (“Does it rhyme with train?”); this should encourage an intermediate the superficial characteristics of the stimulus, such as the sound of a word or level of processing. For the remainder, they were asked about the word’s meaning the typeface in which it’s printed. (“Would it fit into the sentence: The girl placed the ___ on the table?”); this pre- sumably would lead to deep processing—an approach to the material that emphasizes deep processing An approach to mem- what the stimulus means. orization that involves focusing on the After the participants had gone through the entire list of words, they were given an meaning of the stimulus. unexpected task: They were asked to write down as many of the words as they could remember. The results were clear-cut: Participants recalled very few of the words that called for shallow processing (capitalization). Words that required an intermediary level (sound) were recalled a bit better; and words that demanded the deepest level (meaning), were recalled best of all (Craik & Tulving, 1975). Attention to a word’s sound, therefore, is better for establishing memories than thoughtless and mechanical rehearsal; but attention to a word’s meaning is better still and, across many studies, attention to meaning is reliably associated with high levels of subsequent recall. And it’s not just the search for meaning that helps long-term memory. Instead, memory is promoted by finding the meaning—that is, by gaining an understanding of the to-be-remembered materials. In some studies, for example, experimenters have given participants material to read that was difficult to understand; then, immediately afterward, they probed the participants to see whether (or how well) they understood the material. Some time later, the experimenters tested the participants’ memory for this material. The result was straightforward: the better the understanding at the time the material was presented, the better the memory later on (e.g., Bransford, 1979). Other studies have manipulated the to-be-remembered material itself. For example, in one experiment, investigators presented this (tape-recorded) passage: The procedure is actually quite simple. First you arrange things into different groups depending on their makeup. Of course, one pile may be sufficient depending on how much there is to do. If you have to go somewhere else due to lack of facilities that is the next step; otherwise you are pretty well set. It is important not to overdo any par- ticular endeavor. That is, it is better to do too few things at once than too many. In the PAcquisitionO 309 short run this may not seem important, but complications from doing too many can easily arise. A mistake can be expensive as well. The manipulation of the appropriate mechanisms should be self-explanatory, and we need not dwell on it here. At first, the whole procedure will seem complicated. Soon, however, it will become just another facet of life. It is difficult to foresee any end to the necessity for this task in the imme- diate future, but then one never can tell. (Bransford & Johnson, 1972, p. 722) Half of the people heard this passage without any further information as to what it was about, and, when tested later, their memory for the passage was poor. The other participants, though, were given a clue that helped them to understand the passage—they were told, “The paragraph you will hear will be about washing clothes.” This clue allowed that group to make sense of the material and dramati- cally improved their later recall (Bransford & Johnson, 1972; for a related example with a nonverbal stimulus, see Figure 8.6). There’s a powerful message here for anyone hoping to remember some body of material—for example, a student trying to learn material for the next quiz. Study techniques that emphasize efforts toward understanding the material are likely to pay off with good memory later on. Memory strategies that don’t emphasize mean- mnemonics Deliberate techniques peo- ing will provide much more limited effects. Mechanical memory strategies—such as ple use to memorize new materials. repeating the items over and over without much thought—may produce no benefits at all! THE KEY ROLE FOR MEMORY CONNECTIONS Attention to meaning is an effective way to establish long-term memories. Still, it’s not the only way to establish memories, and we’ll need to accommodate this point in our theorizing. What other memory acquisition procedures are effective? We can 8.6 Nonverbal stimulus In general, we draw our answer from the study of mnemonics—deliberate techniques that people easily remember things that are meaning- use to help them memorize new materials. Mnemonics come in many varieties, but ful but don’t remember things that seem to all build on the same base: To remember well, it pays to establish memory connec- have no meaning. This picture can be used tions. In some cases, the connections link the new material to ideas already in mem- to demonstrate this point with a nonverbal stimulus. At first the picture looks like a ory. In other cases, the connections link the various elements of the new material to collection of meaningless blotches, and it’s each other, so that the mnemonic helps organize complex information into a small very hard to remember. But if viewers dis- number of memory chunks. cover the pattern, the picture becomes The role of connections is clear, for example, in the various mnemonics that rely on meaningful and is then effortlessly verse in which a fixed rhythm or rhyme scheme links each element being memorized to remembered. the other elements within the poem. Thus, young children find it easier to memorize the calendar’s layout if they cast the target information as a rhyme: “Thirty days hath September, April, June, and November,” and high- school students have an easier time memorizing the fates of Henry VIII’s wives by summarizing the history in a little verse: “divorced, beheaded, died; divorced, beheaded, survived.” Connections are also the key in other mnemonics, including ones that organize material by linking the first letters of the words in the sequence that’s being memorized. Thus, students rely on ROY G. BIV to memorize the sequence of colors in the rainbow (red, orange, yellow... ), and learn the lines in music’s treble clef via “Every Good Boy Deserves Fudge” (the lines indicate the musical notes E, G, B, D, and F). Various first-letter mnemonics are also available for memorizing the taxonomic categories (“King Philip Crossed the Ocean to Find Gold and Silver,” to memorize kingdom, 310 chapter 8 PMEMORYO phylum, class, order, family, genus, and species). And so on for other memory tasks (Figure 8.7). Still other mnemonics involve the use of mental imagery. One such technique, developed by the ancient Greeks, is the method of loci, which requires the learner to visualize each of the items she wants to remember in a different spatial location (“locus”). In recall, the learner mentally inspects each location and retrieves the item that she placed there in imagination. Does this work? In one study, college stu- dents had to learn lists of 40 unrelated nouns. Each list was presented once for about 10 minutes, during which the students tried to visualize each of the 40 objects in a specific location on their college campus. When tested immediately afterward, the students recalled an average of 38 of the 40 items; when tested one You simply associate each number with day later, they still managed to recall 34 (Bower, 1970; also see Bower, 1972; Higbee, a word, such as “table” and 3,476,029. 1977; Roediger, 1980; J. Ross & Lawrence, 1968). In other studies, participants using the method of loci were able to retain seven times more than their counter- 8.7 Memory school Some mnemonics are more successful than others. parts who learned by rote. It’s also worth mentioning that visualization is, on its own, an effective memoriza- tion tool. If you’re trying to remember a list of words, for example, it’s helpful to form a mental picture of each item on the list (a mental picture of a hammer, for example, and then a mental picture of a puppy, and so on.) Visualization is far more effective, though, if it serves to link the to-be-remembered words to each other—and so here, once again, we see the importance of memory connections. To make this idea concrete, consider a student trying to memorize a list of word pairs. He might decide just to visualize the items side by side—and so (for example), after hearing the pair eagle- train, he might visualize an eagle and then, separately, he might visualize a train. Alternatively, he might try to form mental pictures that bring the items into some kind of relationship—so he might, for example, imagine the eagle winging to its nest with a locomotive in its beak. Evidence indicates that images of the second (interacting) sort produce much better recall than nonunifying images do (Wollen, Weber, & Lowry, 1972; also Figure 8.8). Whether mnemonics are based on imagery or some other system, though, there’s no question that they are enormously useful in memorizing, say, a list of foreign vocabu- lary words or the names of various parts of the brain. But before we move on, we should note that there’s also a downside to using mnemonics: During learning, someone try- ing to memorize via a mnemonic is likely to focus all their attention on just a narrow set of connections—the fact that the locomotive is in the eagle’s beak, or that September 8.8 Interacting and noninteracting depictions Research participants shown related elements, such as a doll sitting on a chair and holding a flag (A), are more likely to recall the trio of words doll, flag, and chair than are participants shown the three objects next to each other but not interact- (A) Interactive depiction (B) Noninteractive depiction ing (B). PAcquisitionO 311 rhymes with November. This strategy guarantees that these connections will be well established; and that’s great if, later on, those connections are just the ones you need. But at the same time, if you focus on just these few connections, you’re putting little effort into developing other possible connections—so you’re not doing much to promote your understanding of the material you’re memorizing. On this basis, mnemonics—as effective as they are for memorization—are an unwise strategy if understanding is your goal. STORAGE We’ve been focusing on the first step involved in memory—namely memory acquisi- tion. Once a memory is acquired, though, it must be held in storage—i.e., held in long- term memory until it’s needed. The mental representation of this new information is memory trace The physical record in referred to as the memory trace—and, surprisingly, we know relatively little about the nervous system that preserves a exactly how traces are lodged in the brain. At a microscopic level, it seems certain that memory. traces are created through the three forms of neural plasticity described in Chapter 7: memory consolidation The biological Presynaptic neurons can become more effective in sending signals; postsynaptic process through which memories are neurons can become more sensitive to the signals they receive; and new synapses can transformed from a transient and fragile be created. status to a more permanent and robust On a larger scale, evidence suggests that the trace for a particular past experience is state; according to most researchers, not recorded in a single location within the brain. Instead, different aspects of an event consolidation occurs over the course of several hours. are likely to be stored in distinct brain regions—one region containing the visual ele- ments of the episode, another containing a record of our emotional reaction, a third retrograde amnesia A memory deficit, area containing a record of our conceptual understanding of the event, and so on (e.g., often suffered after a head injury, in A. Damasio & H. Damasio, 1994). But, within these broad outlines, we know very little which the patient loses memory for about how the information content of a memory is translated into a pattern of neural events that occurred before the injury. connections. Thus, to be blunt, we are many decades away from the science-fiction notion of being able to inspect the wiring of someone’s brain in order to discover what he remembers, or being able to “inject” a memory into someone by a suitable rearrange- ment of her neurons. (For a recent hint about exactly how a specific memory might be encoded in the neurons, see Han et al., 2009.) One fact about memory storage, however, is well established: Memory traces aren’t created instantly. Instead, a period of time is needed, after each new experience, for the record of that experience to become established in memory. During that time, memory consolidation is taking place; this is a process, spread over several hours, in which memories are transformed from a transient and fragile status to a more perma- nent and robust state (Hasselmo, 1999; McGaugh, 2000, 2003; Meeter & Murre, 2004; Wixted, 2004). What exactly does consolidation accomplish? Evidence suggests that this time period allows adjustments in neural connections, so that a new pattern of communica- tion among neurons can be created to represent the newly acquired memory. This process seems to require the creation of new proteins, so it is disrupted by chemical manipulations that block protein synthesis (H. Davis & Squire, 1984; Santini, Ge, Ren, deOrtiz, & Quirk, 2004; Schafe, Nader, Blair, & LeDoux, 2001). The importance of consolidation is evident in the memory loss sometimes produced by head injuries. Specifically, people who have experienced blows to the head can develop retrograde amnesia (retrograde means “in a backward direction”), in which they suffer a loss of memory for events that occurred before the brain injury (Figure 8.9). This form of amnesia can also be caused by brain tumors, diseases, or 312 chapter 8 PMEMORYO strokes (Cipolotti, 2001; M. Conway & Fthenaki, 1999; Kapur, 1999; Moment of brain injury Mayes, 1988; Nadel & Moscovitch, 2001). Retrograde amnesia usually involves recent memories. In fact, the Time older the memory, the less likely it is to be affected by the amnesia—a Period for which Period for which pattern referred to as Ribot’s law, in honor of the 19th-century scholar retrograde amnesia anterograde amnesia who first discussed it (Ribot, 1882). What produces this pattern? disrupts memory disrupts memory Older memories have presumably had enough time to consolidate, so 8.9 Retrograde and anterograde amne- they are less vulnerable to disruption. Newer memories are not yet consolidated, so sia Retrograde amnesia disrupts memory they’re more liable to disruption (A. Brown, 2002; Weingartner & Parker, 1984). for experiences before the injury, accident, There is, however, a complication here: Retrograde amnesia sometimes disrupts a or disease that triggered the amnesia. person’s memory for events that took place months or even years before the brain Anterograde amnesia disrupts memory for injury. In these cases, interrupted consolidation couldn’t explain the deficit unless experiences after the injury or disease. one assumes—as some authors do—that consolidation is an exceedingly long, drawn-out process. (For discussion of when consolidation takes place, and how long it takes, see Hupbach et al., 2008; McGaugh, 2000.) However, this issue remains a point of debate, making it clear that we haven’t heard the last word on how consoli- dation proceeds. RETRIEVAL When we learn, we transfer new information into our long-term store of knowl- edge, and then we consolidate this newly acquired information. But we still need one more step in this sequence, because memories provide no benefit for us if we can’t retrieve them when we need them. Hence retrieval—the step of locating and retrieval The process of searching for a activating information in memory—is crucial. Moreover, the success of retrieval is memory and finding it. far from guaranteed, and many cases of apparent “forgetting” can be understood as tip-of-the-tongue (TOT) effect The retrieval failures—cases in which the information is in your memory, but you fail to condition in which one remains on the locate it. verge of retrieving a word or name but continues to be unsuccessful. Partial Retrieval Retrieval failure can be documented in many ways—including the fact that sometimes we remember part of the information we’re seeking, but we can’t recall the rest. This pat- tern can arise in many circumstances, but it’s most clearly evident in the phenomenon psychologists call the tip-of-the-tongue (TOT) effect. Try to think of the word that means “to formally renounce the throne.” Try to think of the name of the Russian sled drawn by three horses. Try to think of the word that describes someone who, in general, does not like other people. Chances are that, in at least one of these cases, you found yourself in a frustrated state: certain you know the word but unable to come up with it. The word was, as people say, right on the “tip of your tongue.” People who are in the so-called TOT state can often remember roughly what the word sounds like—and so, when they’re struggling to recall abdicate, they might remember abrogate or annotate instead. Likewise, they can often recall what letter the word begins with, and how many syllables it has, even though they can’t recall the word itself (A. Brown, 1991; R. Brown & McNeill, 1966; Harley & Bown, 1998; L. James & Burke, 2000; B. Schwartz, 1999). Similar results have been obtained when people try to recall specific names—for example, what is the capital of Nicaragua? Who was the main character in the movie The PRetrievalO 313 Matrix? In response to these questions, people can often recall the number of syllables in the target name and the name’s initial letter, but not the name itself (Brennen, Baguley, Bright, & Bruce, 1990; Yarmey, 1973). They also can often recall related mate- rial, even if they can’t remember the target information. (Thus, they might remember Morpheus, but not the main character, from The Matrix; the main character, of course, was Neo. And the Russian sled is a troika; it’s a misanthrope who doesn’t like other peo- ple; Nicaragua’s capital is Managua.) People in the TOT state cannot recall the target word, but the word is certainly in their memory. If it weren’t, they wouldn’t be able to remember the word’s sound, or its starting letter and syllable count. What’s more, people often recognize the word when it’s offered to them (“Yes! That’s it!”). This is, therefore, unmistakably a case of retrieval failure—the information is preserved in storage, but for various reasons it is inaccessible. Effective Retrieval Cues Retrieval failure is also clearly the problem whenever you seem to have forgotten some- retrieval cue A hint or signal that helps thing, but then recall it once you’re given an adequate retrieval cue. A clear illustration one to recall a memory. of this pattern often arises when someone returns to his hometown after a long absence. This return can unleash a flood of recollection, including the recall of many retrieval paths The mental connec- tions linking one idea to the next that details the person thought he’d forgotten long ago. Since these memories do surface, people use to locate a bit of information triggered by the sights and sounds of the hometown, there’s no doubt about whether in memory. the memories were established in the first place (obviously, they were) or lost from stor- age (obviously, they weren’t). Only one explanation is possible, therefore, for why the memories had been unavailable for so many years prior to the person’s return to his hometown. They were in memory, but not findable—exactly the pattern we call retrieval failure. Why do some retrieval cues (but not others) allow us to locate seemingly long-lost memories? One important factor is whether the cue re-creates the context in which the original learning occurred. This is obviously the case in returning to your hometown— you’re back in the context in which you had the experiences you’re now remembering. But the same broad point can be documented in the lab; and so, for example, if an indi- vidual focused on the sounds of words while learning them, then she would be well served by reminders that focus on sound (“Was there a word on the list that rhymes with log?”); if she focused on meaning while learning, then the best reminder would be one that again draws her attention toward meaning (“Was one of the words a type of fruit?”; R. Fisher & Craik, 1977). The explanation for this pattern lies in our earlier discussion of memory connec- tions. Learning, we suggested, is essentially a process of creating (or strengthening) connections that link the to-be-remembered material to other things you already know. But what function do these connections serve? When the time comes to recall something, the connections serve as retrieval paths—routes that lead you back to the desired information. Thus, if you noticed in a movie that Jane’s smile caused Tarzan to howl, this will create a link between your memory of the smile and your memory of the howl. Later on, thinking about the smile will bring Tarzan’s howl into your thoughts—and so your retrieval is being guided by the connection you estab- lished earlier. On this basis, let’s think through what would happen if a person studied a list of words and focused, say, on the sound of the words. This focus would establish certain connections—perhaps one between dog and log, and one between paper and caper. These connections will be useful if, later, this person is asked questions about rhymes. 314 chapter 8 PMEMORYO If she’s asked, “Was there a word on the list that rhymes with log?” the connection now in place will guide her thoughts to the target word dog. But the same connection will play little role in other situations. If she’s asked, “Did any of the words on the list name animals with sharp teeth?” the path that was established during learning—from log to dog—is much less helpful; what she needs with this cue is a retrieval path leading from sharp teeth to the target. The impact of these same retrieval cues would be different, though, if the person had thought about meaning during learning. This focus would have created a different set of connections—perhaps one between dog and wolf. In this case, the “rhymes with log?” cue would likely be ineffective, because the person has established no connection with log. A cue that focused on meaning, however, might trigger the target word. Overall, then, an effective retrieval cue is generally one that takes advantage of an already established connection in memory. We’ve worked through this issue by point- ing to the difference between meaning-based connections and sound-based connec- tions, but the same point can be made in other ways. In one experiment, the researchers asked deep-sea divers to learn various materials. Some of the divers learned the material while sitting on land by the edge of the water. Others learned the material while 20 feet underwater, hearing the material via a special communication set. Within each of these two groups, half of the divers were then tested while above water, and half were tested below (Godden & Baddeley, 1975). Imagine that you’re a diver in the group that learned while underwater. In this setting, the world has a different look and feel than it does above water: The sound of your breathing is quite prominent; so is the temperature. As a result, you might end up thinking about your breathing (say) during learning, and this will likely cre- ate memory connections between these breathing thoughts and the materials you’re learning. If you are then back underwater at the time of the memory test, the sound of your breathing will again be prominent, and this may lead you back into the same thoughts. Once thinking these thoughts, you will benefit from the memory connec- tion linking the thoughts to the target materials—and so you’ll remember the mate- rials. In contrast, if you’re on land during the memory test, then the sound of breathing is absent, and so these thoughts won’t be triggered and the connections you established earlier will have no influence. We might therefore expect the divers who learned underwater to remember best if tested underwater; this setting increases their chances of benefiting from the memory connections they established during learning. Likewise, the divers who learned on land should do best if tested on land. And that’s exactly what the data show (Figure 8.10). Related examples are easy to find. Participants in one study were asked to read an article similar to those they routinely read in their college classes; half read the article in a quiet setting, and half read it in a noisy environment. When tested later, those who read the article in quiet did best if they were tested in quiet; those who read it in a noisy environment did best if tested in a noisy setting (Grant et al., 1998). In both cases, par- ticipants showed the benefit of being able to use, at time of retrieval, the specific con- nections established during learning. In case after case, then, it’s helpful, at the time of memory retrieval, to return to the context of learning. Doing this will encourage some of the same thoughts that were in place during learning, and so will allow you to take advantage of the connections link- ing those thoughts to the target material. This broad pattern is referred to as a benefit context reinstatement A way of of context reinstatement—a benefit of re-creating the state of mind you were in dur- improving retrieval by re-creating the ing learning. state of mind that accompanied the ini- Let’s also note that, in these experiments, the physical setting (noisy or not; tial learning. underwater or above) seems to have a powerful influence on memory. However, PRetrievalO 315 8.10 SCIENTIFIC METHOD: Is memory enhanced when the recall situation is similar to the encoding situation? Method Results 1. One group of divers learned a word list on land. Another group of Divers who learned underwater recalled more words divers learned a word list underwater. underwater, and those who studied on land tested better on land. 14 Test environment Land Underwater 12 Mean words recalled 10 8 6 4 2. Each group was tested in both environments for recall of the list items. 2 0 Studied on land Studied underwater CONCLUSION: Information is best recalled in the environment where it is learned. SOURCE STUDY: Godden & Baddeley, 1975 evidence suggests that the physical setting matters only indirectly: A return to the phys- ical circumstances of learning does improve recollection, but only because this return helps re-create the mental context of learning—and it’s the mental context that mat- ters. This was evident, for example, in a study in which participants were presented with a long list of words. One day later, the experimenter brought the participants back for an unexpected recall test that took place in either the same room or a different one (one 8.11 Context reinstatement for students that differed in size, furnishings, and so on, from the context of learning). Not surpris- These students are probably forming con- nections between the material they’re ingly, recall was better for those who were tested in the same physical environment— learning and library-related cues. To help documenting, once again, the benefit of context reinstatement. Crucially, though, the themselves recall this material later on, investigator found a straightforward way of eliminating the difficulty caused by an they’ll want to think about what the environmental change: A different group of participants were brought to the new room; library looked like and how they felt in but just before the test, they were asked to think about the room in which they had that environment. learned the lists—what it looked like, how it made them feel. By doing so, they men- tally re-created the old environment for themselves; on the subsequent recall test, these participants performed just as well as those who were tested in their original room (S. Smith, 1979; S. Smith & Vela, 2001; Figure 8.11). Apparently, then, what mat- ters for retrieval is your mental perspective, not the room you’re sitting in. If you change the physical context without changing your mental perspective, the physical relocation has no effect. Encoding Specificity The effectiveness of context reinstatement also tells us something important about how materials are recorded in memory in the first place. When people encounter some stim- ulus or event, they think about this experience in one way or another; and as we’ve 316 chapter 8 PMEMORYO described, this intellectual engagement serves to connect the new experience to other thoughts and knowledge. We’ve been discussing how these connections serve as retrieval paths, helping people to recall the target information, but let’s now add that this is possible only because those connections are themselves part of the memory record. Thus, continuing an earlier example, if people see the word dog and think about what it rhymes with, what ends up being stored in memory is not just the word. What’s stored must be the word plus some record of the connections made to rhyming words—otherwise, how could these connections influence retrieval? Likewise, if people see a picture and think about what it means, what’s stored in memory is not just the pic- ture, but a memory of the picture together with some record of the connections to other, related ideas. In short, what’s placed in memory is not some neutral transcription of an event. Instead, what’s in memory is a record of the event as understood from a particular perspec- tive or perceived within a particular context. Psychologists refer to this broad pattern as encoding specificity—based on the idea that what’s recorded in memory is not just a encoding specificity The hypothesis “copy” of the original, but is instead encoded from the original (in other words, it’s trans- that when information is stored in mem- lated into some other form) and is also quite specific (and so represents the material plus ory, it is not recorded in its original form but translated (“encoded”) into a form your thoughts and understanding of the material; Tulving & Osler, 1968; Tulving & that includes the thoughts and under- Thomson, 1973; also Hintzman, 1990). standing of the learner. This specificity, in turn, has powerful effects on retrieval—that is, on how (or whether) the past is remembered. For example, participants in one study read target words (e.g., piano) in either of two contexts: “The man lifted the piano” or “The man tuned the piano.” These sentences led the participants to think about the target word in a particular way, and it was then this line of thinking that was encoded into each per- son’s memory. Thus, continuing the example, what was recorded in memory was the idea of “piano as something heavy” or “piano as a musical instrument.” This difference in memory content became clear when participants were later asked to recall the target words. If they had earlier seen the “lifted” sentence, then they were quite likely to recall the target word if given the hint “something heavy.” The hint “something with a nice sound” was much less effective. But if participants had seen the “tuned” sentence, the result reversed: Now the “nice sound” hint was effective, but the “heavy” hint was not (Barcklay, Bransford, Franks, McCarrell, & Nitsch, 1974). In both cases, the memory hint was effective only if it was congruent with what was stored in memory—just as the encoding specificity proposal predicts. This notion of encoding specificity is crucial in many contexts. For example, imag- ine two friends who have an argument. Each person is likely to interpret the argument in a way that’s guided by his own position—and so he’ll probably perceive his own remarks to be clear and persuasive, and his friend’s comments to be muddy and evasive. Later on, how will each friend remember the event? Thanks to encoding specificity, what each person places in memory is the argument as he understood it. As a result, we really can’t hope for a fully objective, impartial memory, one that might allow either of the friends to think back on the argument and perhaps reevaluate his position. Instead, each will, inevitably, recall the argument in a way that’s heavily colored by his initial leaning. MEMORY GAPS, MEMORY ERRORS The processes we’ve been discussing—acquisition, storage, and retrieval—function extremely well in a huge range of circumstances. As a result, each of us can learn an enormous quantity of information, store that information for a long time, and then PMemory Gaps, Memory ErrorsO 317 swiftly retrieve the information when we need it. But of course there are times when remembering is less successful. Sometimes we try to remember an episode but simply draw a blank. Sometimes we recall something, but with no conviction that we’re cor- rect: “I think it happened on Tuesday, but I’m not sure.” And sometimes our memories fail us in another way: We recall a past episode, but it turns out that our memory is mistaken. Perhaps details of the event were different from the way we recall them; perhaps our memory is altogether wrong, misrepresenting large elements of the original episode. Why, and how often, do these memory failures occur? Forgetting There are many reasons why we sometimes cannot recall past events. In many cases, as we’ve noted, the problem arises because we didn’t learn the relevant information in the first place! In other cases, though, we learn something—a friend’s name, the lyrics to a song, the content of the Intro Bio course—and can remember the information for a while; but then, sometime later, we’re unable to recall the information we once knew. What produces this pattern? retention interval The time that One clue comes from the fact that it’s almost always easier to recall recent events elapses between learning and retrieval. (e.g., yesterday’s lecture or this morning’s breakfast) than it is to recall more distant events (a lecture or a breakfast 6 months ago). In technical terms, recall decreases, and forgetting curve The graphic pattern representing the relationship between forgetting increases, as the retention interval (the time that elapses between learning measures of learning and the length of and retrieval) grows longer and longer. the retention interval: As the retention This simple fact has been documented in many studies; indeed, the passage of interval gets longer, memory decreases. time seems to work against our memory for things as diverse as past hospital stays, our eating or smoking habits in past years, car accidents we experienced, our con- sumer purchases, and so on (Jobe, Tourangeau, & Smith, 1993). The classic demon- stration of this pattern, though, was offered more than a century ago by Hermann 100 Ebbinghaus (1850–1909). Ebbinghaus systematically studied his own memory in a series of careful experiments, examining his ability to retain lists of nonsense sylla- 80 bles, such as zup and rif. (Ebbinghaus relied on these odd stimuli as a way of making sure he came to the memory materials with no prior associations or links; that way, Percent saving 60 he could study how learning proceeded when there was no chance of influence from prior knowledge.) Ebbinghaus plotted a forgetting curve by testing himself at vari- 40 ous intervals after learning (using different lists for each interval). As expected, he found that memory did decline with the passage of time. However, the decline was 20 uneven; it was sharpest soon after the learning and then became more gradual (Ebbinghaus, 1885; Figure 8.12). 0 There are two broad ways to think about the effect of retention interval. One per- 0 1 8 24 48 Retention interval (hours) spective emphasizes the passage of time itself—based on the idea that memories decay as time passes, perhaps because normal metabolic processes wear down the memory 8.12 Forgetting curve The figure shows traces until they fade and finally disintegrate. A different perspective suggests that time retention after various intervals since itself isn’t the culprit. What matters instead is new learning—based on the idea that new learning. Retention is here measured in information getting added to long-term memory somehow disrupts the old informa- percent saving—that is, the percentage tion that was already in storage. We’ll need to sort through why this disruption might decrease in the number of trials required to relearn the list after an interval of no happen; but notice that this perspective, too, predicts that longer retention intervals practice. If the saving is 100%, then will lead to more forgetting—because longer intervals provide more opportunity for retention is perfect; no trials to relearn are new learning and thus more disruption from the new learning. necessary. If the saving is 0%, there’s no Which perspective is correct? Is forgetting ultimately a product of the passage of time, or retention at all; it takes just as many trials a product of new learning? The answer is both. The passage of time, by itself, does seem to to relearn the list as it took to learn it erode memories (e.g., E. Altmann & Gray, 2002; C. Bailey & Chen, 1989; Wixted, 2004); initially. but the effect of new learning seems larger. For example, Baddeley and Hitch (1977) asked 318 chapter 8 PMEMORYO rugby players to recall the names of the other teams they had played 100 against over the course of a season; the researchers then systematically Percentage of team names compared the effect of time with the effects of new learning. To examine 80 recalled correctly the effects of time, Baddeley and Hitch capitalized on the fact that not 60 all players made it to all games (because of illness, injuries, or schedule conflicts). These differences allowed them to compare players for 40 whom “two games back” means 2 weeks ago, to players for whom “two games back” means 4 weeks ago. Thus, they were able to look at the 20 effects of time (2 weeks vs. 4) with the number of more recent games held constant. Likewise, to examine the effects of new learning, these 0 researchers compared (say) players for whom the game a month ago 0 1 3 5 7 9 11 13 15 17 was “three games back” to players for whom a month ago means “one Number of intervening games played game back.” Now we have the retention interval held constant, and we 8.13 Forgetting from interfering events can look at the effects of intervening events. In this setting, Baddeley and Hitch report that Members of a rugby team were asked to the mere passage of time accounts for very little; what really matters is the number of inter- recall the names of teams they had played vening events—just as we’d expect if intervening learning, and not decay, is the major con- against. Their performance was heavily tributor to forgetting (Figure 8.13). (For other—classic—data on this issue, see Jenkins & influenced by the number of games that Dallenbach, 1924; for a more recent review, see Wixted, 2004.) intervened between the game to be An effect of new learning undoing old learning can also be demonstrated in the lab- recalled and the attempt to remember. oratory. In a typical study, a control group learns the items on a list (A) and then is tested after a specified interval. The experimental group learns the same list (A), but they must also learn the items on a second list (B) during the same retention interval. The result is a marked inferiority in the performance of the experimental group. List B seems to interfere with the recall of list A (Crowder, 1976; McGeoch & Irion, 1952). Of course, not all new learning produces this disruption. No interference is observed, for example, between dissimilar sorts of material—and so learning to skate doesn’t undo your memory for irregular French verbs. In addition, if the new learning is consistent with the old, then it certainly doesn’t cause disruption; instead, the new learning actually helps memory. Thus, learning more algebra helps you remember the algebra you mastered last year; learning more psychology helps you remember the psy- chology you’ve already covered. Memory Intrusions We still need to ask why new learning seems sometimes to disrupt old. Why can’t the newly acquired information peacefully coexist with older memories? In fact, there are several reasons. In some cases, the new information simply sits side by side with old memories, creating a danger that you’ll get mixed up about which is which—recalling the newer information when you’re trying to come up with the older. In other cases, the new information may literally replace the old memory, much as you delete an old version of a paper from your computer’s hard drive once you’ve created a newer, updated version. In most experiments, it’s difficult to distinguish these two possibilities—that is, to tell whether the new information is merely competing with the old information, or whether it has literally replaced the old information. In either case, though, the new material will lead to intrusion errors—mistakes about the past in which other infor- intrusion errors Memory mistakes in mation is mixed into (intrudes into) your recall. These intrusions are often small (so which elements that were not part of the that you recall having a cheese sandwich when you really had a salad) but can some- original information get mixed into (“intrude” into) someone’s recall. times be quite large: People may confidently, vividly recall a past event that never took place at all. PMemory Gaps, Memory ErrorsO 319 THE MISINFORMATION EFFECT Intrusion errors can ar