Chapter 1: The Evolution of Psychological Science - PDF
Document Details
![FresherHyperbolic](https://quizgecko.com/images/avatars/avatar-17.webp)
Uploaded by FresherHyperbolic
UdeM
Tags
Related
- Psychology 1001 Lecture 2: The History and Philosophical Foundations of Psychology PDF
- The History of Psychology - Lecture Notes PDF
- History Of Psychology Chapter 1 PDF
- PSYCH 1020H Introduction to Psychology I Notes PDF
- Psychology 1A: The Missing History of Psychology PDF
- Dispensa 1 - Cenni storici e metodologici (Psicologia Cognitiva) PDF
Summary
This chapter provides an overview of the evolution of psychological science, focusing on its philosophical underpinnings. It discusses key concepts like dualism and materialism, realism and idealism, and empiricism and nativism. The chapter highlights significant contributors to the field, including thinkers like William James and René Descartes.
Full Transcript
C 1 The Evolution of Psychological Science Psychology’s Philosophical Roots The Late 1800s: Toward a Science of the Mind The Early 1900s: Psychoanalysis and Behaviorism The Early 1900s: Resistance to Behaviorism The Late 1900s: The Cognitive Revolution...
C 1 The Evolution of Psychological Science Psychology’s Philosophical Roots The Late 1800s: Toward a Science of the Mind The Early 1900s: Psychoanalysis and Behaviorism The Early 1900s: Resistance to Behaviorism The Late 1900s: The Cognitive Revolution The Early 2000s: New Frontiers Becoming a Psychologist IN 1860, Abraham Lincoln became the president of the United States, the Pony Express began delivering mail between Missouri and California, and an 18-year-old named William James (1842–1910) started worrying about what to do with the rest of his life. He had hoped to become an artist, but a er studying for several months with a famous painter in Rhode Island, he was forced to admit that he wasn’t all that talented. At his father’s urging, he went to college to study chemistry, then switched to physiology and then to medicine, only to find that those subjects didn’t interest him much. So he headed to Germany, where he began learning about a new science called psychology (from a combination of the Greek psyche which means “soul” and logos which means “to study”). A er two years in Europe, William returned to America, finished his degree, and took a job as a teacher at Harvard University. There, in the classroom, amidst the blackboards and the chalk, surrounded by bright students who were eager to learn about the new science of psychology, William finally found what he had been searching for all along. “So far,” he wrote to his brother a er his first year as a teacher, “I seem to have succeeded in interesting them … and I hear expressions of satisfaction on their part.”1 And then, with characteristic understatement, he added, “I should think it not unpleasant as a permanent thing.” And a permanent thing it became. William remained a teacher at Harvard for the next 35 years, where he taught one of the first psychology courses and created one of the first psychology laboratories in America. He also wrote the first American psychology textbook, The Principles of Psychology. As the historian E. G. Boring (1929, p. 624) later wrote, “No other psychological treatise in the English language has in the modern period had such a wide and persistent influence.” Today, William James is considered the father of American psychology and his brilliant book is still widely read. William James is generally considered to be the father of American psychology. Throughout his illustrious career, James remained a devoted and beloved teacher who was “so vivacious and humorous that one day a student interrupted and asked him to be serious for a moment” (Hunt, 2007, p. 169). When he gave his final lecture on January 22, 1907, his classroom was packed with students, former students, colleagues, and administrators. “This is no science,” James said of psychology in 1892, “it is only the hope of a science.” And at that time, he was right. But now, more than a century later, that hope has been realized and the book you hold in your hand is the evidence. How did it happen? How did we get here from there? How did the psychology taught in William James’s classroom become the psychology taught in yours? This chapter tells that story. ALONG THE WAY, WE’LL GET TO KNOW SOME OF THE MANY PEOPLE, past and present, who have helped to shape the field of psychology. Today, those people are remarkably diverse 70% of the PhDs in psychology from American universities are earned by women, and 30% are earned by people of color. But it wasn’t always so. One of the things you will notice in this chapter is that the history of psychology is mainly the story of white men from Europe, and later, the United States. That’s because until the last half of the 20th century arrived, everyone else’s opportunities to contribute to psychology’s development were severely limited by gender norms, social conventions, sexism, and racism. History is the story of the past, and the past can’t be changed but as you’ll see at the end of this chapter, things have changed dramatically in the last few decades — and decidedly for the better. Psychology’s Philosophical Roots Learning Outcomes W Y N T K Explain the distinction between dualism and materialism. Explain the distinction between realism and idealism. Explain the distinction between empiricism and nativism. Psychology is the scientific study of mind and behavior. The word mind refers to a set of private events that happen inside a person — the thoughts and feelings that we experience at every moment but that no one else can see — and the word behavior refers to a set of public events — the things we say and do that can potentially be observed by others. Both human minds and human behaviors have been around for quite a while, and psychologists were not the first to try to make sense of them. That distinction belongs to philosophers, who have been thinking deeply about these topics for several thousand years. Dualism and Materialism Our bodies are physical objects that can be seen, smelled, and touched. Our minds are not. You can’t hear an emotion or taste a belief. The philosopher René Descartes (1596–1650) thought that the body is made of a material substance, the mind is made of an immaterial substance, and every person is therefore a material container of an immaterial thing — or what later philosophers called the “ghost in the machine” (Ryle, 1949). Descartes’s position is known as philosophical dualism, which is the view that mind and body are fundamentally different things. René Descartes was a dualist who believed that the mind was a nonphysical entity that met the physical body in a structure called the pineal gland (shown in the diagram above from his 1662 book Treatise of Man). But Thomas Hobbes was a materialist who thought that the idea of a “substance” that was distinct from the body was a contradiction in terms because “substance and body signify the same thing.” But if the mind and the body are fundamentally different things, then how do they interact? How does the immaterial mind tell the material body to put its best foot forward? And when the material body steps on a rusty nail, why does the immaterial mind feel pain? Philosophers such as Thomas Hobbes (1588–1679) argued that the mind and body aren’t fundamentally different things at all. Rather, the mind is what the brain does. From Hobbes’s perspective, looking for a place in the brain where the mind meets the body is like looking for the place on your phone where the picture meets the screen. The picture is what the screen does, and they don’t “meet” in some third place. The brain is a physical object whose activity is known as “the mind,” and therefore all mental phenomena — every thought and feeling, every sight and sound — is the result of some physical activity in the physical brain. Philosophical materialism is the view that all mental phenomena are reducible to physical phenomena. So which philosopher was right? This is just one of those debates in which people choose their own sides, and most of the world’s religions — from Christianity and Judaism to Hinduism and Islam — have chosen to side with the dualists and embrace the notion of a nonphysical soul. But most psychologists have gone the other way and have embraced materialism (Ecklund et al., 2007). They believe that all mental phenomena — from attention and memory to belief and emotion — are ultimately explainable in terms of the physical processes that produce them. The mind is what the brain does — nothing less and certainly nothing more. We are remarkably complex machines whose operations somehow give rise to consciousness, and one of psychology’s jobs is to figure out what that “somehow” is. Realism and Idealism You probably have the sense that “you” are somewhere inside your skull and that right now you are “looking out” through your eye sockets and reading the words on this page. It feels as though our eye is some sort of camera, and that “you” are “in here” seeing pictures of the things “out there.” The philosopher John Locke (1632–1704) referred to this idea as philosophical realism, which is the view that our perceptions of the physical world are a faithful copy of information from the world that enters our brains through our sensory apparatus. According to this account, light is right now bouncing off the text in front of you and hitting your eye, and that light contains all the information necessary to produce your perception of the text you are seeing. In essence, your eye is a camera that sends a picture of the world to your brain. John Locke was a British philosopher and physician who championed philosophical realism. Locke was also a political theorist whose writings about the separation of church and state, religious freedom, and liberty strongly influenced America’s founding fathers, such as Thomas Jefferson, who incorporated Locke’s phrase “the pursuit of happiness” into the Declaration of Independence. The philosopher Immanuel Kant (1724–1804) disagreed. He suggested that our perceptions of the world are less like photographs and more like paintings. Philosophical idealism is the view that our perceptions of the physical world are our brain’s best interpretation of the information that enters through our sensory apparatus. According to the idealist account, light is bouncing off the text and is hitting your eye, and your brain is using that information — plus all the other information it has about the world — to produce your perception of the text. It is not just reflecting the information that entered your eye, but interpreting it, and although your perception of the object feels like a perfect copy of the object itself, it is actually a painting of what your brain believes the object looks like. So which philosopher was right? Modern psychology has come down strongly on the side of idealism. As you will see in many of the upcoming chapters, your perception of the world is an inference — your brain’s best guess about what’s probably out there. Because your brain is such a good guesser and such a fast guesser, you typically don’t realize it is guessing at all. You feel like your eye is a camera that shoots super-high-definition video, but that’s only because the artist between our ears can paint so realistically and at lightning speed. Empiricism and Nativism Here are some things you know about coffee mugs: You know that four coffee mugs are more than two, that a coffee mug can’t pass through a solid brick wall, that if you push a coffee mug off a table it will fall down rather than up. How do you know all this stuff? Philosophical empiricism is the view that all knowledge is acquired through experience. Philosophers such as Locke believed that a newborn baby is a tabula rasa, or “blank slate” upon which experience writes its story. In other words, you know about coffee mugs — and about coffee pots and coffee tables and coffee ice cream and a huge number of other objects that don’t involve coffee — because you’ve seen them, or interacted with them, or seen someone else interact with them. Kant thought Locke was wrong about this, too. Philosophical nativism is the view that some knowledge is innate rather than acquired. Kant argued that human beings must be born with some basic knowledge of the world that allows them to acquire additional knowledge of the world. A er all, how could you learn that pushing a coffee mug off a table causes it to fall if you didn’t already know what causation was? The fact that you can acquire knowledge about what coffee mugs do when pushed suggests that your mind came with at least a few bits of knowledge already programmed into it. For Kant, those few preprogrammed bits of knowledge were concepts such as space, time, causality, and number. You can’t learn these concepts, Kant argued, and yet you have to have them in order to learn anything else, which means that they must come factory installed. Which philosopher was right? Most modern psychologists embrace some version of nativism. It is obvious that much of what we know is acquired through experience. But research suggests that at least some of what we know is indeed hard-wired into our brains, just as Kant suspected. As you’ll see in the Development chapter, even newborn infants seem to have some basic knowledge of the laws of physics and mathematics. The tabula is not rasa and the slate is not blank, which leads to some interesting questions, such as What exactly is written on the slate at birth? How and when in our evolutionary history did it get there? Can experience erase the slate as well as write on it? Psychologists refer to these types of questions as “nature-versus-nurture” questions, and as you will see in some of the upcoming chapters, they have a host of techniques for answering them. Other Voices How Should We Judge Historical Figures? This chapter celebrates the ideas of several historical figures who were inarguably brilliant, but who held some opinions that today most of us find abhorrent. Immanuel Kant’s development of idealism was a stroke of genius, but that same genius argued that women were incapable of reasoning from moral principles and then went on to discuss at some length which Africans made the best slaves. How should we think about people like Kant, whose intellectual contributions are as undeniable as his sexist and racist views? Writer and director, Paul Ratner, believes this question has no easy answer, and he grapples with it by considering the controversy over the removal of confederate statues from public places. Paul Ratner is a writer, educator, and filmmaker whose award-winning films include “Moses on the Mesa” and “The Caveman of Atomic City.” How much are current American citizens responsible for the sins of their ancestors? Which men (and yes, mostly these are men) are allowed to stay up as bronze reminders of some heroic past, and which ones need to finally go to the far reaches of our collective unconscious? Do Confederate monuments and statues deserve to stay as part of the legacy of the South, or does it make any sense that a period of history that lasted about 5 years and produced attitudes that were actually defeated in a bloody Civil War is allowed to percolate in the minds of the population? There is a big danger, on the other hand, that as the conversation turns to exorcising ghosts of currently unpopular attitudes, we are doing it through the lens of “presentism.” It’s a bias of judging the behavior of historical people through the standards of today. Oxford helpfully defines it as “uncritical adherence to present-day attitudes, especially the tendency to interpret past events in terms of modern values and concepts.” We tend to view our present time as the best, most advanced socially and intellectually. And as such judge all others as inferior. While that may be true (certainly debatable), it’s unfair to view how people reacted to situations around them within the constraints and prejudices of the society of their day. It’s probably how people of a couple of hundred years from now will judge us, who still eat meat, as some kind of utter barbarians, a lesser human. Our present knowledge comes at the heels of wisdom gathered by generations before us. It is accumulated over time and by that standard should be richer, informed by greater experience and examination. Yet, is it fair to say that a person living 150 years ago should not have had the attitude shared by most people of their time, who only knew what they could know by that point in history? The intelligence of societies grows not only intellectually, reshaping their governments, but emotionally. It has taken the world a while to grow in that regard, to become mature in empathy and it’s obviously nowhere near where it should be in such evolution. As it is biased to judge a person from a different era for not having the moral foresight to stand up to their peers and end the tyranny of injustice, that is also no excuse to celebrate attitudes and statements that go squarely against what we believe in now. […] Respecting and learning from historical figures is extremely useful and necessary, but putting anyone up on a pedestal is generally a losing proposition. Eastern Europe saw a whole century of statues being torn down every few years in the 1900s — from monarchic rulers to Communist heroes who would fall in and out of favor, then a whole period of pulling down Lenin and Stalin figures in the 90’s. Western Europe had its own idol carousel. Many other countries across the world, who’ve had tumultuous histories and had to undergo historical reckonings did the same. It’s a process that happens in societies that experience change. Of course, the big question is — how far should this go? Reproduced with permission of Big Think. The Late 1800s: Toward a Science of the Mind Learning Outcomes W Y N T K Define introspection and explain how it was used in structuralism. Define natural selection and explain how it influenced functionalism. Psychology’s philosophical roots go back thousands of years, but its history as an independent science began a mere 150 or so years ago, when a few German scientists began to wonder whether the methods of the physical and natural sciences might be used to study the human mind. Structuralism: What Is the Mind Like? During his visit to Berlin in 1867, William James sent a letter to a friend: It seems to me that perhaps the time has come for psychology to begin to be a science…. I am going on to study what is already known and perhaps may be able to do some work at it. Helmholtz and a man called Wundt at Heidelberg are working at it, and I hope I live through this winter to go to them in the summer. Hermann von Helmholtz (1821–1894) was a physician and physicist who mainly studied the mathematics of vision, but who had taken to asking people to close their eyes and respond as quickly as possible when he touched different parts of their legs. That’s not as creepy as it sounds. Helmholtz recorded each person’s reaction time, or the amount of time between the onset of a stimulus and a person’s response to that stimulus, and discovered that people generally took longer to respond when he touched their toes than when he touched their thighs. Why? When something touches your body, your nerves transmit a signal from the point of contact to your brain, and when that signal arrives at your brain, you “feel” the touch. Because your thighs are closer to your brain than your toes are, the signal from your thigh has a shorter distance to travel. By carefully measuring how long it took people to feel a thigh touch and a toe touch and then comparing the two measurements, Helmholtz was able to do something remarkable: He calculated the speed at which nerves transmit information! Helmholtz’s research assistant, Wilhelm Wundt (1832–1920), went on to teach the world’s first course in scientific or “experimental” psychology at the University of Heidelberg in Germany in 1867, published the world’s first psychology textbook in 1874, and opened the world’s first psychology laboratory at the University of Leipzig in 1879. Wundt believed that the primary goal of psychology should be to understand “the facts of consciousness, its combinations and relations, so that it may ultimately discover the laws which govern these relations and combinations” (Wundt, 1912/1973, p. 1). Natural scientists had had great success in understanding the physical world by breaking it down into its basic elements, such as cells and molecules and atoms, and Wundt decided to take the same approach to understanding the mind. His approach later came to be known as structuralism, which was an approach to psychology that attempted to isolate and analyze the mind’s basic elements. Wilhelm Wundt taught the world’s first psychology course and published the world’s first psychology textbook. He also opened the world’s first psychology laboratory at the University of Leipzig (where you can still visit his laboratory and sit at his famous desk, shown above). Wundt was the advisor to a remarkable 184 PhD students, many of whom went on to become well-known psychologists, which is why a large percentage of modern psychologists can trace their intellectual lineage back to him. It is fair to say that modern psychology just Wundt be the same without him. (Sorry.) Structuralists o en used a technique called introspection, which is the analysis of subjective experience by trained observers. Volunteers were presented with a wide variety of stimuli, from patches of color to musical tones, and were trained to describe their moment-by- moment “raw experience,” such as the hue and luminance of the color, the feelings they had when they heard the tone, and so on. Structuralists believed that by carefully analyzing the reports from many trained observers who had been exposed to many stimuli, they would eventually discover the basic building blocks of subjective experience. But structuralism didn’t last, and you can probably guess why. Natural scientists had indeed been successful in understanding the natural world by breaking it into small parts, but that approach was successful only because everyone could agree on what those parts were. When two biologists looked at blood under a microscope, they saw the same blood cells. This wasn’t true of everyone who looked at the color green or heard C# played on a piano. There was simply no way to tell if a person’s description of their experience was accurate, and no way to tell if their experience was the same as or different from someone else’s. So, while the German structuralists were busy introspecting, a young American upstart was taking a very different approach to the study of the mind — an approach that would forever consign structuralism to the history chapter of psychology textbooks. Functionalism: What Is the Mind For? William James felt that subjective experience was less like a molecule made of atoms and more like a river — a “stream of consciousness” as he called it — and that trying to isolate its basic elements was a losing proposition. James thought psychologists should worry less about what mental life was like, and more about what it was for. James and other psychologists developed a new approach to psychology called functionalism, which was an approach to psychology that emphasized the adaptive significance of mental processes. What does “adaptive significance” mean? The answer came from Charles Darwin (1809–1882), a naturalist who had recently published a book entitled On the Origin of Species by Means of Natural Selection (1859). In it, Darwin had proposed the principle of natural selection, which refers to the process by which the specific attributes that promote an organism’s survival and reproduction become more prevalent in the population over time. How does natural selection work? Animals pass their physical attributes to their offspring, and those attributes that are most “adaptive” — that is, those that promote the offspring’s survival and reproduction — are more likely to be passed along from one generation to the next. Over time, these adaptive attributes become increasingly prevalent in the population simply because “the population” refers to those animals that have managed to survive and reproduce. Charles Darwin and Alfred Russel Wallace independently developed the theory of evolution and then announced it in a joint publication of the Linnean Society in 1858. But the next year, Darwin published the theory in a book called On the Origin of Species and became one of the most famous scientists of all time, while the world pretty much forgot about good old Wallace. For example, humans have fingers instead of flippers because at some point in the distant past, those of our ancestors who developed fingers were better able to survive and reproduce than those who did not, and they passed their flipperless fingeredness on to us. That’s the principle of natural selection at work, shaping the human body. James reasoned that if our physical characteristics had evolved because they were adaptive, then natural selection should also have shaped the mind. “Consciousness,” James wrote in 1892, “has in all probability been evolved, like all other functions, for a use.” According to James, the task for psychologists was to figure out what that use was. The Early 1900s: Psychoanalysis and Behaviorism Learning Outcomes W Y N T K Outline the basic ideas behind Freud’s psychoanalytic theory. Define the basic idea behind behaviorism. Explain the principle of reinforcement. Structuralism and functionalism were important ideas — to the hundred or so people who knew anything about them. While 19th- century academics debated the best way to study the mind, the rest of the world paid approximately no attention. But all that would change in the next century, when a restless neurologist from Vienna and a failed writer from Pennsylvania would pull psychology in opposite directions and, in the process, become two of the most influential thinkers of all time. Psychoanalysis: The Mind Does Not Know Itself While experimental psychologists were trying to understand the mind, physicians were trying to heal it. The French physicians Jean- Martin Charcot (1825–1893) and Pierre Janet (1859–1947) became interested in patients who had an odd collection of symptoms — some were blind, some were paralyzed, and some were unable to remember their identities — but who had no obvious physical illness or injury. They found that when these patients were hypnotized, their symptoms disappeared, and when the patients emerged from their hypnotic trances, their symptoms returned. Charcot and Janet referred to their patients’ condition as hysteria, which is a now obsolete term that refers to a loss of function that has no obvious physical origin. What could possibly explain it? Enter Sigmund Freud (1856–1939), a young Viennese physician in his late 20s who began his career studying the effects of cocaine and the sexual anatomy of eels (though not at the same time). Freud suspected that many patients with hysteria and other “nervous disorders” had had a childhood experience so painful that they couldn’t allow themselves to remember it. These memories, he reasoned, had been hidden from consciousness and relegated to a place Freud called the unconscious, which is the part of the mind that contains information of which people are not aware. Freud felt confident that these exiled or “repressed” memories were the source of his patients’ hysterical symptoms, and he spent the next several years developing an elaborate theory of the mind known as psychoanalytic theory, which is a general theory that emphasizes the influence of the unconscious on feelings, thoughts, and behaviors. Freud’s theory was complex, and you’ll learn much more about it in later chapters. But in brief, Freud saw the mind as a set of processes that were largely hidden from our view, and he regarded the conscious thoughts and feelings that the structuralists had worked so hard to identify as little more than flotsam and jetsam, bobbing on the surface of a vast and mysterious ocean. To understand the ocean, Freud suggested, you can’t just skim the surface. You have to dive — and when you do, you should expect to encounter some frightening things. For Freud, those frightening things were the person’s anxieties and impulses — the fear of death, the desire to kill, forbidden sexual urges, and so on — all of which were lurking beneath the waves. Sigmund Freud’s first major book, The Interpretation of Dreams, sold only 600 copies in the first 8 years. In a despairing letter to a friend, Freud wrote, “Do you suppose that someday a marble tablet will be placed on the house, inscribed with these words: ‘In this house on July 24, 1895, the secret of dreams was revealed to Dr. Sigm. Freud’?” Today the site of that house bears a memorial plaque with precisely that inscription. Freud believed that the only way to confront these denizens of the deep was through psychoanalysis, which is a therapy that aims to give people insight into the contents of their unconscious minds. A therapeutic session with Sigmund Freud began with the patient lying on a couch and Freud sitting just behind them (probably smoking a cigar). He might ask the patient to describe their dreams or to “free associate” by talking about anything they wanted or by responding quickly to a word (“What pops into your head when I say ‘mother?’”). Freud believed that his patients’ dreams and free associations offered a glimpse into the contents of their unconscious minds, and that if he could see what was there, he could heal them. William James thought most of Freud’s theorizing was nonsense. “I strongly suspect Freud, with his dream theory, of being a regular hallucine,” he wrote in a letter to a friend in 1909. “Hallucine” is an old-fashioned word for “lunatic,” so this was not meant as a compliment. Most experimental psychologists shared James’s assessment and paid scant attention to Freud’s ideas. On the other hand, clinicians paid a lot of attention, and within a decade, Freud’s psychoanalytic movement had attracted a virtual army of disciples. Indeed, Freud’s thinking influenced just about everything in the 20th century — from history and philosophy to literature and art — which is why Freud is ranked as the 44th most influential person in human history (Skiena & Ward, 2013). Behaviorism: The Mind Does Not Matter James had a somewhat dim view of Freud, but as the 20th century got rolling, another, much younger psychologist took an even dimmer view of Freud — and of James, Wundt, and everyone else who had ever talked about the “science of the mind.” That young psychologist had been born in the tiny town of Traveler’s Rest, South Carolina, and went on to the University of Chicago to study the behavior of rats. When his interest changed to the behavior of people, his changing interest changed the world. Watson and Pavlov To John Broadus Watson (1878–1958), everything worth knowing about a rat — how it feeds and mates, how it builds its nest and rears its young — could be known just by watching it, and he wondered why human beings couldn’t be known the same way. Why should the study of human behavior require a bunch of idle speculation about the human mind? Mental life was idiosyncratic, undefinable, and unmeasurable, and Watson felt that if psychology wanted to become a real science, it should limit itself to studying the things people do rather than the things they claim to think and feel. Watson called this idea behaviorism, which is an approach to psychology that restricts scientific inquiry to observable behavior. John B. Watson was the founder of behaviorism, which revolutionized American psychology in the early 20th century. Watson was married, and his academic career was cut short when a scandalous love affair led Johns Hopkins University to dismiss him in 1920. He took a job in advertising, where he spent the remainder of his life working on campaigns for products such as Maxwell House coffee, Johnson & Johnson baby powder, and Pebeco toothpaste. Watson was impressed by the work of the Russian physiologist Ivan Pavlov (1849–1936), who studied digestion in dogs. Pavlov knew that dogs naturally start salivating when they are presented with food. But one day he noticed that the dogs in his laboratory started salivating when they heard the footsteps of the research assistant who was coming down the hall to feed them! Pavlov suspected that his dogs had come to associate the feeder’s footsteps with the arrival of food and that the dogs were responding to the footsteps as though they were the food itself. He devised an experiment to test this hypothesis. First, he sounded a tone every time he fed his dogs. Then, a er a few days, he sounded the tone without feeding the dogs. What happened? The dogs salivated when they heard the tone. Pavlov called the tone a stimulus and the salivation a response. This photo shows Baika, one of Ivan Pavlov’s famous dogs. Pavlov won the Nobel Prize in 1904 for his research on digestion. When Watson read about this research, he quickly realized that these two concepts — stimulus and response — could be the building blocks of a new behaviorist approach. Psychology, Watson argued, should be the scientific study of the relationship between stimuli and responses — nothing less, and certainly nothing more. In his 1919 book, Psychology from the Standpoint of a Behaviorist, he wrote: “The goal of psychological study is the ascertaining of such data and laws that, given the stimulus, psychology can predict what the response will be.” He proudly noted that in his book “the reader will find no discussion of consciousness and no reference to such terms as sensation, perception, attention, will, image and the like” because “I frankly do not know what they mean, nor do I believe that anyone else can use them consistently.” Watson’s arguments were wildly persuasive. Before Watson, some psychologists were structuralists, some were functionalists, and many were agnostics. Then, as one historian wrote, “Watson touched a match to the mass, there was an explosion, and only behaviorism was le ” (Boring, 1929). But if Watson had convinced psychologists that behaviorism was the one and only proper way to study human behavior, it would take a skinny kid from Pennsylvania to convince the rest of the world of the same thing. Skinner Burrhus Frederic Skinner (1904–1990) grew up in Pennsylvania and graduated from Hamilton College in 1926 with the intention of becoming a writer, and like many young people with that particular aspiration, he took a job at a bookstore in New York City. One day while browsing the shelves he came across some books by Pavlov and by Watson. He was captivated. He abandoned his nascent writing career and enrolled as a graduate student in the psychology department at Harvard University — the same department from which William James had retired 20 years earlier. Among B. F. Skinner’s many inventions was the Skinner Box, shown here. “We need to change our behavior and we can do so only by changing our physical and social environments,” he wrote. “We choose the wrong path at the very start when we suppose that our goal is to change the ‘minds and hearts of men and women’ rather than the world in which they live.” Skinner greatly admired the work of Pavlov and Watson, but as he studied them, he started to suspect that their simple stimulus– response psychology was missing something important. Pavlov’s dogs lived in a laboratory where they sat around and waited to be fed, but in the real world, animals had to act on their environments to find food. How did they learn to do that? To find out, Skinner built a cage for laboratory animals that the world would soon come to call a “Skinner Box.” The cage had a lever that, when pressed by a hungry rat, delivered food through a tube. Skinner then devised a “cumulative recorder,” which recorded the frequency of the rat’s lever-presses in real time. These inventions don’t sound like much today, but in 1930 they were serious technology. Moreover, they allowed Skinner to discover something remarkable. When Skinner put a rat in one of his special cages, it would typically wander around for a while, sniffing and exploring, until it accidentally bumped the lever, causing a food pellet to appear as if by magic. A er this happy accident had happened a few times, the rat would suddenly start pressing the lever — tentatively at first, then more quickly and more o en, until it basically looked like the conga player in a hot Latin jazz band. Unlike Pavlov’s dogs, which had learned to monitor their environments and anticipate food, Skinner’s rats had learned to operate on their environments to produce food. When the rat’s behavior produced food (which Skinner called a “reinforcement”), the rat would repeat the behavior; and when it didn’t, the rat wouldn’t. Animals do what they are rewarded for doing, Skinner concluded, and he called this the principle of reinforcement, which is a principle stating that any behavior that is rewarded will be repeated and any behavior that isn’t rewarded won’t be repeated. Skinner argued that this simple principle could explain how rats learn to find food, but that it could also explain the most complex human behaviors. In a relatively short time, Skinner’s theories about the effects of reward came to dominate psychology. Structuralism and functionalism quietly disappeared, and “behaviorism was viewed as the one right way to do psychological science” (Baars, 1986, p. 32). Like Freud, Skinner’s influence went far beyond the academy. His theories spread across the globe and became the foundation of classroom education, government programs, psychological therapies, and even child-rearing practices. In two controversial best-sellers — Walden II (1948) and Beyond Freedom and Dignity (1971) — Skinner laid out his vision for a utopian society in which all human behavior was controlled by the judicious application of the principle of reinforcement. In these books, Skinner claimed that free will was an illusion, and that the world could solve its most pressing social problems by acknowledging that people do what they are rewarded for doing, and that their sense of “choosing” and “deciding” is a dangerous fiction. As you might expect, Skinner’s critics were many and fierce. Time magazine featured him on its cover beneath the words “B. F. Skinner Says: We Can’t Afford Freedom.” One reviewer called his book “fascism without tears,” and another called it “a corpse patched with nuts, bolts and screws from the junkyard of philosophy.” One magazine warned that Skinner was advocating “the taming of mankind through a system of dog obedience schools for all.” These attacks were predictable but mistaken. Skinner did not want to turn classrooms into obedience schools or strip citizens of their civil rights. Rather, he simply believed that a scientific understanding of the principles that govern behavior could be used to improve social welfare and that behaviorists knew what those principles were. The Early 1900s: Resistance to Behaviorism Learning Outcomes W Y N T K Explain why several European psychologists resisted behaviorism. Explain why American social psychologists resisted behaviorism. In the early 1900s, behaviorism was king; but not all the subjects were loyal. In fact, several pockets of resistance could be found throughout the kingdom — groups of psychologists who refused to swear allegiance to the crown and whose work would soon foment a counterrevolution. Gestalt Psychology and Developmental Psychology Many of the dissidents lived in Europe. The German psychologist Max Wertheimer (1880–1943) studied how people perceive motion, and in one of his experiments, participants were shown two lights that flashed quickly on a screen, one a er the other. When the time between the flashes was relatively long, the participant would correctly report that the two lights were flashing in sequence; but when the time between flashes was reduced to about 1/5th of a second, participants reported that a single light was moving back and forth (Fancher, 1979; Sarris, 1989). Wertheimer argued that this “illusory motion” occurs because the mind has theories about how the world works (e.g., “when an object is in one location and then instantly appears in a contiguous location, it probably moved”) and it uses these theories to make sense of incoming sensory data. (You’ve already encountered this idea under the name “philosophical idealism.”) In both conditions of Wertheimer’s experiment, participants had been shown exactly the same physical stimuli, but they had seen different things. Physical stimuli, Wertheimer concluded, are part of the perceptual experience, but the whole is more than the sum of its parts. The German word for “whole” is gestalt, and Wertheimer and his colleagues called their approach Gestalt psychology, which is an approach to psychology that emphasized the way in which the mind creates perceptual experience. If you look up at a “news ticker,” the words in the headlines seem to be scrolling by, moving from right to le. But they aren’t really moving. Rather, contiguous lights are going on and off in rapid succession. This is the phenomenon that the Gestalt psychologist Max Wertheimer was studying in Germany in 1912. While German psychologists were studying why people sometimes see things as they aren’t, the British psychologist Sir Frederic Bartlett (1886–1969) was studying why people sometimes remember things as they weren’t. Bartlett asked participants to read stories and then try to remember the material — 15 minutes to several years later. Bartlett found that participants o en remembered what they had expected to read rather than what they had actually read, and that this tendency became more pronounced with the passage of time. Bartlett argued that memory is not a simple recording device, but rather, that our minds use their theories of how the world usually works to construct our memories of past experience (see The Real World: Beneath the Ocean of Memory). While German and British psychologists were trying to understand the minds of adults, Swiss psychologist Jean Piaget (1896–1980) was trying to understand the minds of children, which he o en did by examining the mistakes they made. For example, in one study, Piaget showed 3-year-olds two equally large mounds of clay and then broke one mound into little pieces. When the children were asked which mound now had “more clay,” they typically said that the unbroken one did. By the age of 6 or 7, children no longer made this mistake. Piaget concluded that the mind has theories about how the world works (“Breaking a material object into pieces doesn’t change the amount of material in it”) and that because small children have not yet learned these theories, they see the world in a fundamentally different way than adults do. Piaget and his contemporaries helped create an area of experimental psychology called developmental psychology, which is the study of the ways in which psychological phenomena change over the life span. Jean Piaget’s studies focused on the mistakes children make in thinking about objects in the world. For example, young children mistakenly believe that when an object changes shape, it also changes mass — that a ball of clay becomes “more clay” when it is flattened into a log. In short, while many early 20th-century psychologists were flying the behaviorist flag, a small number were quietly doing the very thing that behaviorism forbade: studying people’s perceptions, memories, and judgments in order to understand the nature of an unobservable entity called the mind. Social Psychology Like many German Jews, Kurt Lewin (1890–1947) had fled to America in the early 1930s when Hitler came to power. A er taking a job as a professor at MIT, Lewin began studying topics such as leadership, communication, attitude change, and racial prejudice. At the heart of his many different research projects was a single, simple idea: Behavior is not a function of the environment, but of the person’s subjective construal of the environment. Responses do not depend on stimuli, as the behaviorists claimed; rather, they depend on how people think about those stimuli. Kurt Lewin’s studies focused on the psychological differences between autocracy (in which one person has power over all others) and democracy (in which all people share power). In a series of studies, he assigned 10-year-old children to work together in autocratic groups on some days and in democratic groups on other days. He observed that “the change from autocracy to democracy seemed to take somewhat more time than from democracy to autocracy” and concluded that this was because “Autocracy is imposed upon the individual. Democracy he has to learn” (Lewin, 1948). Lewin’s research and theorizing gave birth to a new area of experimental psychology called social psychology, which is the study of the causes and consequences of sociality. Social psychologists sought to understand how people see the social world. For example, Solomon Asch (1907–1996) told a group of participants about a man who was envious, stubborn, critical, impulsive, industrious, and intelligent — a string of adjectives that went from bad to good. He told another group about a man who was intelligent, industrious, impulsive, critical, stubborn, and envious — exactly the same list of adjectives but in the opposite order. Asch discovered that the participants who heard the man’s good traits first liked him more. Asch argued that this “primacy effect” occurred because the early words in each list created a theory (“Intelligent and industrious — wow, this is a really great guy”) that the mind then used to interpret the later words in that list (“Stubborn probably means that he sticks to his principles”). Asch’s studies led to an avalanche of research on how people draw inferences about others. Solomon Asch was influenced by Gestalt psychology and those who resisted the edicts of behaviorism by studying how people think about each other. His early studies of the “primacy effect” showed that early information about a person changes the interpretation of later information. If you saw the photo on the le before the photo on the right, you’d form one impression of the man (“He’s a fairly straight-ahead guy who likes to party on weekends”); but if you saw the photos in the opposite order, you’d probably form a very different impression (“He’s a hipster who covers his ink for his day job”). Other social psychologists studied how people persuade each other to change their beliefs, how people form stereotypes, and how people create identities based on their social groups. Beliefs, stereotypes, prejudices, identities, intentions — concepts like these had been banished from behaviorism but were the heart and soul of social psychology. “The power, the honors, the authority, the textbooks, the money, everything in psychology was owned by the behavioristic school,” psychologist George Miller later remembered (Baars, 1986, p. 203). “Those who didn’t give a damn, in clinical or social psychology, went off and did their own thing.” What the social psychologists didn’t know at the time was that their thing would soon be everybody’s thing because the behaviorist kingdom was about to be attacked, invaded, and conquered. The Real World Beneath the Ocean of Memory Sir Frederic Bartlett was interested in how memory worked in “the real world,” and despite the ascendance of behaviorism, he spent his life studying it. During World War II, he established the Applied Psychology Unit at the Cambridge Laboratory of Industrial Research to help the British military in their efforts to defeat Hitler. So it was more than fitting that nearly a half century a er his death, Bartlett’s pioneering studies of human memory helped solve a naval mystery. During World War II, the Australian warship Sydney (shown in the photo) battled the German warship Kormoran, and both ships sank in the Indian Ocean. There were just a few survivors, and when they were interrogated months later, each had a different memory of the precise spot where the two ships went down. Despite numerous attempts to locate the ships, the wreckage remained lost at the bottom of the sea. Then, in 1998, psychologists John Dunn and Kim Kirsner decided to see if they could use Bartlett’s research to estimate how the survivors’ memories might have become distorted over time (Dunn & Kirsner, 2011). “What we found was that there was a correspondence — that our data looked like the kind of data that Bartlett had generated in his study,” said Dunn (Spiegel, 2011). By combining the survivors’ testimony with Bartlett’s ideas about memory distortion, the psychologists were able to make a prediction about where the ships might actually be. “I never really thought that I would ever find out whether it would be right or wrong,” said Dunn. But he did find out, because in 2008, a team of shipwreck-hunters found the ships on the ocean floor — just about where the two psychologists had predicted they would be. Despite what his behaviorist colleagues had claimed, Sir Frederic Bartlett’s mentalistic research was “real science” a er all. The Late 1900s: The Cognitive Revolution Learning Outcomes W Y N T K Summarize Chomsky’s critique of Skinner. Explain what cognitive psychology is and how it emerged. Explain what evolutionary psychology is and why it emerged. In 1957, Skinner published a book called Verbal Behavior, in which he offered a behaviorist account of how children learn language. The linguist Noam Chomsky decided that there had been enough passive resistance to behaviorism over the last 40 years and that it was time to mount a full-scale attack. In 1959, he published a devastating critique of Skinner’s book, arguing that behaviorist principles could never explain some of the most obvious features of language-learning. For example, children create novel sentences that they have never heard before. How do they produce them? The obvious answer is that they use grammar — a complex set of rules that tells them which of an infinitely large number of possible sentences are permissible (“The girl ran a er the ball”) and which are not (“The girl a er the ball ran”). So how do they learn these rules? Using formal mathematical logic, Chomsky showed that a purely behaviorist account of learning could never explain how children learn grammar. Chomsky (1959) suggested that it was time to toss behaviorism into the dustbin of history: “If the study of language is limited in these ways,” he wrote, “it seems inevitable that major aspects of verbal behavior will remain a mystery.” The world was listening. But “Out with the old!” is a successful rallying cry only when followed by “In with the new!” and there was indeed something new happening in the 1960s that would spark a flame and push behaviorism to psychology’s back burner. It wasn’t a new philosophy, scientific discovery, or social movement. It was a mindless, soulless machine. Cognitive Psychology ENIAC, the first general-purpose electronic digital computer, was built in 1945. It weighed 30 tons, was the size of a small house, cost the equivalent of $7,000,000, and was of interest mainly to a handful of engineers, the U.S. Army, and a few random nerds. But ENIAC and all the computers that followed did something important: They gave psychologists permission once again to talk about the mind. How? In this 1946 photo, Marlyn Wescoff (le ) and Ruth Lichterman (right) are programming ENIAC, the world’s first digital computer. This revolutionary device did many things, but one of the most important is that it gave psychologists a scientifically respectable way to talk about mental processes. A computer’s observable behavior is as simple as a rat’s. Present the computer with a stimulus (“2 + 2 = ?”) and it will produce a response (“4”). But unlike a rat, the way the computer produces its response is neither hidden nor mysterious. It produces its response by processing information — that is, by encoded information, storing it in memory, retrieving it on demand, and combining it in lawful ways. Computers do things that from the outside look a whole lot like learning, reasoning, remembering — and maybe even thinking. If those words could legitimately be used to describe the physical information-processing operations that happen inside a machine, why couldn’t they also be used to describe the physical information-processing operations that happen inside a brain? If the brain is hardware, then the mind is so ware — and there is nothing spooky or unmeasurable about the way a so ware program works. With the digital computer as their guiding metaphor, psychologists of the 1950s and 1960s suddenly felt emboldened to study topics that everyone but the dissidents had ignored for decades: how people allocate their attention, how people expand their capacity to process information by combining it into chunks, how a person’s desires can shape their perceptions of physical objects, and so on. This dramatic shi in psychology’s orientation came to be known as “the cognitive revolution.” Cognitive psychology is the study of human information processing, and for more than 50 years it has been producing deep insights into the nature of the human mind — many of which you will learn about in the chapters to come. Evolutionary Psychology Behaviorism set the mind aside and cognitive psychology brought it back. But behaviorism also set something else aside: the past. Behaviorists viewed individuals as blank slates who came into the world with nothing but a readiness to be shaped by their environments. As John Watson (1930, p. 89) wrote: Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select — doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. Watson’s view was egalitarian and optimistic. But it was also wrong. In the 1960s, the psychologist John Garcia (1917–2012) was studying how rats react to radiation sickness. He noticed that his rats instantly learned to associate their nausea with the taste of the food they ate just before getting sick, and they instantly developed an aversion to that food. On the other hand, no matter how much training they received, his rats could not learn to associate their nausea with a flashing light or the sound of a buzzer. That just didn’t make sense. Pavlov had shown that when two stimuli are paired (e.g., a researcher’s footsteps and the appearance of food), animals will learn to associate one with the other — and it wasn’t supposed to matter whether those stimuli were foods, footsteps, lights, or buzzers. It wasn’t supposed to matter, but it did. What could that mean? John Garcia’s experiments showed that the ease with which associations are learned can be influenced by an organism’s evolutionary history. Because nausea and vomiting are usually caused by the ingestion of food, nature has programmed mammals to instantly develop an aversion to any food they happened to eat just before getting sick to their stomachs. Garcia thought it meant that every organism is evolved to respond to particular stimuli in particular ways — that animals come into the world “biologically prepared” to learn some associations more easily than others. In the real world of forests and sewers, a rat’s nausea is usually caused by eating spoiled food, and although Garcia’s rats had been born in a laboratory and had never eaten spoiled food, their ancestors had. Millions of years of evolution had designed the rat brain so that it would quickly learn to associate an episode of nausea with the taste of food, and that’s why rats learned this association so much more quickly and easily than they learned others. Rats, it turned out, were not blank slates — so why should psychologists think of people that way? Findings such as these led to a new area of experimental psychology called evolutionary psychology, which is the study of the ways in which the human mind has been shaped by natural selection. Evolutionary psychologists began studying gender differences in sexual promiscuity, how people detect cheaters in a social exchange, and how people select their ideal mate. Evolutionary psychology “is not a specific subfield of psychology, such as the study of vision, reasoning, or social behavior. It is a way of thinking about psychology that can be applied to any topic within it” (Cosmides & Tooby, 2000, p. 115), and, as you will see in the upcoming chapters, modern psychologists now apply evolutionary thinking to a wide array of topics. A er the Revolution Behaviorism was a valuable approach that led to many important discoveries about human behavior, but it ignored the mind and it ignored the past — both of which are just too important to stay ignored for long. Although many modern psychologists still study how rats learn and how reinforcement shapes behavior, few are behaviorists who regard humans as blank slates or who believe that mental processes are unmeasurable fictions. Ironically, the emergence of cognitive and evolutionary psychology has in some ways brought psychology full circle. Like the structuralists, cognitive psychologists now ask what the mind is like; and like the functionalists, evolutionary psychologists now ask what the mind is for. Apparently, some questions are just too interesting to go gentle into that good night. The Early 2000s: New Frontiers Learning Outcomes W Y N T K Define neuroscience and explain how modern psychologists study the brain. Define cultural psychology and explain why it matters. The cognitive revolution fundamentally changed psychology at the close of the 20th century. But science doesn’t stand still. In the present century, psychology has continued to evolve, and several new and exciting areas have emerged. We’ll discuss two of them — one that has psychologists looking “down a level” to biology as they search for the neural substrates of mental life, and another that has psychologists looking “up a level” to sociology and anthropology as they seek to understand its cultural origins. Neuroscience The mind is what the brain does. But until recently, knowledge of the brain was based primarily on studies of brains that had been damaged. For instance, in 1861, the French physician Paul Broca (1824–1880) performed an autopsy on a man who had been able to understand words but not produce them, and he found damage in a small region on the le side of that man’s brain. Broca concluded that the ability to speak somehow depended on this particular region — and the fact that this region is today called Broca’s area tells you that he was right. Psychologists learned from brains that were damaged by nature, but they also learned from brains that they damaged themselves. In the 1930s, for instance, the psychologist Karl Lashley (1890–1958) taught rats to run a maze and then surgically damaged different parts of the rats’ cerebral cortices and measured changes in their performance. To his surprise, he found that while brain damage impaired performance, it didn’t really matter where on the cortex the damage was inflicted, which led Lashley to conclude that learning is not “localized” or tied to a specific brain area in the same way that language seemed to be. Of course, damaged brains can only teach us so much. Imagine how hard it would be to figure out how a car works if all you could do was smash it in different places with a hammer and then see how well it drove. Fortunately, newer technologies allow psychologists to observe the undamaged brain in action. For example, functional magnetic resonance imaging (fMRI) is a technology that produces the “brain scans” you o en read about in the news. These scans are maps showing the amount of blood that was flowing in different parts of a person’s brain at a particular moment in time. Because neural activity requires oxygen, and because blood supplies it, these scans can tell us which areas of a brain were working hardest to process information at any particular time, and this has taught us things we could never have learned by merely examining damaged brains. Technologies such as fMRI allow cognitive neuroscientists to determine which areas of the brain are most and least active when people perform various mental tasks, such as reading, writing, thinking, or remembering. The machine (le ) produces what are commonly called “brain scans” (right) but which are actually maps of blood flow in different parts of a person’s brain. For instance, Broca would not have been surprised to learn that people using their hands to speak American Sign Language (ASL) show increased neural activity in the same region of the le hemisphere that he identified in 1861. But he might have been surprised by research that has used fMRI to show that this le - hemisphere activity occurs only in the brains of speakers who became deaf in adulthood. Speakers who were born deaf show increased neural activity in both the le and right hemispheres, suggesting that they are speaking ASL in a very different way (Newman et al., 2002). The advent of the fMRI and other technologies that you’ll learn about in the Neuroscience and Behavior chapter has given birth to two new areas of psychology: cognitive neuroscience, which is the study of the relationship between the brain and the mind (especially in humans), and behavioral neuroscience, which is the study of the relationship between the brain and behavior (especially in nonhuman animals). Cultural Psychology The human beings who inhabit the mountains of India, the plains of China, the cities of Africa, the jungles of Brazil, and the classrooms of America are more alike than they are different, but the differences are important to understanding how they think, feel, and act. Culture refers to the values, traditions, and beliefs that are shared by a particular group of people, and although we usually think of culture in terms of nationality and ethnicity, it can also be defined by age (e.g., youth culture), sexual orientation (e.g., gay culture), religion (e.g., Jewish culture), occupation (e.g., academic culture), and many of the other dimensions on which people differ (see A World of Difference: To Have or Have Not). Scholars have been interested in cultural differences since at least the days of the ancient Greeks, but most 19th-century psychologists were content to ignore them and to assume that what they were studying was universal, or that exceptions to the rule didn’t really matter. In the 20th century, culture became a topic of great interest to the social psychologists, but the rest of psychology ignored it. A er all, how important could culture be if rats didn’t have any? Culture can influence how and what we see. Participants in a study were shown this scene and then another version of this scene in which something was changed. American participants were more likely to spot changes to the red car, but Japanese participants were more likely to spot changes to the buildings. All of that has now changed. America has become more diverse and its diversity has become more apparent, which means that the importance of culture is looming larger than ever before. Cultural psychology is the study of how culture influences mental life, and those influences can be quite profound. For example, in one study, American and Japanese participants were shown two drawings that differed in a few small details and were then asked to spot the differences. The Americans detected more differences in the foreground objects, whereas the Japanese detected more differences in the background objects (Masuda & Nisbett, 2001). Why? The researchers suggested that while Americans live in an independent and individualistic society, the Japanese live in an interdependent society that requires them to attend to relationships and context; this cultural difference may influence the kinds of visual information to which people from each culture naturally attend. Whereas American participants in the study tended to process visual information “analytically” by attending to objects in the foreground, Japanese participants tended to process visual information “holistically” by attending to the background. Because culture can influence just about everything psychologists study, you’ll learn about work in cultural psychology in every one of the upcoming chapters. A World of Difference To Have or Have Not When we think about “other cultures,” most of us imagine faraway lands filled with people eating exotic foods, wearing unfamiliar clothes, and speaking languages we can’t understand. But you don’t have to board an airplane to visit a culture different from your own, because in just about every place on earth, there are two distinctly different cultures living side by side those who have more — more money, more education, more prestige — and those who have less (Kraus et al., 2011). In even the most egalitarian societies, people can be divided into higher and lower social classes, and as it turns out, social class is a powerful determinant of human behavior. Consider an example. Because upper-class people have ample material resources, they don’t need to depend much on others. When problems arise, upper-class people rely on their bank accounts, whereas lower-class people rely on family, friends, and neighbors with whom they must maintain good relationships. In a way, one of the luxuries that upper-class people enjoy is the luxury of not worrying too much about what others feel or think. Does having that luxury influence their behavior? Indeed it does. In laboratory studies, upper-class people o en prove to be less generous, less charitable, less trusting, and less helpful toward others (Piff et al., 2010), as well as more likely to lie and cheat for personal gain (Gino & Pierce, 2009 Piff et al., 2012). This “me first” orientation is easy to see outside of the laboratory, too. For example, in one study, researchers stood near the intersection of two busy streets and recorded the make, model, and year of the cars that approached. They then watched to see whether the drivers cut off other cars and pedestrians in the intersection. As the two graphs show, the drivers of new, expensive cars were considerably more likely to zip through intersections without regard for others. Is this because being upper-class makes people selfish? Or is it because being selfish makes people upper-class? Some studies suggest that the first explanation is the right one. For instance, when participants in experiments are randomly assigned to think of themselves as upper-class — for instance, when they are asked to compare their incomes to those who have less — they also behave more selfishly (Piff et al., 2012). Social class matters. So do gender, race, religion, age, and most of the other dimensions on which human beings differ. Psychological science o en produces conclusions about “people on average,” but it is important to keep in mind that while averages are useful for understanding and predicting behavior, people actually come in nearly infinite varieties, and the things that distinguish them are o en as interesting as the things that make them one. Becoming a Psychologist Learning Outcomes W Y N T K Describe the diversity of psychology. Outline the different kinds of training psychologists may receive. Identify the careers available to psychologists. Although most ordinary people don’t know exactly what psychology is, they have a sneaking suspicion that psychologists can look directly into their minds and read their thoughts, especially the sexual ones. In fact, psychologists can’t do this, but they can do other things that are much more useful, such as helping people, doing research, and making lists with three examples. Now that you know where psychology came from, and how it went from there to here, we’ll close this chapter by looking at modern psychology as a profession. Who Becomes a Psychologist? In July of 1892, William James and six other psychologists decided to form an organization that represented psychology as a profession, and so the American Psychological Association (APA) was born. Today their little club boasts more than 75,000 members — and a second professional organization, the Association for Psychological Science, formed in 1988, now has 30,000 members. James and his friends would never have guessed how massive their profession would soon become, or that by 2017, about 70% of the people receiving PhDs in psychology would be women (National Science Foundation, 2018). Or maybe they would have. A er all, just a few years a er the APA was founded, Mary Whiton Calkins (1863–1930) became its president. Calkins studied at Harvard with James, and over the course of her career wrote four books and published more than 100 scientific papers. In 1894, Margaret Floy Washburn (1871–1939) became the first woman to receive her PhD in psychology. She too went on to serve as APA’s president. Today, women play leading roles in all areas of psychology. In fact, women make up a majority of the membership of APA and hold a majority of its governance positions (National Science Foundation, 2018). You probably noticed that a chapter like this one, on the history of psychology, doesn’t feature many women, and that’s because for most of its existence, the profession was largely closed to them. But history chapters will start to look different in the coming years, because today there are plenty of women in psychology and they are most definitely making history. In 1890, Harvard was an all-male school, but Mary Whiton Calkins (le ) was given special permission to study there with William James. Despite completing all the requirements for a PhD, however, the president of Harvard refused to award her a degree because she was a woman. James was outraged, describing her performance as “the most brilliant examination for the PhD that we have had at Harvard.” Calkins went on to become a professor at Wellesley College, as well as the first female president of APA. In 1894, Margaret Floy Washburn (right) became the first woman to receive a PhD in psychology (from Cornell University). She became a professor at Vassar College, and she too went on to serve as APA’s president. Today, women earn the majority of PhDs in psychology from American universities. Just as there were no women at APA’s founding meeting in 1892, neither were there any people of color. But that too has changed. In 1920, Francis Cecil Sumner (1895–1954) became the first Black person to receive a PhD in psychology, and in 1970, his student Kenneth Clark (1914–2005) became the first Black person to serve as APA’s president. He and his wife Mamie Phipps Clark studied the ways in which Black children were psychologically harmed by segregation, and in 1954 their groundbreaking research was cited in the U.S. Supreme Court’s ruling in Brown v. Board of Education, which held that racial segregation of public schools was unconstitutional (Guthrie, 2000). Today, non-white students earn about 30% of the PhDs in psychology awarded by American universities. America has changed and so has psychology — in ways that would surely have made the seven founders proud. The next century’s history will not look very much like that of the last one’s. Mamie Phipps Clark and Kenneth Clark were the first Blacks to earn PhDs in psychology. Together, they studied the psychological effects of prejudice, discrimination, and segregation on children. Based in part on their research, the U.S. Supreme Court declared segregation unconstitutional. Chief Justice Earl Warren concluded that “to separate them [Black children] from others of similar age and qualifications solely because of their race generates a feeling of inferiority as to their status in the community that may affect their hearts and minds in a way unlikely to ever be undone.” Without the work of these two pioneering psychologists, the integration of American schools might not have happened when it did. How Do People Become Psychologists? College students who major in psychology usually come away with a bachelor’s degree. They can call themselves educated, they can call themselves graduated, and if they really want to, they can call themselves bachelors. But they can’t call themselves psychologists. To be called a psychologist requires earning an additional advanced degree. Two of the most common of these are the master’s degree and the PhD in psychology. The abbreviation PhD stands for “doctor of philosophy” (which actually has nothing to do with philosophy and a lot to do with the history of 19th-century German universities — don’t ask). To earn a master’s or a PhD, students must attend graduate school, where they take classes and learn to do original research by collaborating with professors. Although William James was able to master the entire field of psychology in just a few years because the body of knowledge was so small, graduate students today typically concentrate their training in a specific area of psychology (e.g., social, cognitive, developmental, or clinical). PhD students spend an average of 6 years in graduate school before attaining their degrees (National Science Foundation, 2018), and a erward many go on for even more training in a laboratory or a hospital. Some who attain an advanced degree in psychology become professors at a college or university, where they usually do some combination of teaching and scientific research. You are probably taking a class from one of those people right now, and may we take just a moment to say that they have excellent taste in textbooks? But many people who receive an advanced degree in psychology instead take jobs that involve assessing and treating people with psychological problems. These psychologists are informally referred to as therapists, and they typically have private practices, just like your doctor and dentist might. A practice o en includes a variety of mental health professionals, such as psychiatrists (who have earned an MD or a medical doctor degree) and counselors (who have usually earned a master’s degree or a PhD). There are many other advanced degrees that allow people to call themselves psychologists and provide therapy, such as the PsyD (doctor of psychology) and the MEd (master of education). Why does a practice need so many different kinds of people? First, different degrees come with different privileges: People with an MD can prescribe medications, but in most states, people with a master’s degree or a PhD cannot. Second, most therapists specialize in treating specific problems such as depression, anxiety, eating disorders, and so on, and they may even specialize in treating specific populations, such as children, older adults, or particular ethnic groups (see FIGURE 1.1). It takes a village to look a er people’s mental health. Figure 1.1 The Subfields of Psychology Percentage of PhDs awarded by American universities in various subfields of psychology in 2019. Psychologists are also employed in a wide variety of other settings. For example, school psychologists offer guidance to students, parents, and teachers; industrial/organizational psychologists help businesses and organizations hire employees and maximize the employees’ performances; sports psychologists help athletes train and compete; forensic psychologists assist attorneys and courts in dealing with crime; consumer psychologists help companies develop and market new products; and the list goes on. Indeed, it is difficult to think of any major U.S. institution or enterprise that doesn’t employ psychologists in some capacity — and that includes the U.S. Post Office and the N.F.L. The fact is that modern psychologists are a diverse set of people who teach, do research, help those in distress, and aid the missions of public and private institutions. If you like what you learn in the upcoming chapters, you might even to decide to become one of them. A person earning an advanced degree in psychology can go on to a wide range of fields, such as these three individuals did (from le to right): Lynne Madden is a clinical psychologist who works with both individuals and groups. Gloria Balague applies her training as a clinical psychologist to her work with athletes. Lynne Owens Mock directs a community mental health center in Chicago. Chapter 1 Review Psychology’s Philosophical Roots Psychology is the scientific study of mind and behavior, and it has deep philosophical roots. Philosophical dualism is the view that mind and body are fundamentally different things; philosophical materialism is the view that all mental phenomena are reducible to physical phenomena. Most modern psychologists are philosophical materialists. Philosophical realism is the view that perceptions of the physical world are produced entirely by information from the sensory organs; philosophical idealism is the view that perceptions of the physical world are the brain’s interpretation of information from the sensory organs. Most modern psychologists are philosophical idealists. Philosophical empiricism is the view that all knowledge is acquired through experience; philosophical nativism is the view that some knowledge is innate rather than acquired. Most modern psychologists are philosophical nativists. The Late 1800s: Toward a Science of the Mind Structuralism was an approach to psychology that attempted to isolate and analyze the mind’s basic elements. Functionalism was an approach to psychology that was influenced by Darwin’s theory of natural selection, emphasizing the adaptive significance of mental processes. The Early 1900s: Psychoanalysis and Behaviorism Sigmund Freud developed psychoanalytic theory, which emphasized the influence of the unconscious on feelings, thoughts, and behaviors. He devised a therapy called psychoanalysis to help people gain insight into the contents of their unconscious minds. Freud had little impact on experimental psychology, but a tremendous impact on the treatment of psychological disorders and on the intellectual climate of the Western world. John Watson developed behaviorism, which was an approach to psychology that restricted scientific inquiry to observable behavior. Behaviorism soon came to dominate experimental psychology in America. B. F. Skinner took a behaviorist approach to understanding how organisms learn to operate on their environments, and he developed the principle of reinforcement. He believed this principle could explain complex human behavior, including how people learn language. The Early 1900s: Resistance to Behaviorism In the first half of the 20th century, behaviorism reigned in America, but some psychologists resisted it and continued to do research on “mentalistic” phenomena such as perception, memory, and judgment. At the same time, social psychologists continued to do research on mentalistic phenomena such as beliefs, attitudes, stereotypes, identity, and intention. The Late 1900s: The Cognitive Revolution In the 1950s and 1960s, Noam Chomsky’s critique of Skinner’s theory of language, as well as the advent of the digital computer, helped spark the “cognitive revolution.” The emergence of cognitive psychology allowed psychologists to use the language of information processing to study mentalistic phenomena. In the 1970s and 1980s, psychologists began to incorporate theories from evolutionary biology into their work, which led to the emergence of evolutionary psychology. The Early 2000s: New Frontiers Cognitive neuroscientists study the relationship between psychological processes and neural activity. Behavioral neuroscientists study the relationship between behavior and neural activity. Cultural psychologists study the ways in which culture influences mental life. Becoming a Psychologist Psychology is a diverse science. In the United States, women earn more than half of all PhDs in psychology, and about a third of all PhDs in psychology are earned by people of color. To become a psychologist, one must attain an advanced degree. Many psychologists become therapists or clinicians, but some become professors; and many are employed in government, industry, and a variety of other enterprises and institutions. Changing Minds 1. While scrolling through your newsfeed, you come across a story describing some research that shows that when people feel love, particular regions of their brains “light up.” You scroll down and see that someone has commented: “This is crazy. Love is more than brain chemistry. It’s about the heart!” Given what you know about materialism, dualism, and fMRI, how would you respond? 2. Sigmund Freud is o en described in the media as “the father of modern psychology.” How accurate is that title? Who else might deserve it more? 3. One of your friends hears that you’re taking a psychology class. “Psychology is just so stupid,” your friend says. “I mean, how can it claim to be a science when it studies things no one can see, like thoughts and memories and emotions? Why don’t they just stick to studying what they can actually measure?” Given what you know about the history of psychology, what would you tell your friend? 4. High school students are touring your campus today — and you get to meet them. Yippie! One of them tells you they are considering a major in psychology. “I figure I can get my degree in four years and then start doing therapy.” Given what you’ve learned about careers in psychology, what should you tell them? 5. Another student touring the campus says, “I’d like to study psychology, but I understand it is a field in decline.” Use the DATA to convince the student that they are wrong. Key Terms psychology philosophical dualism philosophical materialism philosophical realism philosophical idealism philosophical empiricism philosophical nativism reaction time structuralism introspection functionalism natural selection unconscious psychoanalytic theory psychoanalysis behaviorism principle of reinforcement Gestalt psychology developmental psychology social psychology cognitive psychology evolutionary psychology cognitive neuroscience behavioral neuroscience cultural psychology