Behaviorism PDF
Document Details
Uploaded by Deleted User
Tags
Summary
This document provides an overview of behaviorism as a theory of personality, focusing on classical and operant conditioning. It details the historical context, key concepts, and experiments associated with the theory.
Full Transcript
## Behaviorism as a Theory of Personality As one of the oldest theories of personality, behaviorism dates back to Descartes, who introduced the idea of a stimulus and called the person a machine dependent on external events whose soul was the ghost in the machine. Behaviorism takes this idea to ano...
## Behaviorism as a Theory of Personality As one of the oldest theories of personality, behaviorism dates back to Descartes, who introduced the idea of a stimulus and called the person a machine dependent on external events whose soul was the ghost in the machine. Behaviorism takes this idea to another level. Although most theories operate to some degree on the assumption that humans have some sort of free will and are moral thinking entities, behaviorism refuses to acknowledge the internal workings of persons. In the mind of the behaviorist, persons are nothing more than simple mediators between behavior and the environment (Skinner, 1993, p 428). The dismissal of the internal workings of human beings leads to one problem opponents have with the behavioral theory. This, along with its incapability of explaining the human phenomenon of language and memory, build a convincing case against behaviorism as a comprehensive theory. Yet although these criticisms indicate its comprehensive failure, they do not deny that behaviorism and its ideas have much to teach the world about the particular behaviors expressed by humankind. ## The Theory of Behaviorism ### Classical Conditioning The Pavlovian experiment. While studying digestive reflexes in dogs, Russian scientist, Pavlov, made the discovery that led to the real beginnings of behavioral theory. He could reliably predict that dogs would salivate when food was placed in the mouth through a reflex called the "salivary reflex" in digestion. Yet he soon realized that, after time, the salivary reflex occurred even before the food was offered. Because the sound of the bell and the sight of the attendant carrying the food had repeatedly and reliably preceded the delivery of food to the dog, the dog had transferred the reflex to these events (Schwartz & Lacy, 1982, p. 21). Thus, the dog began salivating simply at the bell's sound and the attendant's presence. Pavlov continued experimenting with the dogs using a ringing to signal for food. He found that the results matched and the dog began to salivate with the tone and without food (Schwartz & Lacy, 1982, pp. 20-24). What Pavlov discovered was first-order conditioning. In this process, a neutral stimulus that causes no natural response in an organism is associated with an unconditioned stimulus, an event that automatically or naturally causes a response. This usually temporal association causes the response to the unconditioned stimulus, the unconditioned response, to transfer to the neutral stimulus. The unconditioned stimulus no longer needs to be there for the response to occur in the presence of the formerly neutral stimulus. Given that this response is not natural and has to be learned, the response is now a conditioned response, and the neutral stimulus is now a conditioned stimulus. In Pavlov's experiment the sound was the neutral stimulus that was associated with the unconditioned stimulus of food. The unconditioned response of salivation became a conditioned response to the newly conditioned stimulus of the tone (Beecroft, 1966, pp. 8-10). ### Classical conditioning Classical conditioning was the first type of learning to be discovered and studied within the behaviorist tradition (hence the name classical). The major theorist in the development of classical conditioning is Ivan Pavlov, a Russian scientist trained in biology and medicine (as was his contemporary, Sigmund Freud). Pavlov was studying the digestive system of dogs and became intrigued with his observation that dogs deprived of food began to salivate when one of his assistants walked into the room. He began to investigate this phenomenon and established the laws of classical conditioning. Skinner renamed this type of learning "respondent conditioning" since in this type of learning, one is responding to an environmental antecedent. #### Major concepts Classical conditioning is Stimulus (S) elicits >Response (R) conditioning since the antecedent stimulus (singular) causes (elicits) the reflexive or involuntary response to occur. Classical conditioning starts with a reflex: an innate, involuntary behavior elicited or caused by an antecedent environmental event. For example, if air is blown into your eye, you blink. You have no voluntary or conscious control over whether the blink occurs or not. The specific model for classical conditioning is: 1. Unconditioned Stimulus (US) elicits > Unconditioned Response (UR): a stimulus will naturally (without learning) elicit or bring about a reflexive response 2. Neutral Stimulus (NS) ---> does not elicit the response of interest: this stimulus (sometimes called an orienting stimulus as it elicits an orienting response) is a neutral stimulus since it does not elicit the Unconditioned (or reflexive) Response. 3. The Neutral/Orienting Stimulus (NS) is repeatedly paired with the Unconditioned/Natural Stimulus (US). 4. The NS is transformed into a Conditioned Stimulus (CS); that is, when the CS is presented by itself, it elicits or causes the CR (which is the same involuntary response as the UR; the name changes because it is elicited by a different stimulus. This is written CS elicits > CR. In classical conditioning no new behaviors are learned. Instead, an association is developed (through pairing) between the NS and the US so that the animal / person responds to both events / stimuli (plural) in the same way; restated, after conditioning, both the US and the CS will elicit the same involuntary response (the person / animal learns to respond reflexively to a new stimulus). After conditioning, the previously neutral or orienting stimulus will elicit the response previously only elicited by the unconditioned stimulus. The stimulus is now called a conditioned stimulus because it will now elicit a different response as a result of conditioning or learning. The response is now called a conditioned response because it is elicited by a stimulus as a result of learning. The two responses, unconditioned and conditioned, look the same, but they are elicited by different stimuli and are therefore given different labels. ### Operant conditioning Burrhus Frederic Skinner was born March 20, 1904, in Pennsylvania. B. F. Skinner's entire system is based on operant conditioning. The organism is in the process of "operating" on the environment, which in ordinary terms means it is bouncing around its world, doing what it does. During this "operating," the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant that is, the behavior occurring just before the reinforcer. This is operant conditioning: "the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future." There are four types of Operant Conditioning: Positive Reinforcement, Negative Reinforcement, Punishment, and Extinction. Both Positive and Negative Reinforcement strengthen behavior while both Punishment and Extinction weaken behavior, e.g. In **Positive Reinforcement**, a particular behavior is strengthened by the consequence of experiencing a positive condition. For example: A hungry rat presses a bar in its cage and receives food. The food is a positive condition for the hungry rat. The rat presses the bar again, and again receives food. The rat's behavior of pressing the bar is strengthened by the consequence of receiving food. In **Negative Reinforcement**, a particular behavior is strengthened by the consequence of stopping or avoiding a negative condition. For example: A rat is placed in a cage and immediately receives a mild electrical shock on its feet. The shock is a negative condition for the rat. The rat presses a bar and the shock stops. The rat receives another shock, presses the bar again, and again the shock stops. The rat's behavior of pressing the bar is strengthened by the consequence of stopping the shock. In **Punishment**, a particular behavior is weakened by the consequence of experiencing a negative condition. For example: A rat presses a bar in its cage and receives a mild electrical shock on its feet. The shock is a negative condition for the rat. The rat presses the bar again and again receives a shock. The rat's behavior of pressing the bar is weakened by the consequence of receiving a shock. In **Extinction**, a particular behavior is weakened by the consequence of not experiencing a positive condition or stopping a negative condition. For example: A rat presses a bar in its cage and nothing happens. Neither a positive or a negative condition exists for the rat. The rat presses the bar again and again nothing happens. The rat's behavior of pressing the bar is weakened by the consequence of not experiencing anything positive or stopping anything negative. - A behaviour followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future. - A behaviour no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future. **Discriminative stimuli:** The effect stimuli have on an operant response is different than in Pavlovian conditioning because the stimuli do not cause the response. They simply guide the response towards a positive or negative consequence. These operant response stimuli are called discriminative stimuli because they discriminate between the good and the bad consequences and indicate what response will be the most fruitful. For instance, a red stoplight indicates that one should step on the brakes. Although there is nothing that naturally forces humans to stop at a red light, they do stop. This is because the red indicates that if they do not, negative consequences will follow (Schwartz & Lacey, 1982, pp. 30-31). **Avoidance theory:** Although it is not always the case with discriminative stimuli, the red stop light stimuli and the appropriate stop response are also an example of the behavior known as avoidance-escape behavior. Put simply, the stimulus indicates that a negative consequence will follow if an action is not carried out, so the action is carried out. This may seem confusing given that extinction occurs in the sudden absence of any positive reinforcement. However, as shown in the experiments done by Rescorla and Solomon (1967), this is not the case. An animal was placed on one side of a partitioned box and trained to jump over the partition to avoid a shock. When the shock was removed, the animal retained its conditioned jumping behavior. Apparently in avoidant behavior, the escape or absence of reinforcement occurs because of a response. The animals in the box learned to expect shock if they did not respond or no shock if they did. Thus, the extinction occurred because they continued to respond to supposedly eliminate the shock (Schwartz & Lacey, 1982, 87-90). **Schedules of reinforcement:** Another exception to the extinction rule is an operant conditioned response that has been conditioned by intermittent schedules of reinforcement. There are four types of intermittent schedules: fixed interval schedules that reinforce a response after a certain fixed amount of time, variable interval schedules that reinforce a response after an amount of time that varies from reinforcement to reinforcement, fixed ratio schedules that reinforce a response after a certain fixed number of responses have been made, and varied ratio schedules that reinforce a response after varied numbers of responses are made. As strange as it may seem, maintenance of behavior is actually increased on these intermittent schedules as opposed to continuously reinforced behavior. This is due to the fact that with these occasional reinforcement patterns, the extinction of reinforcement takes a long time to recognize. As soon as it is recognized though, another reinforcement occurs and the extinction of the reinforcement now takes even longer to recognize. Thus, intermittent schedules keep the organism "guessing" as to when the reinforcement will occur and will reinforce the behavior without the actual reinforcement taking place (Schwartz & Lacey, 1982, pp. 91-101). ## The Validity of Behaviorism **Failure to show adequate generalizability in human behavior:** Although many experiments have been done showing evidence of both Pavlovian conditioning and operant conditioning, all of these experiments have been based on animals and their behavior. K. Boulding (1984) questions Skinner's application of principles of animal behavior to the much more complex human behavior. In using animals as substitutes for humans in the exploration of human behavior, Skinner is making the big assumption that general laws relating to the behavior of animals can be applied to describe the complex relations in the human world. If this assumption proves false, then the entire foundation upon which behaviorism rests will come crashing down. More experiments with human participants must be done to prove the validity of this theory (Boulding, 1984 pp. 483-484). **Inability to explain the development of human language:** Although Skinner's ideas on operant conditioning are able to explain phobias and neurosis, they are sadly lacking in applicability to the more complex human behaviors of language and memory. The theory's inability to explain the language phenomenon has in fact drawn a large number of critics to dismiss the theory. Although Skinner has responded to the criticism, his arguments remain weak and relatively unproven. Whereas public objective stimuli act as operational stimuli for the verbal responses, private stimuli or concepts such as "I'm hungry" are harder to explain. According to Skinner, the acquisition of verbal responses for private stimuli can be explained in four ways. First, he claims that private stimuli and the community do not need a connection. As long as there are some sort of public stimuli that can be associated with the private stimuli, a child can learn. Also, the public can deduce the private stimuli through nonverbal signs, such as groaning and facial expressions. However this association of public and private events can often be misinterpreted. His third theory that certain public and private stimuli are identical gives a very short list of identical stimuli, and his final theory that private stimuli can be generalized to public stimuli with coinciding characteristics gives very inaccurate results (Skinner, 1984a, pp. 511-517). M. E. P. Seligman offers an interesting alternative to Skinner's weak explanation of language. He explains that although operational and classical conditioning are important, there is a third principle involved in determining the behavior of an organism. This is the genetic preparedness of an organism to associate certain stimuli or reinforcers to responses. An organism brings with it to an experiment certain equipment and tendencies decided by genetics, which cause certain conditioned stimuli and unconditioned stimuli to be more or less associable. Therefore, the organism is more or less prepared by evolution to relate the two stimuli. Seligman classifies these tendencies towards association into three categories: Prepared or easily able to associate two stimuli, unprepared or somewhat difficult to associate two stimuli, and contraprepared or unable to associate two stimuli. The problem with behaviorists, he argues, is that they have mainly concentrated their experiments on unprepared sets of stimuli such as lights and shock. They provide the small amount of input needed for the unprepared association to take place and then create laws that generalize unprepared behavior to all types behavior. Thus, although the behaviorist laws may hold true for the unprepared sets of stimuli tested in labs, they have trouble explaining behaviors that are prepared (Seligman, 1970, pp. 406-408). In order to prove his theory, Seligman gives an example of an experiment conducted by Rozin and Garcia (1971) in which rats were fed saccharine tasting water while bright light flashed and noise sounded. At the same time, the rats were treated with X-ray radiation to cause nausea and illness. When the rats became ill a few hours later, they acquired an aversion to saccharine tasting water but not to light or noise. According to Seligman (1970), evolution had prepared the rats to associate taste with illness, but had contraprepared the association between noise/light and illness (pp. 411-412). When Seligman's theory of preparedness is applied to the language problem, it gives a plausible solution. Language is simply composed of well-prepared stimuli that are easily able to create relationships between verbal words and ideas or objects. In fact, they are so easy that often there is extremely little input needed for the associations to be made. But if this theory is taken as the truth, which it cannot be without further research, then this implies that there is a genetic factor that along with the environment creates personality. This rejects the comprehensive behaviorism theory so espoused by Skinner and his collaborators (Seligman, 1970, pp. 416-417). ## References * Beecroft, R. S. (1966). Method in classical conditioning. In R. S. Beecroft, _Classical conditioning_ (pp. 8-26). Goleta, CA: Psychonomic Press. * Boulding, K. E. (1984). B. F. Skinner: A dissident view. _Behavioral and Brain Sciences_, 7, 483-484. * Dahlbom, B. (1984). Skinner, selection, and self-control. _ Behavioral and Brain Sciences_, 7, 484-486. * Kamin, L. J. (1969). Predictability, surprise, attention and conditioning. In B. A. Campbell & R. M. Church (Eds.), _Punishment and aversive behavior_. New York: Appleton-Century-Crofts. * Mischel, W. (1993). Behavioral conceptions. In W. Mischel, _Introduction to personality_ (pp. 295-316). New York: Harcourt Brace. * Rescorla, R. A. (1988). Pavlovlian conditioning: It's not what you think it is. _American Psychologist_, 43, 151-160. * Rescorla, R. A., & Solomon, R. L. (1967). Two-process learning theory: Relations between Pavlovian conditioning and instrumental learning. _Psychological Review_, 74, 151-182. * Rozin, P., & Garcia, J. (1971). Specific hungers and poison avoidance as adaptive specializations of learning. _Psychological Review_, 78, 459-486. * Schwartz, B., & Lacey, H. (1982). _Behaviorism, science, and human nature_. New York: Norton. * Seligman, M. E. P. (1970). On the generality of the laws of learning. _Psychological Review_, 77, 406-418. * Skinner, B. F. (1931). The concept of the reflex in the description of behavior. _Journal of General Psychology_, 5, 427-458. * Skinner, B. F. (1984a). Operational analysis of psychological terms. _Behavioral and Brain Sciences_, 7, 511-517. * Skinner, B. F. (1984b). Selection by consequences. _Behavioral and Brain Sciences_, 7, 477-481. * Wyrwicka, W. (1984). Natural selection and operant behavior. _Behavioral and Brain Sciences_, 7, 501-502.