Summary

This document discusses various reinforcement schedules in psychology. It covers topics such as fixed ratio, variable ratio, fixed interval, and variable interval schedules. It also relates these concepts to real-life situations. The text explains how different reinforcement schedules affect behavior.

Full Transcript

WhatChapter 7: Theories and Schedules of Reinforcement Schedule of Reinforcement: the response requirement that must be met to obtain reinforcement. A schedule indicates what exactly has to be done for the reinforcer to be delivered. ○ Does each lever p...

WhatChapter 7: Theories and Schedules of Reinforcement Schedule of Reinforcement: the response requirement that must be met to obtain reinforcement. A schedule indicates what exactly has to be done for the reinforcer to be delivered. ○ Does each lever press by the rat result in a food pellet, or are several lever presses required? Continuous Reinforcement Schedule: One in which each specified response is reinforced. Each time a rat presses a lever, it gets a food pellet I turn the ignition on in my car, the motor starts Intermittent(partial) Reinforcement Schedule: One in which only some responses are reinforced. Only some of the rat’s lever presses result in a food pellet. Not all concerts we attend are enjoyable. Characterizes much of everyday life. Four Types of Intermittent Schedules 1. Fixed Ratio Schedule: Reinforcement is contingent upon a fixed, predictable number of responses. ○ FR 5(Fixed Ratio 5) a rat has to press the lever five times to obtain a food pellet. Very similar to a continuous reinforcement as each response gets reinforced. ○ Produce a high rate of response along with a short pause following the attainment of each reinforcer. I might take a short break after reading each chapter or an assignment. Then, each break is followed by a quick return to a high rate of response. ○ Dense(rich): Low ratio requirement ○ Lean: High ratio requirement ○ Ratio strain(“burn out”): Disruption in responding due to an overly demanding response requirement. Very similar to pomodoro technique 2. Variable Ratio Schedule: Reinforcement is contingent upon a varying, unpredictable number of responses. ○ Produce a high and steady rate of response, often with little or no post-reinforcement pause. ○ Gambling: the unpredictable nature of these activities results in a very high rate of behavior. VR 5 Schedule—- A rat has to emit an average of 5 lever presses for each food pellet, with the number of lever responses on any particular trial varying between 1-10. 3 lever presses for the first pellet, six presses for the second pellet, one press for the third pellet, seven presses for the fourth pellet. AVERAGE = 5 lever presses for each reinforcer. ○ Help account for the persistence with which some people display certain maladaptive behaviors. 3. Fixed Interval Schedule: Reinforcement is contingent upon the first response after a fixed, predictable period of time. ○ Produce a “scalloped” (upwardly curved) pattern of responding, consisting of a post-reinforcement pause followed by a gradually increasing rate of response as the interval draws to a close. ○ Trying to phone a business that opens in 30 min. Will be effective only after the 30 minutes have elapsed. The distribution of study sessions throughout a semester; studying for an exam. 4. Variable Interval Schedule: Reinforcement is contingent upon the first response after a varying, unpredictable period of time. ○ Produce a moderate, steady rate of response, often with little or no post-reinforcement pause. ○ Checking your phone for notifications ○ Rat on a VI-30 sec interval 1st lever press after 30 sec interval will result in a food pellet, 2nd lever press after 15 sec, 3rd lever press after 10 sec. Simple Schedules of Reinforcement 1. Duration Schedules: Reinforcement is contingent on performing a behavior continuously throughout a period of time. a. Fixed Duration Schedule: The behavior must be performed continuously for a fixed, predictable period of time. i. I can watch TV each evening after I complete 2 hours of studying. b. Variable Duration: The behavior must be performed continuously for a varying, unpredictable period of time. i. I can reinforce my behavior of studying with cookies or other treats at varying points in time…. A cookie for every 30 min. 2. Response-Rate Schedule: Reinforcement is directly contingent upon the organism’s rate of response. a. Differential Reinforcement of High Rates: Reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time, (reinforcement is provided for responding at a fast rate.)----ONE type of response is reinforced while another is not i. A worker on an assembly line can be told that they can keep their job if they can assemble 20 carburetors per hour. b. Differential Reinforcement of Low Rates: A minimum amount of time must pass between each response before the reinforcer will be delivered(reinforcement is provided for responding at a slow rate.)----RESPONSES THAT OCCUR DURING THE INTERVAL DO HAVE AN EFFECT. i. A rat might receive a food pellet only if it waits at least 10 sec. Between lever presses. ii. Athletic events c. Differential Reinforcement of Paced Responding: Reinforcement is contingent upon emitting a series of responses at a set rate(reinforcement is provided for responding neither too fast nor too slow.) i. Playing an instrument or dancing to music—the actions are performed at a specific pace. (Good sense of timing or rhythm). NonContingent Schedules NonContingent Schedules: The reinforcer is delivered independently of any response(a response is not required for the reinforcer to be obtained.)---- RESPONSE-INDEPENDENT Schedules 1. Fixed Time(FT) Schedule: The reinforcer is delivered following a fixed, predictable period of time, regardless of the organism’s behavior. a. A pigeon receives access to food every 30 sec. Regardless of its behavior b. Many people receive christmas gifts every year regardless if they have been “naughty or nice”. i. The Delivery of a “free” reinforcer following a predictable period of time. 2. Variable Time(VT) Schedule: The reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism’s behavior. a. You may coincidentally run into an old friend about every 3 months. i. Might Account for some forms of Superstitious Behavior Complex Schedules of Reinforcement Complex Schedules: Consist of a combination of two or more simple schedules. 1. Conjunctive Schedules: A type of complex schedule in which the requirements of two or more simple schedules must be met before a reinforcer is delivered. a. The wage you earn on a job is contingent upon working a certain number of hours each week and doing a sufficient amount of work so that you will not be fired. 2. Adjusting Schedules: The response requirement changes as a function of the organism’s performance while responding for the previous reinforcer. a. Tara displayed excellent ability in mastering her violin lessons, and she and her parents decided to increase the amount she had to learn each week. 3. Chained Schedules: Consists of a sequence of two or more simple schedules, each of which has its own S^d and the last of which results in a terminal reinforcer. a. People will compile a “to-do” list of assignments or chores to get done and cross off each item as it is completed. (Crossing off a task acts as a second reinforcer to help keep us motivated and have a sense of accomplishment. i. Goal Gradient Effect: An increase in the strength and/or efficiency of responding as one draws near to the goal. 1. A student writing an essay is likely to take shorter breaks and work more intensely as they near the end 2. Rats running through a maze to obtain food tend to run faster and make fewer wrong turns as they near the goal box. Theories of Reinforcement 1. Drive Reduction Theory: An event is reinforcing to the extent that it is associated with a reduction in some type of physiological drive. a. Food Deprivation produces a “hunger drive”, which propels the animal to seek out food. i. Incentive Motivation: Motivation that is derived from some property of the reinforcer as opposed to an internal drive state. 1. Playing a video game for the fun of it 2. going to a concert to enjoy the music 2. The Premack Principle: States that a high-probability(high-frequency) behavior can be used to reinforce a low-probability(low-frequency) behavior. a. An hour reading comic books(high probability behavior or “goal”) to reinforce doing chores(low probability behavior. b. “ First you work, then you play” c. Watching an episode of the Big Bang Theory(high probability behavior) after each study session(low probability behavior). 3. Response Deprivation Hypothesis: A behavior can serve as a reinforcer when (1) access to that behavior is restricted and (2) its frequency thereby falls, or is in danger of falling, below its preferred level of occurrence. a. Kailey who enjoys reading comic books each day—-- If her parents demand that she complete her chores each day before being allowed to read comic books, her baseline level of free comic book reading will drop to zero. RESULT—- deprived of comic book reading, she will now be willing to do chores to restore her level of comic book reading back to its preferred level. 4. Behavioral Bliss Point Approach: An organism with free access to alternative activities will distribute its behavior in such a way as to maximize overall reinforcement. a. (1) A rat that can freely choose between running in a wheel and (2) exploring a maze. RESULT—- The rat will spend 2 hours running in a wheel and 2 hours exploring a maze(optimal distribution of activities).

Use Quizgecko on...
Browser
Browser