Understanding the Premack Principle
33 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What is a continuous reinforcement schedule (CRF)?

  • A schedule in which only some responses are reinforced
  • A schedule in which each specified response is reinforced (correct)
  • A schedule in which reinforcement is contingent upon the time that has elapsed
  • A schedule in which reinforcement is contingent upon a fixed, predictable number of responses
  • What type of schedule is a Fixed Ratio=FR 5?

  • Reinforcement is contingent upon the time that has elapsed
  • Reinforcement is contingent upon the average number of responses
  • Reinforcement is contingent upon a fixed, predictable number of responses (correct)
  • Reinforcement is contingent upon the time intervals between responses
  • In which type of intermittent (partial) schedule does the reinforcement occur after an average number of responses?

  • (Variable Interval=VI)
  • (Fixed Ratio=FR)
  • (Variable ratio=VR) (correct)
  • (Fixed Interval=FI)
  • What type of schedule characterizes the behavior of starting your car in very cold weather?

    <p>(Variable Interval=VI)</p> Signup and view all the answers

    According to the Premack principle, reinforcers can often be viewed as _____ rather than stimuli.

    <p>behaviors</p> Signup and view all the answers

    The Premack principle states that a _____ ____ behavior can be used as a reinforcer for a _____ ____ behavior.

    <p>high probability; low probability</p> Signup and view all the answers

    If play video games is a diagram of a reinforcement procedure based on the Premack principle, then chewing bubble gum must be a (lower/higher) _______ probability behavior than playing video games.

    <p>lower</p> Signup and view all the answers

    A behavior can be used as a reinforcer if access to the behavior is restricted so that its frequency falls below its baseline rate (preferred level) of occurrence. Do not need to know the relative probabilities of two behaviors beforehand. The frequency of one behavior relative to its baseline is the important aspect.

    <p>True</p> Signup and view all the answers

    If a child normally watches 4 hours of television per night, we can make television watching a reinforcer if we restrict free access to the television to (more/less)_______ than 4 hours per night.

    <p>less</p> Signup and view all the answers

    Gina often goes for a walk through the woods, and even more often she does yardwork. According to the ______, walking through the woods could still be used as a reinforcer for yardwork given that one restricts the frequency of walking to _______ its ________ level.

    <p>Premack principle; below; preferred</p> Signup and view all the answers

    Kaily typically watches television for 4 hours per day and reads comic books for 1 hour per day. You then set up a contingency whereby Kaily must watch 4.5 hours of television each day in order to have access to her comic books. According to the Premack principle, this will likely be an (effective/ineffective) __________ contingency.

    <p>ineffective</p> Signup and view all the answers

    If drinking a soda because you love its sweetness is an example of _______ motivation.

    <p>intrinsic</p> Signup and view all the answers

    According to Hull's Drive Reduction Theory, reinforcement occurs when a stimuli is associated with...

    <p>Reduction in physiological drive</p> Signup and view all the answers

    Which scenario exemplifies Sheffield's Drive Induction Theory?

    <p>Customers repeatedly visiting a barber shop due to the induction of sexual contact through reading materials</p> Signup and view all the answers

    What does the Premack Principle suggest about reinforcing behaviors?

    <p>High-probability behavior can reinforce a low-probability behavior</p> Signup and view all the answers

    How does Sheffield suggest creating strong reinforcement?

    <p>Combining drive induction with drive reduction</p> Signup and view all the answers

    What did research supporting Hull's theory demonstrate about hungry rats in a T-maze?

    <p>Performing better when given several small pellets as reinforcement</p> Signup and view all the answers

    What does Sheffield's Drive Induction Theory argue regarding reinforcement?

    <p>Reinforcement can occur through the induction of a drive, rather than just the reduction of it.</p> Signup and view all the answers

    According to Hull's Drive Reduction Theory, what strengthens preceding behavior?

    <p>Reinforcement associated with reduction in physiological drive</p> Signup and view all the answers

    In applied settings, how can the Premack Principle be used to make less probable behaviors more reinforcing?

    <p>Using high-probability behaviors to reinforce low-probability behaviors</p> Signup and view all the answers

    What does Hull's theory fail to explain, leading to the concept of incentive motivation?

    <p>Reinforcers not associated with drive reduction</p> Signup and view all the answers

    What does D. Premack's Premack Principle view as reinforcers?

    <p>Behaviors rather than stimuli</p> Signup and view all the answers

    What is the general strategy of commercials according to Sheffield's Drive Induction Theory?

    <p>Induce a need state and then reduce it</p> Signup and view all the answers

    What type of schedule provides reinforcement after a varying number of responses, leading to a steady rate of response with no postreinforcement pause?

    <p>Variable Ratio (VR)</p> Signup and view all the answers

    Which schedule reinforces the first response after a fixed period of time, producing a scalloped pattern of responding with a postreinforcement pause?

    <p>Fixed Interval (FI)</p> Signup and view all the answers

    In which type of intermittent (partial) schedule does the reinforcement occur after an average number of responses?

    <p>Variable Ratio (VR)</p> Signup and view all the answers

    What is the equation for Fixed Ratio=FR 5?

    <p>$f(x) = 5x$</p> Signup and view all the answers

    Which schedule results in a high rate of response with a short postreinforcement pause, which increases with higher ratio requirements?

    <p>Fixed Ratio (FR)</p> Signup and view all the answers

    What type of schedule reinforces the first response after a varying period of time, resulting in a moderate, steady rate of response with little or no postreinforcement pause?

    <p>Variable Interval (VI)</p> Signup and view all the answers

    Which statement about dense and lean reinforcement schedules is true?

    <p>Dense schedules are more difficult to obtain reinforcement from.</p> Signup and view all the answers

    How do different schedules of reinforcement affect behavior over time?

    <p>They can affect the development of certain behaviors.</p> Signup and view all the answers

    What is the equation for Fixed Interval=FI 30-sec?

    <p>$f(t) = \frac{1}{30}t$</p> Signup and view all the answers

    How do time-contingent schedules differ from fixed schedules in terms of rapidity of responses?

    <p>Fixed schedules produce postreinforcement pauses due to distant reinforcers.</p> Signup and view all the answers

    Study Notes

    Schedules of Reinforcement in Operant Conditioning

    • Fixed Ratio (FR) schedules require a specific number of responses for reinforcement, such as 50 lever presses.
    • An FR schedule results in a high rate of response with a short postreinforcement pause, which increases with higher ratio requirements.
    • Variable Ratio (VR) schedules provide reinforcement after a varying number of responses, leading to a steady rate of response with no postreinforcement pause.
    • Fixed Interval (FI) schedules reinforce the first response after a fixed period of time, producing a scalloped pattern of responding with a postreinforcement pause.
    • Variable Interval (VI) schedules reinforce the first response after a varying period of time, resulting in a moderate, steady rate of response with little or no postreinforcement pause.
    • Reinforcement schedules can be dense or lean, with dense schedules easily obtaining reinforcers and lean schedules being more difficult to obtain reinforcement from.
    • Different schedules of reinforcement can affect behavior, such as the development of an abusive relationship, with dense reinforcement becoming more intermittent over time.
    • The rapidity of responses has little effect on time-contingent schedules, while fixed schedules produce postreinforcement pauses due to distant reinforcers.
    • Examples of schedules include FR 200, VR 5, FI 30-sec, and VI 30-sec, each with specific characteristics and patterns of response.
    • Examples of real-life scenarios involving reinforcement schedules include a mother's repeated requests to her child and waiting for a bus at specific intervals.
    • The behavior of individuals in response to reinforcement schedules can vary, with different patterns of response over time.
    • Understanding the different schedules of reinforcement is essential in operant conditioning and can help in analyzing and modifying behavior in various contexts.

    Studying That Suits You

    Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

    Quiz Team

    Related Documents

    Schedules of Reinforcement PDF

    Description

    Test your knowledge about the Premack principle and its application in behavior modification. Explore the challenges in measuring behavior probabilities and learn about the potential errors in determining free-choice preference rates.

    More Like This

    Use Quizgecko on...
    Browser
    Browser