CHAPTER 6 – DEVELOPING BEHAVIORAL PERSISTENCE THROUGH THE
USE OF INTERMITTENT REINFORCEMENT
• Intermittent reinforcement is when behavior is reinforced occasionally rather than
every time it occurs.
Schedules of Reinforcement
• Schedule of reinforcement: A rule specifying which occurrences of a given
behavior, if any, will be reinforced
• Different schedules of reinforcement have different impacts on the rate of behaviors
and the speed of extinction (produce different behavior patterns)
• Different schedules may be optimal for different kinds of situations
• Continuous reinforcement: is the simplest schedule of reinforcement. Fast
learning and fast extinction
• Intermittent reinforcement: Slow learning and slow extinction
• Extinction: is the complete opposite of continuous reinforcement
• Continuous and extinction are extremes of reinforcement while intermittent is in the
• When a behavior is being conditioned or learned then it is in the acquisition phase.
Best form of reinforcement in this phase is continuous reinforcement.
• After it is well learned, it is then in the maintenance phase. Best form of
reinforcement in this phase is intermittent phase.
• Advantages of intermittent reinforcement (vs. continuous) for the maintenance of
a) Reinforcer remains effective longer because satiation takes place more
b) Behavior that has been reinforced intermittently tends to take longer to
c) Individuals work more consistently on certain intermittent schedules
d) Behavior that has been reinforced intermittently is more likely to persist
after being transferred to reinforcement in the natural environment
• Free-operant procedure: No constraints on the individual’s responses,
“free” to respond repeatedly.
• Discrete-trials procedure: A distinct stimulus is presented prior to an
opportunity to respond, the individual’s responses are limited to the rate at
which the stimuli is presented
o Ex: presenting math problems one at a time
• Based on the number of responses (target behavior) emitted
• Fixed-ratio schedule: Reinforcement occurs each time a fixed number of
responses are emitted
• Behavioral result:
o High steady rate of responding until reinforcement, followed by a post-
reinforcement pause o High resistance to extinction
• Ratio strain: Deterioration of responding from increasing a fixed-ratio schedule too
• It is best to increase the number of required responses gradually
• Examples of fixed-ratio schedules:
o Do 20 sit-ups before taking a water break
o Do 10 math problems before having a snack
o Working on a production line where you have to complete your task a
certain number of times before getting paid or taking a break
• Variable-ratio schedule: The number of responses required for producing
reinforcement changes unpredictably, from one reinforcement, to the next.
• Behavioral result:
o High steady rate of responding until reinforcement.
o No or very small post-reinforcement pause.
o High resistance to extinction
• Examples of variable-ratio schedules:
o Slot machines
o Door-to-door sales
• VR vs. FR schedules
o VR schedules can be increased more abruptly than FR schedules without
producing ratio strain
o Values of VR (the mean number of required responses) that can maintain
a behavior are somewhat higher than the values for FR
o VR produces higher resistance to extinction than comparable FR
Simple Interval schedules
• Fixed-interval schedule: The first response after a fixed amount of time following
the previous reinforcement is reinforced, and a new interval begins
o The size of an FI schedule is the amount of time that must elapse before
reinforcement becomes available
o The response can occur at any time once the time interval has passed
and it will be reinforced
• Behavioral result:
o Rate of responding increases gradually near the end of the time interval
until reinforcement is available
o Post-reinforcement pause - longer time interval, longer pause
• How to determine whether an FI schedule is in place:
o Does reinforcement require only one response after a fixed period of time?
o Does responding during the time interval affect anything? (NO)
• Example of a fixed-interval schedule:
o Looking at the clock while waiting for your work shift to end at 5:00pm. Time need to pass before reinforcement takes place.
• Variable-interval schedule: A response is reinforced after u