Textbook Notes (280,000)
CA (170,000)
UTSC (20,000)
Psychology (10,000)
PSYB45H3 (1,000)
Chapter 8-9

PSYB45H3 Chapter 8-9: Textbook Chapter 8 & 9


Department
Psychology
Course Code
PSYB45H3
Professor
Jessica Dere
Chapter
8-9

This preview shows pages 1-3. to view the full 12 pages of the document.
Textbook Notes PSYB45 Lec 4
Chapter #8 & Chapter #9
1
Chapter # Deelopig Behaioral
Persistece ith Schedules of
Reiforceet
Some Definitions
- intermittent reinforcement is an arrangement in which a behavior is positively
reinforced only occasionally (i.e., intermittently) rather than every time it occurs
- Response rate refers to the number of instances of a behavior that occur in a given
period of time. Response rate is more commonly used when talking about schedules of
reinforcement, so that is the term we use in this chapter.
- A schedule of reinforcement is a rule specifying which occurrences of a given behavior, if
any, will be reinforced.
- The simplest schedule of reinforcement is continuous reinforcement (CRF), which is an
arrangement in which each instance of a particular response is reinforced.
o Example: each time you turn on the tap, your behavior is reinforced by water
- The opposite of CRF is called operant extinction - on an extinction schedule no instance
of a given behavior is reinforced. The effect is that the behavior eventually decreases to
a very low level or ceases altogether.
- Between these two extremesCRF and operant extinctionlies intermittent
reinforcement
- While a behavior is being conditioned or learned, it is said to be in the acquisition phase.
- After it has become well learned, it is said to be in the maintenance phase
- It is best to provide CRF during acquisition and then switch to intermittent
reinforcement during maintenance.
- Intermittent schedules of reinforcement have several advantages over CRF for
maintaining behavior:
o (a) The reinforcer remains effective longer because satiation takes place more
slowly;
o (b) behavior that has been reinforced intermittently tends to take longer to
extinguish
o (c) individuals work more consistently on certain intermittent schedules;
o (d) behavior that has been reinforced intermittently is more likely to persist after
being transferred to reinforcers in the natural environment.
Ratio Schedules
- In a fixed-ratio (FR) schedule, a reinforcer occurs each time a fixed number of responses
of a particular type are emitted
- deterioration of responding from increasing an FR schedule too rapidly is sometimes
find more resources at oneclass.com
find more resources at oneclass.com

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Textbook Notes PSYB45 Lec 4
Chapter #8 & Chapter #9
2
referred to as ratio strain
- The optimal response requirement differs for different individuals and for different tasks
- the higher the ratio at which an individual is expected to perform, the more important it
is to approach it gradually through exposure to lower ratios
- The optimal ratio value or response requirement that will maintain a high rate of
response without producing ratio strain must be found by trial and error
- When considering the effects of schedules of reinforcement on response rate, we need
to distinguish between free-operant procedures and discrete-trials procedures.
o A free-operant procedure is one in which the idiidual is free to respod at
various rates in the sense that there are no constraints on successive responses
For example: given 12 math problems working at the rate of 1 problem
per minute or 3 problems per minute or any other rate
o discrete-trials procedure, the idiidual is ot free to respod at hateer rate
he or she chooses because the environment places limits on the availability of
response opportunities
For example: if a parent told a teenager child they could take the family
car after you have helped do the dishes following three evening meals
- variable-ratio (VR) schedule, a reinforcer occurs after a certain number of a particular
response, and the number of responses required for each reinforcer changes
unpredictably from one reinforcer to the next (example: slot machines)
o The number of responses required for each reinforcement in a VR schedule
varies around some mean value, and this value is specified in the designation of
that particular VR schedule
- Three differences between the effects of VR and FR schedules are that:
o the VR schedule can be increased somewhat more abruptly than an FR schedule
without producing ratio strain
o the values of VR that can maintain responding are somewhat higher than FR
o VR produces a higher resistance to extinction than FR schedules of the same
value
- A type of reinforcement schedule that is becoming increasingly popular in applied
settings is progressive ratio (PR) - A PR schedule is like an FR schedule, but the ratio
requirement increases by a specified amount after each reinforcement
o At the beginning of each session, the ratio requirement starts back at its original
value.
o After a number of sessions, the ratio requirement reaches a levelcalled the
break point or breaking pointat which the individual stops responding
completely.
Simple Interval Schedules
- fixed-interval (FI) schedule, a reinforcer is presented following the first instance of a
specific response after a fixed period of time
- Most of us rely on clocks to tell us when to do things that are reinforced on an FI
schedule
find more resources at oneclass.com
find more resources at oneclass.com

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Textbook Notes PSYB45 Lec 4
Chapter #8 & Chapter #9
3
- In such cases FI schedules produce:
o (a) a rate of responding that increases gradually near the end of the interval until
reinforcement occurs
o (b) a postreinforcement pause
- An FI schedule requires only one response at the end of the interval
- When judging whether a behavior is reinforced on an FI schedule, you should ask
yourself two questions:
o (a) Does reinforcement require only one response after a fixed interval of time?
o (b) Does responding during the interval affect anything?
- Example: A job that pays by the hour is often erroneously cited as an example of an FI
schedule
o A little thought shows that it is incorrect because hourly pay assumes that the
individual works throughout each hour.
o But an FI schedule only requires one response at the end of the interval
- Variable-interval (VI) schedule: a reinforcer is presented following the first instance of a
specific response after an interval of time, and the length of the interval changes
unpredictably from one reinforcer to the next. on a VI schedule, a response is reinforced
after unpredictable intervals of time
- The lengths of the intervals in a VI schedule vary around some mean value, which is
specified in the designation of that particular VI schedule
o For example, if a mean of 25 minutes is required before reinforcement (e.g.,
receiving an e-mail) becomes available, the schedule is abbreviated VI 25
minutes
- VI produces a high resistance to extinction relative to continuous reinforcement
- VI produces no postreinforcement pause
- Simple interval schedules are not often used in behavior modification programs for
several reasons:
o (a) FI produces long postreinforcement pauses
o (b) although VI does not produce long postreinforcement pauses, it does
generate lower response rates than ratio schedules do
o (c) simple interval schedules require continuous monitoring of behavior after the
end of each interval until a response occurs.
Schedules with a Limited Hold
- A limited hold is a deadline for meeting the response requirement of a schedule of
reinforcement.
- A limited hold can be added to any of the ratio or interval schedules.
Fixed-Ratio Schedules with a Limited Hold
- “uppose that a fitess istrutor sas to a perso ho is eerisig, If ou do 30 sit-ups,
the ou a get a drik of ater. That ould e a FR 30 shedule. No suppose that
the fitess istrutor sas to the perso, If ou do 30 sit-ups in 2 minutes, then you can
get a drik of ater. That ould e a eaple of a FR 30 shedule ith a liited
find more resources at oneclass.com
find more resources at oneclass.com
You're Reading a Preview

Unlock to view full version