Textbook Notes (368,125)
Canada (161,663)
Psychology (1,957)
PS261 (109)
Chapter 6

Learning and Behaviour-Chapter 6

7 Pages
103 Views
Unlock Document

Department
Psychology
Course
PS261
Professor
Anneke Olthof
Semester
Fall

Description
1 Learning and Behaviour-Chapter 6 Schedules of Reinforcement and Choice Behaviour -schedule of reinforcement: a program or rule that determines which occurrence of a response is followed by a reinforcer -perfect contingency between response and reinforcement is rare in the real world studying hard for every test is not going to guarantee that you do good on it -reinforcement schedules that involve similar relations between the response and reinforcers usually produce similar patterns of behaviour—becomes highly predictable -schedules of reinforcement influence both how an instrumental response is learned and how it is then maintained by reinforcement -focus is on schedule factors that control the timing and repetitive performance of the instrumental behaviour Simple Schedules of Intermittent Reinforcement Ratio Schedules -ratio schedule: reinforcement depends only on the number of responses the organism has performed – ex. drug addicts visiting a clinic several times a week and getting reinforced with money if they are clean -continuous reinforcement (CRF): schedule of reinforcement in which every occurrence of the instrumental response produces reinforcer not common out of the laboratory, world is not perfect -partial/intermittent reinforcement: only some of the occurrences of instrumental response are reinforced, the response is reinforced occasionally—ex. biting into a strawberry expecting it to be good, but you get a rotten one  Fixed Ratio Schedule (FR) -fixed ratio schedule: reinforcement schedule in which a fixed number of responses must occur in order for the next response to be reinforced—ex. reinforced after every 10 lever press or making a phone call as you have to press a fixed amount of numbers before connection -steady and high rate of responding once the behaviour gets under way, but there may be a pause before the start of the required responses (such as waiting to dial the numbers to make a call) -cumulative record: special way of representing how a response is repeated over time ; total number of responses that have occurred up to a particular point in time -post-reinforcement pause: pause in responding that typically occurs after the delivery or the reinforcer on fixed ration and fixed interval schedules of reinforcement -ratio run: high and invariant rate of responding observed after the post-reinforcement pause on fixed- ratio schedules; the ratio ends when the necessary number of responses have been performed and the participant is reinforced -higher ratio requirements=longer post-reinforcement pause -ratio strain: disruption of responding that occurs when a fixed ratio response requirement is increased too rapidly when raised too quickly it can stop responding all together  Variable-Ratio Schedule (VR) -variable ratio schedule: reinforcement schedule in which the number of responses necessary to produce reinforcement varies from trial to trial; the value of the schedule refers to the average number of responses needed for reinforcement—ex. gamblers playing a slot machine, never know when you will be paid out -types of schedules are found in daily life whenever unpredictable amount of effort is required to obtain a reinforcer -because number of responses required for reinforcement is not predictable, predictable pauses in the rate of responding are less likely with variable-ratio schedules 2 -respond at a fairly steady rate -post-reinforcement pauses can occur, the pauses are longer and more prominent with fixed-ratio intervals Interval Schedule -interval schedule: reinforcement schedule in which a response is reinforced only if it occurs after a set amount of time following the last reinforcer or start of the trial  Fixed-Interval Schedule (FI) -fixed interval schedule: reinforcement schedule in which the reinforcer is delivered for the first response that occurs after a fixed amount of time following the last reinforcer or the beginning of the trial—ex. a washing machine is on this schedule as a fixed amount of time is required to complete the wash cycle (if you open the machine before the time has passed you won’t be reinforced with clean clothes) -fixed interval scallop: the gradually increasing rate of responding that occurs between successive reinforcements on a fixed interval schedule as the time of reinforcement draws nearer, there will be more and more responding -performance on these schedules reflects the subjects accuracy in telling time—if they were not good at telling time, they would be equally likely to respond the same throughout the schedule -having a watch or a clock makes it much easier to judge time intervals and there are even clock stimuli in experiments with pigeons -introduction to the clock stimuli increased the duration of the post-reinforcement pause (longer pause created) and cause responding to shift closer to the end of the FI cycle (becoming more accurate) -FI schedule does not guarantee that the reinforcer will be provided every time, in order to receive the reinforcer, the subject still has to make the instrumental response -ex. of midterms and students study habits (rate of studying increased before midterms and right after a midterm it will decrease)  Variable-Interval Schedule(VI) -variable-interval schedule: reinforcement schedule in which reinforcement is provided for the first response that occurs after a variable amount of time from the last reinforcer, or the start of the trial— ex. a mechanic not being able to tell how long it will take to fix your car -found in situations where an unpredictable amount of time is required to prepare or set up the reinforcer -subject has to perform the instrumental response in order to obtain the reinforcer—given if the individual responds after the variable interval has timed out -maintain steady and stable rates of responding without regular pauses **limited hold: a restriction on how long a reinforcer remains available; in order for a response to be reinforced, it must occur before the end of the limited time hold period occurs with interval schedules only in the real world; ex. if you want breakfast at McDonalds, you can go (behavioural response) but you have to be there at a certain time in order to get the reinforcer Comparison of Ratio and Interval Schedule -with both fixed-ratio and fixed interval schedules, we see a post-reinforcement pause after each delivery of the reinforcer and each produce high rates of responding just before the delivery of the next reinforcer -variable-ratio and variable-interval schedules both maintain steady rates of responding without predictable pauses **these two things do not mean that interval and ratio schedules motivate behaviour in the same way 3 -experiment of how they motivate behaviour differently: recorded two pigeons, one reinforced on a variable ratio schedule and the other reinforced on a variable interval schedule; trained to peck the response key for food reinforcementthe frequency of reinforcement was virtually identical for the two animals  even though they both received the same frequency and distribution of reinforcers, they behaved differently as pigeon reinforced on the VR scale responded at a much higher rate than the pigeon reinforced on the VI scale -fine that the VR schedule motivates much more vigorous instrumental behaviour -higher response rates on variable ratio schedule compared to variable interval schedule powerfully illustrate how schedules can alter the motivation for instrumental behaviour displays that we could get employees to work harder if the wages were on a ratio rather than an interval schedule Why Might Ratio Schedules Produce Higher Rates of Responding than Interval Schedules  Reinforcements of IRTs -inter-response time (IRT): the interval between one response and the next; IRTs can be differentially reinforced in the same fashion as other aspects of behaviour, such as a response force or variability -focuses on the spacing or interval between one response and the next -a subject that has mostly short inter-response time is responding at a HIGHER rate, long inter-response times are associated with a LOWER rate of responding -with ratio scale, there is no time constraints and the faster the completion of the ratio, the faster they receive reinforcementusually reinforcing short inter-response times -interval schedules favour waiting linger between responses reinforcing long inter-response times  Feedback Functions -feedback function refers to the relationship between response rates and reinforcement rates calculated over an entire experimental session or an extended period of time observed on ratio schedules -response rate is directly related to reinforcement rate as the higher the response rate, the more reinforcers the subject will earn per hour and the higher its reinforcement rate will be -with the interval schedule, they can place an upper limit on the number of reinforcers a subject can earn, such as placing a 2 minute interval; this means that the subject can only earn 30 reinforcers in a hour -doctors lawyers and hair dressers work on a ratio schedule, therefore the more clients they see, the more money they make, however cashiers or mail men cannot increase their wages by increasing their efforts meaning they are on a variable interval scale Choice Behaviour: Concurrent Schedules -experiments in which only one response is being measured ignore some of the richness and complexity of behaviour –organisms engage in a variety of activities and are continually choosing among possible alternatives -understanding mechanism of choice is fundamental to understanding behaviour since much of what we do is the result of choosing one activity over another -numerous studies of choice have been conducted in Skinner boxes equipped with two pecking keys a pigeon could peck; in a typical experiment, responding on each key is reinforced on some schedule of reinforcement (two schedules are in effect at the same time and subject is free to switch from one response key to the other)known as concurrent schedule -concurrent schedule: allow for continuous measurement of choice, because the organism is free to change back and forth between the response alternatives at any time—ex. playing slot machines at a casino is an example as you are able to change machines each on a different schedule at any time 4 -study: pigeon has the opportunity to peck the left key (VI 60 seconds) or the right key (FR 10 schedule) on either side at any timepoint in experiment is to see how the pigeon distributes its pecks on the two keys and how the schedule of reinforcement on each key influences its choices Measures of Choice Behaviour -choice in concurrent schedule
More Less

Related notes for PS261

Log In


OR

Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


OR

By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.


Submit