Textbook Notes (368,278)
Canada (161,760)
Psychology (1,957)
PS261 (109)
Chapter 5

Learning and Behaviour-Chapter 5

9 Pages
70 Views
Unlock Document

Department
Psychology
Course
PS261
Professor
Anneke Olthof
Semester
Fall

Description
1 Learning and Behaviour- Chapter 5 Instrumental Conditioning Foundations -instrumental conditioning is the type of conditioning that is involved in training a quarterback to throw a touchdown or a child to skip rope—in this type of conditioning, obtaining a goal or reinforcer depends on the prior occurrence of a designated response -procedures used in experiments of habituation, sensitization, and classical conditioning do not require the participant to make a particular response in order to obtain food or other USs or CSsClassical conditioning reflects how organisms adjust to events in their environment that they cannot control directly -in instrumental behaviour, responding is necessary to produce a desired environmental outcome—the behaviour will occur as similar actions produced the same type of outcome in the past -ex. student can earn better grades by studying harder, turning the key in the ignition to start the car -instrumental behaviour: an activity that occurs because it is effective in producing a particular consequence or reinforcer Early Investigations of Instrumental Conditioning -began with the early work of Thorndike who originally was looking at animal intelligence by placing a hungry cat, dog or chicken in a puzzle box and measuring the length of time it took them to get out found that with continued practice, it took them less and less time to get out -although he titled his treatise animal intelligence, he found that many aspects of animal behaviour were rather unintelligent—he did not believe the animals got out of the box because they gained sight into the task he interpreted it as reflecting learning of an S-R association -as the association between the box cues and the successful response became stronger, the animal came to make that response more quickly; consequence of successful response, strengthened the association between the box stimuli and that response -law of effect: states that if a response in the presence of a stimulus is followed by a satisfying event, the association between the stimulus (S) and the response (R) is strengthened—if the response is followed by an annoying event, the S-R association is weakened -according to the law of effect, what is learned is an association between the response of the stimuli present at the time of the response—satisfying or annoying consequence simply serves to strengthen or weaken the association between the preceding stimulus and response Modern Approaches to the Study of Instrumental Conditioning -as more scientists have become involved in instrumental learning, the range of tasks that they use has become smaller—a few of these have become standard and have been used repeatedly to facilitate comparison of results obtained in different laboratories Discrete Trial Procedure -discrete trial procedure: method of instrumental conditioning in which the participant can perform the instrumental response only during specified periods, usually determined either by placement of the participant in an experimental chamber or by the presentation of the stimulus -each training trial ends with the removal of the animal from the apparatus and the instrumental response is performed only once during each trial -often conducted in some type of maze; T maze has been used in many experiments and consists of a start box and a goal box is arranged at each end of the T’s used to study more complex questions as it has 2 choices 2 -behaviour in a maze can be quantified by measuring running speed (time from start to goal) and latency (the time it takes the animal to leave the start box) Free-Operant Procedures -free-operant procedures: a method of instrumental conditioning that permits repeated performance of the instrumental response without intervention by the experimenter this method was invented by B.F Skinner to study behaviour in a more continuous manner than is possible with mazes and was interested in looking at a form of behaviour that could represent natural occurring activity -causal observation suggests that ongoing behaviour is continuous, one activity leads to another and that behaviour does not fall into units, therefore Skinner proposed the concept of the operant as a way of dividing behaviour into meaningful measurable units -operant response: a response that is defined by the effect it produces in the environment; ex. pressing a lever and opening a door, any sequence of movement that depresses the lever or opens the door constitutes an instance of the particular operant -the critical thing is the way in which the behaviour operates in the environment various ways of pressing the lever are assumed to be functionally equivalent because they all have the same effect on the environment which would be activation of the recording sensor (rat could press with paw, tail or nudge and it would all be the same response) -perform many operants during the course of our daily lives, such as opening the door where it does not matter if we use out right or left hand -any response that is required to produce a desired outcome is an instrumental response, since it is “instrumental” in producing a particular outcome  Magazine Training and Shaping -successful training of operant or instrumental response often requires lots of practice and a carefully designed series of training steps that move the student from the status of a novice to an expert -like practice, there are also preliminary steps for establishing lever-press responding in a lab rat  first, the rat has to learn when food is available in the cup (involves classical condition as the sound of the food delivery device known as food magazine) and after enough pairings, the sound comes to elicit a sign tracking response as the animal will go to the food cup—this is known as magazine training -magazine training: preliminary stage of instrumental conditioning in which a stimulus is repeatedly paired with the reinforcer to enable participant to learn to get reinforcer when it is presented  after magazine training, rat is ready to learn required operant response, the food will be given if the rat does anything remotely related to pressing the lever—such as giving the rat food when it even comes close to the lever—sequence is known as shaping -shaping: reinforcement of successive approximations to a desired instrumental response -at first, only crude approximations of the final performance are required for reinforcement, but as the shaping process progresses, more and more is required until the reinforcer is only given if the final target response is made -successful shaping of behaviour involves: 1) clearly define the final response you wish subject to perform, 2) you have to clearly assess the starting level of performance, no matter how far it is from the response you are interested in, 3) you have to divide the progression from the starting point to the final target response in appropriate training steps or successive approximations -reinforcement of successive approximations, and non-reinforcement of earlier response forms  Shaping and New Behaviour -shaping processes are often used to generate new behaviour 3 -in teaching the rat to press the bar, we are not teaching new response components as it has done this action in its natural habitat at some point, rather we are teaching the rat how to combine familiar responses into a new activity -instrumental conditioning often involves the construction, or synthesis or a new behavioural unit from pre-existing response components that already occur in the subjects repertoire—however it can also be used to produce responses unlike anything the subject ever did before, such as throwing a football 60 yards down the field as this would be extremely hard for an untrained person to do, or even playing a musical instrument (novel responses such as these are also created by shaping) -creation of new responses by shaping depends on the inherent variability of behaviour -ex. a coach asking his player to throw 30 yards, he might throw 25, 32, 29 or 34 yards-this variability permits the coach to set the next successive approximation at 34 yards and with this new target the trainee will start to make even longer throws with gradual iterations of this process, the trainee will make longer and longer throws achieving distance that would not be performed otherwise -shaping takes advantage of the variability of behaviour and generates responses that are entirely new in the trainee’s repertoire  Response Rate as a Measure of Operant Behaviour -free operant methods permit continuous observation in contrast with discrete-trial techniques -with continuous opportunity to respond, the organism determines the frequency of its instrumental response -provides a special opportunity to observe changes in the likelihood of behaviour over time -Skinner proposes that the rate of occurrence of operant behaviour (frequency of the response per minute) be used as a measure of response probability highly likely responses occur frequently and have a high rate in contrast with unlikely responses that occur seldomly and have a low rate **response rates have become the primary measure in studies that employ free-operant procedures Instrumental Conditioning Procedures -appetitive stimulus: a pleasant or satisfying stimulus that can be used to positively reinforce an instrumental response ex. mowing the lawn, the result is getting paid, or it may be an instrumental response that may turn off or eliminate a stimulus, such as shutting the window to stop incoming rain -aversive stimulus: an unpleasant or annoying stimulus that can be used to punish an instrumental response Positive Reinforcement -positive reinforcement: instrumental conditioning procedure in which there is a positive contingency between the instrumental response and a reinforcing stimulus—if participant performs response, they will get reinforcing stimulus and if they do not perform the response they do not get the reinforcing stimulus ex. a father gives his daughter a cookie when she puts her toys away -positive reinforcement should produce an increase in the rate of responding  Positive: response produces appetitive stimulus  Reinforcement increases in response rates Punishment -punishment: instrumental conditioning procedure in which there is a positive contingency between the instrumental response and an aversive stimulus; if the participant performs the instrumental response, it receives an aversive stimulus, if the participant does not perform the instrumental response, it does not receive an aversive stimulusex. your teacher gives you a failing grade for answering too many questions wrong on a test  Positive: response produces an aversive stimulus 4  Punishment or a decrease in the response rate Negative Reinforcement -negative reinforcement: instrumental conditioning procedure in which there is a negative contingency between the instrumental response and an aversive stimulus; if the instrumental response is performed, the aversive stimulus is terminated or cancelled, but if it is not performed, the aversive stimulus is presentedex. opening up an umbrella to stop the rain from getting you wet, or rolling up the car window to reduce the wind blowing in -people tend to confuse negative reinforcement and punishment as an aversive stimulus is used in both procedures—in punishment procedures the instrumental response produces the aversive stimulus, in negative reinforcement, the response terminates the aversive stimulus  Negative: response eliminates or prevents the occurrence of an aversive stimulus  Reinforcement or increase in response rate Omission Training -omission training: an instrumental conditioning procedure in which a positive reinforcer is periodically delivered only is the participant does something other than the target responseex. when a child is told to go to his or her room after doing something bad, the child does not receive an aversive stimulus when they are told to go to their room, rather by sending the child to the room, the parent is withdrawing sources of positive reinforcement such as playing with friends or watching TV -omitting good things—another example would be suspension of a driver’s license -omission training is often a preferred method of discouraging human behaviour, because, unlike punishment it does not involve delivering an aversive stimulus -also known as differential reinforcement of other behaviour (DRO)  Negative: response eliminates or prevents the occurrence of an appetitive stimulus  Punishment or decrease in response rate -case study with the women Bridget who banged her head against the wall in order to get attention; therefore to discourage self-injury behaviour an omission training procedure was put into place and during the omission stage Bridget was ignored when she banged her head hard against the wall, but received attention if she was not head banging  found that self-injury behaviour decreased significantly during the DRO sessions -find that attention is a very powerful reinforcer for human behaviour -progressive animal trainers reward the behaviour they want and, equally important, ignore the behaviour they don’t want Fundamental Elements of Instrumental Conditioning -analysis of instrumental conditioning involves numerous factors and variables -instrumental conditioning fundamentally involves three elements: the instrumental response, the outcome of the response (reinforcer), and the relation or contingency between the response and the outcome The Instrumental Response -some responses are more easily modified than others-this section describes how the nature of the response determines the results of positive reinforcement procedures  Behavioural Variability vs. Stereotypy 5 -Thorndike described instrumental conditioning as involving the stamping of an S-R association and Skinner wrote about being reinforced, or strengthened both emphasized that reinforcement increases the likelihood that the instrumental response will be repeated in the future -accustomed to thinking about the requirements for reinforcement being an observable action such as the movement of a leg, torso or hand; however criteria for reinforcement can be defined in terms of more abstract behaviour such as its novelty (behaviour for reinforcement might include doing something new that they may not have had to do on the previous trials—engaging in response variability which is the basis for instrumental reinforcement) -ex. classic study of instrumental conditioning of response variability: pigeons in experimental group had to peck either the right or left response key 8 times in order to get reinforced, however after each trial they could not use the pattern that they had used previously, thus these pigeons had to generate novel patterns of right-left pecks and not repeat any for the 50 trials; in the control group food was provided at the same frequency after 8 pecks, but they did not have to generate novel response sequences Results: when the instrumental conditioning procedure required response variability, variability in responding increased to about 75% by the last 5 days of training, and in contrast with the control condition, variability in performance sequences dropped to less than 20% shows that variability in responding can be increased by rei
More Less

Related notes for PS261

Log In


OR

Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


OR

By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.


Submit