6.1 What ideas guide study of learning? Skinner believed application of basic learning principles could create a
better, more humane world. Walden Two: utopia of praise/incentives instead of punishment.
Learning: enduring change in behavior resulting from experience, adapting to environment.
Understanding how events are related through conditioning (environmental stimuli and behavioral
responses become connected). Classical/Pavlovian conditioning: learning that two types of events go
together (music, scary movies). Operant/instrumental conditioning: behavior leads to a particular
outcome (studygrades). Watson rejected Freudian methods of introspection - dream analysis and free
association because they could not be observed directly, studied using scientific methods. Watson
argued observable behavior was only valid indicator of psychological activity. Lock’s tabula rasa idea
meant infants born knowing nothing and acquires all knowledge through sensory experiences. Watson
believed environment and its effects were only determinants of learning.
Behavioral responses are conditioned: Pavlov studied salivary reflex acquired in dogs
o Pavlov’s experiments: neutral stimulus unrelated to salivary reflex presented along with stimulus
that produces reflex. This pairing is conditioning trial, repeated many times. Then critical trials
have neutral stimulus alone, reflex is measured. Classical/Pavlovian conditioning: type of
learned response; neutral objects come to elicit response when associated with stimulus that
already produces response. Unconditioned response UR: does not have to be learned, such as a
reflex. Unconditioned stimulus US: elicits response, such as a reflex, without prior learning.
Conditioned stimulus CS: elicits response only after learning has taken place. Conditioned
response CR: response to a conditioned stimulus, has been learned. CR weaker than UR.
o Acquisition, extinction, spontaneous recovery: conditioning is basis for how animals learn to
adapt to environments. Acquisition: gradual formation of association between conditioned and
unconditioned stimuli. Critical element: stimuli occur together in time – contiguity (stronger
acquisition when there is brief delay between CS / US). Extinction: CR extinguished when CS
no longer predicts US. Spontaneous recovery: previously extinguished response reemerges after
presentation of CS
o Generalization, discrimination, and second-order conditioning: stimulus generalization: stimuli
similar, not identical to CS produce CR. Adaptive, because in nature CS is not experienced in
identical way. Stimulus discrimination: learning to differentiate between similar stimuli if one
is consistently associated with US, other is not. Second-order conditioning: CS does not become
directly associated w/ US, instead, CS becomes associated w/ other stimuli associated with US.
Phobias and addictions have learned components
o Phobias and treatment: acquired fear out of proportion to real threat, develop through
generalization of fear experience. Fear conditioning: classically conditioned to fear neutral
objects. Albert conditioned to be afraid of rats, other objects. Counterconditioning: exposing
patient to small doses of feared stimulus while engaging in enjoyable task. Systematic
desensitization: relaxing muscles while imagining feared object. CS CR1 (fear) connection can
be broken by developing a CS CR2 (relaxation) connection.
o Drug addiction: smell coffee, sight needle. Treatment for drug addiction should include drug
cues to help extinguish responses to those cues. Tolerance higher in previously-used locations
Classical conditioning involves more than events occurring at same time: some conditioned stimuli are
more likely to produce learning; contiguity not sufficient to create CS-US association
o Evolutionary significance: certain stimuli pairs more likely to become associated than others,
e.g., conditioned food aversion when illness clearly caused by other condition, blamed on food.
Rats rely more on taste and birds rely more on vision in selecting food, so birds learn to associate
visual cues with illness better. Rats freeze to auditory CS, but rise to visual CS. Biological
preparedness: animals tend to fear potentially dangerous things over objects with little threat.
Learning involves cognition: classical conditioning is how animals predict events. Cognitive perspective
on learning: increased consideration of mental processes such as prediction/expectancy. Rescorla-
Wagner model: animal learns expectation that some predictors (potential CSs) are better than others.
Strength of CS-US association determined by extent to which US is unexpected/surprising. Greater surprise of US, more effort into trying to understand its occurrence so it can predict future occurrences.
Leads to greater classical conditioning of CS that predicted US. Orienting response: animal encounters
novel stimulus, pays attention. Animal more easily associates US with novel stimulus than with familiar
stimulus. Blocking effect: once conditioned stimulus is learned, it can prevent acquisition of new
conditioned stimulus. Stimulus associated with CS can act as occasion setter (trigger) for CS.
6.2 How does operant conditioning differ from classical conditioning? Operant/instrumental conditioning:
consequences of action determine likelihood it will be performed in future. Law of effect: behavior that leads to
satisfying state of affairs likely to occur again; if leads to annoying state of affairs, less likely to occur again
Reinforcement increases behavior: reinforcer: stimulus that occurs after a response and increases
likelihood that response will be repeated.
o Skinner box: special levers for food or water
o Shaping cannot provide reinforcer until animal displays appropriate response; difficult to apply
outside box. Shaping: operant-conditioning technique that consists of reinforcing behaviors that
are increasingly similar to desired behavior. Successive approximations – e.g., rolling over.
o Reinforcers can be conditioned: primary reinforcers: food and water, evolutionarily beneficial.
Secondary reinforcers: events or objects that are reinforcers but do not satisfy biological needs.
o Reinforce potency: Premack theorized about how reinforcer’s value could be determined: key is
amount of time we engage in specific behavior associated with reinforcer, eg ice cream vs.
spinach. Premack principle: valued activity used to reinforce performance of less valued activity
Both reinforcement and punishment can be positive or negative
o Positive and negative reinforcement: Positive: increase probability that behavior will be
repeated; reward. Negative: increases behavior through removal of unpleasant stimulus. NOT
being electrically shocked if you press a lever; but pressing lever to STOP electric shock.
o Positive and n