PSYC215 Chapter 3 Notes
Priming: Activating particular associations in memory.
Belief Perseverance: Persistence of one’s initial conceptions, as when the basis for one’s belief is
discredited but an explanation of why the belief might be true survives.
Misinformation Effect: Incorporating “misinformation” into one’s memory of the event, after witnessing
an event and receiving misleading information about it.
Overconfidence Phenomenon: The tendency to be more confident than correct-to overestimate the
accuracy of one’s beliefs.
Confirmation Bias: A tendency to search for information that confirms one’s preconceptions.
Heuristics: A thinking strategy that enables quick, efficient judgments.
Representativeness Heuristic: The tendency to presume, sometimes despite contrary offs, that
someone or something belongs to a particular group if resembling (representing) a typical member.
Available Heuristics: A cognitive rule that judges the likelihood of things in terms of their availability in
memory. If instances of something come readily to mind, we presume it to be commonplace.
Illusory Correlation: Perception of a relationship where none exists, or perception of a stronger
relationship than actually exists.
Illusion of Control: Perception of uncontrollable events as subject to one’s control or as more
controllable than they are.
Regression Toward The Average: The statistical tendency for extreme scores or extreme behaviour to
return toward ones average.
Misattribution: Mistakenly attributing a behaviour to the wrong cause.
Attribution Theory: The theory of how people explain others’ behaviour-for example, by attributing it
either to internal dispositions (enduring traits, motives, and attitudes) or to external situations.
Dispositional Attribution: Attributing behaviour to the person’s disposition and traits.
Situational Attribution: Attributing behaviour to the environment.
Fundamental Attribution Error: The tendency for observers to underestimate situational influences and
overestimate dispositional influences on others’ behaviour. (Also called correspondence bias, because
we so often see behaviour as corresponding to a disposition.)
Self-awareness: A self-conscious state in which attention focuses on oneself. It makes people more
sensitive to their own attitudes and dispositions.
Self-fulfilling Prophecy: A belief that leads to its own fulfillment.
Research reveals the extent to which our assumptions and prejudgments can bias out
perceptions, interpretations, and recall.
The first group of experiments examine how predispositions and prejudgments affect how we
perceive and interpret information. The second group plants a judgment in people’s minds after
they have been given information to study how after-the-fact ideas bias recall. The overarching
point: We respond not to reality as it is but to reality as we construe it.
Our memory system is a web of associations, and priming is the awakening or activating of
certain associations. Priming experiments reveal how one thought, even without awareness, can
influence another thought, or even an action. Priming effects surface even when the stimuli are presented subliminally-too briefly to be
Our first impression of one another are more often right than wrong, and the better we know
people, the more accurately we can read their minds and feelings.
When social information is subject to multiple interpretations, preconceptions matter.
Ways of thinking or schemas guide not only our interpretations of our self, but also our
understanding of others.
People everywhere perceive media and mediators as biased against their position. “There is no
subject about which people are less objective than objectivity.”
People’s perceptions of bias can be used to assess their attitudes. Tell me where you see bias,
and I will see your attitudes.
Both proponents and opponents of a topic show that the two sides of an identical body of mixed
evidence had not lessened their disagreement, but increased it.
The “Kulechov effect” skilfully guided viewers’ inferences by manipulating their assumptions.
Spontaneous trait transference: If we go around talking about others being gossipy, people may
then unconsciously associate “gossip” with us. Call someone a jerk and folks may later construe
you as one.
There is an objective reality out there, but we view it through the spectacles of our beliefs,
attitudes, and values. This is one reason our beliefs and schemas are so important; they shape
our interpretation of everything else.
Belief perseverance shows that beliefs can take on a life of their own and survive the
discrediting of the evidence that inspired them.
The more we examine our theories and explain how they might be true, the more closed we
become to information that challenges our belief.
The remedy of belief perseverance is to simply explain the opposite. Indeed, explaining any
alternative outcome, not just the opposite, drives people to ponder various possibilities.
In constructing memories, we reconstruct our distant past by using our current feelings and
expectations to combine fragments of information. In its search for truth, the mind sometimes
constructs a falsehood.
People will incorporate misleading information into their memories when they receive the
misleading information about it.
People whose attitudes have changed often insist that they have always felt much as they now
feel. George Vaillant notes: “It is all too common for caterpillars to become butterflies and then
to maintain that in their youth they had been little butterflies. Maturation makes liars of us all.”
The construction of positive memories does brighten our recollections. We tend to later recall
experiences as even more fond, minimizing the unpleasant or boring aspects and remembering
the high points.
With any positive experience, some of the pleasure resides in the anticipation, some in the
actual experience, and some in the rosy retrospection.
We also revise our recollections of other people as our relationships with them change.
University students in steady relationships rated their partners. Two months later, those more in
love saw their partner as love at first sight while those who broke up saw their partners
originally as selfish and bad-tempered.
In reconstructing past behaviour, it is necessary for humans to remember that events happened
in a desired manner. We tend to underreport bad behaviours and overreport good behaviours. While our cognitive systems process a vast amount of information efficiently and automatically,
our adaptive efficiency has a trade-off; as we interpret our experiences and construct memories,
our automatic intuitions often err.
We are unaware of our flaws and the “intellectual conceit” evident in judgments of past
knowledge (“I knew it all along”) extends to estimates of current knowledge and predictions of
future behaviour and this tends to be positive.
The most confident people were most likely to be overconfident. Studies reveal a similar
correlation between self-confidence and accuracy in discerning whether someone is telling the
Ironically, incompetence feeds overconfidence. Students who score at the bottom on tests of
grammar, humour, and logic are most prone to overestimating their gifts as such.
Our ignorance of our ignorance sustains our self-confidence. In follow up studies, the “ignorance
of one’s incompetence” occurs mostly on relatively easy-seeming tasks, but on hard tasks, poor
performers more often appreciate their lack of skill.
Dunning concludes: “what others see in us…tends to be more highly correlated with objective
outcomes than what we see in ourselves.”
People may often give too much weight to their current intentions when predicting their future
People often tend to recall their mistaken judgments as times when they were almost right.
In confirmation bias, people tend to not seek information that might disprove what they believe.
We are eager to verify our beliefs but less inclined to seek evidence that might disprove them.
Out preference for confirming information helps explain why our self-images are so remarkably
stable. William Swann and Stephen Read discovered that students see, elicit, and recall feedback
that confirms their beliefs about themselves.
People seek friends and spouse those who bolster their own self views-even if the think poorly
of themselves. This self-verification to how someone with a domineering self-image might
behave at a party.
You need to be careful about other peoples’ dogmatic statements. Even when people seem sure
they are right, they may be wrong. Confidence and competence need not coincide.
Three techniques to reduce the overconfidence bias includes:
o Prompt feedback from experts.
o To reduce “planning fallacy” overconfidence, people can be asked to “unpack” a task-to
break it into its subcomponents-and estimate the time required for each. When people
think about why an idea might be true, it begins to seem true.
o Get people to think of one good reason why their judgments might be wrong: force
them to consider disconfirming information.
We should be careful not to undermine people’s self-confidence to a point where they spend
too much time in self-analysis or where self-doubts begin to cripple decisiveness. In times when
their wisdom is needed, those lacking self-confidence may shrink from speaking up or making
Overconfidence can cost use, but realistic self-confidence is adaptive.
Our cognitive system specializes in mental shortcuts, which we use to form impressions, make
judgments, and invent explanations with lightning speed. Utilization of heuristics promotes our
survival. The biological purpose of thinking is less to make us right than to keep us alive.
However, in some situations, haste makes errors. Consider the question: Do more people live in Iraq or in Tanzania. Based on examples readily
available in our memory-as Iraqis tend to be-then we presume that the event is commonplace.
This cognitive rule is called available heuristic.
Sometimes availability heuristic deludes us. The more absorbed and “transported” the reader (“I
could easily picture the events”), the more the story affects the reader’s later beliefs.
One use of availability heuristics highlights a basic principle of social thinking: People are slow to
deduce particular instances from a general truth, but they are remarkably quick to infer general
truth from a vivid instance.
The availability heuristic explains why powerful anecdotes are often more compelling than
statistical information and why perceived risk is therefore often badly out of join with real risks.
Easily imagined (cognitively available) events also influence our experiences of guilt, regret,
frustration, and relief. Imagining worse alternatives helps us feel better. Imagining better
alternatives, and pondering what we might do differently next time, helps us prepare to do
better in the future.
The more significant the event, the more intense the counterfactual thinking.
It’s easy to see a correlation where none exists. When we expect significant relationships, we
easily associate random events, perceiving an illusory correlation.
If we believe a correlation exists, we are more likely to notice and recall confirming instances. If
we believe the premonitions correlate with events, we notice and remember the joint
occurrence of the premonition and the events later occurrence. We seldom notice or remember
all the times unusual events do not coincide.
Experiments on gambling illustrate the illusion of control, where more than 50 experiments
have consistently found people acting as if they can predict or control chance events.
Gamblers tend to attribute wins to their skill and foresight. Losses become “near misses” or
“flukes” –perhaps (for the sports gambler) a bad call by the referee or a freakish bounce of the
We fail to recognize the statistical phenomenon of regression toward the average. Because
exam scores fluctuate partly by chance, most students who get extremely high scores on an
exam will get lower scores on the next exam.
Experience has taught us that when everything is going great, something will go wrong, and that
when life is dealing us terrible blows, we can usually look forward to things getting better.
Often, though, we fail to recognize this regression effect.
Nature operates in such a way that we often feel punished for rewarding others and rewarded
for punishing them. In actuality, as every student of psychology knows, positive reinforcement
for doing things right is usually more effective and has fewer negative side effects.
Social judgment involves efficient, though fallible, information processing. It also involves our
feelings: our moods infuse out judgments.