Management Skills 3S03 March 4 , 2014
Chapter 3: Problem Solving and Ethics
The Challenge of Problem Solving
Good problem solving and decision making are in large part a function of what traps to
avoid and what not to do.
Problem solving myths:
• Taking action is better than standing by
• Trust your gut
• I know when I’m making a poor decision
• Dividing an elephant in half produce two small elephants
• Ethics is not my problem
• Ethical abuses are due to unethical people
Why Smart People Make Bad Decisions
Bad decisions happen about as frequently as good ones. Decision making is another area
where true expertise involves knowing the traps that so frequently hinder sound
Intuition represents a collection of what we’ve learned about the world, without knowing
we actually learned it. Intuition is important in automatic processes such as social
interactions or driving a car. It can be useful if we track what we have learned and under
what circumstances that learning led to success so we can replicate it in the future.
The Ladder of Inference
Ladder of inference – show how our intuition operates and can lead to mistakes.
Inference is drawing a conclusion about something we don’t know based on things we do
know. Order of ladder from bottom to top:
• I observe “data” and experiences
• I select “data” from what I observe
• I make assumption based on the meanings I added
• I draw conclusions
• I adopt beliefs about the world
• I take actions based on my beliefs
Fundamental attribution error: The error deals with the process of attribution cause to
events – that is, explaining why things occurred. The essence of this error is people tend
to overattribute behaviour to internal rather than external causes. Thus, when
determining the cause of another person’s behaviour, you are more likely to consider
factors related to the person’s disposition, than to their particular situation.
Selfserving bias – where we attribute personal successes to internal causes and personal
failures to external causes.
Six Ways People Exercise Poor Judgment Without Knowing It
Judgment Error 1: Availability [Type text] [Type text] [Type text]
Availability bias – this bias clouds our judgment because things more readily available
to us (that is, they can be more easily brought to mind) are likely to be interpreted as
more frequent or important.
Judgment Error 2: Representativeness
Representative bias – people pay more attention to descriptors they believe to be more
representative of the person’s career choice than the key base rate information that leads
to the better choice. I.e. MBA student – arts or business?
Gamblers fallacy – people truly believe that each coin flip or pull of the slot machine are
somehow connected to previous actions.
Hasty generalization fallacy – people often draw inappropriate general conclusions
from specific cases because they do not realize their specific example is not necessarily
so in all, or even most, cases. I.e. motorcyclist going against helmet regulation because he
has ridden for 25 years and never gotten hurt. The hasty generalization fallacy occurs
because we tend to operate by what has been called the law of small numbers – we are
willing to leap to general conclusions after seeing only one or two examples.
Judgment Error 3: Anchoring and Adjustment
Research shows we often provide estimates based on the initial starting estimate. Even
when people are told the initial estimate is random, their adjusted estimates remain close
to the initial estimate or anchor. This pattern of anchoring and adjustment is quite
prevalent – different starting points lead to different end results.
Judgment Error 4: Confirmation
Confirmation bias represents people’s tendency to collect evidence that supports rather
than negates our intuition before deciding. In solving problems, one of the most insidious
traps is gathering data that seeks to confirm our ideas and exclude data that might
disconfirm them. I.e. number sequence problem, answer: any three ascending numbers, 2,
Judgment Error 5: Overconfidence
Overconfidence bias leads us to believe we posses some unique trait or ability that
allows us to defy odds, whereas others simply don’t have such a trait. Often termed the
Lake Wobegon Effect – after the radio show in which the imaginary town boasts all of its
children are above average. Research shows no connection between one’s confidence
level about being right and actually being right.
Judgment Error 6: Escalation of Commitment
Escalation of commitment – people are likely to continue to invest additional resources
in failing courses of action even though no foreseeable payoff is evident. “Throwing good
money after bad” is the essence of escalation of commitment – I.e. Volvo repairs
Escalation is prevalent for several reasons: we don’t want to admit that our solution may
not have been the right one so we stay the course, we don’t want to appear inconsistent or
irrational so we continue to hope for the best even though data simply don’t justify such a
response, and in organization, not continuing could be seen as giving up rather than
fighting onward – and no one likes a quitter. th
Management Skills 3S03 March 4 , 2014
Overcoming Judgment Biases
There are no simple or surefire ways to always avoid decision biases. The biases are
insidious and hardest to detect in our own decisionmaking. However, useful tactics, more
aptly called defenses, exist: 1) confidence estimates, 2) trial and error calibration, and 3)
Attach an estimate of confidence to beliefs held by ourselves and others, to curb
overconfidence bias. Using confidence estimations to build “confidence ranges” can
move you away from singlepoint estimations.
Trial and Error Calibration
Weather forecasters are very accurate, and physicians are not. The answer as to why lies
in a key aspect of trial and error, namely regular feedback and knowledge of results.
Weather forecasters predict rain and know in a few hours if their predictions were correct,
if they weren’t they go and see where they went wrong. This process repeats itself
everyday as forecasters calibrate their predictions with the results.
Training yourself to use trial and error calibration involves a few steps: with every
prediction, record the reasons why you’ve established the prediction. Track the results.
Study the success and failures. Remember that chance is not selfcorrecting. A string of
failures does not mean you’re “due” for a success.
Approach all decisions and presented evidence with healthy skepticism. The best
defenses for decision biases are:
• Don’t jump to conclusions
• Don’t assume a relationship is a cause; record and test your decision outcomes
• Don’t base your conclusion only on your own experience
• Don’t just look to support your case, look for the nonsupporting evidence too
• Don’t fall prey to overconfidence, get confidence estimates and ranges.
Solving Problems Ethically and Effectively
There is a difference between good decisions and good outcomes.
There is no such thing as a perfect decision or a perfect decision process.
As humans we will always be subject to bounded rationality – our brains’ limitations
constrain our thinking and reasoning ability, and thus, it is impossible to consider
simultaneously all information relevant to any decision or problem. Bounded rationality
leads managers to engage in satisficing or determining the most acceptable solution to a
problem, rather than an optimal one. Using problem solving models does improve
decision quality. A popular model that has 5 major steps is PADIL (Problem,
Alternatives, Decide, Implement, Learn)
Define and Structure the Problem
Make sure you are working on the correct problem. One mistake is to begin with a
solution, not a problem. The temptation to jump to a solution is very powerful and leads [Type text] [Type text] [Type text]
to what Ian Mitroff calls “solving the wrong problem precisely.” Several ways in which
people solve the wrong problem precisely
• Picking the wrong stakeholders
• Framing the problem too narrowly
• Failure to think systematically
• Failure to find the facts
Asses Key Stakeholders
A stakeholder is anyone who has a stake in the problem or solution.
Determining Whom to Involve
One useful tool for helping gauge the appropriate level of involvement in problem
solving is that developed by Victor Vroom and Phillip Yetton. Those authors note that a
decisionmaker could involve others on a broad continuum ranging from no involvement
to full employee delegation. This continuum represents 5 key participation approaches:
decide, consult individually, consult group, facilitate group, delegate to group. The
framework identifies seven factors that must be addressed before you decide which
approach is best. These factors can be framed as questions to be answered:
• Decision significance
• Importance of commitment
• Leader’s expertise
• Likelihood of commitment
• Group support
• Group expertise
• Group competence
Evaluate these seven factors in high or low terms, then create a flow chart that will yield
the most effective participation approach.
Framing the Problem Correctly
Strong evidence suggests the way in which a problem is stated determines the quantity
and quality of solutions generated.
Black or white fallacy assumes our choices are clear and limited to two, when in reality
there may be many other choices. People generally frame problems in “eitheror” terms.
No discussion of solving the right problem is complete without a basic understanding of
systems and systems thinking. A system is a perceived whole whose elements “hand
together” because they continually affect each other over time and operate toward a
All systems express what is known as systemic structure or a pattern of
interrelationships among the system components.
A systems approach – “how will this change affect other things?” is critical to being