Week 5 – oct 15, 2012
Surveys & Public Opinion Polling
• Ask about experiences, attitudes or knowledge
• Types of surveys
‐ Status surveys (current characteristics)
‐ Survey research (relationships among variables) – cross-sectional, longitudinal and
o Different types of survey research: cross-sectional, longitudinal, and sequential (not asking the
questions to the same ppl but they are asking them again throughout time to compare time...).
• Types of survey instruments
‐ Self-administered questionnaire
‐ Interview schedule (for tel. and in-person interviews)
o Cost start to go up from tel interviews to in person interviews.
• Types of survey questions
‐ Demographic (factual) & content (experiences/opinions/etc.)
• Beware weaknesses in self-reporting – how reliable?
o How honest are ppl with answering the questions... with online questions, they might not say
the truth, whereas in person they will be having a conversation...
Steps in Survey Research(Graziano & Raulin, p.319)
• Determine content area, define population and decide how survey will be administered
• Develop survey instrument & pretest it
‐ What questions to ask and in what order?
‐ Open-ended, multiple-choice (closed-ended) and Likert scale Qs
o What are the questions are you going to ask, and in what order... some questions might
influence the answer of another question.
o Open-ended: ask a question without giving options of answers.
o Close-ended: ask questions with already giving answers to chose from (multiple choice)
o Open ended is more costly because you have to analyse all the answers, which will also take
• Draw representative sample
‐ Nonprobability sampling (convenience sampling) - weak
‐ Probability sampling (random and stratified random samples)
‐ Sample size (larger if heterogenous)
‐ Margin of error and confidence intervals: usually +/- 3% margin at the 95% confidence
level (‘+/-3%, 19 times out of 20’)
o Probability: randomly selecting ppl to give the survey.
o More likely to see stratified random samples (what we are going to do): actually trying to get ppl
that are representative to the type of questions they need to answer. With phones its harder
because ppl will hang up...
o 19 times out of 20 is the confidence level.
• Administer survey
o Before this they will actually do a pre test of the survey, so they can get a sense of were the
survey is working and were its not and to who to give the survey... this way they wont be
• Analyze (weighting), interpret and communicate findings
o Wheight the data of all the ppl they got a hold of. What are the results of the survey and what
does this mean to the clients. Questions to critically appraise a poll
• Political context: Who did the poll and who paid for it? When was it undertaken? What other
• Surveys and survey administration: what questions were asked and in what order? how were
the interviews conducted?
• Respondents and sampling: How many people participated, how were they selected and what
area/group are they from? Who else should have been interviewed?
• Confidence and error: What is the sampling error? Who’s on first? What else might skew
• Reporting of results: Are results based on all answers or just some?
o Who did the poll, and who paid for it...
o At the end of the day the questions that are asked are up to the clients to actually answer
o Need to look at when the poll was conducted, because it can influence on the answers they get.
• Growing importance given refusal rates via telephone (up to 80%) and decline/shifts in
telephone use (1/6 of adults not accessible) – have proven more accurate for elections
• Respondents seem to prefer Internet polling and invest more time/energy/candour into them so
more reliable (also less expensive) – higher response rates; no interviewer bias
• Need to ensure a quality sample – use of access panels from which to draw sample (need to
ensure representativeness) – discount quickpolls
• Need to ensure confidentiality/privacy/security in conducting surveys
WEEK 6 – oct 15, 2012 – Case Study research
What is a Case Study(Yin, 2003)
* An in-depth investigation of a contemporary phenomenon within its real-life context when
behaviours cannot be manipulated
• uses direct observation and interviews
• relies on multiple sources of evidence (qual and/or quant)
• explores links between phenomenon and context
• data collection guided by theoretical propositions
o qualitative and quantitative evidence
o looking at the links between the phenomenon and the research
o you can generalize to theoretical propositions… getting other data…
Case Study as a Research Strategy
Case studies have often been critiqued
Extent of validity of their findings
• can they do more than answer descriptive questions?
• can they be used for more than exploratory research?
• how much can you generalize from a case study?
Researchers have not always been careful
• They can be an essential research strategy – particularly for
understanding complex social phenomena
• can be used for exploratory, descriptive and explanatory research When should a case study be used?
* Particularly good for ‘how’ and ‘why’ questions
• tracing causal links (not frequencies)
* Sometimes good for ‘what’ questions
• if exploratory and not ‘how much’ or ‘how many’ questions
* Not generally good for ‘who’ or ‘where’ questions (and ‘how much’ or ‘how many’)
Oct 18 2012 – guest speaker on ford.. cars.. gm
Oct 29 2012 – guest speaker... BN
Case Studies (continue...)
Case Studies & Experimental Design
* Categorizing case study designs using the concepts of experimental design helps to assess
internal validity of the study
* Virtually all case study research is quasi-experimental (no random assignment)
* Studies can be categorized according to the extent that they vary across time, space and/or use
Yields Four Designs
* Which of these is closest to ‘classic experimental design’?
* In which kind of study would you have the greatest confidence?
Temporal Yes Dynamic Longitudinal
Variation comparison comparison
No Spatial Counterfactual
Nov 5, 2012 -- Action Research, Futures Research and Scenarios Research
* ‘Action research is a way of initiating change in social systems – societies, communities,
organizations or groups – by involving members of the group in on the research process.’
(McNabb, 2008: 335)
• Inductive, empirical, applied, pragmatic research based on citizen participation
• Begins from the proposition that groups can do things better if they decide to do them
o Not just citizen participation, it can be business group members…
‘to discover or invent, examine and evaluate, and propose possible, probable, and preferable
futures.’ (Wendell Bell, 1997)
• an empirical and scientifically based approach to understanding the future
• often motivated by normative ends (how can we make a better future?)
• increasingly incorporates public participation
• has developed a range of methodologies, including scenarios research… o future oriented activity, making a better future
* Scenario planning is, ‘a process of positing several informed, plausible and imagined alternative
future environments in which decisions about the future can be played out, for the purpose of
changing current thinking, improving decision making, and enhancing human and organization
learning and improving performance.’ (Chermack and Lynham, 2002: 16, as cited in Chermack,
o Aid to decision making. Scenario planning, change current thinking, improving learning, how to
make things better, focussing on decision making.
A Closer Look at Action Research
Models of action research
Traditional Action Research
• Originated in the 1930s/1940s; key characteristics:
– data of any type gathered via different methods
– conducted in location and often with full universe
– usually focused on single case or organization
o collective form of datas. Collecting data in the location. Not just dealing with a sample of ppl, ur
dealing with all the ppl in this situation.
Participatory Action Research
• Concerned with research, education and action – goal of fundamental
change with participants taking primary responsibility for the project
• Emerged along with critical theory – focused on emancipation –
researchers often explicitly political
o Not only about the research question itself, it emerges with critical theories (specific form of
theory, focussing on emancipation). Researchers are going out there… dealing with ppl in those
Key Themes in Action Research
Collaboration in problem-solving
Problem identification, planning and acting
Educating and reeducating
Democratic participation and action
Theory building and practical application
o Critical theory is involved with education to get a better understanding.
o Key aim is around emancipation… working with highly marginalized communities
Seven Phases of Planned Change (McNabb, p.342)
1. Identify need for change
2. Collect data about current situation and ideas about ‘ideal state’
3. Collaborative diagnosis based on data
4. Develop a plan for change based on data
5. Convert plan into ‘actions’
6. Evaluate actions to determine if change working
7. Institutionalize actions that are working – reassess and repeat process as needed When to Use Action Research
* To identify citizen needs and potential solutions to problems
* To design effective programs
* For organizational development
* For community development/redevelopment
o Governments will use this too. . but not likely to be hard core participatory research users.
o Communities itself know what are the solutions
A Closer Look at Futures Studies
* Emerged in mid-1900s – looks far into the future (much further than action research)
* Includes both expert (e.g., Delphi) and participatory methods
• Expert methods tend to be quantitative (but not always accurate!!!)
• Participatory methods can strengthen results and their implementation
Similarities with Action Research
* Focus on participation, social change, engagement in creation of knowledge, systems thinking,
holistic complexity, visions of the future, commitment to democracy, social innovation and
ongoing probing of assumptions and reinterpretation
• But action research tends to be more inwardly focused (e.g., single
organization/program) and tends to focus on iteration
A Closer look at Scenario Planning
* Scenario planning versus scenario building
* A key outcome of scenario planning is improving decision-making capabilities
• Decision failure can result from errors/mistakes or because something unusual happens
• the latter category (‘unexpected decision error’) is due to bounded rationality,
the tendency to only consider exogenous variables, the stickiness and friction of
information and knowledge, and mental models that underpin decisions
• Scenario planning can help to reduce these challenges of dynamic decision-
o Form of research…
o Focusing on scenario planning. Trying to improve decision making. Chose to act or not in policy
o Decision failure: errors or mistakes, doesn’t take in consideration of something. Its unexpected.
o Bounded rationality (not perfect rationality, if we have perfect rationality what would it look like
managers don’t have the time or resources to make perfect decisions, but they have the
rationality, always operating with bounded rationality)
o Considering exogenous variables (looks outside. Ex: for power to go out, its cuz of a storm going
on outside… something that isn’t within the system. Don’t think of what is under their control)
o Stickiness: difficult for us to transmit our ideas. Transmit accurately
o Friction: slower decisions being taken but then might be better decisions taken. Takes a lil
longer to get to ur decisions. Getting multiple info, more actors involved and bringing in more
o Mental models: our beliefs and attitudes and understanding on how the world works. Mental
model that accurately reflects decisions. Strengths and Limitations of Action, Futures and Scenarios Research
* How expensive are these methods?
* To what degree do you think they are used in public administration research?
* How popular are they likely to be with citizens? experts? politicians? others?
* How much time are they likely to consume?
* What evidence to we have to support the claim that they improve policies and programs?
o 1- really expensive. Involves a lot of ppl and time. On going process.
Nov 8, 2012 –
-PAP3310: presentations NOVEMBER 26!
-written part due dec12 in the office between 2pm-4pm
-write a critical evaluation individually of all groups on dec12 too. 4-5 pages. Half a page for each
-presentation 10-12mins, and then 5-10mins Q&A w/ the class
-Briefing note summarizing the research findings and recommendations (2 pages)... follow instruction
off the BN slide class notes. Section I-VI on page 7 of the slide.
-A literature review of scholarly and applied research in the area (5-7pages)... take all the sources and
categorizing all the material
-A policy recommendation based on the literature review. (2-3pages) ... what our group recommends
-Proposal for a research study based in knowledge gaps identified in your literature review (3-4 pages) ...
which research questions remains to be answered? What study design would you recommend and why?
-Bibliography of sources (2-3 pages) ... 20 to 25 sources.
Nov 12, 2012 - Comparative Research, Lesson-Drawing and International Benchmarking/Peer Reviews
Comparative research tends to be peripheral in public administration research
• Emerged in 1960s/70s but we don’t tend to do much of it
• Studies tend to be either parochial (one-country) or polycentric (multi-
country but limited efforts to generalize)
• Distinguishing factors that are universal from those that are context
specific is very challenging
Five key challenges of comparative research
• Defining culture, recognizing phenomenon that are culturally specific,
avoiding cultural bias, designing methodology that’s equivalent across
cultures, avoiding misinterpreting results
Challenges of Comparative Public Administration Research
Accounting for cultural factors
• Discerning general political philosophy and the cultural elements
• Cultural differences can be more pervasive than expected
Finding conceptually equivalent terms
• Linguistic differences
o Different terms can have different meaning in deferent countries. Ex: decentralization has a
more specific meaning in France. Translating the terms is really hard too.
• Language, cultural and disciplinary skills Recommendations to Address the Challenges
Research design considerations
• Type of cross-cultural research to be undertaken (parochial, polycentric
• Country selection – needs a solid rationale
• Cultures and languages involved
• Accounting for culture
• Composition of the research team, conceptual equivalence, translators
• Culturally specific versus universal findings, cross-national analysis
Policy Transfer and Lesson Drawing
A subfield of comparative public policy – a unique contribution to public policy via
evidence-based practice, rational lesson-drawing & good policy
• processes through which knowledge about institutions, policies or
programs are used by foreign govts to inform their policies, institutions
• Transfer processes: voluntary, negotiated or coercion
Lesson-drawing is a form of voluntary policy transfer
• introduction of parallel programs in more than one country – countries
look to others’ experiences for guidance
• Involves research (not just experiences or informal input)
Guidelines for Successful Lesson-Drawing (Rose, 2004)
Learn key concepts
Catch the attention of policy-makers
Scan alternatives and decide where to look for lessons
Learn by going abroad
Abstract from your observations a generalized model of how a foreign program works
Turn the model into a lesson for your own national context
Decide whether the lesson should be adopted
Decide whether the lesson can be applied
Simplify the means/ends of a lesson to increase its chances of success
Evaluate a lesson’s outcome prospectively, and if it is adopted, over time
• encourages domestic regulatory development according to ‘best practice’
standards of external organizations
OECD regulatory reviews excellent illustration of this
• emerged in mid-1990s following privatization/deregulation
• peer reviews benchmark national regulatory approaches against ‘good
practices’ advocated by the OECD
• reviews voluntary – states apply for review, co-finance parts of the process but
are not obligated to follow report’s recommendations The Regulatory Review Process
Voluntary but states don’t have veto over report
Initial meeting to lay out timetable and procedures, followed by questionnaire
OECD officials visit member state to undertake relatively independent analysis
OECD directorates draft reports & discuss with member; reports peer reviewed by other states
OECD drafts policy recommendations
Recommendations finalized with member state cooperation
Report formally released but no monitoring re. follow-through
Impact of Regulatory Reviews
Even though the OECD process is voluntary, it…
• represents a ‘focusing event’ for ‘policy entrepreneurs’
• brings together previously unconnected officials within a member state