SOCS3100 Lecture Notes - Lecture 10: Polskie Radio Program Iii, Nvivo, Contingency Table
SOCS3100
Policy Development, Program Management, and Evaluation
May 16, 2018
WEEK 10
Program Information Systems, Performance Indicators, Technical and Political
Challenges
PART 1: Kettner on Management Information
It is important to build evaluability into a program’s design from the outset.
Issue to be addressed + Defined client group + Hypothesis that intervention X will
produce outcome Y → Design the management information system
1. What questions do I need the system to answer?
a. What are the contracted reporting requirements?
2. What data elements must be included in the system in order to answer the
questions?
3. What types of routine reports do I want the system to generate?
a. What are the contracted reporting requirements?
Begin at the End
– agencies should start thinking about evaluation very early in the program planning
process, with the end result in mind
1. Consider the evaluation context of data collection / aggregation
2. Identify the program questions to be answered
3. Identify the data elements needed
4. Identify a strategy for analysis
5. Prepare a format for regular reports
Take into account the purpose of data when considering these points.
If it was your community agency, what sort of questions would you like to have answers
for?
EXAMPLE: What types of services (policy instruments) prove most effective in
providing the required assistance to clients? For what kind of clients (demographics)?
What sort of information would we need to collect about a program if we wanted to know
the answer to that question?
What data must we capture to gain that information?
find more resources at oneclass.com
find more resources at oneclass.com
INPUTS → THROUGHPUTS → OUTPUTS → OUTCOMES → IMPACT
Inputs
• Client characteristics at entry to program (i) – to establish a baseline
• Demographic data
o Age
o Gender
o Ethnicity
o Education
o Disability
o Health issues
• Problems/strengths rated 1-5 (ex. through survey)
o Quality and stability of housing
o Stability of employment
o Income
o Budgeting skills
o Access to transport
o Level of safety
o Self-confidence
Overall: ability to live independently and safely
Patterns? – Do problems appear to be associated with any demographic characteristic
(ex. younger age, employment)?
• Staff characteristics (age, qualifications)
o Education
o Years of experience
o Types of experience
o Ethnicity
o Gender
Patterns? – How do staff education and/or experience correlate with client completions
and client outcomes?
Throughputs (Activities, processes, ex. workshops)
Which intervention leads to best results for clients?
• Service components
o Counselling
▪ Assessment, setting goals, implementing plan
o Groupwork / Case management / Information sessions
o Other forms of support
▪ EX: Meals for seniors – at home, community centres
▪ EX: Transport to job interviews, clothing for job interviews
▪ EX: Assistance with rent
• Units of service provided
find more resources at oneclass.com
find more resources at oneclass.com
Document Summary
Program information systems, performance indicators, technical and political. It is important to build evaluability into a program"s design from the outset. Take into account the purpose of data when considering these points. Inputs throughputs outputs outcomes impact. Inputs: client characteristics at entry to program (i) to establish a baseline, demographic data, age, gender, ethnicity, education, disability, health issues, problems/strengths rated 1-5 (ex. through survey) Income: quality and stability of housing, stability of employment, budgeting skills, access to transport, level of safety, self-confidence. Do problems appear to be associated with any demographic characteristic (ex. younger age, employment): staff characteristics (age, qualifications, education, years of experience, types of experience, ethnicity, gender. Does more time spent on intake screening produce better outcomes: staff (ex. time, money, facilities (ex. office, client"s house, equipment. To determine the effectiveness of the program : client characteristics at the end of the program (ii, client characteristics at subsequent intervals following that (iii)