EDB172 Lecture Notes - Lecture 4: Joel Klein, Triage, Semantic Integration
Datafication: Understanding Large and Big Data in Education
Greg Thompson
• The process through which educational processes are transformed into numbers that allow
measurement, comparison and centralised intervention
• Judgements of how teachers, principals, schools are going and whether they need help
• Accountability: what does it mean, what does it look like, who is it that we can hold to account
within these systems?
• Large vs. Big Data:
• Big data: your computer punches data that people can analyse all the time, phone continually
updates movement (speed); can process massive amounts at the same time (scale)
Large data:
o had to use test results to bring schools to account (Joel Klein, chancellor of NY schools)
o Hold teachers and schools to account with test results
o NAPLAN: two competing aims
• Tied to the curriculum; arguably if teaching to the test, are teaching aspects of the
curriculum
• Does it constitute high stakes or low stakes assessment?
• Fabrication of results; participations rates may be down for results to go up
• The problem with data is always Campbell's Law: "the more any quantitative social indicator is
used in social decision-making, the more subject it will be to corruption pressures and the more
apt it will be to distort and corrupt the social processes it was intended to measure"
i.e. the higher stakes you play on any one data source, the more likely it is that that measure
becomes corrupted because people try to see how that measure can work for them
• Policy and unintended consequences
o While they often come from a good place (want students to learn and be more literate
and numerate, etc.), the problem is using fairly blunt measures and have unintended
problems as a result
o Narrowed curriculum/pedagogic focus
o 'targeted' teaching or education triage? = focus a lot on students who are sitting just
below the bar, meanwhile students right down the bottom aren't going to improve so
hold off on helping
find more resources at oneclass.com
find more resources at oneclass.com
Document Summary
Datafication: understanding large and big data in education. Greg thompson: the process through which educational processes are transformed into numbers that allow measurement, comparison and centralised intervention. Large vs. big data: big data: your computer punches data that people can analyse all the time, phone continually updates movement (speed); can process massive amounts at the same time (scale) ), the problem is using fairly blunt measures and have unintended problems as a result: narrowed curriculum/pedagogic focus. = focus a lot on students who are sitting just below the bar, meanwhile students right down the bottom aren"t going to improve so hold off on helping. The underlying analytic thesis being that more data beats better models. Key considerations here relate to scalability, distribution, the ability to process and so on: velocity proposes that the data flow rate is important, not least in relation to the feedback loop to action.