Thanks for this! We say all the time that we’re “following the data” and “evidence-based” without really thinking about whether the data is any good or if it makes any damn sense.
Thanks Peter this is a really important issue - interesting to compare countries, e.g. In England there is no such focus on Quality Tiers but more focus on Effect Sizes (ES) (ignoring most quality criteria) as indicating an impact on learning. The Education Endowment Foundation (EEF) is the dominant body, equivalent to WWC. Originally EEF used Meta-analyses, copying Hattie's method, but due to significant peer review critique has moved to focus in individual studies. A look at their report on "Feedback" shows the disparity in results of studies, the huge disparity is ignored by simply averaging all the effect sizes. Currently around 150 studies with about 50 reporting ES from 0 to -1.5, i.e., feedback has no effect or decreases student achievement! In Australia Hattie's old Visible Learning (2008) is the dominant evidence base. Any detailed look at the claims from these organizations show huge differences and contradictions. Throw in financial conflicts of interest of others promoting products/services, which these days must reference evidence, means we are in a total mess regarding evidence. Not sure what the answer is.
In my family (a science teacher, a retired science teacher, a physics professor, and a retired engineer) we usually sum up "educational data" with the phrase
*Garbage in, garbage out* because most of the evidence is based on garbage data. The problem is that these studies are trying to quantify learning, an inherently qualitative practice. 🤷
The idea that learning is measurable at every stage is so ingrained that I don't know how to get back to the process of personal development.
Demands of schools and fundees notwithstanding, few of any of USED policies of the last several decades of "reform" have had a smidgen of evidence to support them, unless, "Will this move privatization forward?" was the research question.
Thanks for this! We say all the time that we’re “following the data” and “evidence-based” without really thinking about whether the data is any good or if it makes any damn sense.
Thanks Peter this is a really important issue - interesting to compare countries, e.g. In England there is no such focus on Quality Tiers but more focus on Effect Sizes (ES) (ignoring most quality criteria) as indicating an impact on learning. The Education Endowment Foundation (EEF) is the dominant body, equivalent to WWC. Originally EEF used Meta-analyses, copying Hattie's method, but due to significant peer review critique has moved to focus in individual studies. A look at their report on "Feedback" shows the disparity in results of studies, the huge disparity is ignored by simply averaging all the effect sizes. Currently around 150 studies with about 50 reporting ES from 0 to -1.5, i.e., feedback has no effect or decreases student achievement! In Australia Hattie's old Visible Learning (2008) is the dominant evidence base. Any detailed look at the claims from these organizations show huge differences and contradictions. Throw in financial conflicts of interest of others promoting products/services, which these days must reference evidence, means we are in a total mess regarding evidence. Not sure what the answer is.
In my family (a science teacher, a retired science teacher, a physics professor, and a retired engineer) we usually sum up "educational data" with the phrase
*Garbage in, garbage out* because most of the evidence is based on garbage data. The problem is that these studies are trying to quantify learning, an inherently qualitative practice. 🤷
The idea that learning is measurable at every stage is so ingrained that I don't know how to get back to the process of personal development.
Some have characterized Educational evidence, particularly by the Guru's - Marzano & Hattie as "Garbage In, Gospel Out".
Demands of schools and fundees notwithstanding, few of any of USED policies of the last several decades of "reform" have had a smidgen of evidence to support them, unless, "Will this move privatization forward?" was the research question.