Guestpost: When K12 Leaders Make Mistakes, Who Takes Note?
Steve Rees on why you ain't gonna learn what you don't want to know.
Steve Rees is an education polymath. I have vintage pictures he took of Jerry Garcia and Alan Ginsburg in 1960s California. His photography appeared in the recent Bill Graham retrospective and in documentaries about the Grateful Dead. Steve once gave me a Warlocks poster. He’s also a fine fly fisherman. And I should also mention he’s been a leader on education data, how to use it, think about it, communicate about it – long before data took on some prominence in education.
Today, a guest post by Steve about his new book, with Jill Wynns, about data, Mismeasuring Schools’ Vital Signs. Here’s Steve, enjoy:
I’ll confess, mistakes, errors and miscalculations fascinate me. I find they hold rich opportunities to learn. The errors made by district and site leaders, their origins and the learning lessons they offer became the heart of a book I wrote during the pandemic, together with my friend Jill Wynns: Mismeasuring Schools’ Vital Signs. Her 24 years as a school board leader in San Francisco, and my 23 years working with more than 240 California districts, gave us a wealth of stories to draw from.
Some of these stories come from my having read many dozens of California school districts’ plans and school site annual plans. They are artifacts of human judgment, and they are filled with weak evidence and broken logic: test results misinterpreted; miniscule changes in grad rates mistaken as consequential; teaching staff experience and attrition ignored; the progress of emerging bilingual students mismeasured. This mess supports John Hattie’s question in Visible Learning:
“The key question is whether teaching can shift from an immature to a mature profession, from opinions to evidence, from subjective judgments and personal contact to critique of judgments.”
Disregarding signals of misteaching
I’ve worked with districts where 30 percent of third-grade students lag in reading by one year or greater by the spring. Yet no one dares ask whether this is the result of how they teach reading to everyone. Could teachers and the district curriculum and instruction leaders have selected the wrong instructional materials?
I’ve worked with districts, as analytic partner, where seven graduating class cohorts in a row start out strongly in math in 3rd grade, but when they hit 6th grade, progress stops. Middle school math results go flat as a pancake (negligible scale score increase from grade 6 to 7 to 8). Principals see the pattern (good) but continue teaching the same way with the same instructional materials the next year (bad).
California Dept. of Education warps the evidence
I’ve seen California’s official measuring tool for school and district accountability, the Dashboard, taken as the gospel truth. Yet it is riddled with six types of errors, which combine to make it a festival of false positives and false negatives. It reaches wrong conclusions about academic achievement. It mismeasures gaps. It often classifies schools and districts into color zones that are fundamentally incorrect. It ignores imprecision inherent in test results. It still omits growth measures. Although the Dashboard is the butt of jokes at conferences, the California Department of Education (CDE) requires that it be the primary source of evidence for all schools’ and districts’ annual plans. The result: money and time are being wasted. Planning becomes a charade. At charter renewal hearings, it is sometimes the justification for denying reauthorization to schools that shine. (See this scary example from Los Angeles.)
Why aren’t these errors learning opportunities?
When practitioners make mistakes in other professions, they are often noted. In hospitals, when patients are harmed by a surgical team’s error, it is documented. It may be studied by quality control teams. If the error was the result of misfeasance, the error may be the subject of an internal review, or a case study. If it is negligence, it may lead to litigation.
If they aren’t visible errors, however, the human tendency to disguise those errors may prevail. In schools, I believe this happens all too often. We have data galore that documents the misclassification of English learners, the failure to screen for dyslexia, and the misteaching of reading. Yet the data isn’t really understood. Why? Perhaps one factor is that data has not been built into solid evidence, and put in the hands of people ready to bring it to light, and if necessary, raise a ruckus.
Why does it take public interest lawyers like Mark Rosenbaum suing California school districts like Berkeley USD to get K12 leaders and school board trustees to consider that they have adopted the wrong reading curriculum? Will the absence of dyslexia screening in 11 states, including California, be seen as a mistake soon, or will it take litigation to force the question?
If you share my hope that K12 leaders can eventually learn from their mistakes without the help of lawyers, I hope you’ll pick up our book, Mismeasuring Schools’ Vital Signs. It will help sharpen up your b.s. detectors, and encourage you to bring errors into the light of day.
For faculty to request a review copy
Steve Rees is the founder of School Wise Press, and currently leads their K12 Measures team. He welcomes your comments by email.