Curmudgucation: Why Do State Report Cards Stink
Morgan Polikoff (USC Rosier School of Education, FutureEd) and some folks at the reformy Center for Reinventing Public Education along with the Data Quality Campaign wanted to put together a report on what state report card sites had to say about pandemic learning loss trajectories.
What they found is what lots of us could have told them-- the state school report card sites kind of stink.
The report itself is pretty brief. Do state sites provide longitudinal data? Only a few provide it, and not in any manner that is easy. Most commonly they provide Big Standardized Test data, graduation rates, a few other odds and ends, again, not always easy to find.
And "Overall, state report cards were remarkably difficult to use." Sometimes technical issues. Sometimes too much data in unwieldy format, and some just damned near impossible to navigate. I'd add to the list sites with a whole lot of edu-jargon that parents will need to translate. Add on top of that that most of the sites are hard to locate in the first place. In the process of writing about education for over a decade, I have often gone looking for information about particular schools, and not once has a search engine directed me to a state's report card site.
In his frustrated take over this adventure, Polikoff asks the right questions. For instance, "who is the intended user?" Is there an audience for these sites? One theory, favored by some reformy types, is that parents trying to pick a school will head to these sites to shop for a school. But most of the information that a parent would want is just not there at all, and maybe some folks should finally release the dream that parents will choose schools based on Big Standardized Test scores (and not sports programs or location or who else has kids going there).
It's that same childlike faith that transparency and data will drive the education marketplace towards excellence, which is doomed because A) excellence in education defies transparent data collection (BS Test results are not it) and B) that's not how the marketplace works, anyway.
I'm not sure there is any audience for these sites at all. It's the kind of thing I think of as a library publication--something that puts down information that needs to be stored somewhere, because it's important and the odd researcher or historian may want it at some point. Like the big 19th century history of your town, or your family genealogy, or a book of instructions for household plumbing repair. It doesn't have an audience in the usual sense of the word, but it's information you put somewhere just in case someone needs it.
If there is an audience for these sites, it would probably be some federal regulatory office that gave states the impression that they would be held accountable for some assortment of these data. So like a lesson plan-- somebody told you you have to do it, but that doesn't mean they (or you) are going to look at it. Perhaps a state could use this information to actually direct assistance to schools, and certainly some states have used public school performance data to target those schools for privatization. But do either of those processes require an actual state report card website?
Is there an audience at all? I checked Pennsylvania's Future Ready site on a traffic checking site and found that it averaged 737 visits a month since April. Florida, a more volatile education state, shows around 950 per month during that time (and 4% of the traffic is from India). Now maybe if we drill down into pages within the site, we find better results. But should state functionaries be putting in much effort for that kind of traffic. Or should they be trying to drive traffic there to justify its existence?
So Polikoff's last question, based on an observation made by some members of his committee.
Are state reports doomed to be compliance exercise?
Well, yes. Yes, they are. Compliance exercises are the special hallmark of state governments, especially in areas like education where politics demands answers but actual meaningful answers are hard to come by. And as Rick Hess has observed, while it may be easy to make someone do something, it's hard to make them do it well. Particularly when it's unclear why you're doing it.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.