The Relatively Unexplored Frontier of Charter School Finance
Do charter schools do more – get better results – with less? If you ask this question, you’ll probably get very strong answers, ranging from the affirmative to the negative, often depending on the person’s overall view of charter schools. The reality, however, is that we really don’t know.
Actually, despite uninformed coverage of insufficient evidence, researchers don’t even have a good handle on how much charter schools spend, to say nothing of whether how and how much they spend leads to better outcomes. Reporting of charter financial data is incomplete, imprecise and inconsistent. It is difficult to disentangle the financial relationships between charter management organizations (CMOs) and the schools they run, as well as that between charter schools and their “host” districts.
A new report published by the National Education Policy Center, with support from the Shanker Institute and the Great Lakes Center for Education Research and Practice, examines spending between 2008 and 2010 among charter schools run by major CMOs in three states – New York, Texas and Ohio. The results suggest that relative charter spending in these states, like test-based charter performance overall, varies widely. In addition, perhaps more importantly, the findings make it clear that there remain significant barriers to accurate spending comparisons between charter and regular public schools, which severely hinder rigorous efforts to examine the cost-effectiveness of these schools.
The analysis, by Bruce Baker (Rutgers University), Ken Libby and Kathryn Wiley (University of Colorado Boulder), uses data from both government (and/or charter authorizer) sources, as well as tax filings of the CMOs that operate charter schools (IRS 990 forms). Due to the need to gather data manually, the authors limit their analysis to a set of CMOs, many of which are well-known organizations that operate schools in multiple states.
As is the case when comparing charter and regular public school achievement results, spending contrasts must account for differences in context and enrollment. This analysis therefore controls for observable factors, such as student characteristics (e.g., special education), grade range and school size, that influence spending.
Unsurprisingly, charter spending relative to that of comparable regular public schools varies widely in all three locations, as does, unfortunately, the confidence one can have in their accuracy (as gauged by comparing the figures from the government/authorizer reports and those from CMOs’ tax filings).
In Ohio, for example, charters appear to consistently spend less – in some cases, substantially less – than comparable district schools. But the extent of this discrepancy remains an open question, due to inconsistencies between the different data sources.
In Texas, the charter/regular public school spending differences are very much mixed, with some charter chains spending more than comparable regular public schools, and others spending less.
Finally, the authors express the highest level of confidence in their estimates for New York, where the IRS and government/authorizer sources largely match up. The findings indicate that NY charter spending is higher among virtually every CMO included in the analysis. The differences range from a few hundred to several thousand dollars per pupil, with the largest differences (equivalent to around 20-30 percent) found among schools run by KIPP, Uncommon Schools and Achievement First.
Obviously, these results are not easily generalized to the nation – the analysis includes only a selection of (mostly well-established) CMOs, operating in just three states. And, again, significant questions remain about the accuracy of the data, especially in Ohio and Texas.
That said, it safe to say that charter spending relative to that of comparable district schools is rather mixed. Some charters spend more, others spend less; and this varies within and between CMOs, states and districts.
There is, however, one consistent finding from this analysis – KIPP schools appear to spend more than neighboring schools, though the differences are not always large and vary by location and school characteristics (e.g., grade range served). Most notably, KIPP schools in NY spend around 30 percent more than district schools, whereas in Texas, the differences are all positive, but are small in some cities, and larger in others. These results for KIPP are consistent with previous research.
But, again, in many respects the major conclusion of this analysis is that the scarcity and apparent inconsistency of data in most places largely precludes any rigorous comparisons of charter and regular public schools’ spending. The situation is much better in New York State than in Ohio and Texas, but all three are way ahead of the rest of the nation. In most other states, even the preliminary analysis presented in this report is simply not possible.
This is a critical problem. Governance structure – whether a school is or is not a charter school – does not by itself influence results. Schools’ effectiveness varies not by what they are, but rather by what they do. Schools like KIPP seem to get strong results, but they also tend to spend a lot more. They provide costly services such as massive amounts of extended time and tutoring, and thus their results may come at significantly greater cost.
To even begin to figure this out, one must know how much these schools spend and, preferably, where that money goes (how schools/districts spend money matters as much as how much they spend).
This report is a step forward in this direction, but progress will be limited until states and districts make deliberate efforts to improve the collection and reporting of charter school financial data. In the meantime, the common claims that charter schools “do more with less” remain little more than speculation.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.