The Test-Based Evidence On New Orleans Charter Schools
Charter schools in New Orleans (NOLA) now serve over four out of five students in the city – the largest market share of any big city in the nation. As of the 2011-12 school year, most of the city’s schools (around 80 percent), charter and regular public, are overseen by the Recovery School District (RSD), a statewide agency created in 2003 to take over low-performing schools, which assumed control of most NOLA schools in Katrina’s aftermath.
Around three-quarters of these RSD schools (50 out of 66) are charters. The remainder of NOLA’s schools are overseen either by the Orleans Parish School Board (which is responsible for 11 charters and six regular public schools, and taxing authority for all parish schools) or by the Louisiana Board of Elementary and Secondary Education (which is directly responsible for three charters, and also supervises the RSD).
New Orleans is often held up as a model for the rapid expansion of charter schools in other urban districts, based on the argument that charter proliferation since 2005-06 has generated rapid improvements in student outcomes. There are two separate claims potentially embedded in this argument. The first is that the city’s schools perform better that they did pre-Katrina. The second is that NOLA’s charters have outperformed the city’s dwindling supply of traditional public schools since the hurricane.
Although I tend strongly toward the viewpoint that whether charter schools “work” is far less important than why – e.g., specific policies and practices – it might nevertheless be useful to quickly address both of the claims above, given all the attention paid to charters in New Orleans.
The claim that NOLA’s schools as a whole are better post-Katrina is exceedingly difficult to substantiate empirically (despite absurd raw comparisons of results before and after the storm). The hurricane resulted in the largest displacement of Americans in well over a century. This pattern of leave-taking and return was not random.
Flooding (i.e., home damage) was worst in areas with the highest concentrations of poverty. The city’s population is now considerably smaller than it was prior to the storm (as is school enrollment), and, while the differences are not all large and subject to fluctuation, the residents, on average, have changed in terms of characteristics such as education and age. In short, NOLA public school students and their families are almost certainly rather different pre- and post-Katrina, and many of these differences cannot be measured directly.
This severely complicates (but does not necessarily preclude) any pre- versus post-hurricane comparisons of testing (or other) outcomes.
There is, however, one other change that bears mentioning: Schools in New Orleans, charter and regular public, have had a lot more money than they used to. In 2004-05, prior to the hurricane, New Orleans schools spent just under $8,000 per-pupil, roughly equivalent to the state average. In 2009-10 (there is a lag in the release of finance data), school site per-pupil spending was reported at over $13,000, approximately 20 percent higher than the state overall. In 2007-08 and 2008-09, it was even higher. (Note that spending varies considerably by school, and that part of this increase, particularly in the earlier years, is due to the cost of charter start-ups and conversions, as well as storm-related government grants to all schools.)*
Again, it’s very tough to say how NOLA schools, charter or regular public, would be performing if they had to serve the same student population that lived in the city pre-Katrina, or how they would have fared with more funding but no storm. Moreover, hypotheticals aside, I cannot say what effect this funding has had on performance, especially given the circumstances. But it is worth noting that there has been an increase in resources, and it’s something to keep an eye on going forward.
Moving on, the second claim – that the city’s charters outperform traditional public schools since Katrina – is testable, but the evidence is (understandably) scarce, and not quite as clear-cut as is sometimes implied.
To my knowledge, the only rigorous charter analyses using post-storm New Orleans data have been conducted by CREDO (of “CREDO study” fame). Although the full sets of results were never released to the public (several news stories reported a truncated version of last year’s findings), the folks at New Schools for Orleans (NSNO), a non-profit organization that assists, opens and advocates for charters in the city, and oversees the CREDO analysis as part of administering an i3 grant in NOLA, were gracious enough to let me take a look at some of the latest results.
Six of the higher-performing charters in the city (as estimated by CREDO) are selective schools (e.g., converted magnets). These cannot really be compared with non-magnet traditional public schools, which take all comers (as NSNO, to its credit, acknowledges).
In keeping with CREDO’s accessible style of presenting results, among the 47 non-selective charters in the latest analysis (which includes data from 2008-2011), 20 are estimated to have generated significantly better testing gains in math; 16 performed better in reading. The rest did discernibly worse (6 schools in math, 9 in reading), or were not statistically different (21 in math, 22 in reading). This breakdown is roughly consistent with the previous round, as reported in the newspapers.
There was also, as usual, wide variation in the actual effect sizes in both subjects.
So, are NOLA charters outperforming the city’s regular public schools? About half of them appear to be, at least according to CREDO’s test-based estimates, but there are (as always) a few important caveats, most of which are common, but seem especially salient in the New Orleans context.
One caution is that these charters aren’t really being compared with the city’s regular public schools, at least not exclusively (remember that the estimated charter impacts are relative, not absolute). The comparison group in the CREDO study is comprised of schools from which any student transferred into NOLA’s charters. It’s not unusual to use district “feeder schools” for these comparisons, but, in NOLA’s case, this includes a bunch of schools outside the city.
In addition, the comparison feeder schools that are within the city represent two different governance structures, with about a third run by Orleans Parish (in 2008-11) and the rest operated by the state-level RSD (the proportions vary over the three-year period).
This means that, overall, the regular public school reference group consists of schools that are both located in different cities and represent different financing, policies, etc. This matching approach, necessitated mostly by the scarcity of NOLA regular public schools, makes it somewhat difficult to interpret the findings in the typical “charter versus district school” manner. (Note that this is probably much less of an issue when comparing individual charters with their counterparts intra-sector – e.g., for scaling decisions.)
Another thing to keep in mind is that these results, like those of most charter analyses that don’t use random assignment (which is rare), cannot fully account for selection bias – e.g., unobserved differences in applicants’ characteristics that are not picked up by the standard set of education variables, including peer effects (for example,charters serve about the half the proportion of special education students enrolled in regular public schools).**
Nor can the models address the issue, common in the charter debate nationally but particularly oft-discussed in NOLA, that voluntary and involuntary attrition among charter students might influence the estimates. (Note: There islittle evidence regarding the extent [if any] to which selection and attrition affect results from these types of analyses.)
Finally, and most generally, this is, of course, just one analysis, using data from a time period (2008-2011) during which NOLA’s schools, not to mention its residents, were in a state of remarkable flux. Families were still returning to the city and/or getting resituated, the student population was highly mobile and schools under the jurisdiction of three distinct governance structures were being built or converted to charters (charter coverage grew from 57 percent in 2008 to 78 percent in 2011). Put crudely, there was a lot going on, and it’s difficult to say how this might have affected school and student performance.
At the very least, it would be advisable to wait for at least one or two other independent analyses, using data from a few years into the future, before drawing anything beyond tentative test-based conclusions about the city’s chartersen masse, especially vis-a-vis regular public schools (and, of course, there should be additional examinations of non-test outcomes, as in this report). This is always good advice (not always heeded by charter opponents who cite CREDO’s national report), but it’s especially applicable in New Orleans.***
In the meantime, it’s fair to say that many NOLA charters appear to be strong schools that are serving their students well. This should not be dismissed or diminished.
Whether this carries implications for charter schools in general, however, is a complicated issue. To me, once again, that requires one to ask not how many NOLA charters do better, but why – i.e., whether there are concrete policies or other observable characteristics associated with results within the charter sector (NOLA may be particularly well-suited for this type of examination, given the size of its sector).****
One cannot determine with absolute certainty what makes schools, charters or regular public, in NOLA or elsewhere, successful. Policies aren’t randomly assigned to schools, and many important factors, such as culture and leadership, elude easy measurement. But the implications of even tentative findings are potentially important – to all schools.
- Matt Di Carlo
*****
* Temporary funds, such as storm-related grants and charter conversion costs, are likely the primary reasons why spending went down between 2007-08 and 2009-10, after peaking at around $16,000 per student in 2007-08. The level is still considerably higher than the state average, although it may decline a little further (e.g., as grants run out and fewer schools are in the start-up phase). It’s tough to say what spending would have looked like without Katrina, but it’s a decent bet it would have been lower than it is. It’s also worth mentioning that these spending figures are almost certainly inaccurate or incomplete in many cases, due to inconsistency in reporting between the different entities overseeing and operating schools, as well as the difficulty in tracking private donations.
** Some critics (and scholars) contend that the charter movement in NOLA purposefully promoted segregation and other exclusionary practices after Katrina, with the overriding goal of privatizing the city’s public school system. I cannot address these claims (especially those ascribing motives), but it is worth noting that the city’s charters are disproportionately located in areas with lower minority populations and, although most are open to all city residents, proximity is often the dominant issue when parents choose schools, even at the expense of performance. On the other hand, the proportion of black students in non-selective charters is roughly equivalent to that in regular public schools, and there are, unsurprisingly (given CREDO’s methods), no discernible associations, bi- or multivariate, between school-level student characteristics and effect estimates (though these are all very crude assessments of the issues).
*** For example, seven of the 20 non-selective charters with a discernible positive math and/or reading effect in the latest round of results had no such impact in the last round (as reported in newspapers). Such discrepancies are to be expected in these studies, due to both error and “real” performance changes, but they do suggest the need for caution in drawing conclusions about charters, whether as a whole or specific schools (especially when effect sizes are smaller). In addition, for several of the more recently-opened schools, estimates are based on only one or two years of data.
**** Depending on data availability, I may try to take a quick look at this in a future post.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.