Policy Documents

Are Bad Schools Immortal?

David A. Stuit –
December 6, 2010

Are bad schools immortal? Based on the pioneering analysis in these pages, it would seem so, at least for most such schools most of the time. About three-quarters of them stay open—and stay bad, certainly when judged by the meager (bottom quartile) proficiency levels that their pupils attain.

Even more troubling, this glum track record is nearly as weak in the charter-school sector as in the district sector, despite the acclaimed charter-movement doctrine that “bad schools don’t last—either they improve or they close.”

Would that it were so. Yet 72 percent of the original low-performing charter schools examined in this study were still operating, and still low-performing, five years later, compared with 80 percent of district schools. That means very few schools picked themselves up, rolled up their sleeves, and “turned around” their low achievement levels to above the state average. Bona fide turnarounds were rare: Just 1.4 percent of district schools and less than 1 percent of the charters earned that accolade.
We must, however, register three disclaimers. First and most obvious, analyst David Stuit did not—could not, talented though he is—actually examine eternity and thus we cannot truly speak of immortality. He tracked 2000+ low-performing public schools (1,768 of them district-operated,
257 of them charters) in ten states from 2003-04 through 2008-09. It’s possible, even likely, that by spring 2010 at least a few more of them had improved or closed, and that this process is continuing. (It’s just as possible, of course, that some schools in Stuit’s larger sample that were not low-performing in the base years of his analysis could later have slipped down into that category.)

Second, we’re tough graders. To be deemed a turnaround, a school in its state’s lowest decile (i.e., proficiency at or below the 10th percentile) at the beginning of the period had to surpass the 50th percentile within five years. That means a school might have made substantial progress (e.g., 2nd to 50th percentile) yet not qualify as turned-around.

Third, this analysis relies on absolute proficiency scores on state tests (variable as these tests and proficiency definitions are) to judge school performance. Stuit did not—again, for the most part could not—undertake “value added” analysis. We may fairly surmise that some of these schools are adding considerable academic value to significant numbers of children even as they remain well below average in getting kids to “proficiency,” compared with other schools in their states. Still and all, the picture is not pretty. We find in these results two large takeaways that policy makers and educators should ponder: