How Promising Is Accountability?

Published September 1, 2001

As Congressional conference committee negotiations continue for President George W. Bush’s “No Child Left Behind” education initiative, one of the critical elements of the plan hinges on expansion of existing state accountability mechanisms to measure students’ academic progress.

However, as California’s experience with its Academic Performance Index has shown, when state-level systems themselves are flawed, the result is dubious accountability that rewards failure as well as success.

“We need to set clear goals for performance and demand that our schools get the job done,” Secretary of Education Roderick Paige declared in testimony before the U.S. Senate earlier this year. But if existing state accountability mechanisms do not set clear goals, the prospects for real improvements in student performance become increasingly doubtful.

California Rewards Bad Schools

Under the Governor’s Performance Awards program, California offers bonus funding as an incentive for schools to improve. But the program operates in such a manner that, in practice, some of the largest “performance awards” have gone to badly performing schools . . . even to schools that fell to the state’s worst rating after placing in the second-to-worst category the year before.

The index, in place since 1999, is based on the performance of students in grades 2-11 on the Stanford 9/STAR test. It employs a scale where the lowest possible score is 200, the highest 1,000. It also ranks schools side-by-side on a 10-point scale–one being the worst score and 10 the best. Using a formula developed by the California Department of Education, schools are then assigned indexed “growth targets” to meet the following year.

The program made $96 million available for schools that met or exceeded their growth targets in the first year. Another $96 million from the state’s Immediate Intervention/Underperforming Schools program was made available for schools that did not meet their growth targets. Funding under this latter program comes with the stipulation that, should schools continue to fail to meet their targets or fail to demonstrate significant growth, they “may eventually be subject to state sanctions.”

In the program’s first year, 4,502 schools–over two-thirds of schools in the state–received Governor’s Performance Awards, which typically ranged from $20,000 to $50,000. Some schools received upwards of $170,000.

“We set the bar higher for every school by holding each one accountable for the only thing that really matters: improved student achievement,” said Governor Gray Davis in his January 2000 State of the State address. “And we focused like a laser on the gateway skill: reading.”

But a number of Governor’s Performance Awards were given to schools that in fact had dropped to a lower decile–and in some cases to the lowest decile–from one year to the next.

Fun with Numbers

What allows these schools to be cast in a more positive light by state officials is that the Performance Awards also reward growth among what the state deems to be “similar schools” and also among certain “numerically significant” ethnic or demographic subgroups within the school.

For example, in Santa Clara County’s Alum Rock Union Elementary School District, Harry Slonaker Elementary slipped from the second-lowest decile among all schools statewide in 1999 to the lowest in 2000. Ranked against “similar” schools, it scored a 2 in both years, next to the bottom. As a reward for such dubious “achievement,” Slonaker was granted a $43,247 Governor’s Performance Award for 2000-2001.

The two schools that received that district’s highest Governor’s Performance Awards, Cesar Chavez Elementary and Lee Mathson Middle School, remained in the lowest-performing category statewide for both years. But Mathson received $48,693 and Chavez was awarded $47,236 in the program’s first apportionment.

Several other important questions have been raised about California’s Academic Performance Index. A 2000 study by California Parents for Educational Choice (CAPE), for example, pointed out that some school districts inappropriately distributed advance copies of test questions or excluded large numbers of students from different test sections. The San Francisco Chronicle reported that much of the test score increases reported by the San Francisco Unified School District could be attributed to the exclusion of increasing numbers of students, whose test scores were thus excluded from school totals.

Don Soifer is executive vice President of the Lexington Institute. His email address is [email protected].