What Do State Tests Measure?

Published February 1, 2002

By linking federal resources to school performance, the recently revised Elementary and Secondary Education Act (ESEA) has generated considerable optimism that additional resources will finally produce the desired outcome of improved student achievement against state academic standards.

However, pessimists point out that many states still have not met the core accountability requirements of a similar revision to the ESEA approved by Congress in 1994–yet they have not been sanctioned for that failure.

Worse still, recent evidence from Illinois indicates student performance on state-designed tests bears no relationship to changes in state educational standards.

The education bill approved by Congress last year and signed by President George W. Bush in January requires each state to test students in grades 3-8 annually against the state’s own standards. The bill also calls for the reporting of both aggregate and disaggregated test results to parents and the general public, giving parents and taxpayers information on how individual schools are performing in comparison to schools in similar circumstances.

Standards, Tests, Report Cards Questioned

Illinois has state educational standards (Illinois Learning Standards, ILS), state tests (Illinois Standards Achievement Test, ISAT), and school report cards to provide a summary of school performance to the public.

Average ISAT test scores by school are not reported to Illinois parents or the public. The state report cards thus do not allow parents to determine how well their child’s school is performing relative to other schools in the area or across the state.

It’s not only the report cards that are weak. Researchers have been unable to find a statistically significant relationship between implementation of the state standards and student performance on the state test, leading some to wonder whether state test results are meaningful. And the standards themselves have been questioned by at least two organizations, who find the standards inadequate in several respects.

The ILS, adopted in 1997 as the state’s response to the national standards movement, are intended to define the knowledge and skills all Illinois public school students are expected to possess. The standards were not enthusiastically embraced by all.

A panel of seven experts commissioned by the Illinois Family Institute found the ILS lacking in subject area content with an over-emphasis on the “process” of teaching. In addition, the ILS draft came under criticism from national experts for such lapses as omitting mention of Illinois’ favorite son, Abraham Lincoln, in the social science standards.

In a wide-ranging appraisal of subject standards from 47 states plus the District of Columbia, the Fordham Foundation in 1998 gave only a C- to Illinois standards overall. It rated the June 1997 Draft Illinois Social Science standards as “useless.”

Test Results Difficult to Interpret

The state test that preceded the ISAT, the Illinois Goals Assessment Program (IGAP), was used from 1985 until 1998. The costly switch to the ISAT was justified as a way to align the state test with the state learning standards. As then-State Superintendent Max McGee declared in 1999, when the ISAT debuted, “unless we deliver assessment results in a manner directly related to the standards, the standards will not be effectively implemented.”

IGAP test results were reported annually as actual scores. Newspapers and organizations, such as the Illinois Tax Foundation, routinely ranked schools by average test scores and per-student spending to provide the public with information on how their local school was performing relative to others with similar demographics.

But McGee, contending such rankings were unfair and misleading, eliminated the public reporting of actual scores when ISAT was implemented. He claimed the change would enhance standards implementation throughout Illinois.

While individual student ISAT scores are still provided to the parents and local schools, the schools’ average scores are not shared with the public. The published test “results” are simply the percentages of students who exceed, meet, or fail to meet state standards in various subjects, as well as the percentage who score so low as to trigger an academic warning.

For example, the Illinois State Board of Education reported that 74 percent of third-grade students met or exceeded state standards in mathematics in 2001. But how is the public to interpret this result? How many correct answers must a student get to meet state standards, or to exceed state standards? According to ISBE documents, this varies by grade level and by subject.

Measuring Against the Standards

The ISBE Web site provides a separate scoring table for each of the ISAT subtests: reading, writing, mathematics, science, and social science. Each table defines the four performance ratings–exceed, meet, below, and academic warning–in terms of a range of scores at each grade level. These ranges vary from subtest to subtest and from grade level to grade level. In other words, “meets standards” has a different definition for each grade level and subtest.

For example, a mathematics subtest score of 185 (81.3 percent of the maximum score in the 120-200 point scale) merits an “exceeds standards” rating for a third-grader; a “meets standards” rating for a fifth-grader; an “exceeds standards” rating for an eighth-grader; and a “meets standards” rating for a tenth-grader.

The minimum mathematics score to achieve a “meets standards” rating is 153 (41.3 percent) for third-graders; 158 (47.5 percent) for fifth-graders; 162 (52.5 percent) for eighth-graders; and 158 (47.5 percent) for tenth-graders.

According to ISBE documents, the ranges for each of the performance ratings were established not by statistical analysis but by a “standard-setting procedure,” whereby educators wrote performance definitions for each subtest at each performance level. Those definitions then were “used as the foundation for determining the three cut points (Exceeds/Meets, Meets/Below, and Below/Academic Warning).”

Test Scores and State Standards

The ISBE has been funding a four-year statistical analysis project to examine the relationship between implementation of the Illinois Learning Standards and student performance on the ISAT.

The study’s third-year report, released last August, described how “teachers and principals across the state are using state learning standards to focus and give meaning to their school improvement efforts,” and to affect professional development, curriculum development, and classroom assessment practices. Nevertheless, the report’s conclusions seriously question the value of the ISAT.

“At this time, no significant, statistical relationship can be detected between the changes in ISAT performance and changes in ILS,” the report concluded. In fact, “disentangling the unique contribution of ILS to improving student learning will likely be a near impossibility in a study of this scope and duration,” noted the report’s authors. They proposed another ISBE study to identify the “intervenable” factors–including ILS implementation–associated with the “complex phenomena of student achievement.”

In December 2001, the ISBE issued a six-month $125,000 consulting contract to research ways to close the testing “achievement gap.” The contract went to former state superintendent McGee, who oversaw the transition to the ISAT and was directly responsible for the change in reporting ISAT results.


Dawn Earl is director of education policy for the Illinois Family Institute. She has worked as a classroom teacher, curriculum supervisor and administrator, and is a former board member of Community Unit School District 200 in Wheaton, Illinois.