A January 2006 report from the National Center for the Study of Privatization in Education has stoked a debate about the veracity of the claim that private school students outperform their public school counterparts.
Using figures from the math component of the 2003 National Assessment of Educational Progress (NAEP), commonly referred to as the Nation’s Report Card, University of Illinois at Champaign-Urbana researchers Christopher Lubienski and Sarah Theule Lubienski compared the performance of fourth- and eighth-grade private school students with that of their public school peers.
“Researchers and policymakers have known for some time that private schools have higher raw scores on these types of tests,” Chris Lubienski said about the results’ significance, “but we also know that they serve, on average, more advantaged populations with characteristics that are already associated with higher test scores. We had rather comprehensive data and methods available to account for those differences in populations served in public, private, and charter schools. Once those differences are considered, they more than explain the differences in test scores.”
Questioning Study’s Purpose
Sister Dale McDonald, director of public policy and educational research at the National Catholic Education Association (NCEA), was skeptical of the study’s value.
“What’s the purpose?” McDonald asked. “They are looking at one subject and one test, and trying to draw generalizations. Most likely, they are trying to figure out how well a school is doing based on isolated factors, and we don’t believe those isolated factors help us better understand our schools.”
The Lubienskis’ study, “Charter, Private, Public Schools and Academic Achievement: New Evidence from NAEP Mathematics Data,” was funded through the National Center for Education Statistics (NCES) but was contracted out to the Lubienskis to maintain scientific credibility.
Neutralizing ‘Private School Effect’
The Lubienskis–who controlled for socioeconomic status, ethnicity, gender, disability, English proficiency, and school location, and then grouped students with similar characteristics–compared performance by students in Catholic, Lutheran, conservative Christian, other private schools, and charter schools to average public school achievement.
According to their findings, public schools significantly outscored Catholic schools in both fourth and eighth grade, and Lutheran schools outperformed all private school types. Charter schools performed slightly lower than public schools at the fourth-grade level, but slightly higher at the eighth-grade level.
Chris Lubienski pointed out the findings have limitations but indicate the issue needs further research and attention.
“We think the data and analysis are significant enough to point researchers and policymakers to the need to check their assumptions on this issue,” Chris Lubienski said. “Also, we think this points to the need for further study using a variety of datasets and methodological approaches, also looking at different subjects and grade levels.”
The Lubienskis’ methodology has drawn criticism from the private school community. The Lubienskis say they used Hierarchical Linear Modeling (HLM) to extract “nested,” or multi-level information. HLM allows researchers to examine individuals within an organizational unit that needs analysis itself–in this case, the individuals are students and the organizations are schools. The Lubienskis then made comparisons by demographics. This entire process has been criticized for its limitations.
“The same procedure could be carried out with any two sets of data: public suburban schools and public urban schools, crop yields in Iowa and Kansas,” said Joe McTighe, executive director of the Council for American Private Education. “The approach would be the same: Neutralize the observed advantage of Set A and then go on to demonstrate that, absent the advantage, Set A is no better off than Set B.”
McDonald agreed, noting the theoretical nature of the study.
“We are not greatly concerned with the Lubienski study,” McDonald said. “It’s an academic exercise in which certain factors were considered and certain factors were removed–you could just as easily arrive at different conclusions with the same data.”
NAEP data, which in 2003 included more than 340,000 students in 13,000 schools, is presented as a snapshot by the National Center for Education Statistics, a nonpartisan, nonpolitical federal research group. Researchers, policymakers, and educators all use NAEP results to track performance trends.
The Lubienskis acknowledged their study–which is cross-sectional rather than longitudinal, like the NAEP data it draws from–cannot justify broad conclusions. They believe they have localized gaps in achievement at a point in time, not gaps in growth: a statistic experts such as McDonald consider more important.
“Although we cannot and do not make causal claims from cross-sectional studies such as NAEP, with the raw data we can account for the primary possible confounding variables that could explain differences in achievement between schools, making it less likely that longitudinal data would tell a different story,” Sarah Theule Lubienski said.
McDonald said that for parents and school leaders, the Lubienski study does not hold much relevance.
“The report from the Lubienskis does not address factors like school climate or location, for instance, that we think are important,” McDonald explained. “Parents do not choose schools based on NAEP performance–there is a ‘so what’ factor about isolating NAEP scores at all.”
Consider Many Factors
Parents consider a variety of factors such as safety, teacher quality, academic strength, and religious values when choosing schools for their children, McDonald said. Performance on tests like NAEP might not play a role at all for some parents, she noted.
McTighe added that the Lubienski study is too hypothetical, given the variables of actual school settings.
“Children do not attend statistically modeled classrooms in computers,” McTighe said. “They attend real classrooms in real neighborhoods with real classmates and real teachers. You can’t go into a real class and reconstruct it by excluding certain factors. It is what it is.”
While the Lubienskis admit causal relationships between their findings and the quality of private, charter, and religiously affiliated schools cannot be made, they contend policymakers eager to bring market-based reforms to education should reexamine their assumptions.
“We clearly show that policymakers should not assume that private schools are more effective simply because private school average achievement tends to be higher than that of public schools,” Sarah Theule Lubienski said. “What our study shows is that the achievement gap that typically favors private schools washes away when one accounts for demographic differences.”
While any long-term implications of the Lubienski study for school choice cannot yet be known, the NCEA and many other Catholic organizations continue to support school choice efforts nationwide.
“The NCEA holds that parents are the principal educators of their children,” McDonald said. “They have the right to determine the kind of education that they want for their children. As a Catholic organization, we believe that being poor should not limit that right–be it to choose public, charter, or private schools, through voucher, scholarship, or tax credit programs.”
Kate McGreevy ([email protected]) is a freelance education writer living in New Mexico. She formerly worked with the Cesar Chavez Public Charter Schools for Public Policy in Washington, D.C.
For more information …
“Charter, Private, Public Schools and Academic Achievement: New Evidence from NAEP Mathematics Data” is available online at http://www.ncspe.org/publications_files/OP111.pdf.
For more information on Catholic schools, visit the National Catholic Education Association Web site at http://www.ncea.org.
For more on private schools, see the Council for American Private Education Web site at http://www.capenet.org.