Lexington, IPI put spotlight on rampant regulatory abuse

Published January 1, 2000

Hiding data, ignoring sound science, “solving” problems that don’t exist–these are just some of the things federal agencies do to help perpetuate their grip on the levers of regulatory power, according to a report just released by the Lexington Institute and the Institute for Policy Innovation (IPI).

Unfortunately, the final year of this millennium saw no let-up by federal regulators to impose their will–no matter how misguided–on ordinary people. The consequences of ill-advised schemes emanating from Washington can be such that they harm the very society they are supposed to protect.

The report, “Out of Control: Ten Case Studies in Regulatory Abuse,” explores the pock-marked landscape of bureaucratic excess. As Oklahoma Senator James Inhofe notes in his introduction to the report, “[M]ost bureaucrats in Washington have very little to no experience in the private sector, so they fail to understand simple business concepts and practices.” They also don’t make good use of the science that is readily at their disposal. The report shows where this potent brew can lead.

Regulating plants as pesticides

Hoover Institution Senior Fellow Henry Miller, one of the world’s foremost authorities on biotechnology, addresses the folly of EPA’s plans to regulate genetically modified plants as if they were pesticides.

Instead of hailing the work of scientists who have made plants more resistant to insects, viruses, bacteria, and fungi, EPA wants to impose a draconian regulatory straight-jacket on one of the most promising technologies to come along in years.

“EPA’s decision to regulate an entire class of negligible-risk crop and garden plants genetically improved with new biotechnology has outraged the scientific community and virtually eliminated commercial R&D in this sector,” Miller notes. In October 1998, the prestigious Council on Agricultural Science and Technology, an international consortium of scientific and professional societies, characterized EPA’s approach as “scientifically indefensible,” concluding that treating gene-spliced plants as pesticides would “undermine public confidence in the food supply.”

According to Miller, EPA’s plans violate one of the cardinal principles of regulation: “The degree of oversight of a product or activity should be commensurate with the risk.”

Hiding data and regulating SUVs

In May, EPA proposed tightening tailpipe emission standards for sport-utility vehicles (SUVs) and requiring oil companies to reduce the sulfur content in gasoline. This, the agency assured the public, would save 2,400 lives a year as a result of lower particulate matter emissions from SUVs.

But, as Steve Milloy points out, the 2,400-lives-saved figure is highly suspect, to say the least. The agency’s claim, it turns out, is based on a single, EPA-funded study, known as the Pope study. That study, actually a statistical analysis, is unique, in that its underlying data are missing. EPA used the same study in 1996 to justify its proposed new standards for particulate matter, saying that 15,000 lives would be saved annually by its action. In both cases, EPA has refused to allow the public to have access to the data on which the new regulations are based.

Milloy, publisher of the Junk Science Home Page, notes that science is based on the scientific method: “a system for developing and testing scientific theories. The scientific method requires that individual studies be capable of replication–i.e., independent scientists under the same conditions should be able to repeat a study’s findings.” No such independent review of the Pope study has taken place, because EPA and the scientists it funded have refused to release the underlying data to public scrutiny.

Thanks to the efforts of Senator Richard Shelby of Alabama, federal agencies may no longer be able to impose regulations based on “secret science.” Last year, Shelby attached an amendment to a spending bill requiring that taxpayer-funded scientific data be made available to the public under the Freedom of Information Act (FOIA). The new “data access” law is under vicious attack from those who would use secret science to promote their own regulatory agendas.

Ignoring science on chloroform

Science at EPA suffered another setback late last year when the agency’s brass overruled the recommendations of its own scientists and torpedoed a science-based standard for chloroform in drinking water.

As I report in the Lexington/IPI study, the decision will force water system operators to waste precious resources battling fictitious threats resulting from the purification of drinking water.

Chloroform is created when water is chlorinated to remove microbial pathogens. Since trace elements of chloroform and so-called disinfectant byproducts are an inevitable result of the water purification process, water suppliers in the U.S. have come to see them as posing a far lower risk to public health than the pathogens that would otherwise remain in the drinking water. But by insisting on a chloroform standard of zero (unobtainable in any event), EPA, contrary to its scientists’ recommendations, was essentially denying the existence of a threshold below which exposure to a substance poses little or no risk.

Environmental groups had urged EPA to ignore its scientists’ findings. In bowing to that pressure, EPA turned its back on a key provision of the 1996 Safe Drinking Water Act, which directs the agency to use the “best peer-reviewed science.”

The living dead zone and factory farming

Responding to a Congressional mandate, the White House and an inter-agency working group are spearheading efforts to deal with hypoxia where the Mississippi River spills into the Gulf of Mexico. Hypoxia is the name for a low-oxygen area where fish cannot live. It is said to characterize a “dead zone’ in the Gulf of Mexico, one created by overzealous Midwestern farmers using too much fertilizer. Or so we are told. But as Dennis Avery, director of global food issues at the Hudson Institute, points out, the so-called “dead zone” is alive and well.

The “dead zone,” he explains, has nothing to do with fertilizer run-off making its way downstream where it suffocates marine life. Rather, it is a natural phenomenon, connected to rainfall patterns in the Mississippi River Valley. When there is a severe drought, as there was in 1988, the hypoxic zone contracts, indeed, almost disappears. By contrast, the great flood of 1993 triggered a dramatic expansion of the hypoxic zone. The return of relatively normal rainfall patterns over the past six years has led to a shrinking of the hypoxic zone.

What some people call a “dead zone” is known to local fishermen as “the Fertile Crescent, a place where nutrient-rich water from the Mississippi flows into the Gulf. Efforts aimed at reducing the nutrient content of those waters could have a devastating effect on the Gulf fishery. Unfortunately, that’s just what Washington has in mind. Convinced there is a problem to solve, the federal government wants to cut back the use of fertilizer on farms in the Midwest by 20 percent and convert 24 million acres of Midwest farmlands into wetlands and forests to absorb more of the nitrogen from the farms.

What will this do? Avery says fish catches in the Gulf will decline and Midwestern farmers will obtain lower corn yields. But as worldwide demand for farm products rises, densely populated countries such as India and Indonesia will have to try to make up the difference. “Fighting a hypoxia problem that does not exist in the Mississippi Valley will thus result in the loss of 60 to 100 million acres of tropical wildlands; a trivial price to pay to protect the egos (and salaries) of our hardworking bureaucrats in Washington,” Avery says.

Environmental degradation will also result from another scheme federal bureaucrats are promoting. EPA has decided that attacks on fish in the Chesapeake Bay by a tiny, nasty marine critter called Pfesterria piscida have been brought on by run-off of chicken and horse manure from “confined animal feeding operations,” or CAFOs.

CAFOs, often derisively referred to as “factory farms,” make an appealing target because of their size. But are they the environmental culprits EPA makes them out to be? The Black River watershed in North Carolina, Avery points out, drains the most intensive hog farming area in the United States. But the Black River is still rated “outstanding” in water quality.

Clamping down on CAFOs in favor of free-range chickens and hogs is certain to increase the amount of manure that finds its way into the nation’s rivers and streams, Avery argues. Free-range livestock will require more land, thus reducing wildlands acreage considerably. The losers will be the farmers, the people who eat the food the farmers produce, and the environment.

Shooting, shoveling, and shutting up

This summer, President Bill Clinton and Interior Secretary Bruce Babbitt proudly announced the forthcoming removal of the bald eagle from the Endangered Species List. The bird’s recovery, they said, was a sure sign of how successful the Endangered Species Act (ESA) has been.

But R.J. Smith, senior environmental scholar at the Competitive Enterprise Institute, demolishes this self-serving claim. He points out that the 1972 ban on DDT eliminated the major cause of the eagle’s reproductive problems, paving the way for the bird’s recovery.

In the 26 years that have ensued since the ESA was enacted, grandiose claims of its success have been as frequent as they have been dishonest. According to Smith, not a single one of the 27 species removed from the Endangered Species List can be attributed to the ESA. Some were removed because they were extinct, some were removed because they never should have been on the list in the first place, and some recovered thanks to factors unrelated to the ESA.

Instead of protecting species, the ESA has been cynically used “as a cover for cost-free national land-use control,” Smith says. For landowners, having an endangered or threatened species on their land can be ruinous, because they quickly lose the economic use of their property. “By threatening landowners who make room for nature with the uncompensated loss of their land and crops,” Smith observes, “the ESA encourages landowners to get rid of wildlife habitat and sterilize their property. Known as the ‘shoot, shovel, and shut-up syndrome,’ the inevitable result of the law’s punitive nature forces landowners to treat wildlife as a liability.”

EPA occupies the Hudson Valley

In a bid to rid the Upper Hudson River Valley of the “scourge” of PCBs, EPA has decided to convert the area into a giant Superfund site. There are two major problems with this idea: First, PCBs (polychlorinated biphenyls) turn out to be not the problem EPA still insists they are and, second, Superfund creates rather than solves problems.

In March, Renate Kimbrough, the scientist whose work 25 years ago prompted Congress to ban the use of PCBs, revised her findings, saying now she can find no link between exposure to PCBs and cancer in humans. Conclusions she and fellow scientist Martha Doemlannd reached were published in the peer-reviewed Journal of Occupational and Scientific Medicine; they coincide with plans by EPA to remove PCBs from the Hudson River.

To the horror of local residents, the agency wants General Electric to dredge the river of all PCBs. Like any other river, the Hudson flows . . . and whatever substances are set free during dredging, including PCBs, will be pushed downstream by the current. This opens the door to an ever-expanding Superfund site, in which EPA assumes the role of a quasi-army of occupation in the Hudson Valley.

Cleanup at an average Superfund site takes 10 to 15 years. Thanks to the dredging EPA wants to have done, the Hudson River site could be tied up even longer. Sadly, if EPA would just leave well enough alone, the PCBs would dissipate over time, something that New York State biologists have determined is already happening. But then when did EPA ever leave well enough alone?

The missing link at EPA

Microbiologist David Lewis, who has spent nearly three decades as a career scientist at EPA, has uncovered a startling deficiency in the way the agency assess the risks various chemicals pose to public health and the environment. That deficiency, he says, is EPA’s failure to pay adequate attention to biology.

Almost all the chemicals regulated by EPA, once introduced into the environment, are transformed by microorganisms living in soil and water into other chemicals with totally different properties, Lewis explains. “In all its thirty years of existence,” he points out, “the agency has never developed a reliable means for predicting how long industrial pollutants will persist in the environment, and what chemicals they will be transformed into by living organisms.”

As Lewis explains it, microbes quickly detoxify some industrial wastes once they enter the environment, but in other cases, they change innocuous wastes into potentially hazardous agents. “Because EPA has expended most of its resources devising and defending regulations aimed at pleasing environmental activists, instead of developing and applying sound science,” Lewis goes on, “regulators do not know what happens when pollutants and microorganisms meet in the real world.”

The agency’s neglect of microbiology seriously undermines whatever scientific pretensions EPA might have in formulating environmental regulations. Thanks to Lewis, we now know that EPA has been flying blind for thirty years.

How ordinary chemicals became pollutants

Whenever environmental issues are discussed, the words “toxic” or “toxic pollutant” invariably surface. But as Hugh Wise, an environmental scientist at EPA points out, “toxic” and “toxic pollutant” are merely regulatory terms with no real basis in science.

Thanks to the popularity of Rachel Carson’s 1962 best-seller, Silent Spring, the public became accustomed to hearing about “cancer-causing” industrial chemicals. This, Wise explains, “led people to believe that these chemicals are inherently toxic, an implication without scientific merit. At the same time, a basic principle of toxicology went unnoticed: the dose makes the poison.”

Unfortunately, this unscientific use of terms made its way into the nation’s environmental regulatory statutes, beginning with the Clean Water Act (CWA) of 1972. The CWA defined a “pollutant” so broadly as to include almost anything (including sand and rocks) EPA might decide to regulate. What’s more, Wise says, the CWA required EPA to publish a list of chemicals that were to be designated as “toxic pollutants.” When EPA failed to come up with the list on time, the agency was sued in 1975 by the Natural Resources Defense Council (NRDC), the Environmental Defense Fund (EDF), and others.

The suit was settled in a consent decree in 1976, resulting in a list of 126 chemicals that henceforth came to be known at EPA under a variety of regulatory labels, including “priority pollutants,” “toxics,” and “toxic pollutants of concern.”

What’s important, Wise says, is that these designations had nothing to do with the chemicals’ inherent toxicity (the dose makes the difference) but were made to satisfy the requirements of a consent decree. Litigation trumped science, and the nation has been paying for it ever since.

Bonner Cohen is a senior fellow at the Lexington Institute in Arlington, Virginia.