As part of his climate change initiative announced in June, President Obama declared, “Today I’m calling for an end of public financing for new coal plants overseas unless they deploy carbon capture technologies, or there’s no other viable way for the poorest countries to generate electricity.” Restrictions on financing will reduce the supply and increase the cost of electrical power in developing nations, prolonging global poverty.
The World Bank has followed President Obama’s lead, announcing a shift in policy in July, stating that they will provide, “financial support for greenfield coal power generation projects only in rare circumstances.” The bank has provided hundreds of millions in funding for decades to coal-fired projects throughout the developing world.
Also in July, the Export-Import Bank denied financing for the proposed Thai Binh Two coal-fired power plant in Vietnam after “careful environmental review.” While 98 percent of the population of Vietnam has access to electricity, Vietnamese consume only about 1,100 kilowatt-hours per person per year, about one-twelfth of United States usage. Electricity consumption grew 34 percent in Vietnam from 2008 to 2011. The nation needs more power and international funds for coal-fired power projects. But western ideologues try to prevent Vietnam from using coal.
The President, the World Bank, and the Export-Import Bank have accepted the ideology of Climatism, the belief that mankind is causing dangerous climate change. By restricting loans to poor nations, they hope to stop the planet from warming. But what is certain is that their new policies will raise the cost of electricity in poor nations and prolong global poverty.
In most markets, coal is the lowest-cost fuel for producing electricity. According to the International Energy Agency (IEA), world coal and peat usage increased from 24.6 percent of the world’s primary energy supply in 1973 to 28.8 percent of supply in 2011. By comparison, electricity generated from wind and solar sources supplied less that 1 percent of global needs in 2011.
The cost of electricity from natural gas rivals that of coal in the United States, thanks to the hydrofracturing revolution. But natural gas remains a regional fuel. Natural gas prices in Europe are double those in the US and prices in Japan are triple US prices. Until the fracking revolution spreads across the world, the lowest cost fuel for electricity remains coal.
Despite our President’s endorsement, carbon capture technologies are far from a proven solution for electrical power. According the US Department of Energy, carbon capture adds 70 percent to the cost of electricity. In addition, huge quantities of captured carbon dioxide must be transported and stored underground, adding additional cost. There are no utilities currently using carbon capture on a commercial scale.
Coal usage continues to grow. Global coal consumption grew 2.5 percent from 2011 to 2012, the fastest growing hydrocarbon fuel. In 2011, coal was the primary fuel for electricity production in Poland (95%), South Africa (93%), India (86%), China (84%), Australia (72%), Germany (47%), the United States (45%), and Korea (44%). Should we now forbid coal usage in developing nations?
President Obama has stated, “…countries like China and Germany are going all in in the race for clean energy.” But China and Germany are huge coal users and usage is increasing in both nations. More than 50 percent of German electricity now comes from coal as coal fills the gap from closing nuclear plants. Today, China consumes more than 45 percent of the world’s total coal production.
Today, more than 1.2 billion people do not have access to electricity. Hundreds of millions of others struggle with unreliable power. Power outages interrupt factory production, students walk to airports to read under the lights, and schools and hospitals lack vital electrical power.
Electricity is the foundation of a modern industrialized nation. Lack of electricity means poverty, disease, and shortened lifespans. Foolish climate policies lock chains on developing nations.
[First Published by Washington Times]
Entrepreneurs in industries tied to the energy efficiency gambit, justified by the climate change House of Cards, all have the same false bravado: they are “game changers” and “market leaders” (for products nobody wants); all their squandered revenues are “investments;” their technological breakthroughs are always “just around the corner;” and it just takes one more round of mandates/grants/loans/tax breaks to achieve viability in the free market.
It’s true of renewable energy and electric vehicles, and as Cree Inc. CEO Chuck Swoboda (in photo with President Obama) revealed last week, it’s true of the alternative light bulb industry too. In a shareholder meeting at the company’s Durham, N.C. headquarters, he boasted about his marketing acumen that he says will persuade the public to embrace Cree’s light-emitting diode (LED) technology and abandon the traditional light bulb – which consumers will soon have no choice about. The meeting featured some new Cree television commercials, with one titled “Eulogy,” in which a spokesman buries an incandescent in a field.
“We are changing generations of belief about lighting,” Swoboda said.
Read the rest here: http://bit.ly/1bSGl9k
The war on climate change has produced many dubious “innovations.” Intermittent wind and solar energy sources, carbon markets that buy and sell “hot air,” and biofuels that burn food as we drive are just a few examples. But carbon capture and storage is the Edsel of energy policies.Carbon capture and storage (CCS), also called carbon capture and sequestration, is promoted by President Obama, the Department of Energy (DOE), and the Environmental Protection Agency (EPA) for coal-fired power plants. In September, the EPA proposed a limit of 1,100 pounds of CO2 emissions per megawatt-hour of electricity produced, a regulation that would effectively ban construction of new coal plants without CCS. Coal is the world’s fastest growing hydrocarbon fuel. Increased use of coal by developing nations boosted coal use from 24.6 percent of the world’s primary energy supply in 1973 to 28.8 percent in 2011. Wind and solar remain less than one percent of the global energy supply. Proponents of the theory of man-made warming realize that world use of coal will remain strong for decades, so they insist that coal plants use CCS to limit CO2 emissions. CCS requires capturing of carbon dioxide, a normal waste product from the combustion of fuel, transporting CO2 by pipeline, and then storing it underground. EPA Administrator Gina McCarthy says, “CCS technology is feasible and it’s available.” Carbon capture is feasible, but it’s very expensive. The DOE estimates that CCS increases coal-fired electricity cost by 70 percent. This does not include the additional cost of building pipelines to transport the carbon dioxide and the cost of establishing reservoirs to store the CO2 underground. An example is Southern Company’s planned coal-fired plant with CCS in Kemper County, Mississippi, which is scheduled to begin operations in 2014. With recent cost overruns, the Southern Company now estimates a $4.7 billion price tag for the 582-megawatt plant. This exceeds the price of a comparable nuclear plant and is almost five times the price of a gas-fired plant. The DOE pledged $270 million in funding for the Kemper County plant along with a federal tax credit of $133 million. Mississippi customers will be socked with a $2.88 billion electricity rate increase to support the plant. Nine U.S. plants currently capture CO2 as part of normal industrial processes, such as natural gas or chemical refining and fertilizer production. All nine facilities sell CO2 to the petroleum industry for Enhanced Oil Recovery (EOR), a process which pumps CO2 into the ground. The Kemper County plant will also provide CO2 for EOR. Another ten US projects are underway to capture CO2 and most of these projects are subsidized with federal money. Ford spent $350 million on the Edsel, the most famous car failure in history. But CCS is a much bigger financial boondoggle. From 2008 through 2012, governments committed to spend more than $22 billion on CCS projects. The United States leads the way with a commitment of more than $5 billion. Despite support by US and world governments, carbon capture is not headed for success. A report released by the Global CCS Institute this month shows that international investment in CCS is now in decline. During the last year, the number of large-scale CCS projects declined from 75 to 65. Five projects were cancelled and seven were put on hold, with only three new projects added. The institute reports that private organizations are not investing in CCS. The number of CCS projects in Europe has declined from 21 to 15, where no new project has entered commercial operation since 2008. The Global CCS Institute states that an “urgent policy response is required” for success. In other words, governments must impose carbon taxes and provide big subsidies for CCS. Would carbon capture really have a measurable effect on global warming? CO2 emissions from power plants total less than one percent of the carbon dioxide that naturally enters the atmosphere each year from the oceans, the biosphere, and other natural sources. If the world fully implements CCS, it’s unlikely that we could detect a change in global temperatures. But, worse than this, if the theory of dangerous man-made global warming is false, CCS becomes an expensive solution to a non-problem. When the dust of history settles and the ideology of Climatism fades away, failed CCS projects will be remembered as the Edsel of energy policies. Originally published in The Washington Times Steve Goreham is Executive Director of the Climate Science Coalition of Americaand author of the bookThe Mad, Mad, Mad World of Climatism: Mankind and Climate Change Mania.
The National Interagency Fire Center keeps records of the number of wildfires and areas covered for the United States on a daily basis. The following table listing fires for the past ten years from January 1 to September 24 was released September 25. As shown, 2013 wildfires are the lowest level in the past ten years and less than two-thirds the average. This is contrary to impressions given by national newspapers and ABC, CBS, NBC, and CNN on their reporting.
Due to considerable September rainfall and snowfall out West, it is unlikely the number of 2013 wildfires will increase by any significant amount during the rest of the year.
The great attention given wildfires this year is due to efforts by President Obama’s Administration promoting global warming from burning fossil fuels is causing increased catastrophic events like wildfires. Paraphrasing Mark Twain on his reported death, “The news of greater national wildfires is greatly exaggerated.”
Other areas involving the United States on global warming threats subject to Mark Twain’s comment are recent increases in rain, drought, hurricanes, tornados, sea level rise, 2012 being the hottest year on record, and lack of snowfall. On a global basis comments subject to Mark Twain’s comments are accelerating sea level rise, accelerating global temperature increases, and record decline on sea ice extent. There is a host of data collected by NOAA and other organizations refuting exaggerations by climate alarmists on recent weather data. Climate events happening today have been happening for millennia.
YEAR-TO-DATE NATIONAL WILDFIRE STATISTICS
2013 (1/1/13 – 9/24/13) Fires: 38,619 Acres: 4,006,968
2012 (1/1/12 – 9/24/12) Fires: 47,437 Acres: 8,678,952
2011 (1/1/11 – 9/24/11) Fires: 59,149 Acres: 7,684,416
2010 (1/1/10 – 9/24/10) Fires: 48,951 Acres: 2,764,416
2009 (1/1/09 – 9/24/09) Fires: 70,609 Acres: 5,576,832
2008 (1/1/08 – 9/24/08) Fires: 67,399 Acres: 4,668,075
2007 (1/1/07 – 9/24/07) Fires: 71,244 Acres: 8,155,743
2006 (1/1/06 – 9/24/06) Fires: 82,922 Acres: 8,961,859
2005 (1/1/05 – 9/24/05) Fires: 52,311 Acres: 8,150,320
2004 (1/1/03 – 9/24/04) Fires: 60,470 Acres: 7,738,246
10-year average 2004-2013 Fires: 59,911 Acres: 6,638,583
Heartland Institute friend Tom Harris of the International Climate Science Coalition (ICSC) was a guest recently on Sun News to discuss the case of a “climate refugee.” Tom talked about a man from the tiny archepelago nation of Kiribati who’s work permit expired in New Zealand and wants to stay there. So what does he claim? That because of climate change he cannot return home due to sea-level rise.
Harris, points out the problem with this claim is that even when sea level rise was at its peak during World War II it was only .6 millimeters per year. It would take thousands of years for this “climate refugee” to be affected. Tom then goes on to discuss that bowing to claims like this would have wide-rippling effects.
For instance, the government of the Czech Republic is currently being sued by Micronesia because of their coal stations and the belief that the Maldives are sinking. The initiations of these claims are coming from the belief in a false problem being pushed by both media and universities.
Government action distorts and then warps the private sector. It provides incentives other than – less than – those that best serve the participants.
Any government advantage given to one thing is a disadvantage to others. Since government is terrible at just about everything, it most often chooses to advantage dumb things at the expense of smart ones. Or it plays Crony Socialism favorites – helping (campaign contributing) friends and punishing enemies. Solyndra, anyone? The IRS?
As a result of all of this inanity, private people make all sorts of decisions and deals not to best meet their needs – but to serve the interests chosen for them by government. Or they use the government as an anti-free market weapon against competitors – rather than working to improve what they’re doing.
The bigger the government, the worse the private sector is warped. Our federal government is spending nearly $4 trillion a year, and has millions of pages of laws and regulations – so you see the size of the problem.
And the heinousness is not just felt domestically. As the world has gotten smaller, our markets for nearly all things have gotten global. And our terrible government policy has messed that up too.
“Trade Wars” actually aren’t about trade – they are about government trade policy. If peoples are trading freely, there isn’t a “War” – there’s commerce. The “Wars” only happen when governments get involved – placing tariffs, regulations and subsidies in the way of the flow.
It becomes a regulatory arms race. A government imposes another subsidy or tax. So several others in response impose new subsidies and taxes of their own. Lather, rinse, repeat.
Government farm policy is an excellent example of this terrible mess. We have all over the world decades worth of Trade War impediments. Thus, sadly, partial deregulation is now no longer an option.
Why not? Let’s look at sugar policy:
(T)he European Union (EU), which supplied as much as 20 percent of global (sugar) exports in the 1990s, shifted from a net exporter to a net importer following sugar policy reforms in 2005.
Their reforms? Unilateral tear-down of their trade barriers – which sounds good. Except it allowed Big Sugar Subsidy Brazilto flood their market – and wipe out nearly all domestic production. And now the EU is paying about 25% more for sugar.
Something similar is happening here. Into the midst of this global omni-protectionist nightmare, we in 1994 dropped the North America Free Trade Agreement (NAFTA). Which greatly opened up our trade with Canada – and Mexico.
Mexico is also a large sugar producer. And the Mexican government is a huge part of the process. The Federales currently own outright 20% of all Mexican production. And subsidize the living daylights out of the rest.
To further manipulate the market, Mexico uses NAFTA to serve for the rest of the world as an open sugar channel to America. They export to us per year nearly 8 million tons – yet import more than 1.2 million tons for domestic use.
In a truly free market, they would send us 1.2 million tons less – and keep it for themselves so as to not need imports. But in the patchwork quilt that is the global Trade War subsidy scheme, Mexico has manipulated their sugar price to be about 10% higher than ours. So they are playing the margins and reaping the windfall.
Yet for some reason we can’t export to Mexico at anywhere near similar levels – to avail ourselves of their higher price. Hmmm….
This is one of those times where partial deregulation is actually a problem. The EU tried it, and lost huge. We passed NAFTA, and in the greater sugar global regulatory mess it has mainly served to help those who stuck more squarely to their Trade War guns.
We have a worldwide governments-induced trade impediment problem. We need much more than mere piecemeal, a la carte attempts at solutions.
[First Published by Human Events]
Expanded services, an increasing number of plan and device options, more usage, and now being held up as an industry noted for its affordability – the wireless industry is on a roll!
A recent report by the Consumer Electronics Association (CEA) “Smartphones: Consumer Behavioral Trends,” indicates that the smartphone continues its development as a communications and data hub. The numbers are not so surprising for an even a casual observer of technology use, but now they are demonstrated with empirical evidence.
While a typical smartphone user is still using the device as a phone, about 20% of the time, they are more than twice as likely, at 43%, to be using it for communications of other sorts, such as email, text or social networking. Fourteen percent of the time they are surfing the Web.
The variety of apps being used is also impressive, even if somewhat predictable. From weather to social networking to navigation and of course video, smartphone users are heavy app users.
As CEA chiefeconomist Shawn DuBravac comments, “The degree to which consumers use their smartphones primarily as data information hubs, mostly forgoing devices’ traditional purpose, is significant. Smartphones have become the viewfinder of our digital life. How smartphone utilization evolves has incredible implications moving forward.”One implication? All of this activity by users requires more data connectivity.
Data connectivity to millions of users who are on the move using games, video and voice is no small feat. And even thought two-thirds of online U.S. consumers already own smartphones, the appeal, and the corresponding challenge for providers, only continues, as 45% of all consumers expressed an intention to purchase one in the next year. Those who don’t own a smartphone? Sixty-one percent intend to snap one up soon. More challenges to be met by industry as they serve more connectivity to more people more often.
The President is correct to use the mobile industry as a benchmark of service and product success at affordable prices, and consumers clearly agree given the continuing strong demand. But he is wrong to analogize to the “Affordable Care Act” when a recent CNN/Opinion Research shows that 57% of consumers do not want it, and with various pieces being rolled back because they actually fail to work.
Government could learn many valuable lessons from the wireless industry about delivery of complicated products done in a way that consumers enjoy and at price level that cause the President to gush. Perhaps the President himself has made the best case for making sure that government is out of the way of wireless, and allowing this highly competitive industry to continue its success. Article published by Madery Bridge Associates, LLC
“[T]he GM debate is over. It is finished. We no longer need to discuss whether or not it is safe. … You are more likely to get hit by an asteroid than to get hurt by GM food.” So said Mark Lynas, the British environmentalist, who helped launch the anti-GMO movement in the 1990s.
Lynas went on to say that “people who want to stick with organic are entitled to—but they should not stand in the way of others who would use science to find more efficient ways to feed billions.”
We could not have put it more succinctly ourselves.
Organic activists are on the attack at the local level in a bid to influence global acceptance of genetic engineering. For years we’ve been asking why those leading the organic industry are so dead-set opposed to genetically modified organisms.
GMOs are already cutting down drastically on pesticide use, fuel consumption and the amount of land devoted to agriculture. Aren’t these the stated goals of the organic movement? This 20-year-old technology will also soon lead to drastic reductions in agricultural water-use, and genetically engineered crops capable of pulling their own nitrogen from the earth’s atmosphere are already on the drawing board. Innovations like these will further reduce the amount of energy farmers use, along with the overall amount of energy humankind requires as it continues to produce more food on less land for more people.
And yet, a fierce either-or (and we must stress one-sided) debate ensues between a minority activists who want the entire world to “go organic”, and scientists and humanitarians who are using genetics and biotechnology to improve our food and medicine. If science makes the human race more efficient in the areas of transportation, communication and housing, then surely it can, and should, also help us in the vital arena of food production. Shouldn’t it?
The world’s premier national and international academies of science have reached an unqualified consensus that GMO crops are good for the poor and hungry. Even the president of the Pontifical Academy of Sciences stated recently, “Genetically-modified food represents a step forward in evolution.”
Crop biotechnology 2.0
While most people think only of commercial crops like Monsanto’s Roundup Ready canola or Bt corn when they hear mention of GM food, the three of us (two academics and a former organic inspector) are left to wonder why an entire discipline is being rejected by “organic” anti-GMO activists when this discipline holds such promise beyond the commercial realm. Commercial crops, which farmers can freely choose to grow, are only the tip of the iceberg when it comes to debating the two competing philosophies of food production before us.
GMO crops that fix their own nitrogen would drastically reduce energy consumption on conventional farms by eliminating the natural gas used in synthesizing ammonium nitrate and the fuel burned in trucks that deliver that fertilizer to farms. Such technology could eliminate the current organic practice of planting legume cover crops, which are subsequently plowed down to trap nitrogen in the soil. This could cut an organic farmer’s fuel bill by as much as 50 percent!
If only the organic industry would consider accepting GMO crops on a case-by-case basis, there could be the possibility of a more rational approach to the new technology of genetic engineering.
And what, we hasten to ask anti-GMO activists, about a life-saving GMO crop like Golden Rice? According to the World Health Organization, 250,000 to 500,000 children in the developing world go blind each year due to vitamin A deficiency, half of whom die within a year. 250 million preschool children, mainly in urban slums, suffer from this deficiency. In all, 2-3 million people die from vitamin A deficiency-related diseases every year.
Genetically modified Golden Rice was developed in response to this unfolding humanitarian disaster by Swiss scientist Dr. Ingo Potrykus and his colleagues in 1998. It contains beta-Carotene, and not only prevents blindness but also boosts the immune system and contributes to general good health. However, Greenpeace and its allies in the organic movement have successfully managed to block the introduction of this non-commercial GMO product based on the flimsy claim that it may pose “environmental and health risks.” As if 250 million children with vitamin A deficiency is not itself a “health risk.”
In response to this we quote Lord Walter Northbourne, one of the preeminent forefathers of the organic movement. In 1931 he wrote:
“If we waited for scientific proof of every impression before deciding to take any consequential action we might avoid a few mistakes, but we should also hardly ever decide to act at all. In practice, decisions about most things that really matter have to be taken on impressions, or on intuition, otherwise they would be far too late…. We have to live our lives in practice, and can very rarely wait for scientific verification of our hypotheses. If we did we should all soon be dead, for complete scientific verification is hardly ever possible. It is a regrettable fact that a demand for scientific proof is a weapon often used to delay the development of an idea.”
If such reasoning is good enough for the organic movement, then surely it’s good enough for the science of genetic engineering. But many organic activists remain adamantly opposed to this new and promising technology. Rather than even consider the possible benefits, commercial or humanitarian, of GM technology, they seek instead the following goals, by any means necessary:
To prevent organic farmers from ever using genetically-modified seed – on pain of facing certain de-certification not only of a crop, but of the field where such seed might have been used, and potentially of an entire farm where the indiscretion occurred, for as long as a decade or more.
To prevent all possibility, no matter how remote, of cross-pollination—they call it “contamination”— between an organic crop and a neighboring GMO crop through pollen drifting over a fence line in spite of the fact that minimal cross-pollination is regarded as a fact of life in agriculture. (Only pedigree seed growers are required to literally eliminate the possibility of it from ever occurring.)
And finally, the organic activists’ most ambitious undertaking: To ban the use of GMOs altogether by all farmers everywhere, regardless of the choices individual farmers might want to make on their own land.
Impossible you say? Here’s how the activists are already imposing these anti-scientific, and we believe anti-human, ends.
Welcome to the new normal
The three of us have been involved in public education on genetically engineered/modified crops and food for decades. Although the science has advanced a great deal over the years, the critics have not changed their position that GM crops and food represent a threat to people and the environment. But, having failed to convince federal, provincial and state authorities, the critics have turned their attention to local governments where they hope politicians might more easily be swayed by public persuasion.
In the arena of public opinion, first-hand experience has taught us that fear can be very effective in winning the public over. There is a great deal of pseudo-science available on the Internet designed to generate fear of GMOs.
GM crops are produced, in part, with recombinant DNA technology. Few in the public, particularly politicians, are trained in this field of science, and so the failure to recognize the difference between the real science and the pseudo-science is to be expected.
Indeed, just imagine if Einstein’s theory of relativity was for some strange reason at issue at the local level. Experts would be called upon to help explain things. But this was not the case, for example, when the Richmond City Council and the District of Saanich, both in the Canadian province of British Columbia, voted to ban or express their opposition to GM crops.
Robert Wager, an academic with almost three decades of experience in the field of recombinant DNA, first met with Richmond City’s Sustainability Manager in early 2012 as the city first began to research the issue. After a short presentation the question session began. Two hours later, after he had debunked a large number of widely held myths that were presented to council by anti-GM activists, it became clear that city officials had already absorbed a great deal of pseudo-science on GM crops and food.
Indeed, documents that surfaced later prove that a group called GE-Free-BC had been petitioning the City of Richmond to ban GM crops since June of 2010, a year and a half prior to Wager being allowed to present.
At the subsequent Richmond public council meeting the usual fear stories were relayed as fact by many genuinely frightened attendees. Some expressed fears of the alleged health dangers posed by GM crop technology. They were sure of their “facts” having gleaned them from the Internet. Sadly, Wager was the only one at the meeting who conveyed the actual science of GMOs, according to world health and food safety experts. Despite the endorsement of GMOs by every food-safety authority in the world, it became evident that nothing could alleviate the fear in the room, and Wager soon realized a ban was imminent
The Richmond council cited two reasons to justify its ban: it stated that the transfer of GM pollen or seed to a neighboring organic field would threaten organic certification; alleged human/animal health issues associated with GM food were the second reason for the ban. Neither of these two reasons cited is supported by history or science.
In 19 years of monumental growth of both GM and organic agriculture there has not been one case of decertification of an organic crop caused by trace amounts of GM pollen or seed. A spokesperson for an organic food company admitted as much to council. History clearly demonstrates that GM crops do not represent any risk to organic farmers, except for what might be understood as an activist/bureaucratic risk whereby an organic farmer could face decertification of his crop, his field or even his entire farm as punishment from those who lead the organic industry.
You heard right… the anti-GM activists who lead the organic industry are willing to go as far as to inflict hardship on organic farmers just to prove their point and ensure a tight lid is kept on the advancement of GM farming.
All of the alleged dangers of GM crops and food have been assessed by global experts and dismissed. Everyone from the European Union (EU) to the World Health Organization (WHO), National Academies of Science (NAS), Health Canada to the local Vancouver Coastal Health Authority (VCHA) agree there is no evidence of harm from consuming food made with GM ingredients. And yet, this local council decided it knew better and proceeded full-steam ahead towards an outright ban.
Unfortunately, the scientific facts Wager presented had little effect, and fear and a lack of scientific understanding left the door open for the manipulation of the council.
One example came from a councilor who claimed: “They put the genetic characteristics of the chemical into the food and then it goes into us!” But there is no such thing as “genetic characteristics” of chemicals. And yet, the two hundred people holding up anti-GMO signs during the meeting cheered the comment. The Richmond council subsequently decided to move forward with the ban at the next public meeting, a definite case of public policy based on fear from anti-GMO pseudo-science.
Suppressing scientific assessment
The debate played out in a different but also discouraging, way in Saanich. Public documents show that one particular council member, the chair of Healthy Saanich Advisory Committee (HSAC), was intent on getting a “non-support” resolution passed regardless of the science. The HSAC minutes of May 2011 call for advice on GMOs from The Peninsula Agricultural Commission (PAC).
As the issue was coming to a head, Wager learned that the PAC had been asked to develop an opinion on a proposed GM crop ban for Saanich (subsequently downgraded to a vote of non-support for GM crops). Naturally he contacted them immediately.
Having dozens of GM-specific publications and extensive speaking experience on GM crops and food under his belt, and being a resident of the region, Wager offered to come before PAC at no charge. But he was turned down because the members of this commission claimed to already have enough experts lined up. One of these “experts” has zero publications in the field of GM crop technology; not surprisingly this “expert” recommended a ban of GM crops. The other was a local organic farmer with a long history of anti-GMO activity. Saanich council clearly did not seek balanced expert opinion on GM crop technology.
The minutes for the April 2012 PAC meeting show that there was no discussion or debate about whether to impose a ban; the issue had already been decided before the council meeting. “The Healthy Saanich Committee’s [HSAC] consensus was to support the concept of a ban on GE-GMO food crops in Saanich,” the minutes read. It was noted that this type of ban would be difficult to enforce [actually impossible, as it is under federal jurisdiction]. It was therefore decided to obtain information from other municipalities to see how a local ban could be achieved. The subsequent “debate” by the HSAC was clearly a sham, as the committee members had already decided their position on GMOs.
HSAC did go through the motions of holding a special meeting for public input on the GM crop issue in September. Wager again attended at his own expense. After sitting for over an hour listening to one speaker after another present fear stories, he was given the opportunity to present the real science to HSAC. But minutes into his presentation, the chair cut him off. Wager would later learn that the HSAC consensus had already been reached six months prior to this meeting—and the public meeting was an empty exercise.
Between that September HSAC meeting and the Saanich District Council meeting in November, when a final decision was scheduled to be made, Wager was assured he would get another opportunity to come before council. But one day before the District meeting, Wager was informed he would not be permitted to make any presentation.
After discussing this turn of events with Saanich Legislative Services (a non-political body that’s supposed to help citizens who live in Saanich), Wager discovered a possible avenue to provide further input. He respectfully requested that council refer the agenda item for the non-support declaration for GMOs back to Committee for further consideration at their next meeting.
But the council rejected that request. Instead, the mayor himself weighed in, saying the council had to trust the HSAC in coming to its recommendation. To no one’s surprise, Saanich Council then voted to move ahead with the non-support declaration for GM crops, precisely as recommended by HSAC.
In Wager’s last correspondence with the HSAC in Saanich, the Chair admitted, “The committee felt strongly that the information you and others shared clearly demonstrated the inconsistent and contradictory opinions and findings with respect to GMOs.” And yet, the fact remains that this committee embraced pseudo-science-driven fear. The process cannot, by any stretch of the imagination, be considered science-based, much less democratic. And remember, Wager lives, votes, and pays taxes in this region! And yet he was purposely ignored.
Organic farming contradictions
The contrast between the over-regulation of genetically modified foods and the lax regulation of organic foods is striking. At the same time as a concerted attack against GMOs is being waged at the local level, the organic industry in North America remains largely unregulated, running almost entirely on record-keeping and record-checking.
Indeed, by the United States Department of Agriculture’s own admittance, “The number of results reported to the NOP [National Organic Program] in 2011 represents a sampling rate of less than 1 percent of certified operations.” Things go rapidly downhill from there because it turns out, “The majority of results reported to the NOP in 2011 were received from certifying agents which are headquartered outside of the United States, where periodic residue testing is a requirement under international organic standards (e.g., the EU). In Canada meanwhile – one of America’s largest trading partners in organic products – there is no testing whatsoever to ensure organic products are genuine.
And while there has not been one death or even an illness linked to the consumption of foods made with genetically modified ingredients, thousands of people get sick and die every year because of contamination problems linked to slipshod organic farming practices at some farms.
Consider the news just over the past week. The Canadian Food Inspection Agency and Costco Wholesale Canada announced that Costco recalled its Kirkland Signature brand Organic Lean Ground Beef likely contaminated by E. coli. And the largest processor of organic peanut butter shuttered its facilities over the weekend, the victim of a Salmonella outbreak that sickened 41 people in 20 states in 2012.
These are not isolated stories. Organic food is more dangerous than conventionally grown produce because organic farmers use animal manure as the major source of fertilizer for their food crops. Animal manure is the biggest reservoir of these nasty bacteria that are afflicting and killing so many people. Because of lack oversight, the organic industry has been plagued by contamination problems worldwide.
When dealing with the potential dangers of un-composted feces, mere record-keeping and record-checking cannot possibly be expected to keep people safe. In one notorious recent case involving the finding of a novel strain of O104:H4 bacteria linked to an organic farm in Lower Saxony in Germany in 2011, 3,950 people were affected and 53 died.
Said simply. While manure used in organic farms can be deadly, the cumulative conclusion after more than 2000 studies of genetically modified foods is that GMOs pose no serious health or safety concerns. There is still no such thing as organic testing, neither in the field nor after harvest nor in any certified-organic processing facilities—and, most disturbingly, not on incoming shipments of certified-organic product from countries like China, Mexico or Argentina. These foreign shipments account for the majority of the certified-organic food being sold in North American grocery stores. Organic certification on this continent is all based on paperwork with no recourse to science.
As such, long before one considers the remote possibility that an organic crop might become “contaminated” (to the level of 0.01 percent or less by pollen drifting from a neighboring GMO field), there is a far more pressing consideration: Are prohibited synthetic fertilizers or pesticides being used fraudulently on organic farms? Aren’t these the things that the organic industry once claimed to eliminate or at least drastically-reduce our exposure to? Sadly, such a commonsensical consideration, alongside the much more troubling possibility that lethal pathogens might be entering the organic food chain through the improper composting of animal and plant waste, does not warrant concern from those who lead the organic industry.
Shouldn’t a luxury food item be safer, or at least as safe, as its competition? Shouldn’t science be used to prove its worth? Instead, organic food turns out to barely exceed conventional food in purity and not at all in the nutritional department—no wonder, given the laxity of the organic certification system.
When it comes right down to it there nothing in GM technology that should offend organic growers. It is, in fact, an entirely “organic” procedure, and a very precise one at that. Organic farmers seem content to use seeds that are produced with nuclear and chemical mutagenesis which are very imprecise and hardly organic. They also use many inorganic substances such as copper, phosphorous and potassium with no apparent contradiction. And which is better—the broadcast spraying by organic farmers of a Bacillus thuringiensis (or Bt) microbial pesticide over entire fields with attendant drift into non-target areas, or the selective targeting of only those pests that actually attack the crop through the use of Bt corn and Bt cotton?
In the final analysis organic farming and GM technology would make a powerful team to improve our food production and nutrition on a large number of fronts. There is no reason why GM seeds cannot be grown organically. The benefits to organic farmers would soon become apparent and the real farmers in both camps could slough off the misinformation and fear mongering of the urban-based anti-GM activists. That’s the real promise of sustainability.
We conclude where we began, with the candid admission by one of the world’s most highly respected opponents to the science of genetic engineering that he was wrong. Mark Lynas stands in contrast to devout anti-GMO activists like Arpad Pusztai who remain steadfast in their baseless opposition to this new and promising field of science.
Pusztai was the lead scientist on the only remotely scientific attempt to prove that genetically modified food might be dangerous, and is still held up as a hero of sorts for the anti-GMO movement. The popular myth surrounding Pusztai is that he “was effectively silenced over his research and a campaign was set in motion to destroy his reputation.” But the fact of the matter is that Pusztai failed to use a control group in his study on rats, one of the most basic rules of the scientific process. He also fed his rats a strict diet consisting only of potatoes (GM potatoes of course), which any lab technician can tell you is a very poor diet for rats, low in protein, which is guaranteed to produce health problems. After all, as Paracelsus (the medieval founder of modern toxicology) so aptly put it, “All things are poison, and nothing is without poison; only the dose permits something not to be poisonous.”
Most damning is the fact that even with all the billions of dollars floating around in the organic industry, Pustzai‘s simple and inexpensive experiment has never been repeated.
Is this the best the anti-GMO organic movement can come up with as a reason to stand idly by and allow 2 million people or more to continue dying from vitamin A deficiency every year? Apparently the answer is yes. And we find that deplorable on all levels.
[Originally published on the Genetic Literacy Project]
He and co-author James C. Bennett wrote a book titled, America 3.0: Rebooting American Prosperity in the 21st Century. The book explores the possibility of a new era in America. The new era — they call it America 3.0 — is one that is less centralized ( less “top down”), and where we operate on more of an individual scale and become ever more productive.
In this podcast, Lotus discusses the previous two American eras — 1.0 and 2.0. America 1.0 is the time from the founding of the nation until just before the Industrial Revolution, which then takes us into America 2.0. Lotus says we are now at the tail-end of 2.0, in a kind of stagnant “transitional period.”
Why are Lotus and Bennett so hopeful? Well, they think that technology develops autonomously, regardless of the government obstacles in place (which cause it to slow down, but never cease improving). Furthermore, Lotus says that we have underlying cultural foundations that are unique — such as the nuclear family — that make us more resistant to the institution of socialism.
Even the host, Jim Lakely — Director of Communications at The Heartland Institute — began to transform his pessimistic attitude by the end of his conversation with Lotus!
Be sure to listen to the podcast in the player above and stay tuned for Part II tomorrow.
[Subscribe to the Heartland Daily Podcast for free at this link]
Hydraulic fracturing, more commonly known as “fracking,” is a new technology that is improving our lives and changing the way we produce oil and natural gas. Over the past few years, the United States has become the world’s largest producer of natural gas, and it is slated to overcome Saudi Arabia as the top oil-producing nation by 2017.
The increase in energy production is creating thousands of high-paying jobs, lowering the cost of natural gas (a big benefit for those living on a fixed income), and making the United States less reliant on energy from potentially hostile nations.
As with everything, there’s no such thing as a free lunch. Although multiple sources (including Lisa Jackson, the former head of the EPA) have confirmed that hydraulic fracturing can be done in an environmentally responsible manner, there are still some concerns about how much water is used during the process.
In the Marcellus Shale of Pennsylvania and West Virginia, approximately four to five-million gallons of water are used to hydraulically fracture a natural gas well over the course of its lifetime. While this sounds like a lot of water, in water-rich areas like West Virginia and Pennsylvania, the amount of water used is relatively small compared to other activities such as public consumption, watering livestock, and other industrial uses.
Most of the water used to fracture the well stays deep underground after the fracking process, and away from drinking-water aquifers. The rest of the fluid, anywhere from six to 20 percent of the initial four million gallons, comes back to the surface. Most of this fluid is recycled and used at another well, saving energy companies’ money and reducing the amount of freshwater needed for the next well.
According to Downstream Strategies, over 80 percent of the water used in fracking in West Virginia and over 70 percent used in Pennsylvania comes from rivers and streams. While some environmentalists will try to make this sound scary, the amount of water used for fracking in Pennsylvania is about 1 percent of total surface water, and much less than the available groundwater in the state. Also, using surface water is consistent with common hydraulic fracturing practices around the nation. In fact, farmers in North Dakota prefer the water be taken from the Missouri River and not local groundwater aquifers.
While some environmentalists will make is seem like fracking operations are drinking the rivers dry, this is simply not the case. There is plenty of water to go around.
Hydraulic fracturing has provided the United States with an amazing opportunity. For the first time in decades the US is a leader energy production, creating thousands of good jobs for hardworking families, and making the US more energy independent every day.
The answer is something “rare.” Something that is currently used in almost everything modern, but that is abundant and recoverable in very few places on the planet—hence the “rare” moniker. Something that China has in abundance and that they are using as an economic weapon against the rest of the world—much like OPEC uses oil. And, this something is also found in the U.S., which could give us a competitive advantage in the global economy.
Have you guessed it? “Rare” was a big clue.
I am talking about Rare Earth Elements (REEs), many of which are classified by the Department of Defense as Critical Minerals.
REEs are found in almost all massive rock formations—though their concentrations range from 10 to a few hundred parts per million by weight, which makes them difficult to extract. There are 17 different REEs with names ending in “ium” such as: dysprosium, yttrium, neodymium, terbium, cerium, and europium—just to name a few.
While most people don’t give REEs a thought, we all use them in our modern lives as they are an essential part of what makes cell phones, flat screens, and computer chips work. But REEs are not just about convenience and luxury. They are in every modern vehicle from a Prius to a Ford F-150. They enable miniaturization—making things fast and light.
According to the Wall Street Journal (WSJ), “A Department of Energy report in 2010 noted that several minerals vital to clean-energy applications, including neodymium and dysprosium, face ‘critical’ supply questions over the next 15 years.” A 2011 PricewaterhouseCoopers report revealed that 73 percent of CEOs in the automotive industry have businesses that face minerals and metals scarcity. The same problem applies to 78 percent of high-tech industry CEOs and 50 percent of aviation CEOs.
But, perhaps, most importantly, REEs are playing an ever-increasing role in vital defense technologies. REEs are used in stealth radar-evading technology, in targeting mechanisms for missiles and temperature-resistant magnets, and materials used in jet engines and aerofoil components in manned aircraft and, increasingly, in unmanned drones. The U.S. Department of Defense recently released findings from the “Strategic and Critical Materials 2013 Report on Stockpile Requirements,” which identified 23 critical minerals, of which shortages are likely. They include several vital to defense technologies, such as search and navigation equipment, missiles, and space vehicles. Such shortages will limit our ability to produce the defense systems of the future. It is frightening to realize that the Chinese do not have this worry.
The U.S. is one of only a few countries with known recoverable REE deposits (with approximately 13 percent of the world’s total known reserves), and we have more commodity minerals and metals than any other country. Yet, today, less than half the minerals U.S. manufacturers use comes from domestically mined resources. More specifically, China currently has a near-monopoly on the production of REEs—generally supplying approximately 85 percent of the world’s current REE supply and 100 percent of several REEs. Additionally, in recent years, China has imposed quotas on exports to protect its need for REEs and to compel high-tech companies to establish production in China by giving them the benefit of lower prices and guaranteed supply. In US News, Eric Hannis, senior fellow in defense studies at the American Foreign Policy Council in Washington, DC, addressed companies’ increasing hesitancy to move production to mainland China: “the need to gain a cost-effective, guaranteed supply of rare earths means that many have been forced to make a ‘deal with the devil.’”
Instead of easing the mining regulatory framework to help promote domestic REE production, the Obama administration engaged in the governmental form of a temper tantrum. It joined countries without the quality resources found in the U.S. and lodged a complaint with the World Trade Organization (WTO), claiming that Beijing is unfairly choking off exports of the commodities to benefit its domestic industries. (Imagine that a government would make policy that was designed to benefit its own? Perhaps the White House should try that.) In fact, if the WTO initiative is successful and China is forced to reverse its current policy of keeping the majority of its REEs in-country for its own use, we might end up encouraging them to flood global markets with REEs again, driving the price down—as they did in the 80s and 90s. By joining in the WTO case, the Administration could make it very difficult for U.S. projects to get up and running and stay competitive—and this could be their plan. Gratefully, the very low cost of production expected at Molycorp’s Mountain Pass mine provides some economic cushion.
The WSJ reports that the U.S. was once self-sufficient in REE production but ceded the market to China over the past two decades, “partly because of environmental concerns over energy-intensive mining and partly because of falling global demand and prices,” as a result of China dumping huge quantities of REEs onto the world market.
REEs in China were first discovered in 1927, with the Bayan Obo mine open in 1950. The U.S. dominated global production from the 60s and into the 80s with light and heavy REE production at Mountain Pass, but China then launched a dedicated campaign to dominate the REE supply chain. In 1986, Deng Xiaoping, a Chinese politician and reformist leader of the Communist Party who, after Mao’s death, led his country towards a market economy, established the National High Technology Research and Development Program. His goal was to help China “achieve breakthroughs in key technical fields that concern the national economic lifeline and national security, and to achieve leapfrog development in key high-tech fields in which China enjoys relative advantages.” In 1992, Xiaoping boldly proclaimed: the “Middle East has oil, China has rare earths.”
This brings us to today.
REEs have gone “from being practically unheard of a few years ago to being one of the most-talked-about commodities,” according to the WSJ. There has been growing concern in the U.S. regarding our reliance on China for our REE needs, which has resulted in unusual bipartisan support for increased domestic REE production.
Regarding the U.S. position in the international REE arena, Hannis states: “In much the same way that we should strive for independence from Middle Eastern oil, the United States now needs to make ‘rare earth independence’ from China a key priority of government. After all, the nation that supplies our rare earths shares one key similarity to the region that supplies much of our oil: neither are getting friendlier to America.”
Apparently Congress has gotten the message. Last year, Rep. Doc Hastings, (R-WA), Chairman of the House Natural Resources Committee, said: “Just like the United States’ dependence on foreign oil causes pain at the pump, Americans will soon feel the impact of China’s monopoly on the rare-earth element market.”
In September, the House of Representatives passed, with bipartisan support, HR 761: the National Strategic and Critical Minerals Production Act of 2013, with the goal of allowing for the more efficient development of the U.S.’s $6.2 trillion worth of minerals and metals without minimizing or hindering the environmental review process.
On October 29, 17 Senators (nine Republicans and eight Democrats) introduced similar legislation: The Critical Minerals Policy Act of 2013 to “help reduce the nation’s dependence on foreign suppliers.” According to the Background and Section-by-Section Summary: “The legislation directs the Secretary of Interior to establish a list of minerals critical to the U.S. economy and, pursuant to those designations, outlines a comprehensive set of policies that will bolster critical mineral production, expand manufacturing, and promote recycling and alternatives—all while maintaining strong environmental protections.”
The Critical Minerals Policy Act of 2013 has widespread industry support—even though it calls for yet another stall-tactic study, when 23 studies have already determined that we have a crisis and an emergency.
Hal Quinn, National Mining Association CEO, heralded the bill as “a welcome recognition of the urgent need to facilitate the development of American minerals.” He observed that the measure would analyze the “impediments to domestic minerals mining that hamper the prospects of a sustainable U.S. manufacturing renaissance. It is widely understood that the slow and inefficient permitting system in the U.S. poses the largest impediment to unlocking the full value of American minerals.”
Likewise, the Women’s Mining Coalition statement reads: “this bill will provide high-paying, long-term employment for many Americans, while also providing world-respected environmental management practices and implementation.”
The House bill passed with the support of 100 percent of the Republicans and many Democrats. The same can be assumed for the Senate version. Senate Democrats need to hear from their constituents, they need to know that you support The Critical Minerals Policy Act of 2013. Please call your Senator and ask for his or her support.
Right now, there are only two non-Chinese suppliers for REEs (Molycorp in California and Lynas Corp. in Australia)—but there are several projects, such as Rare Elements Resources’ “Bear Lodge,” in development in the U.S. that could be providing national and economic security within a few years with the passage of The Critical Minerals Policy Act of 2013. America’s REE resources and potential production and refining give us a strong global advantage—but we must accelerate and streamline the permitting process. It is time for our government to make policy that is designed to benefit America.
On October 30th, former State Senator Mark Q. Rhoads, the author of Land of Lincoln, Thy Wondrous Story, was featured at Heartland Institute’s ongoing author event series.
Rhoads was born in Hinsdale, Illinois and raised in Western Springs. He was elected to represent parts of Cook County and DuPage County in the Illinois State Senate in 1976 and reelected in 1980. In 1982, Rhoads taught as a Fellow of the Institute of Politics at the John F Kennedy School at Harvard University. As a visiting member of the Chicago Sun-Times editorial board from 1984 to 1987, Rhoads wrote more than three hundred editorials. His articles have appeared in The Wall Street Journal, The Washington Post, The Washington Times, Illinois Issues, Illinois Review and St. Louis Post-Dispatch, along with 250 appearances on radio and TV interview programs.
Rhoads’ new book is definitely a must read book not only for history buffs, but for all interested in the story of Illinois told in a unique way. Rhoads research includes information about Illinois history not covered in any other book.
Land of Lincoln ties together the historical connections which existed among a trio of cities with the founding of the Illinois State Society in Washington, D.C. in 1854 — Springfield, Chicago, and Washington, D.C. — through narratives gleaned from the 159-year-old Illinois State Society. These narratives cover important events in Illinois through the interactions of its political leaders with a generous sprinkling of humorous and anecdotal accounts of Illinois-bred notable personalities in the fields of arts, science, literature, business and sports.
The book also contains some never-before-published photographs of past Illinois leaders. Among the eight full pages is one photo that caught my attention: the late Chicago Sun-Times columnist Robert Novak of Joliet, accepting an award from Senator Carol Moseley Braun at the 1977 Illinois State Society Inaugural Gala.
The Illinois State Society of Washington, D.C. is the oldest state society. It has fondly been called Illinois’s “103rd county,” as it has served as a home away from home for Illinoisans of both political parties since 1854 and likewise for the nonpolitical living and working in Washington, D.C.
Rhoads’ unique perspective and access to pertinent historical anecdotes is partly due to his longtime involvement with the Illinois State Society in Washington DC. He was president of the Society from 1989 to 1990 and served on the Society’s board for twenty-seven years, from 1985 to 2012.
Rhoads begins his book’s narrative with an account in 1854, when the Illinois State Society was founded as the Illinois Democratic Club of Washington City. The group’s party affiliation was eventually dropped and the organization became simply known as The Illinois State Society.
Although Abraham Lincoln became president in 1861, Rhoads could not find proof that Lincoln was ever a member of the society. It was thought that most likely Lincoln didn’t have the time to be involved, since he was president during the tremulous time of the Civil War.
Rhoads told the Heartland audience his concern about the apparent deficit of knowledge that exists among students about their own state’s history. Texas alone requires the regular study of state history for high school graduation. Illinois has state history listed as part of its required curriculum, but no system of enforcement is in place.
Just as Rhoads enjoyed the study of American history through an understanding of how Illinois and its elected and non-elected leaders and celebrities shaped and enhanced the scheme of things in America’s political history, he believes school students will be more motivated to learn if they are taught in ways in which they can relate, rather than the typical uninspiring and boring timeline approach.
Illinois became the 21st state on December 3, 1818. Rhoads would like to coordinate Illinois teachers to celebrate Illinois’ 100th anniversary by having students recognize people of importance who came from their own home area. In so doing Rhoads hopes to foster an interest in state history, so the lives of those who built and guided the state over the years are celebrated by students throughout Illinois. Students must also be made aware that some of the problems Illinois is facing today are not new problems.
Illinois historical revelations noted by Rhoads in his first rate collection include:
- “Lonesome” George Gobel, who started out as a child star playing the guitar on WLS, which even in 1937 was an established major contributor to Chicago’s radio history.
- The Dawes family is virtually forgotten, yet it played an important part in the historical legacy of Illinois. Charles G. Dawes, 30th Vice President under President Calvin Coolidge, was from Evanston. His great, great grandfather had ridden with Paul Revere.
- Accusations of bribery which sent Governor Rod Blagojevich to jail happened in a big way 100 years ago, but Billy Lamar escaped punishment. This blatant bribery happening however, was the impetus that led to a change in our Constitution. Senators were no longer elected by legislators, but through state elections.
- The World Fairs of 1989 and 1933 were major events in Chicago. Planning for the 1933 World’s Fair, Chicago business leaders formed The Secret Six, knowing that Al Capone would be bad for tourism in a city still reeling from the St. Valentine’s Day massacre of 1932. The end result was a case brought against Al Capone for tax invasion. The city did feel safer. 39 million people visited the World Fair of 1933, which provided lasting benefits for Chicago.
- It was again question time when Rhoads asked who was in the stands as an eight year old boy on Oct. 1st, 1932, when Babe Ruth is said to have called the shots during game three of the 1932 World Series at Wrigley Field in Chicago. It was Chief Justice John Paul Stevens. This story was told to Mark Rhoads by a former president of the Illinois State Society, Virginia Blake, now 100 years old.
- Richard Sears and Alvah Roebuck met in Chicago and in 1893 founded a mail order catalogs which grew into Sears, Roebuck & Company. Noticing that the catalog of Montgomery Ward had no illustration, Sears and Roebuck took over the mail catalog business from Ward by adding illustrations. Chicago was ideally suited to the mail order business because of its transportation hub.
- James Kraft found someone to sell him a horse which enabled him to peddle wheels of cheese. The rest is history.
- Actor Harrison Ford is from Chicago; his wife is from Freeport, IL.
- Four presidents lived in Illinois: Abraham Lincoln, Ulysses S. Grant, Ronald Reagan, and Barack Obama.
- Jim and Marian Jordon, both born near Peoria, IL, became know as Fibber McGee and Molly. While in high school Jim Jordon played basketball with the late Bishop Fulton Sheen.
Rhoads ended his book with two presentation of “What ifs?” and then spoke of a thrifty Democrat governor.
If Governor Horner, first elected governor of Illinois in 1932, could be brought back, what would Gov. Horner have to say to Gov. Quinn if seated next to him at a table?
You might have it hard now, but look what I faced during a time of depression?
If Mayor Medill, mayor during the Great Chicago fire of Oct. 8, 1871, could be brought back, what would Mayor Medill have to say to Mayor Rahm Emanuel if he could engage Rahm in conversation?
You think you have it bad? I lost the entire city to fire, but it was rebuilt in 22 years.
Rhoads’ final thought was light-hearted in nature when he posed this question: Which Democratic governor behaved more like a Republican when he took office? It was Governor Henry Horner (Democrat) who in 1933 replaced Governor Louis Emmerson. Instead of discarding the stationary with Emmerson’s name on it, Governor Horner crossed out the name of Emmerson and wrote in his own.
Despite the many unpleasant, sleazy and darn right corrupt happenings that have peppered the history of Illinois, Rhoads still has hope for Illinois’ future. Why? Because each generation renews itself.
A free copy of Mark Rhoads’ book was placed on the desks of each legislator in the Illinois General Assembly in Springfield by Democratic House and Senate leaders, Madigan and Cullerton, respectively. Land of Lincoln, Thy Wondrous Story, was published in 2013 by Jameson Books, Ottawa, Illinois.
The next “Save the Date” Heartland Author event is scheduled for Thursday, November 14, when Dan Pilla will discuss,10 Principles of Federal Tax Policy.
Watch the video of Mark Rhoads at the Heartland Institute’s Author event below:
[Article originally published in the Illinois Review]
But those of us who have followed this legislation closely weren’t surprised at all. In fact, a year ago I wrote on John Goodman’s blog – “The Republicans may have lost this election, but a year from now millions of people may be begging to be freed from the scourge of ObamaCare’s exchanges.”
I’m not psychic. My conclusion was based on the dismal experience of everything the federal government had already tried and failed to do to implement the law and on the excellent reporting of Robert Pear at the New York Times. He had written months earlier that, while the state exchanges were doing their work in public, the federal effort was entirely behind closed door. The Feds even refused to divulge what was in a “request for proposals” they had released to advertising agencies to help hype up interest in the exchanges.
Doing your work in secret may shield you from the criticism of political opponents, but it also prevents you from getting constructive suggestions from neutral outsiders. And that has been the primary problem with this whole exercise from the beginning. The idea was poorly conceived, the law was poorly written, and it is being poorly implemented, all because they didn’t want to hear from critics. The effort was so fragile that the slightest hint of trouble would have brought it crashing down. But it is far better to crash a poor concept on paper before it becomes law than to pass it and watch it wreck the lives of real people.
Now, not everything about this law has failed. Some things, like covering adult children on their parents’ policies to age 26, paying 100% for preventive services, and eliminating annual and lifetime caps on benefits, have been implemented smoothly. But these are all things that are within the scope of the normal regulatory process – the regulators tell the insurance companies what to do, and the insurance companies do it. Both regulators and insurers have been doing this for decades.
These may or may not be good ideas, but they are easy to implement. In fact, all these provisions would have required only about five pages of legislation.
The problem comes in the other 2,695 pages of legislation. Why so many pages? Because they were creating entirely new functions for the federal government and all the agencies needed to operate them. And this is where the failures have come. The list is extraordinary –
- The CLASS Act. This feeble attempt to create federal long-term care insurance was thrown overboard by the administration itself after it became apparent it would be impossible to do.
- The 1099 provision. This requirement that businesses issue a 1099 to any vendor from whom they purchased $600 of goods and services in a year was repealed after business owners explained what an impossible burden it would impose.
- Federal high risk pools. This program seemed to be well-funded, but they enrolled very few people at much higher cost than projected. They quickly ran out of money.
- Retiree health subsidies. Large corporations and unions were more than happy to accept free money to do what they were doing anyway (provide health benefits to retirees), but all the money ran out in about a third of the time expected.
- CO-OPs. As an alternative to the “public option,” Congress appropriated billions of dollars to create consumer-run, nonprofit insurance companies in each state and even created a whole new section of the Internal Revenue Code for them. It was never explained why mutual insurance companies were not adequate to the job. A few have been created but Congress put so many restrictions on them that even those few are unlikely to make it past the first year.
- Small employer tax credits. The complexity and confusion of these credits deterred all but a handful of companies from applying.
- Medical Loss Ratios. The MLR requirements have had the very predictable effect of discouraging innovation and cost containment. The requirements actually encourage higher premiums — the higher the premiums, the more money insurance companies have for administration and profits. Plus, they discourage fraud prevention efforts.
- Medicaid expansions. The Supreme Court made these expansions voluntary for the states and only about half have done it. But it was an odd idea from the start. One-third of all the uninsured were already eligible for Medicaid but hadn’t bothered to enroll, and a large portion of the people now enrolling seem to be people who were already covered in the private market.
- Limits on FSA funding. It is cruelly ironic, but the families most disadvantaged by the new $2,500 limit on FSA funding are those with special needs children.
- Limits on the Medical Expense Deduction. Beginning in 2013, a taxpayer is able to deduct only those medical expenses that exceed 10% of income, up from the current 7.5%. Once again it is the sickest families that will be hurt.
- Accountable Care Organizations (ACOs). These were intended to introduce an entirely new form of health care delivery to reduce costs and improve quality in the Medicare program, and eventually throughout the health care system. But they have been plagued with problems, including that nearly a third of the original participants have already dropped out. CMS is refusing to release information on how the rest are doing, suggesting it isn’t going well.
- Health IT. The HITECH bill was enacted separately from ObamaCare, and many billions have been spent on it, but reports from the field indicate the top-down efforts result in lower quality and less efficiency as physicians spend more time wrestling with computers than taking care of patients.
Essentially, everything the federal government itself has been responsible for doing has failed. None of this should have been a surprise. Any honest reading of available research would have shown the futility of these efforts. For example the United Kingdom was way ahead of us in trying to upgrade its health information technology. It spent $12 billion on the project before concluding it was an abysmal failure and shut it down completely. An effort very similar to ACOs was tried from 2005-2010 in the form of the Physician Group Practice demonstration project. It, too failed. Medical loss ratio limits have been tried without success in several states.
But, like the old Gene Autry song, the true believers in the Obama administration didn’t want to hear a discouraging word, so they locked themselves away and even refused to let the public know what they were doing.
But the biggest failure of all is still a few months away. This is the essence of the law – that it will reduce the numbers of uninsured. It will not. It is far more likely to raise the numbers of people without insurance coverage. Why? Three reasons –
- The Medicaid expansion will have at best a modest effect on covering new people. As we said above one-third of the uninsured are already eligible for Medicaid but not enrolled. In many cases, they have been on the program before and haven’t bothered re-enrolling. They simply don’t find value in it. They are more likely to see a doctor by going to the hospital ER, and they know if anything big happens they will be instantly signed up. Having a Medicaid card in their wallets just to please some bureaucrat is no enticement.
Now, it is likely that the Medicaid roles will grow, but this will be from people who were already covered by their employers. Medicaid is free while employers expect a premium contribution. It makes sense to switch, but it doesn’t add any uninsured people.
The individual market is gone. This means some 15 million newly uninsured people. Some of them will get coverage on the exchanges, but not many. The exchanges offer benefits that are too rich – things like pediatric dental care that few private insurance plans cover – and all those “enhanced” benefits cost money. Yes, some will be offered some subsidies, but in most cases people will still pay more than they have been used to, and the enrollment process is a nightmare.
Employers will drop coverage in droves. That has been happening slowly for years, but Obamacare will accelerate it dramatically. In 2011 the McKinsey Company did a large survey of the employers and found 30% said they would definitely or probably drop their coverage. This was blithely dismissed by administration supporters, but McKinsey is a rock solid company with no interest in political spin. This means perhaps 50 million people will no longer have coverage on the job. They will find –
- No more automatic enrollment going along with the job. People will have to take the initiative to find out about the Exchange.
- No more pure community rating of the employee share of premiums. The Exchange will vary premiums every year based on a person’s age.
- No more paycheck deductions of the employee share of premiums. People will have to make some kind of payment arrangement for their share of the premium.
- No more convenient and friendly HR Department people to answer questions. People will have to seek out an “Exchange navigator” to get their questions answered
- More importantly, they will find no more employer contribution to the cost of their coverage.
Very few people have significant medical expenses in the course of a year. The cost of insurance coverage far outweighs the cost of their medical expense – especially if they are paying the entire premium themselves.
Annual Spending as a Percent of the Total, by Decile
Source: Taken from “Medicare for All,” a presentation by Paul Y. Song, MD, PNHP 2011, slide #41; data attributed to Thorpe and Reinhardt.
So there might be 20% – 30% of the total, maybe even 50% who find that insurance is a good deal through the Exchanges, but that leaves 25 million people who are newly uninsured.
Taken together, the numbers of uninsured Americans may double after Obamacare is fully implemented. How’s that for a kick in the head?
Now, you may be wondering about all those pitiful, helpless people you have heard about who have been denied coverage for all these years. Won’t they be lining up around the corner to finally get insurance coverage? Yes, very likely they will be. But keep in mind that the only people subject to such a denial are new applicants for individual coverage. This is a very small segment of the population. People getting employer based coverage are free of any such concern, as are people on Medicare or Medicaid. AHIP (the insurance company trade association) reports that in 2008, there were only 1,763,000 people who applied for individual coverage, and only 223,000 were denied. Many of these people were allowed to enroll in a state high-risk pool, so were not uninsured when this law was passed. That’s a pretty tiny number of people compared to the many millions who are about to become uninsured.
Finally, there is the issue of the mandate itself. Won’t those people who refuse to enroll be severely punished? No. There is no effective enforcement mechanism. Many commentators have remarked on how low the tax penalty is, but even that penalty is easily avoided. The only tool the IRS has for collecting the penalty is seizing a taxpayer’s tax refund. It may not garnish wages, it may not place a lien on property, it may not bring the taxpayer to court. It is easy enough to avoid a refund, by adjusting withholding at the beginning of the year. That has the added effect of giving the taxpayer more money in his paycheck during the course of the year and depriving the federal government of an interest-free loan from the taxpayer.
So the biggest failure of all will take place in the next six to nine months: We will discover the numbers of uninsured has doubled just in time for the 2014 elections.
[Originally published on The Federalist]
Today, the first-ever program-wide reduction in the Supplemental Nutrition Assistance Program (“SNAP”), also known as food stamps, will take place. The reduction amounts to $5 billion out of a program that spent nearly $80 billion in the most recent fiscal year, more than double what food stamps cost taxpayers when Barack Obama took office and more than five times the cost in FY2002.
Food stamp benefits are distributed to a shocking 47 million people, about 14 percent of all households in the United States, up more than two-thirds from five years ago despite the unemployment rate having dropped far below the worst months of 2009 and 2010.
Think about that: Nearly one out of every seven Americans gets the government cheese. That can only be good for those who distribute the Velveeta (or, for today’s pampered and unashamed EBT recipients, perhaps a fine organic chèvre), lovingly keeping millions of Americans as their little pet mice. We’re not even a nation of sheep anymore; we’ve been downgraded to rodents, hoping to be fed by our benevolent government keepers. Maybe we’ll even get over-sized Habitrails one day.
This increase in dependency is a specific goal of the Obama administration, with the USDA giving bonuses to states for “efficiency” in adding more people to the food stamp welfare rolls. In 2011, Oregon bragged of a $5 million “performance bonus” for increasing SNAP recipients by 60 percent over three years.
Contrary to willfully ignorant reporting describing today’s reductions as “a move by congress,” these automatic “cuts” represent the end of a temporary boost of 13.6 percent in SNAP benefits enacted through President Obama’s 2009 stimulus, which failed to stimulate anything other than the growth of government and dependency — again, precisely the goal of this administration.
Due to the reduction, according to CBS News, “the maximum payment for a family of four will shrink from $668 a month to $632, or $432 over the course of a year.” In other words, program participants will see a modest cut which was always intended to take place at the end of a temporary boost in benefits (despite Reagan’s warning that “nothing lasts longer than a temporary government program”).
Yet panic-stricken liberals are warning of — and perhaps hoping for — riots.
Fox News has reported that the Department of Homeland Security is wasting $80 million taxpayer dollars to protect government buildings from the violence of those living off the forced generosity of others when that generosity is slightly reduced.
This has to stop.
We are not a nation of impoverished peasants living in straw huts, desperate for just enough kindness from the czar to allow us to survive, albeit cold and hungry, through another gray winter, while we listen hopefully to Bolsheviks promising to free us from bitter serfdom.
We are not a nation of mice, helpless in our cages but for the magical hand that makes cheese appear in our bowl.
And we are not a nation accustomed to seeing our fellow citizens’ self-worth destroyed by being led into the opiate of dependency — at least not on this scale.
Separate from discussions of the importance of freedom, the immorality of redistribution, and the exceptionalism and self-reliance that once defined the American character, an economic debate also surrounds the food stamp program.
It’s one thing for Nancy Pelosi, economoron that she is, to suggest that food stamps provide economic stimulus. Back in 2010, the lunatic leftist gave us this piece of economic wisdom: “It is the biggest bang for the buck when you do food stamps and unemployment insurance. The biggest bang for the buck.”
But it’s something else when a financial reporter buys unquestioningly into that obviously ridiculous argument.
In a piece called “Food stamp cuts will hurt more than you think,” Yahoo! Finance reporter Jeff Cox gives us this gem of economic idiocy:
Programs such as SNAP have what economists call a “multiplier effect”—in other words, a dollar given to an entitlement recipient has amplified economic benefits. In this case, those consist primarily of the grocers who benefit when food stamp users shop in their stores. The estimated multiplier effect for food stamps is as high as 2 to 1.
Cox, who republished this story on CNBC’s website, is regurgitating a myth perpetuated by the left and seemingly accepted by an economically ignorant public (including, obviously, financial reporters who should know better).
Particularly annoying about the myth, beyond the fact that it is used to support harmful public policy, is that it is so obviously impossible.
First, food stamp money is simply redistributed from other Americans, either by taxing their earnings today or by deficit spending and taxing our children’s future earnings. Which means that any positive economic effect is offset by an equal negative economic effect, some immediately and some as a wet blanket on the next generation’s economic opportunity.
Second, if it were the case that food stamps have a miraculous positive multiplier, not only should we then put the whole country on food stamps, we should also use rent stamps and gasoline stamps and grande Frappucino stamps. Economic growth would skyrocket! Government revenue would surge so much from GDP growth that we could slash tax rates and still leave government with more revenue than it’s taking in today, while we all get rich at the same time.
In 1848, the great French economist Frédéric Bastiat warned against the Pied Pipers of Redistribution:
There is only one difference between a bad economist and a good one: the bad economist confines himself to the visible effect; the good economist takes into account both the effect that can be seen and those effects that must be foreseen.
Yet this difference is tremendous; for it almost always happens that when the immediate consequence is favorable, the later consequences are disastrous, and vice versa. Whence it follows that the bad economist pursues a small present good that will be followed by a great evil to come…
It is simply and obviously impossible for redistribution to create important positive economic effects, which is also why Obama’s ginormous stimulus package failed, why cash-for-clunkers failed, and why every Keynesian economic program ever attempted has failed.
Economic benefits arising from theft (also called taxation), even if the confiscated money is then redistributed toward ostensibly charitable ends, are no more likely to exist than is the Tooth Fairy.
In fact, there is no economic difference between the money I redistribute to my daughter when she loses a tooth and the money Democrats redistribute to people through the food stamp program (as long as my daughter then spends the money, such as on a candy bar in a grocery store, right behind a food stamp recipient). In both cases, one person’s gain is another person’s (my) loss. But at least when I act as the Tooth Fairy, we know that our money is going to someone we want it to go to.
I do not want Americans to starve in the streets, much less riot at the thought of getting less of my money. But even during the Great Depression, prior to the existence of the welfare state, we did not have either. The American people have been, and must return to being, proud, self-sufficient whenever possible, and voluntarily generous, rather than living in a statist world where Big Brother takes care of everything — meaning citizens need care little for others, or even for themselves.
Today’s first-in-history nationwide reduction in food stamp spending is a welcome pause in the metastasizing culture of dependency intentionally brought on Americans since FDR trampled the Constitution to create a welfare state.
But it represents the beginning rather than the end of the conversation: In September, the House of Representatives passed the Nutrition Reform and Work Opportunity Act of 2013 which would cut almost $40 billion from food stamp spending over the course of a decade (representing less than a 5 percent reduction) by more rigorous means testing, eliminating certain duplicate benefits, ending bonus payments and SNAP advertising budgets, and cutting down on fraud and abuse within the program.
The Democratic-controlled Senate passed a measure that would reduce SNAP costs by about one tenth of that amount and, given the Democrats’ history when it comes to redistribution and their true interest in reducing the number of dollars flowing through the sticky fingers of bureaucracy, even that minuscule reduction is likely a mirage.
President Obama is pressing for a Farm Bill, which includes food stamp funding, to be agreed upon and passed as quickly as possible. The chasm between budget- and dependency-cutting House Republicans and Harry Reid’s Senate poverty pimps will be extraordinarily difficult to bridge. At least the worst case seems to be a halt to the rapid growth of American mice nervously awaiting their handout of government cheese.
[First Published by American Spectator]
Apparently that is not stopping Former FCC Chairman Reed Hundt and Greg Rosston from trying in their new white paper: “Articulating a Modern Approach to FCC Competition Policy.”
Their paper contrives: “three different competition policy approaches: the classicrole of regulating terms and conditions of sale, the modernrole of using various tools to create largely deregulated, multi-firm, competitive markets, and the laissez-faire approach of believing that unregulated markets, even if monopolized, will produce the best outcome.”
The purposes of the paper’s elaborate contrivances are clear, to advance Mr. Hundt’s FCC-first approach to communications policy by associating it with the popular “modern” brand and by linguistically-positioning it as the middle position between two extremes – all in hopes of influencing the core policy trajectory of prospective FCC Chairman Tom Wheeler.
The fatal flaw of the paper’s framework is it is clearly contrived, so it withers under scrutiny.
Here are the problems with the paper in a nutshell. First, the paper’s rewrite of FCC history is demonstrably selective, revisionist and incomplete. Second, it effectively proposes to redefine the term “modern” to allow for retention of their favorite obsolescing policies. Third, the paper ignores the transformative effect and reality that competition policy and the Internet have on the FCC’s original regulatory authority – in order to imagine that the FCC remains all powerful over an ever-expanding regulatory domain.
If an idea or policy is worthy of retention, one should argue it on the merits and facts, not on the contrived notion that something is “modern” when it clearly is not.
Revisionist History: First, the authors’ self-described “short history” of “classic,” “modern” and “laissez-faire” FCC competition policies is demonstrably selective, revisionist and incomplete. The authors undermine the credibility of their analysis and conclusions by ignoring most all of the most important FCC decisions of former Chairman Hundt’s tenure and the demonstrably destructive impacts these decisions had on investment, competition and the marketplace.
Why no mention of Mr. Hundt’s first big decision to slash cable rates another 17% on top of the 10% rate reduction ordered by then Acting FCC Chairman Quello in implementing the 1992 Cable Act? In hindsight, are there no competition-policy-lessons learned from how that decision ravaged cable investment for several years — delaying cable-led broadband competition until the turn of the century? And didn’t the decision to drastically lower cable rates an additional 17% by regulatory fiat actually make it harder for nascent DBS competitors to succeed via competitively undercutting cable on price?
Why no mention of the several-year delay in the formation of real facilities-based competition as a result of pursuing a “classic” common-carrier-regulation-vision via FCC TELRIC/UNE-P contrivances, which were ultimately found illegal in court?
In hindsight, are there no lessons learned about FCC competition policy from the marketplace result of these “classic” regulatory decisions? Are there no “competition” lessons learned from the Hundt-FCC’s policy of picking the CLECs as market winners — via unsustainably massive subsidies and regulatory favoritism — that ultimately led to the bankruptcy of the entire CLEC industry?
Or what did the FCC learn about competition from its one-sided reciprocal compensation policies that helped fuel a trillion dollar market bubble and created over twelve unsustainable Internet backbone companies that either went bankrupt or were consolidated to survive?
Why no discussion in the paper of how the so-called “modern” competition- approach actually delayed for several years the world-leading, facilities-based broadband competition that we now enjoy under “laissez-faire” policy? Why no discussion in the paper of the opposite competition results in the marketplace between the policies of the “modern” nineties and the “laissez-faire” aughts? Concerning merger competition policy, why no mention in the paper of the lessons learned from the Hundt-FCC preempting a potential SBC-AT&T merger as “unthinkable” by spontaneously creating a new “competition” policy of “precluded” competition out of whole cloth, when the FCC several years later approved an SBC-AT&T merger with conditions and without anti-competitive consequences?
Concerning Hundt-FCC auction and spectrum cap polices, why no mention in the paper of lessons learned from the FCC making spectrum-conditioning mistakes that kept 20 MHz of prime Nextwave PCS spectrum from getting to market for well over a decade? Or the FCC’s spectrum-conditioning mistakes that have kept public safety from getting the spectrum it needs for over a decade as well?
The point here is that if we are to discuss history in order to learn from it for the purpose of new policy making, it is essential to have an accurate and complete understanding of the results of various competition policy approaches. Selective and revisionist history is a poor foundation for creating workable, legal, and successful competition policy going forward.
Redefines “Modern:” To conclude that Chairman Hundt’s competition policy notions are “modern” requires some definitional gymnastics and eye-closing. The paper self-servingly defines the “modern” competition era as between 1970 and 2000, when “competition” was permitted by the FCC and centrally managed by the FCC – essentially the FCC-centric competition era.
When sustainable real market competition emerged after 2000, when consumers not regulators picked winners and losers, and when market forces, investment and innovation, not government subsidies, determined competitive outcomes, the paper amazingly excludes that recent broadband competition policy success from the “modern” competitive era. The paper again self-servingly defines that successful broadband competition, not as modern, not even as competition, but as “laissez-faire.”
The paper also conveniently ignores that broadband Internet competition, 4G wireless, and the smart-phone/tablet revolution occurred during the non-modern, laissez-faire era. Wouldn’t most people consider broadband, 4G, smart-phones and tablets “modern” competition in every sense of the word?
The paper goes on to further define “modern” as “multi-firm” competition. Given that one of the definitions of “multi” is “more than one,” isn’t all competition by definition “multi-firm?” If “multi-firm” competition is redundant, why define one’s entire framework around it?
This suggests what really is going on is more of a Trojan horse framework, where competition policy disguises as re-regulation policy, where the FCC unilaterally could define “multi,” as not more than one firm, but as 4, 5, 6, or more firms in the “relevant” market. Then the FCC potentially could regulate any information services company for falling short of the FCC’s contrived multi-firm competition framework.
Specifically the paper advises: “The FCC needs to put in place a framework for all of its decisions so that companies will understand how such arrangements will be evaluated; without clear guidance, like that provided by the DOJ/FTC merger guidelines, firms will not know how the FCC will judge their actions.”
The very big problem with this approach is that such a “framework” would be a de facto law, and that is Congress’ role under the Constitution. Nowhere in the 1996 Telecom Act creating communications competition or any other law does Congress give the FCC such sweeping regulatory authority over information services firms.
The other big problem here is that the paper is effectively recommending transmogrifyingafter-the-fact antitrust law enforcement overseen by the DOJ/FTC , into preemptive FCCpolicy with no authority to do so from Congress.
We have seen this overreach before: e.g. the Hundt-FCC’s overturned UNE-P competition policy; the Martin-FCC’s overturned net neutrality enforcement action; and the Genachowski-FCC’s likely-to-be-overturned common carrier-like regulation of broadband.
The current FCC should be mindful that each of these attempts to unilaterally re-define competition “policy” were struck down because the FCC did not have direct authority to do set national competition policy in that way.
FCC-Firsters: The third major contrivance of the Hundt-Rosston white paper is it imagines a near all-powerful FCC – a de facto “FCC-first” construct.
Consider the authors own words here. “Somewhere between a tenth and a sixth of the American economy is in [the FCC’s] purview.” (p. 7) “Other than the all-important consideration of judicial review, not too many checks on its [the FCC’s] authority exists.” (p. 4) “… the FCC is its own boss. Congress would find it quite difficult to impeach a commissioner, and Congress to pass a low overturning an agency decision. The agency as a Fourth Branch of government, is as to the markets in its purview the most important of all the branches.”
The big problem here is that the paper’s assumption of largely unchecked FCC power largely ignores the transformative effect and reality that new laws, Congressional policies, FCC precedents, and technological innovations have had in hollowing out the FCC’s original and obsolescing 1934 sweeping regulatory authority.
That law’s policy and reality predicates are obsolescing or obsolete. Telephone service is not a “natural” monopoly; competition is not only possible, it’s vibrant and widespread. Technology then was analog and not digital, and continuous not discontinuous in nature. We now enjoy universal voice service, and broadband is comparably universally available.
Moreover, the FCC’s underlying statutory authority did not envision many transformative technological innovations: TV, transistors, digital computers, cellular service, PCs, the Internet, the World Wide Web, wire line broadband, wireless broadband, web applications, smart-phones/tablets, etc. Is it “modern” to predate all that?
The authors’ uber-expansive view of the FCC’s power going forward largely ignores that consumers don’t need the FCC anywhere like they needed the FCC eighty years ago. Competition and innovation naturally have obsolesced much of the FCC’s 1934 purpose and role.
This is not to say that there are transitional or vestigial roles and consumer protection functions for the Federal Government to play in the truly modern communications era, but it’s not the role envisioned eighty years ago, when the economic and technological predicates were almost the opposite of today.
The authors’ encouragement of an FCC-first approach to competition policy may advance a classic interventionist role for the FCC like Chairman Hundt pursued during his tenure, but it is badly outdated for the technological, economic, competitive, and legal predicate that prospective FCC Chairman Tom Wheeler inherits and must operate within.
The authors’ characterization of their approach to FCC competition policy as modern is contrived and not accurate.
First the paper tries to rewrite FCC history to exclude much of the most important competitive facts and results from the policies that they advance. Second the authors attempt to redefine the term “modern” to the point of being unrecognizable. Third the authors appear to goad the FCC to effectively ignore the law and the courts and implement whatever FCC competition policy that three commissioners support.
Let me finally address the authors’ substantive recommendations. The authors’ recommend the FCC “have a consolidation policy.” That’s wholly unnecessary and redundant. America already has a consolidation policy; it’s called antitrust law.
What the authors are asking for is for the FCC competition policy to effectively flip the legal burden of proof from the government proving a merger/acquisition is anti-competitive to a company having to prove a merger is pro-competitive before the FCC – i.e. companies are guilty of anti-competitive intentions until proven innocent.
The authors also recommend an FCC competition policy of “standing up for competition usually turns out to be the same as standing up for entrepreneurship, innovation, the little guy who wants to get big…”
Defining competition as picking “entrepreneurship, innovation and the little guy” as the winners, may sound appealing to some, but its exposes the authors’ real concept of competition policy — to tilt the rules to and subsidize some government-favored players, especially new entrants – with no concept of competition as: competing for customers; ensuring supply meets demand; providing the most value and benefits to customers; investing to have a superior offering; etc.
This recommendation of theirs epitomizes my strong opposition to the authors’ proposed FCC competition policy, which would ensconce the FCC as central manager of the communications marketplace with a pre-determined view of what companies should gain or lose share over time via government decisions, without regard to what consumers want or do, and without regard to economics, return on investment or solvency/profit.
When then FCC Chairman Hundt implemented this kind new entrant-favored FCC competition policy in the mid 1990s, it proved to be the single most financially destructive regulatory experiment in managed competition in U.S. history.
The Hundt FCC’s competition policy made it clear to the marketplace that the FCC would be “standing up for entrepreneurship, innovation, the little guy.” That FCC ensured new entrants CLECs could expect virtually every price, term and condition advantage and subsidy that CLECs and new Fiber back bone companies could dream up.
This predictably led to economically unsustainable competition, where dozens of CLECs and over a dozen new fiber backbone companies were funded when market economics could sustain only a few. American investors and pensioners lost big — over a trillion dollars when the tech bubble burst. The bankruptcy of the entire CLEC industry cost hundreds of billions of dollars in losses. And the burst of the tech bubble meant FCC-encouraged fiber backbone companies and equipment providers lost over a trillion dollars in market value in weeks.
The cause of this huge and destructive carnage was FCC competition policy uneconomics and hubris where the FCC imagined that their visible hand of making most all of the relevant economic decisions for the sector could outperform the market’s invisible hand.
In stark contrast, the so-called laissez faire FCC approach, or light touch regulatory approach has attracted over a trillion dollarseconomic private capital investment, produced world-leading facilities-based broadband competition, and generated stellar competitive outcomes: falling real prices, increasing customer value, innovation, investment, differentiated choices, and more.
In sum, a modern FCC will adapt to the real progress of America’s competitive markets and world-leading technological innovation. A modern FCC will not look backward nostalgically or try to relive the past by dragging obsolescing, FCC-empowering, economic regulations into the present day so the FCC can quixotically try and control America’s Internet tomorrows.
To paraphrase our cinematic American philosopher Forrest Gump: modern is as modern does.
[Originally published on Precursor Blog]
One of my favorite off-Broadway shows is “I Love You, You’re Perfect, Now Change.” The show closed in 2008 after running for 12 years and 5003 performances. In a series of unrelated vignettes, the musical comedy delves into the world of dating, love and marriage.
Enough said about Broadway, on or off.
But I’m put in mind of “I Love You, You’re Perfect, Now Change,” in thinking about the newly reconstituted Federal Communications Commission. With Tom Wheeler coming on board this week as the new FCC chairman and Michael O’Rielly as a new commissioner, the Commission will be back up to its full five-member complement.
No, I don’t mean to say I “love” Messrs. Wheeler and O’Rielly. But based on what I know about them, I do respect and admire them for what they’ve already accomplished in their public and private sector lives.
And, no, I don’t mean to say Messrs. Wheeler and O’Rielly are “perfect.” But, as widely acknowledged, they do bring a wealth of experience and expertise concerning communications policymaking to their jobs, and this certainly is a net positive.
But I do mean to say: “Now Change.”
What I mean to suggest by this is that it is an opportune time for a reset of the FCC’s generally pro-regulatory mindset. This pro-regulatory mindset is perhaps not unnatural, given that, from its inception and for many decades thereafter, the markets under the Commission’s regulatory jurisdiction were either monopolistic or tended in that direction.
But that is no longer the case, of course, and this fact is widely, if not universally, acknowledged by public policymakers, scholars, and, importantly, consumers. The fact of the matter is that the transition from narrowband to broadband, from the Analog Age to the Digital Age, has enabled competition to develop in formerly monopolistic markets.
In my oral testimony before the House Subcommittee on Communications and Technology at its October 24 hearing on the “Evolution of Wired Communications Networks,” I stated:
There is no doubt the IP revolution has enabled increasing competition among broadband providers for the provision of voice, high-speed data, and video services, whether these providers offer their services over wireline, cable, wireless, satellite, fiber, or whatever. The relevant point is not that all of the services offered by all of the competitors are perfectly substitutable, or that they all meet every consumer’s desires at all times. The relevant point for policymakers is that, for an increasingly large number of consumers, these various competitors provide a choice of service providers offering a choice of attractive service options.
I suspect that not one of the FCC Commissioners would disagree with the above statement. But the important, more fundamental, point is to grasp the implications for regulatory policy. As I said in my House testimony, in today’s increasingly competitive environment, going forward the FCC needs to rely much more on a free market-oriented paradigm in which “future regulatory activity should be tied closely to findings of demonstrable market failure and actual consumer harm.”
This does not mean there is no place for regulation in the event of market failure and consumer harm. But it does mean that the agency needs to reset its mindset so that it does not look to regulation in the absence of proven market failure and consumer harm.
At this time I want to eschew addressing any specific proceedings. You can find hundreds and hundreds of pieces doing so on our Free State Foundation website and our blog – and rest assured they will keep on coming.
Instead, in closing, I want to say that I have no doubts about the good faith of the new FCC Chairman, or of any of his colleagues. Nevertheless, I do know that, from their high positions, which confer considerable unbridled administrative discretion, it is often easy just to succumb to the bureaucratic imperative to “regulate first,” perhaps with the best of intentions.
As a cautionary warning, I think that what Dean Roscoe Pound had to say around the middle of the last century is very useful for the commissioners to appreciate:[T]he zeal of administrative agencies to achieve the immediate ends they see before them leads them to see their functions out of focus and to assume that constitutional limitations and guaranteed individual rights must give way before their zealous efforts to achieve what they see as a paramount purpose of government.
And, in the same time frame, F.A. Hayek had this to say in The Constitution of Liberty about what he called “general regulations of economic activity.” Hayek declared “the economist will remain suspicious and hold that there is a strong presumption against such measures because their over-all cost is almost always underestimated and because one disadvantage in particular – namely the prevention of new developments – can never be fully taken into account.”
With these injunctions in mind, I have no hesitancy, with a nod to off-Broadway, to say to Chairman Wheeler and his colleagues: “I respect you, you have experience and expertise, now change.”
[First published at the Free State Foundation.]
“Laws are like sausages — it is best not to see them being made,” the German statesman Otto von Bismarck famously noted. Now Bloomberg stands accused of not letting the public see how his team made various nanny-state laws and regulations from his (failed) soda ban to his rules against donating food to city-run homeless shelters.
A consumer group, Keep Food Legal, last week sued the Bloomberg administration for allegedly failing to comply with a series of Freedom of Information Law requests going back more than a year. These sought documents from the Mayor’s Office and the Health Department touching on which groups, individuals and outside agencies helped develop the city’s most restrictive food laws and regulations — from the mandatory calorie counts on menu boards to the trans-fat ban, as well as reportedly pending proposals to restrict salt in restaurants and limit happy hours.
The Health Department belatedly offered to provide some information, but the Mayor’s Office completely failed to comply with requests. Yet the state Freedom of Information Law allows only very narrow exemptions to requests like Keep Food Legal’s.
Maybe City Hall feels burned by its 2010 experience, when the Health Department complied with a New York Times FOIL request about a controversial ad campaign meant to support the mayor’s soda tax.
That campaign used public money for posters directly linking sugar consumption to weight gain — a claim the FOIL showed that the department’s own top nutritionist had debunked.
When it comes to the science, “the idea of a sugary drink becoming fat is absurd,” Health Department scientists warned in the memo obtained by the Times.
Cathy Nonas, the chief nutritionist, warned if the ads contained such a claim, scientists “will make mincemeat of us.”
Are these the sausages the mayor doesn’t want you to see being made?
This is precisely the type of scrutiny Freedom of Information laws are meant to foster.
The issue here isn’t so much why the nanny-state policies are bad, but rather how the city, particularly when it comes to food policy, operates secretly — and possibly under the shadowy influence of activist groups like the Robert Wood Johnson Foundation and liberal academics such as NYU’s Marion Nestle.
There’s nothing wrong, per se, about developing policies recommended by outside groups, but the public has a right to know about it. Imagine the furor if the Health Department were collaborating with McDonalds or Coca-Cola and using public dollars to promote their claims — all in secret. The outrage would be palpable, and justified.
Transparent policy-making is important so that the public can see how our laws are made. That scrutiny is valid, be it from regular citizens, left-wing food police or pro-freedom consumer groups such as Keep Food Legal.
When Bloomberg finally leaves City Hall, a new administration could be emboldened by the lack of transparency this one got away with. Keep Food Legal’s action, if successful, will send a message to the next administration: You’re free to listen to the most radical outside advisers — but you can’t do it in secret.
[Originally published on the New York Post]
Arguably the most prescient comment in American political history is the perhaps apocryphal response from Benjamin Franklin, one of our Founding Fathers, to the passer-by who asked him in Philadelphia, in 1787, what kind of government the Founders had designed for the United States of America: “A Republic, madam, if you can keep it.”
Through two centuries of ratification debate, amendment and judicial interpretation, the Constitution itself has stood up remarkably well. Yet nearly every day in every way in the 226 years since its creation, the strength and resilience of the Constitution have been tested.
Lord Acton famously observed that power tends to corrupt and absolute power corrupts absolutely. Over the past two-and-a-quarter centuries, the powers invested in each branch of government — and the power that officeholders seek and assume to themselves— have wreaked havoc on the rights of the people in whose name the Constitution was adopted.
Nowhere is this more evident than in the troubling growth of the executive branch’s interest in operating in secrecy.
Compare and contrast just two recent cases, both coincidentally involving sex and secrets. In one case, the executive branch has gone to unprecedented lengths to discover who “leaked” certain information of public interest; in the other case, the government refuses to disclose who within the government “leaked” personal and private information to the media.
The first case is that of 55-year-old bomb expert Donald Sachtleben of Carmel, Ind., who — according to the Sept. 24 New York Times and other sources — recently agreed to accept a nearly 12-year federal prison sentence after being accused as the source for a May 7, 2012, Associated Press report on a foiled overseas terrorist bomb plot.
Americans now know that in April 2012 a successful U.S. intelligence operation had disrupted an al-Qaida plot in the Arabian Peninsula to destroy an airliner by using an underwear bomb purportedly designed by Ibrahim al-Asiri. Acting on a leak provided by former FBI agent Sachtleben, the AP broke the news in May 2012 before the White House had a chance to do so, thereby potentially depriving an incumbent president from maximizing its political value in a re-election year.
Some 40 years ago, then-FBI Associate Director Mark Felt (later nicknamed “Deep Throat”) spoon-fed details of an FBI investigation of the White House to two young Washington Post reporters, Bob Woodward and Carl Bernstein, who were able to keep Felt’s identity a secret until after his death some 30 years later.
But in Sachtleben’s case, the FBI interviewed more than 500 officials while the Justice Department secretly subpoenaed calling records associated with 20 telephone lines of the AP or its reporters in an effort to uncover the man (Sachtleben) whom the AP still refuses to confirm as its source.
In what The Times characterizes as a “bizarre coincidence,” the government then “discovered” that other “law enforcement officials” had already conveniently seized Sachtleben’s computer, cellphone and electronic media in connection with a purportedly unrelated child pornography investigation, making their task that much easier.
Caught between the hammer of the national security charges and the anvil of a child pornography prosecution, Sachtleben reportedly agreed to a 140-month plea agreement, the heaviest yet imposed on a civilian “leaker.”
Meanwhile, on the other side of the personal privacy/government secrecy coin, the Justice Department currently seeks to dismiss a lawsuit by Jill Kelley, the woman whose report to the FBI of “harassing e-mails” from Paula Broadwell, the biographer of former Army Gen. David Petraeus, ultimately led to the ouster of Petraeus as CIA director and of fellow Gen. John Allen as top U.S. commander in Afghanistan.
Kelley and her husband, a co-plaintiff, seek to find out who in the government leaked Kelley’s name and e-mails to the media in violation of the Privacy Act, a post-Watergate “reform” law, thereby placing Kelley in the role of villainess in the downfall of Gens. Petraeus and Allen.
The government seeks to dismiss the lawsuit on the grounds that the Privacy Act does not apply to the FBI and the Defense Department whenever they opt out and that the Kelleys’ complaint fails to plead that the leaked information was kept in a records system subject to the act’s requirements.
The Framers rightly recognized that rulers, even in a constitutional republic, would quickly abuse their power if not checked. Two of the most important checks imposed were noninterference with freedom of the press and requiring that those who made the laws should also live under them.
It’s a worrisome sign that 40 years after Watergate, the executive branch still argues that the laws that apply to the people do not apply to it and that freedom of the press is no longer so highly prized.
It’s a sobering thought that, according to The Times, the executive branch has brought eight leak-related prosecutions in the past five years compared to only three under our Constitution’s previous 221.[First published in the Chicago Daily Law Bulletin and reprinted by permission of the author.]
October 17 was the fortieth anniversary of the oil embargo slapped on America by the Organization of Petroleum Exporting Countries (OPEC). That action changed the entire geopolitical map by taking the power from the United States and giving it to the Middle East. As a result of the embargo, America slid into a serious recession.
It was a different world prior to the embargo. America was the dominant player in the energy market and had surplus supply to fill the demand gap OPEC created.
It wasn’t the embargo itself that changed the dynamic, but the timing of it.
U.S. oil production peaked in 1970 and declined sharply in the subsequent years. When OPEC chose to use oil as a diplomatic weapon in 1973, America had become increasingly dependent on suppliers from the oil-rich Middle East. Scarcity was our reality.
To punish the U.S. for supporting Israel in the Yom Kippur War, OPEC banned oil exports to the U.S. OPEC then reduced production by 5 percent per month until the embargo ended in March of 1974.
For the past forty years, OPEC has controlled energy’s geopolitical equation. Every president since Richard Nixon has urged the country to strive for energy independence so that we don’t face another energy crisis like 1973.
While the impacts of the embargo have been harsh, there’s also a silver lining: North American producers were forced to find new ways to explore for and produce hydrocarbons—and those technologies and techniques developed by individuals and industry have, once again, changed the geopolitical equation.
According to the Reuters story on the embargo’s anniversary:
“The United States is less reliant each month on Middle East energy, thanks to increasing production of both oil and natural gas from technologies such as hydraulic fracturing, or fracking, which allows extraction of oil and gas from shale deposits.”
While the U.S. is less reliant on the Middle East—with our crude oil production up by 50 percent since 2008, it isn’t actually due to hydraulic fracturing, as Reuters states. According to Harold Hamm, who is credited with being one of the first wildcatters to take a chance on developing North Dakota’s Bakken field, saying that “fracking is the root of America’s new supplies of oil and gas” is a misconception that has “been erroneously driving public discourse and policy.”
Hamm comments on the embargo’s anniversary in Forbes: “It’s also time for America to hear the truth about the real source of our modern-day oil and natural gas renaissance—horizontal drilling.” (The distinction is important as fracking has been used by the environmental lobby to create fear when in fact fracking has been consistently in use for more than 60 years.) Extolling how far America has come since the 1973 embargo, Hamm states: “Never again are we going to be held hostage and extorted.”
Hamm is correct. As the Wall Street Journal says, “greater U.S. oil production gives foreign-policy flexibility.” Likewise, Time Magazine affirms: “OPEC’s influence has been diminished, and oil can no longer be used as a weapon the way it was 40 years ago.”
How does energy security give the U.S. “foreign-policy flexibility?” One example is Iran. Reuters reports: “Last year, Washington and its European allies orchestrated a partial boycott of Iranian oil, to compel Tehran to return to talks about its nuclear program. The sanctions against Iran took roughly 1 million barrels per day off world markets—without the price spikes many predicted.” Additionally, U.S. production has helped dampen price spikes from supply problems in Nigeria, Libya, and Sudan—and made us less vulnerable to Middle East oil shocks. Without the domestic supply, current gasoline prices would be even higher.
While U.S. dependence on Middle Eastern oil has reversed course since 1973—increasing for thirty years and declining since 2008, we are still importing the same percentage of oil that we did 40 years ago: 35 percent.
We have come a long way, but there is still much that can be done to reduce use of Middle Eastern oil and improve our energy and national security. Solutions tend to fall into two categories: supply side and demand side.
The supply side is being secured by increasing U.S. oil and gas production—but we can do more. President Obama needs to finally approve the Keystone pipeline. Allowing more access to federal lands and accelerating approval for drilling permits are two other supply side solutions.
On the demand side, Robbie Diamond, founder, president and CEO of Secure America’s Energy Future (SAFE), concludes:
“The domestic oil boom has already reaped tremendous benefits, but integrating natural gas and electricity into America’s transportation system is a necessary way to diminish both our dangerous reliance on a single commodity and our economic exposure to the global oil market.”
At the Oil Embargo +40 conference, organized by SAFE, held in Washington, DC, on October 16, Fred Smith, Chairman and CEO of FedEx espoused the benefits of electric vehicles for short-haul, light-duty vehicles and natural gas for longer haul trucks, and Dan Akerson, Chairman and CEO of General Motors announced, a new bi-fuel Chevrolet Impala that will use both conventional gasoline and compressed natural gas.
Could America still feel the shockwaves of supply disruptions caused by Middle Eastern instability? Yes, but we are far less vulnerable today than we were in 1973, as the geopolitical equation continues to evolve. A recent report from Citigroup points out that by the end of the decade, the U.S. “could be freed from the shackles involved in sacrificing a values-driven policy focusing on human rights and democratic institutions in order to secure cooperation from resource-rich despotic regimes.”
We may never see $1 a gallon gasoline again. But, we can be optimistic about America’s potential energy future (if the Obama Administration policies don’t impede its success). Hamm exclaims:
“Perhaps most significantly on the 40th anniversary of the OPEC Oil Embargo, U.S. gasoline prices are down despite an escalating crisis in the Middle East, and we are no longer beholden to go to war and sacrifice American lives to protect our oil interests.”