Somewhat Reasonable

Syndicate content Somewhat Reasonable | Somewhat Reasonable
The Policy and Commentary Blog of The Heartland Institute
Updated: 5 min 30 sec ago

Tax Inversion Freedom vs. Fiscal Berlin Wall

6 hours 36 min ago

The Obama Administration has proposed its latest form of collectivist control over the American people. In a letter to Congress U.S. Treasury Secretary, Jack Lew, has called for punishment and prohibition of any company that tries to move its headquarters overseas to avoid higher taxes in the United States. Plus, Mr. Lew has the audacity to call his proposed territorial imprisonment of American business, “economic patriotism.”

National Imprisonment Equals “Economic Patriotism”

He said, “What we need is a nation with a new sense of economic patriotism, where we all rise or fall together.” In other words, if the government ruins or weakens your business or financial well being through its misguided and disastrous economic policies, no one is to be allowed to escape the consequences.

The implication, it should be understood, is that you are the ward and property of the state. The government determines and fully controls your present and your future. Any attempt to emigrate your enterprise to a fiscally freer and less burdensome country or political jurisdiction will be prevented through the full police power of the American government.

What has caused a panic among many in Washington is that some high profile corporations have gained media attention with their plans to shift their corporate headquarters to countries with lower business taxes than those in the United States.

How dare portions of the American public who are the shareholders in these companies try to deny the political authorities some of the tax revenue they lust after!

You see the United States has one of the highest corporate tax rates in the world. Also the U.S. government imposes double taxes on American companies that do business in other parts of the world. If a U.S. headquartered enterprise earns profits in another country, they pay taxes in the country in which those profits have been earned.

But unlike most other nations, if that American enterprise repatriates back to the U.S. any or all of the net profits after paying those foreign taxes, Uncle Sam taxes them again. In other words, Washington doesn’t care where that American company has earned those profits or how much a foreign government has already taxed it; Uncle Sam demands his own “cut.”

Tax Differentials and Personal Choice

Let me give an example from my own experience as an income earner deciding where to work.  A good number of years ago, I was teaching at a university in Texas. I was offered a highly attractive academic position at a free market-oriented college in Michigan.

I explained to the academic dean at this college in Michigan that his salary offer was one I had to refuse.  Texas, I pointed out, had (and has) no state income tax, while Michigan does. If I were to accept the offer I would be financially worse off in comparison to the take-home pay that I was earning in Texas. I made a counter-offer of what salary I would have to receive if accepting the position was to be financially rewarding enough.

My employment experience is far from unique. Indeed, it goes on all the time for tens of thousands of people in the United States every year. Pleasant working conditions, an interesting and challenging job description, better opportunities for promotion and advancement, and desirable geographical location in which to live are all among the factors that most people weigh when deciding whether or not to leave a job and accept alternative employment.

But salary matters, too, and for most of us it is given a significant weight in this decision-making process. All things considered, each of us would rather earn more, on net, than less. Our take-home income is the means by which we have access to all the purchasable goods and services we like to buy in the market. And the greater our take-home pay the larger the number of things we can acquire that money can buy.

It is not surprising, therefore, that besides the nonmonetary factors that influence employment decisions, people usually investigate, to one degree or another, how local and state taxes will affect their net-income position when deciding whether to move from one place to another.

Unless the monetary reward after taxes is sufficiently attractive (or unless the nonmonetary factors are so overwhelming), many of us are unlikely to make a move from one town to another or one state to another when we are comparing job opportunities.

Business Decisions and the Tax Structure

The same incentives and relative earning opportunities guide business enterprises in selecting locations for their headquarters and their production, service, and distribution facilities among the various states.  Because of tax advantages hundreds of thousands of companies are incorporated in the tiny state of Delaware, including a majority of the companies listed among the Fortune 500.

Tax differentials can also influence decisions about the country in which a business has its headquarters and facilities. Globally, the average corporate tax rate in 2013 was about 24 percent. In the European Union it was an average of 23 percent. In Latin America it was around 27 percent. In Africa, it was about 28 percent. But in the United States, it was between 35 and 40 percent.

In addition, the U.S. government taxes overseas business earnings in a way that most these other countries do not. For instance, suppose you own and operate your business in one of the leading member countries of the European Union, but also have a subsidiary in the United States. You will owe taxes to your home government only on the profits you’ve earned from sales in your own country. The U.S. government will have already taxed any profits your company earned from American operations; you can then, in general, repatriate those after-tax profits back to your own country without any further tax liability.

However, if you are an American company with a subsidiary in any other country, the United States will double-tax you on your foreign earnings. Not only will you have paid corporate taxes in the foreign country in which you’ve earned those profits, but in most cases the U.S. government will also tax those profits if you repatriate them back to the U.S. The fiscal arm of the U.S. Treasury extends around the globe, claiming a right to a portion of anything you’ve earned anywhere in the world.

This is why American multinational companies have parked nearly $2 trillion of accumulated profits in foreign countries, rather than beinging it home to the United States to be taxed at a corporate rate of at least 35 percent. This represents a huge amount of money that could have been distributed as dividends among tens of millions of Americans who own shares in these American enterprises, and who are therefore denied the financial benefits those dividends could have provided.

Tax Inversion as Escape from Fiscal Plunder

To minimize their tax liability within the United States and to try to reduce tax disadvantages, a growing number of American corporations have been moving their headquarters “offshore,” a procedure known as “corporate inversion.”

A company opens a subsidiary in another country, or merges with another company in another country. It then transfers its international headquarters to the country in which the subsidiary or the merged company is located.

Often virtually nothing else changes as a result of this inversion. Manufacturing, jobs, sales, and marketing may remain exactly as they were before. It is basically a paperwork process to shift the company’s ownership and headquarters outside the United States to avoid such fiscal disadvantages as double taxing of earnings.

Recently there has been a lot of publicity due to the fact that the Pfizer Corporation attempted to buy a British pharmaceutical company to move its headquarters to the United Kingdom.  And it has been reported that Walgreens Pharmacy has been weighing the possibility of buying a British drugstore chain that has its headquarters in Switzerland.

Critics of the type of legal restrictions that Secretary Lew wishes to see implemented against corporate inversions have pointed out that such decisions and actions are perfectly legal and no different than a company’s decision to move, say from Iowa to Kansas where corporate taxes are noticeably lower.

Rather than attempt to legally imprison American companies within the territorial jurisdiction of the United States government, a far better solution would be to lower (or even abolish!) corporate taxes and eliminate the double tax on earnings by American companies from their foreign business activities.

The Tax State vs. Economic Freedom

The real heart of the problem, however, is a lot deeper and more fundamental than these insightful observations and policy suggestions about tax disadvantages. The Austrian economist, Joseph A. Schumpeter, suggested in an essay entitled “The Crisis of the Tax State” (1919) that a nation’s fiscal system can serve as a useful basis for a history of that country’s rise and fall, since the tax system and its structure reflect the political and ideological ideas of that society through time.

Many of the classical liberals of the nineteenth century believed that what a man had earned through his own peaceful production and market exchange was his “natural right” to keep. Many of the classical economists of that time, from a more utilitarian perspective, reinforced this view by emphasizing that taxing income and capital weakened incentives for work, saving and investment, thus reducing the growth of capital and therefore retarding increases in the standard of living for all in society.

Many of them argued that both justice and compassion, especially a concern for the condition of the poor, required that those who had produced and earned should be allowed to keep the fruits of their own labor, with expenses of a strictly limited government funded though a variety of low indirect taxes.

The ideal of a low tax system to fund minimal government with the fewest infringements on market incentives was the fiscal regime of an earlier historical era devoted to the principles of individual rights and freedom, and a respect for private property.

The Rise of the Social Engineering Fiscal State

The rise of socialist, interventionist, and welfare-statist ideas in the late nineteenth and twentieth centuries created a totally different fiscal regime. Government was to use its taxing power as an avenging sword to abolish the exploitation of the poor by the rich. It was to rectify and reduce inequalities in income that were considered to be socially unjust.

Government was to use its taxing authority to influence and indirectly determine the amounts, types, and locations of investment and job creation in various parts of the country. It was to use its fiscal tools to make agricultural countries more industrialized and make industrial countries more agrarian. It was to use customs duties to control the flow of imports and investments into the country, and subsidize or restrict the export of certain goods, resources, or services out of the country.

Taxes were made into an instrument of social engineering and a device for special-interest power politics.

In such an era and regime of fiscal collectivism, nothing is more detested by the political authorities than any attempt by the tax-paying public to escape from the clutches of the tax collector and the government’s thirst for the private wealth created and earned by the members of society through their peaceful and voluntary market transactions.

The Tax Vampires and Fiscal Berlin Wall

Even a legal avoidance of taxes such as the corporate inversion process threatens to reduce the fiscal blood supply of the political vampires who live off the productive efforts of others.

How can those in political authority redistribute income; how can foster various schemes for environmental development; how can they manipulate and control the direction of investment, manufacturing, and employment within their country – what in the 1950s and 1960s the French called “indicative planning” through the tax structure; how can they have the financial resources to bestow privileges on some and impose financial hardships on others in pursuit of re-election through the pandering to special groups? How can they do any of these things, as well as many others, if the tax base is reduced by corporations moving out of the government’s immediate jurisdiction and therefore out of the direct control of the taxing power of the state?

This new political campaign against corporate inversion, therefore, is really an assault on a remaining freedom through which private citizens attempt to retain more of the wealth and income they have produced and earned in the market. It is a campaign to keep the American people captive behind a fiscal Berlin Wall over which there is to be no escape.

[Originally published at EpicTimes]

Categories: On the Blog

Energy and Earthquakes in Ohio

7 hours 6 min ago

Ohio sits above the Utica and Marcellus shales, two geologic formations that have rich energy potential waiting to be unlocked by the process of hydraulic fracturing, commonly referred to as “fracking.” Increased energy production has the potential to be a powerful economic engine for unemployed Ohioans, but the debate over hydraulic fracturing has served to highlight the natural and political fault lines running through the state.

These fault lines have become most apparent near Youngstown, Ohio, an area that has felt two different sets of minor, fracking-related earthquakes, one in 2011 and one in the spring of 2014. The 2011 earthquakes are thought to be caused by disposing of wastewater from the hydraulic fracturing process in deep underground injection wells (the same kind used for carbon capture and sequestration), and the 2014 quakes are thought to have been caused by the hydraulic fracturing process itself.

Despite these quakes, the residents of Youngstown have rejected bans on hydraulic fracturing three times within the last year, by double-digit or near-double-digit margins on each vote, dealing radical environmental groups a hat-trick of defeat in their quest to ban the practice. Youngstown residents have embraced hydraulic fracturing largely because the quakes were mild, Ohio regulators acted quickly and efficiently to take steps limiting future risks, and hydraulic fracturing has sparked an economic recovery in Rust Belt communities throughout the state.

When most people think of earthquakes, they think of Hollywood-style earthquakes twisting bridges and splitting roads in two, but these are not the kind of tremors Youngstown experienced. The largest earthquakes felt in Youngstown in 2011 measured at 2.7 and 4.0 in intensity, and they resulted in zero injuries and no cases of verified damage. The largest of the 2014 quakes was a magnitude 3.0, which the US Geological Survey categorizes as a minor quake that produces vibrations similar to the passing of a truck.

Not only were the quakes minor, but state regulators wasted no time in identifying the problems, shutting down activity directly contributing to the risk, and crafting new rules that have been applauded by the industry and environmental groups, such as the Environmental Defense Fund. Among the new rules are requirements that companies install sensitive seismic monitors before drilling horizontally into rock formations within three miles of a known fault area or an area where seismic activity greater than 2.0 has occurred, and drilling must be suspended pending investigation when monitors detect seismic activity above magnitude 1.

These safeguards and the sterling track record of Ohio regulators in dealing with oil-and gas-related earthquakes have given residents confidence that proper precautions will be taken to make sure hydraulic fracturing is done in an environmentally responsible way, while providing a huge economic stimulus in an area that has been trending downward for decades.

In some ways, the Youngstown area is an unfortunate poster child for the term Rust Belt. The region lost more than 16,000 manufacturing jobs during the Great Recession, and the unemployment rate peaked at 12.9 percent in March of 2009. Now, hydraulic fracturing is starting to shake off some of that rust, as evidenced by the construction of a $1 billion steel plant by Vallourec, a French manufacturer of steel pipes for the oil and gas industry, which employs 350 people.

Additionally, a local pipefitters union, which reported 40 percent unemployment at the height of the recession only 41⁄2 years ago, reached full employment last year, and as the business manager of Local 396 states, “None of this would have been possible without the oil and gas industry.” This dramatic drop in unemployment led Local 396 to rally against the fracking ban in Youngstown, creating a rift between blue-collar workers and greens, two demographics that have traditionally supported Democratic candidates.

As more states seek to increase their economic opportunities by expanding their oil and natural gas industries, Ohio’s restrictions on wastewater injection wells and required mapping of geologic fault lines will likely serve as a blueprint for regulators in other states. But even if other states have fewer worries about natural geologic faults, the political ones will be sure to shake things up.

Isaac Orr (iorr@heartland.org) is a research fellow for energy and environmental policy at The Heartland Institute.

Categories: On the Blog

Who’s Really Waging the ‘War on Science’?

11 hours 48 min ago

Left-leaning environmentalists, media and academics have long railed against the alleged conservative “war on science.” They augment this vitriol with substantial money, books, documentaries and conference sessions devoted to “protecting” global warming alarmists from supposed “harassment” by climate chaos skeptics, whom they accuse of wanting to conduct “fishing expeditions” of alarmist emails and “rifle” their file cabinets in search of juicy material (which might expose collusion or manipulated science).

A primary target of this “unjustified harassment” has been Penn State University professor Dr. Michael Mann, creator of the infamous “hockey stick” temperature graph that purported to show a sudden spike in average planetary temperatures in recent decades, following centuries of supposedly stable climate. But at a recent AGU meeting a number of other “persecuted” scientists were trotted out to tell their story of how they have been “attacked” or had their research, policy demands or integrity questioned.

To fight back against this “harassment,” the American Geophysical Union actually created a “Climate Science Legal Defense Fund,” to pay mounting legal bills that these scientists have incurred. The AGU does not want any “prying eyes” to gain access to their emails or other information.  These scientists and the AGU see themselves as “Freedom Fighters” in this “war on science.” It’s a bizarre war.

While proclaiming victimhood, they detest and vilify any experts who express doubts that we face an imminent climate Armageddon. They refuse to debate any such skeptics, or permit “nonbelievers” to participate in conferences where endless panels insist that every imaginable and imagined ecological problem is due to fossil fuels. They use hysteria and hyperbole to advance claims that slashing fossil fuel use and carbon dioxide emissions will enable us to control Earth’s climate – and that references to computer model predictions and “extreme weather events” justify skyrocketing energy costs, millions of lost jobs, and severe damage to people’s livelihoods, living standards, health and welfare.

Reality is vastly different from what these alarmist, environmentalist, academic, media and political elites attempt to convey.

In 2009, before Mann’s problems began, Greenpeace started attacking scientists it calls “climate deniers,” focusing its venom on seven scientists at four institutions, including the University of Virginia and University of Delaware. This anti-humanity group claimed its effort would “bring greater transparency to the climate science discussion” through “educational and other charitable public interest activities.” (If you believe that, send your bank account number to those Nigerians with millions in unclaimed cash.)

UVA administrators quickly agreed to turn over all archived records belonging to Dr. Patrick Michaels, a prominent climate chaos skeptic who had recently retired from the university. They did not seem to mind that no press coverage ensued, and certainly none that was critical of these Spanish Inquisition tactics.

However, when the American Tradition Institute later filed a similar FOIA request for Dr. Mann’s records, UVA marshaled the troops and launched a media circus, saying conservatives were harassing a leading climate scientist. The AGU, American Meteorological Society and American Association of University Professors (the nation’s college faculty union) rushed forward to lend their support. All the while, in a remarkable display of hypocrisy and double standards, UVA and these organizations continued to insist it was proper and ethical to turn all of Dr. Michaels’ material over to Greenpeace.

Meanwhile, although it had started out similarly, the scenario played out quite differently at the University of Delaware. Greenpeace targeted Dr. David Legates, demanding access to records related to his role as the Delaware State Climatologist. The University not only agreed to this. It went further, and demanded that Legates produce all his records – regardless of whether they pertained to his role as State Climatologist, his position on the university faculty, or his outside speaking and writing activities, even though he had received no state money for any of this work. Everything was fair game.

But when the Competitive Enterprise Institute filed a FOIA request for documents belonging to several U of Delaware faculty members who had contributed to the IPCC, the university told CEI the state’s FOIA Law did not apply. (The hypocrisy and double standards disease is contagious.) Although one faculty contributor clearly had received state money for his climate change work, University Vice-President and General Counsel Lawrence White claimed none of the individuals had received state funds.

When Legates approached White to inquire about the disparate treatment, White said Legates did not understand the law. State law did not require that White produce anything, White insisted, but also did not preclude him from doing so. Under threat of termination for failure to respond to the demands of a senior university official, Legates was required to allow White to inspect his emails and hardcopy files.

Legates subsequently sought outside legal advice. At this, his academic dean told him he had now gone too far. “This puts you at odds with the University,” she told him, “and the College will no longer support anything you do.” This remarkable threat was promptly implemented. Legates was terminated as the State Climatologist, removed from a state weather network he had been instrumental in organizing and operating, and banished from serving on any faculty committees.

Legates appealed to the AAUP – the same union that had staunchly supported Mann at UVA.  Although the local AAUP president had written extensively on the need to protect academic freedom, she told Legates that FOIA issues and actions taken by the University of Delaware’s vice-president and dean “would not fall within the scope of the AAUP.”

What about the precedent of the AAUP and other professional organizations supporting Dr. Mann so quickly and vigorously? Where was the legal defense fund to pay Legates’ legal bills? Fuggedaboutit.

In the end, it was shown that nothing White examined in Legates’ files originated from state funds. The State Climate Office had received no money while Legates was there, and the university funded none of Legates’ climate change research though state funds. This is important because, unlike in Virginia, Delaware’s FOIA law says that regarding university faculty, only state-funded work is subject to FOIA.

That means White used his position to bully and attack Legates for his scientific views – pure and simple.  Moreover, a 1991 federal arbitration case had ruled that the University of Delaware had violated another faculty member’s academic freedom when it examined the content of her research. But now, more than twenty years later, U Del was at it again.

Obviously, academic freedom means nothing when one’s views differ from the liberal faculty majority – or when they contrast with views and “science” that garners the university millions of dollars a year from government, foundation, corporate and other sources, to advance the alarmist climate change agenda. All these institutions are intolerant of research by scientists like Legates, because they fear losing grant money if they permit contrarian views, discussions, debates or anything that questions the climate chaos “consensus.”  At this point, academic freedom and free speech obviously apply only to advance selected political agendas, and campus “diversity” exists in everything but opinions.

Climate alarmists have been implicated in the ClimateGate scandal, for conspiring to prevent their adversaries from receiving grants, publishing scientific papers, and advancing their careers. Yet they are staunchly supported by their universities, professional organizations, union – and groups like Greenpeace.

Meanwhile, climate disaster skeptics are vilified and harassed by these same groups, who pretend they are fighting to “let scientists conduct research without the threat of politically motivated attacks.” Far worse, we taxpayers are paying the tab for the junk science – and then getting stuck with regulations, soaring energy bills, lost jobs and reduced living standards … based on that bogus science.

Right now, the climate alarmists appear to be winning their war on honest science. But storm clouds are gathering, and a powerful counteroffensive is heading their way.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (www.CFACT.org) and author of Eco-Imperialism: Green power – Black death.

 

Categories: On the Blog

Harry Reid Remains Obstinately, Perversely Insistent on Taxing the Internet

13 hours 3 min ago

As we asked last week – who says bipartisanship is dead?

The allegedly “do-nothing” Republican-led House of Representatives has actually done a lot of good stuff – a lot of it bipartisan in nature.

“We’ve put together a helpful list of bipartisan bills (Senate Majority Leader Harry Reid) can start with that have passed the House with over 250 votes and are stuck sitting idly in the Senate. The do-nothing Senate really must do-better,” (Congressman Eric) Cantor spokeswoman Megan Whittemore said.

Therein lies the problem.  Senator Reid remains an obstinate obstruction to advancing good legislation.  He only moves bills that lessen our freedom – and lighten our wallets.

On November 1 – and remember, Election Day is November 4 – a since-1998 federal moratorium on Internet-only taxes ends.  Meaning you and I will pay the government even more coin to surf the World Wide Web.

When you next fume at your Internet or cell phone bill – check the litany of taxes tacked on.  (And forget not the built-in government costs you pay but never see.)  That way you’ll know at whom to actually be angry.

Governments already bite huge chunks out of our hide.  But they view our money likeJello – there’s always room in their stomachs and wallets for more.

But the House just passed – by mega-bipartisan voice vote acclimation – the Permanent Internet Tax Freedom Act (PITFA).

(T)he law has attracted large bipartisan majorities every time it’s been up for a vote in either house. That’s because the law has allowed the Internet to grow into an engine of interstate and international commerce.

Except Senator Reid won’t allow the bill up for a vote in his Senate.  Unless he can tether it to a whole new Internet tax scheme – the woefully misnamed, incredibly destructive Marketplace Fairness Act (MFA).

A new Senate bill may force lawmakers this week to make a tough choice on internet taxes: they must agree to expand the reach of sales taxes on out-of-state retailers, or else see the end of a law that forbids states and cities from imposing a tax on internet access.

…(T)he dilemma stems from Senate Majority Leader Harry Reid’s handling of a bill called the Internet Tax Freedom Act. That bill, which passed the House by a large margin, would permanently entrench a temporary moratorium on ISP taxes. The measure is politically popular because it means consumers won’t see “service fees,” akin to those that appear on cell phone statements, on their broadband bills.

Instead of putting the same bill to the Senate, however, Reid has decided to attach it to a proposed law called the Marketplace Fairness Act. That bill, which first passed the Senate last year, would require online retailers to collect tax on sales they make to out-of-state consumers.

Get that?  Under the MFA, uber-tax-happy states like California would no longer be confined to taxing into oblivion just Californians.  They’d have access to the wallets of every business – every person – in all fifty states.

Turning Huge Government states into additional Huge Government federals.  And tempting Less-Huge-Government states to grow – with the siren song of new coin taken from people in forty-nine states that can’t vote against them.  That’ll help the economy and spur growth.

These people need to be told: Stop looking for new ways to take our money – instead, stop spending quite so much.

The underlying assumption when government looks for new money – is that every current penny is being spent wisely and well.  And there’s your Joke of the Day.

Senator Reid is constantly decrying the lack of bipartisanship in Washington.  He could get a whole lot more of it if he stopped serially, unilaterally blocking things that garner it – like PITFA.

Or tacking to it things like the MFA – which both sides oppose.

Hold on a sec.  Senator Reid’s actually getting more of the bipartisanship he says he wants – against him and his Huge Tax, Huge Government policies.

That’s one mean to an end, I guess.

[Originally published at Human Events]

 

Categories: On the Blog

Heartland Challenges, Dismantles UN’s Climate Change Finding

July 28, 2014, 3:42 PM

Climate change hysteria has become the mantra of U.S. government since Al Gore’s 2006 Oscar-winning documentary on global warming, An Inconvenient Truth. The latest is that the U.S. Defense Department has embraced Al Gore’s message. According to a Defense Department official, Daniel Chiu, “All Pentagon operations in the U.S. and abroad are threatened by climate change.” Chiu, as Deputy Assistant Secretary of Defense for Strategy and Force Development, gave this additional warning to senators at a hearing on Tuesday, July 22:

The effects of the changing climate affect the full range of Department activities, including plans, operations, training, infrastructure, acquisition, and longer-term investments. By taking a proactive, flexible approach to assessment, analysis, and adaptation, the Department can keep pace with the impacts of changing climate patterns, minimize effects on the Department, and continue to protect our national security interests.

The Defense Department’s proclamation is in keeping with an alarming statement made by President Obama at a recent fundraiser outside Seattle when he called for a Collectivist “New World Order.”  Obama bemoaned that the old new order isn’t working around the world, but “we’re not quite yet towhere we need to be in terms of a new order that’s based on a different set of principles, that’s based on a sense of common humanity, that’s based on economics that work for all people.”

As discussed in Thorner’s Illinois Review article of Thursday, July 24UN and its auspices bear responsibility for Climate ChangeAgenda 21, as a product of the UN’s Rio+20 conference, is all about the new World Collectivist Order espoused by President Obama in which environmental goals are integral to its success.

The 97% Consensus Figure

Repeated time and again is that almost all scientists agree that climate change is real, man-made, and dangerous, even though there are skeptic world-wide among scientists studying weather and climate who question pronouncements of certitude that due to man’s emissions of CO2 runaway global warming will occur.

This past May Secretary of State John Kerry warned graduating students at Boston College of the “crippling consequences” of climate change. “Ninety-seven percent of the world’s scientists,” he added, “tell us this is urgent.”  John Kerry’s comments mirror those publicized by UN IPCC reports, which have been unequivocally accepted by the Obama administration.  It goes without saying that  all IPPC scientists are self-professed members of the fictional 97% consensus of world scientists who believe in earth changing, man-made global warming.

Ignored by the media is a petition circulated for signatures by a group of physicists and physical chemists based in La Jolla, Calif.  Known as the Petition Project, it attracted more than 31,000 (more than 9,000 with a Ph.D.).  The petition states that “there is no convincing scientific evidence that human release of . . . carbon dioxide, methane, or other greenhouse gases is causing or will, in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate.”

Heartland Institute fills a need and reaches out to present “hard” science

The Heartland Institute initially started looking at the issue of global warming in 2006, realizing that the debate as being presented by the UN and its auspices was all one-sided.  According to Heartland’s president, Joe Bast, it was “time to bring together global warming skeptics to develop personal relationships and a social movement” to counter the published reports released by the UN Intergovernmental Panel on Climate Change (IPCC) with its “self-interest to exaggerate the threat, to ignore any doubts, and to pursue one avenue, which is reducing emissions.”

The 1st International Conference on Climate Change was held in New York Cityin 2008.  Between 2008 and 2014 nine international conferences were held.  The just completed Las Vegas 9th International Conference on Climate Change from July 7-9, attended by Thorner, featured 64 speakers, from a multitude of disciplines and rightly qualifies as the most star- studded climate conference yet.  Visit this site to watch all of the #ICCC9 conference videos.

NIPCC as a Counter to IPCC

In addition to Heartland’s sponsorship of nine international climate change conferences to inform the public that there is another side to the global warming debate and that science is not settled, The Heartland Institute’s association with the Nongovernmental International Panel on Climate Change (NIPCC) is noteworthy and is recognized worldwide, sometimes with scorn, with findings that reducing CO2 emissions, especially here in the United States alone, will not effect the global temperature. In 2009 the Center for the Study of Carbon Dioxide and Global Change, the Science and Environmental Policy Project (SEPP), and The Heartland Institute combined forces to produce Climate Change Reconsidered,which was the first comprehensive alternative to the alarmist reports of the United Nations Intergovernmental Panel on Climate Change (IPCC).

The newest volume in the Climate Change Reconsidered series, was released on April 9: Climate Change Reconsidered II: Biological Impacts. The second volume was released in June. These two volumes are the fifth and sixth in a series of scholarly reports produced by NIPCC.  Previous volumes in the Climate Change Reconsidered series were published in 200820092011, and 2013.  Reports are available for free online on this site. 

Whereas the reports of the United Nations’ Intergovernmental Panel on Climate Change (IPCC) warn of a dangerous human effect on climate, NIPCC concludes the human effect is likely to be small relative to natural variability.  Whatever small warming is likely to occur will produce benefits as well as costs.

Lord Christopher Monckton, a favorite of Conference attendees, issues a warning

It was during the 12th panel discussion at the recent Las Vegas 9th International Conference on Climate Change on Tuesday, July 8, titled, International Perspectives on Climate Change, when panelist Lord Christopher Monckton, a former policy advisor to Margaret Thatcher and one of the most visible and outspoken voices against climate change science as a climate change skeptic, gave pertinent details about past UN conferences, along with a grave warning about the Paris Conference (COP21) that will take place in the summer of 2015.  If the Paris Conference succeeds in its goal, as Lord Monckton feels that it might, for the first time in over 20 years of UN negotiations all the nations of the world, including the biggest emitters of greenhouse gases, will be bound by a universal agreement on climate change.  Monckon is calling for a “Get out” clause insert – or freedom clause — if countries wish to withdraw from the treaty because of changed minds about the dogma of Global Warming as a threat to mankind.  In reality, there hasn’t been any global warming for 17 years and 10 months!

Normal or hard science, contrasted with Post-normal science

Just what is the basis of real or hard science and how does it differ from post-normal science?  The concept of post-normal science was introduced by Funtowicz and Ravetz during the 1990’s. Climate change as promoted by UN IPCC reports, fall in the post-normal science category in that the process of science is linked to who gets funded, who evaluates quality, and who has the ear of policy makers.  Predictions are made on the basis of a theory or hypothesis. This contrasts the use of real world observations to test predictions as do the NIPCC reports published by The Heartland Institute.

In Climate Change and the Death of Science, Christian British blogger Kevin McGrane nails practitioners of post-normal science.  In their own words Kevin McGrane demonstrates that even they know and admit they’re no longer doing science but politics.  McGrane’s article strikes at the very root of many environmentalists’ routine practices.

A good explanation of Normal vs Post-normal science was contained in a brochure published and distributed by The Cornwall Alliance at the Las Vegas Conference.  The Cornwall Alliance is a coalition of clergy, theologians, religious leaders, scientists, academics, and policy experts committed to bringing a balanced Biblical view of stewardship to the critical issues of environment and development. Not only was the Cornwall Alliance a cosponsor of the Las Vegas Conference, but its president, founder and national spokesman, E. Calvin Beisner, Ph.D. , participated in Panel 21 (Global Warming as a Social Movement).  Dr. Beisner was likewise honored as the winner of Heartland’s Outstanding Spokesperson in Faith, Science, and Stewardship Award.

Normal or Hard science (NIPCC reports) consists of a rigorous process in which scientists formulate hypotheses to explain how some natural process works; test those hypothesis by careful observation of the real world and of laboratory experiments, diligently disciplining themselves to look as carefully for results that might falsify their hypotheses as results that are consistent with them; and freely share the raw data and the computer codes by which they interpret those date with other scientists so that they can try to replicate their observations and experiments, or find flaws in them.  Post-normal science (UN IPCC reports) is a process in which someone formulates a hypothesis, pays attention only to experimental and real-world observations that seem to confirm it but ignores and even suppresses contrary observation, refuses to share his date and methods with other scientists, and seeks to intimidate scientists who disagree with him or to prevent publication of their research.

What happens if Post-normal science wins?

This is what we are up against.  To put it bluntly, practitioners of post-normal science have stabbed real science in the back on their hurried way to declare how an “overwhelming scientific consensus” exists on manmade global warming.

Nigel Lawson asks in A Wicked Orthodoxy published May 5, 2014:

How is it that much of the Western world has succumbed to the self-harming collective madness that is climate orthodoxy?  It is difficult to escape the conclusion that climate-change orthodoxy has in effect become a substitute religion. . .

Throughout the Western world, the two creeds that use to vie for popular support, Christianity and the theistic belief system of Communism, are each clearly in decline.  Yet people still feel the need both for the comfort and for the transcendent values that religion can provide.  It is the quasi-religion of global salvationism that has filled the vacuum, of which the climate-change dogma is the prime example.

Although the Western world will suffer economically with countless hardships befalling its inhabitants, it is with the masses in the developing world where the greatest immorality is even now taking place. How can you ask the millions of people in third world countries who live in dire poverty to abandon the cheapest available sources of energy while suffering malnutrition, disease, and premature death. Not only is the global-warming orthodoxy irrational. It is also wicked.

In Conclusion

A new Pew “Global Attitudes Project” poll of July 22 offers details on the way citizens of the world think about climate change.  Unfortunately the poll has been interpreted in a way that brands the American people as ignorant to the risks of global warming.  Why?  Because only one in four Americans indicated that climate change was a “major threat,” making the U.S. the least concerned nation.  Maybe Americans as a whole are smarter than we sometimes give them credit for being.

If real scientists don’t rise up and point out that the emperor of “post-normal science” has no clothes, the whole scientific enterprise will die and the world and its people will suffer  The Heartland Institute must be applauded and supported for its fearless stance as a global warming skeptic, noted by The Economist, for taking action through its presentation of international conferences, and for its fact-based NIPCC reports which challenge the UN IPCC reports that advance unproven hypotheses.

Articles by Nancy Thorner based on Heartland’s 9th International Conference on Global Warming:

Article 1:  http://illinoisreview.typepad.com/illinoisreview/2014/07/ready-thorner-a-climate-change-holocaust.html#more

Article 2:  http://blog.heartland.org/2014/07/heartlands-science-director-breaks-ground-to-rein-in-us-epa/

Article 3:  http://illinoisreview.typepad.com/illinoisreview/2014/07/scientist-honored-at-heartland-conference-seeks-us-congressional-seat.html

Article 4:  http://illinoisreview.typepad.com/illinoisreview/2014/07/ready-thorner-un-and-its-auspices-bear-responsibility-for-global-warming-hysteria.html#more

[Originally published at Illinois Review]

 

Categories: On the Blog

UN and its Auspices Bear Responsibility for Global Warming Hysteria

July 28, 2014, 3:32 PM

President Obama has made 2014 his “year of action” and plans to use his executive authority to implement various actions of his agenda that are too divisive for Congress to consider. John Podesta, as White House adviser, was brought on board late last year to help Obama find ways to use executive orders to unilaterally push climate policies.

The EPA has already released emissions limits for existing coal-fired plant.  Early last month the EPA rolled out new proposed rules that would require power plants to slash carbon emissions by 30 percent over the next 15 years as part of the Obama administration efforts to curb air pollution and fight climate change.

Recently (July 23) a coalition of top business groups expressed rising concerns over the Environmental Protection Agency’s plans to cut carbon emissions from existing power plants and demanded more time to respond.  The same business group coalition is also eying a legal battle against the Obama administration if called for.  According to the EIA (Energy Information Administration), if power companies are further mandated to comply with the EPA’s Mercury and Air Toxics Standard (MATS) which limit mercury emissions and others pollutants, it is estimated that by 2040 this nation will have lost 15% of its coal-fired capacity.

Before drastic action is taken to curb CO2 emissions which would result in higher energy prices, the loss of jobs, certain electricity black outs, and an overall drag on this nation’s economy and productivity, shouldn’t both sides of the global warming argument be heard?  Given a fair and balanced approach, those Americans who accept Global Warming as settled science might not be so willing to go along with alarmists who are prepared to ruin the economy, sacrifice jobs, and our standard of living all for the sake of a crusade being promoted and conducted by politicians and world leaders seeking to tell everyone else how to live.

Undoubtedly Al Gore has done much to promote alarm and concern that catastrophic Global Warming is taking place through his 2006 Academy Award winning documentary film, An Inconvenient Truth.

UN as a Promoter of One World Government through social engineering

Understanding how the issue of Climate Change originated and why green energy vs. carbon-produced energy sources is now being pushed by nations all over the world (including the U.S.), requires some historical knowledge.  Social engineering has been the orchestrated role of the progressive-oriented United Nations since its founding in 1945, when 50 nations and several non-governmental organizations signed the U.N. Charter.  Today almost every fully recognized independent states are member states in the U.N.  If accepted for U.N. membership, member states must accept all obligations outlined in the Charter and be willing to carry out any action to satisfy those obligations.

An attempt at U.N. social engineering took place this week on Tuesday, July 22nd, when the U.S. Senate’s Foreign Relations Committee began discussion of the United Nations Convention of the Rights of Persons with Disabilities(CPRD).  Should the Senate approve the UN CPRD treaty, it could threaten U.S. sovereignty and parental rights, putting this nation under international law when it comes to parenting our special needs children by giving the U.N. discretion over healthcare and education decisions for special needs kids.   Our nation already has laws to protect Americans with disabilities!

UN’s Rio+20 conference:  a blueprint for sustainable development worldwide, with emphasis on the environment

Operating within the U.N. is the United Nations Environment Programme (UNEP) established in 1972, with its mandate “to promote the wise use and sustainable development of the global environment.”  This agency has become the leading global environmental authority that sets the global environmental agenda, that promotes the coherent implementation of the environmental dimensions of sustainable development within the United Nations system, and that serves as the authoritative advocate for the global environment.

Twenty years after the establishment of the UNEP, the UN Climate Change crusade began in earnest.  Initiated at the UN Rio+20 Conference (also known as the “Earth Summit”) held from June 3-14, 1992, the Conference themes were that of a green economy in the context of an institutional framework for “sustainable development” to eradicate poverty.  The two-week 1992 UN Earth Summit produced Agenda 21, adopted as a climax to a process that had begun in 1989 through negotiations among all U.N Member States.  Its intent was to serve as a wide-ranging blueprint for action to achieve sustainable development worldwide.  As written, Agenda 21 was a  Declaration on Environment and Development, the Statement of Forest Principles, the United Nations Framework Convention on Climate Change and the United Nations Convention on Biological Diversity.

172 governments participated in the 1992 Rio Earth Summit, 108 as heads of State of Government.  George H. Bush represented the U.S.  The UN Rio+20 “Earth Summit” set the agenda for further UN conferences, at which time the emphasis continued on the need for “environmentally sustainable development” — that which meets the needs of the present without compromising the ability of future generations to meet their own needs. Subsequent U.N. Conferences included those held Copenhagen (2009), Cancun (2010), and Durbin (2011).

Sustainable government in the here and now

An example of sustainable development presently being enacted throughout the world under the guise of saving the planet from global warming, was brought home in a recent article titled, “Agenda 21:  Home Sweet Home in Freight Shipping Containers,” written by senior columnist for Canada Free Press, Ileana Johnson, and best-selling author of UN Agenda 21:  Environmental Piracy.  Ileana Johnson relates how damaged shipping containers are now being tuned into housing units in this nation and throughout the world

Writes Ileana Johnson:  These tiny spaces are expensive but they give the occupants a false sense of saving money and the planet by not using a car, walking or biking everywhere, just like the zoning environmentalists have been pushing for a while now, high density, and high rise living, five minutes from work, school, shopping, and play while the metro is nearby. Absolute heaven if you want to live like a rat in an 8-by-40-foot box! Who would not enjoy living in “lovingly repurposed steel husks” that have been previously sloshing across oceans.

So it is that the progressive UN-inspired social engineering projects of Sustainable Urbanism, Sustainable Development, and Equitable Communities are now being implemented around the world.  Having been adopted  at the UN’s Rio+20, the UN’s social engineering projects are not just aimed at destroying national sovereignty, language, and cultural identity.  Social engineering, as being imposed on entire neighborhoods, is resulting in a massive replacement of rural areas and suburban sprawl with high density, high rise urban dwellings, all in the name of green environmentalism as a way of saving the planet from the destruction of manufactured man-made global warming/climate change.

UN International Panel on Climate Change (IPCC)

In tandem with the UN Conferences, which have colored the thinking of world leaders since 1992 and have led them to become advocates of Global Warming, is the UN’s Intergovernmental Panel on Climate Change, a scientificintergovernmental body under the auspices of the United Nations, set up at the request of member governments.  So far there have been five reports.  All of the IPCC reports assess scientific information relevant to:

1.  Human-induced climate change.

2.  The impacts of human-induced climate change.’

3.  Options for adaptation and mitigation.

The IPCC’s Fifth Assessment Report (WGII AR5) was the product of this year’s March 25-29 meeting in Yokohama, Japan. As with the other four assessment reports, the consequences of Global Warming were many and required the issuance of a thirty-two page report for policymakers!  The AR5 report reads like a bad novel with consequence after consequence stated unless human induced climate change is addressed without delay.

Evaluatng IPCC scientists

John Christy, Professor of Atmospheric Science, University of Alabama, describes the IPCC as a framework around which hundreds of scientists and other participants are organized to mine the panoply of climate change literature to produce a synthesis of the most important land relevant findings.  These finding are published every few years to help policymakers keep tabs on where the participants chosen for the IPCC believe the Earth’s climate has been, where it is going, and what might be done to adapt to and or even adjust the predicted outcome.

Although Christy refers to most IPCC participants as scientists who bring an aura of objectivity to the task, he does note two drawbacks which limit the objectivity of IPCC scientists: 

1. IPCC is a political process to the extent that governments are involved.  Lead Authors are nominated by their own governments.

2. Scientists are mere mortals looking at a system so complex that it’s impossible to predict the future state even five day ahead.  It doesn’t help that it’s tempting among scientists as a group to succumb to group-think and the herd-instinct (now formally called the “informational cascade.”  Scientists like to be the “ones who know” and not thought of as “ones who do not know.

As far as process is concerned, IPCC scientist trust computer simulations more than actual facts and actual measurements.  Many times there are not exact values for the coefficients in computer modes.  There are only ranges of potential values.  By moving a bunch of these parameters to one side or the other, a scientist of computer modeler can usually get very different results — ones that are favorable to the individual or institution doing the study which, in turn, insures a continuance of government funding.

Patrick Moore, Ph.D., once a Greenpeace Insider, lashes out at UN’s IPCC. 


Patrick Moore, Ph. D. at the 9th International Conference on Global Warming  

Moore co-founded the environmental activist group Greenpeace as a PhD student in ecology in 1971, but left Greenpeace in 1986 after the group became more interested in “politics” than science.   Patrick Moore has angered environmentalist groups after saying climate change is “not caused by humans” and there is “no scientific proof” to back global warming alarmism.

On February 28, 2014, Moore told a US Senate Committee:  “There is no scientific proof that human emissions of carbon dioxide are the dominant cause of the minor warming of the Earth’s atmosphere over the past 100 years,”  “If there were such a proof, it would be written down for all to see.  No actual proof, as it is understood in science, exists.”

Patrick Moore is critical of the UN’s Intergovernmental Panel on Climate Change (IPCC) for claiming “it is extremely likely” that human activity is the “dominant cause” for global warning, noting that “extremely likely” is not a scientific term.

Confessions of a Greenpeace Dropout: The Making of a Sensible Environmentalist is Moore’s firsthand account of his many year as an ultimate Greenpeace insider.

Dr. Patrick Moore was the winner of The Speaks Truth to Power Award in Las Vegas at the 9th International Conference of Climate Change.

Articles by Nancy Thorner based on Heartland’s 9th International Conference on Global Warming:

 

[Originally published at Illinois Review]

Categories: On the Blog

2007–a great year for growing bad legislation like the ethanol mandate

July 28, 2014, 3:16 PM

President Obama, and his administration, has enacted so many foolish and cost-increasing energy policies, it is easy to think that they are his purview alone. But in 2007, Republicans were just as guilty. Seeds were planted and a garden of bad legislation took root in a totally different energy environment. At the time, the growth seemed like something worthy of cultivation. However, what sprouted up more closely resembles a weed that needs to be yanked out.

Last week, I wrote about Australia’s carbon tax that was pulled on July 17. Its seeds were also planted in 2007, though not germinated until 2011. Prime Minister Abbott promised to eradicate the unpopular plant—and after nearly a year of struggle, he did.

2007 was also the year of the Renewable Portfolio Standard (RPS). Around that time, more than half the states put in a mandate requiring increasing amounts of wind and solar power be incorporated into the energy mix the local utilities provided for their customers. It was expected that the RPS would become a much-admired garden with wind turbines blowing in the breeze and solar panels turning toward the sun like sunflowers.

Instead, the RPS has been an expensive folly. Angering the ratepayers, electricity prices have gone up. Groups, like the American Bird Conservancy, have filed suit against the U.S. Fish and Wildlife Service because it allows bald and golden eagles to be chopped up by wind turbines without punishment to the operators. Industrial solar installations are in trouble due to the massive land use and literally frying birds that fly through the reflected sunlight. The mandates have created false markets and bred crony corruption that has the beneficiaries squawking when legislatures threaten to pull plans that have grown like kudzu. Yet, many states have now introduced legislation to trim, or uproot, the plans that sounded so good back in 2007. Though none has actually been yanked out, Ohio just put a pause on its RPS.

The RPS was state legislation; the RFS, federal.

Enacted, in 2005 and strengthened in 2007, the Renewable Fuel Standard (RFS)—also known as the ethanol mandate—had true bipartisan support (something that is difficult to imagine in today’s political climate). Both Republicans and Democrats lauded the RFS as America’s solution to U.S. dependence on foreign oil. In signing the Energy Independence and Security Act that contained the RFS, President George W. Bush promised it would end our addiction to oil by growing our gas. Although it was passed by Congress with the best of intentions, it, too, has become a costly, wasteful, and politically-charged fiasco that has created an artificial market for corn-based ethanol and driven up both fuel and food prices while threatening to damage millions of families’ most prized and essential possessions: their cars and trucks.

Times have changed. People are no longer lining up to view the garden of renewables as they do to stroll through the spectacular floral displays at Las Vegas’ Bellagio—where teams of specialized staff maintain the stylized gardens. At the Bellagio, you can gaze gratis. America’s renewable garden is costly at a time when our citizens are forced to cut back on everything else.

Compared to 2007, several things are different today. The big one is the economy. We, as a country, were still living large in 2007. We were also still dependent on oil from overseas and our purchases were funding terrorism. Plus, it was, then, generally believed by many that our globe was warming—and it was our fault because of burning fossil fuels. When presented with the idea of growing our gasoline, even though it might cost more, it seemed worth it—after all, what was a few cents a gallon to thumb our nose at the Middle East and save the planet?

But this is a different day. A few cents a gallon matters now. Thanks to the combined technologies of horizontal drilling and hydraulic fracturing, America is rich with oil-and-gas resources—and we could be truly energy secure if there were greater access to federal lands. Since 2007, the U.S. has trimmed our CO2 emissions—while they’ve grown globally. The predicted warming (and accompanying catastrophes) hasn’t happened. Instead, it appears that the increased CO2 has generated record harvests—despite predictions to the contrary.

But the seeds planted in 2007 have grown false markets that need the mandates—both for electricity generation and transportation fuels—to stake them up, as they can’t survive on their own. Talk of yanking the mandates is likened to cutting down the once-a-year blossom of the Queen of the Night. “How could you?”  “You’ll kill jobs!”  Elected officials, such as Congressman Steve King (R-IA), who are normally fiscally conservative, vote to continue the boondoggles that benefit his state.

When the Energy Independence and Security Act was passed in 2007, it was assumed that gasoline demand would continue to rise indefinitely so larger volumes of ethanol could be blended into gasoline every year to create E10, a motor fuel comprised of 90 percent gasoline and 10 percent ethanol. Rather than requiring a percentage of ethanol, the law mandated a growing number of gallons of ethanol be used.

Instead, due to increased vehicle efficiencies and a bad economy, gasoline demand peaked in 2007 and began to decline, reducing the amount of gasoline consumed in the U.S. Still, the law requires refiners to blend ever-increasing volumes of ethanol into gasoline every year until 36 billion gallons of ethanol is blended into the nation’s fuel supplies by 2022.

It is the mandate that allowed the ethanol tax credit (a.k.a. subsidy) to expire at beginning of 2012. The growing mandates gave the corn farmers plenty of incentive.

In the modern era, with ethanol no longer needed due to America’s increasing oil production and the mandates’ unreasonable requirements, an unusual collection of opponents has risen up against ethanol:environmentalists and big oilauto manufacturers and anti-hunger groups.

Much to everyone’s surprise, last November the EPA came out with a proposal to use its authority to make a practical decision to keep the mandate from increasing that resulted in a cut in the amount of biofuels that refiners would need to mix into their fuels—a decision that was required to be made by the end of November 2013. To date, in the seventh month of 2014, the EPA still has not released the 2014 mandates. Refiners are still waiting.

On Thursday, July 24, White House Advisor John Podesta met with select Democrat Senators including Heidi Heitkamp (D-ND) and Al Franken (D-MN) to discuss the EPA’s November 2013 proposal to lower ethanol targets—which, according to reports, Franken called: “unacceptable.” The Hill quotes Franken as saying: “White House adviser John Podesta has indicated the administration plans to raise the amount of ethanol and other biofuels that must be blended into the nation’s fuel supply.” And, in another report, The Hill says: “That may mean Podesta’s signal—that the levels of ethanol, biodiesel and other biofuels will be increased in the EPA’s final rule—is as good as gold.” A decision from the EPA is expected to “be imminent.”

All of this amid new reports that ethanol has little if any effect on reducing greenhouse gas emissions blamed for climate change. A Congressional Budget Office report, released on June 26, states: “available evidensce suggests that replacing gasoline with corn ethanol has only limited potential for reducing emissions (and some studies indicate that it could increase emissions).”

It may have been Bush who planted the ethanol mandate, but it is the Obama administration that is fertilizing it and keeping it alive, when it should be yanked out by its roots.

The author of Energy Freedom, Marita Noon serves as the executive director for Energy Makes America Great Inc. and the companion educational organization, the Citizens’ Alliance for Responsible Energy (CARE). Together they work to educate the public and influence policy makers regarding energy, its role in freedom, and the American way of life. Combining energy, news, politics, and, the environment through public events, speaking engagements, and media, the organizations’ combined efforts serve as America’s voice for energy.

[Originally published at RedState]

 

Categories: On the Blog

Area Lobbyist: Why Do Republicans Want To Kill My Job?

July 28, 2014, 9:33 AM

John Feehery’s piece here on the dangers of rising Republican skepticism for big business is an amusing read, not just because I’m pretty sure nearly every sentence of it can be debunked in whole or in part. The tone is one of desperate confusion: when did the Republican Party stop being knee-jerk pro-business in the subsidies and carveouts and bailouts sense? Why do they want to kill the jobs of hardworking K Street influence peddlers?

Feehery writes that “Amid the fight for the soul of the Republican Party, some elements of the GOP coalition have become overtly hostile to Big Business.” I think that’s just false: no one is “overtly hostile” to big business, nor are they hostile because it’s big. Rather, they’re hostile to big business that uses government policy to warp the marketplace. No Republican is saying that Boeing needs to be broken up, just that they don’t have special privileges.

“Defeating crony capitalism has become the battle cry of libertarian conservatives.” Wait, does he mean that “real” Republicans should actually support crony capitalism? If Feehery equates “hostility to big business” with “defeating crony capitalism,” that suggests some big business profits do in fact come from cronyism, which you would think all Americans would oppose. (Except lobbyists.)

“Big Business wants immigration reform and higher academic standards for elementary and secondary schools–policy priorities that drive the hard right into conniptions.” Well, except everybody is for immigration reform – the system is a catastrophe – and higher academic standards – they are too lax. The debate is over what kind. But to suggest that the “hard right” opposes those things because they oppose the Gang of Eight and Common Core is intellectually dishonest, almost as insulting as claiming being in favor of home schooling means you want to break down community.

“What would happen if Big Business decided to change sides? What would happen if the Chamber of Commerce suddenly stopped being a huge fundraising machine for the Republican Party and started financing pro-business Democrats? That is the dream of Sen. Chuck Schumer (D., N.Y.)–and the nightmare scenario for House Speaker John Boehner (R., Ohio).” So because big business gives Republicans a lot of money, they should do special favors for them? Of course Chuck Schumer very much wants to be the party of cronyism – but again, if the price of corporate campaign dollars is corrupt manipulation of the economy to benefit large politically connected corporations at the expense of working families, that’s a pretty high price.

“Could Democrats make a coalition of big business and labor work?” Well, to a certain extent, they already do! That’s the whole reason this argument has juice. The highway bill Feehery mentioned is essentially a money laundering scheme from taxpayers to labor bosses. And business already gives to Democrats to the tune of hundreds of millions of dollars.

Whatever grassroots liberals think about inequality, Washington Democrats know that their rhetorical attacks on the 1 percent are simply cover for doing the bidding of big business on all sorts of issues. Chris Dodd understands the danger of this hypocrisy, and so do a lot of smart Democrats.

“How could the Republicans survive as a purely populist/libertarian political party?” Wait, so only populists and libertarians oppose cronyism? I mean, I understand that’s why the U.S. Chamber is opposing Justin Amash. But is Paul Ryan “purely populist/libertarian” – he just gave a speech on cronyism and called for getting rid of the ExIm bank, which is of course currently under investigation. Are Marco Rubio, Jeb Hensarling, Darrell Issa, and Jim Jordan all crazy populist libertarians?

The truth is that anti-cronyists on the right aren’t anti-business, nor will their policy approaches lead to business suddenly shifting to a monopartisan Democratic bent. http://vlt.tc/13za They simply want businesses to earn their profits in a competitive marketplace, and they want Washington to stop sending taxpayer dollars to insulate business from risk. Suggesting that being anti-cronyism means you are opposed to business is absurd. It just means you’re opposed to Feehery’s business, which consists of profiting handsomely off an alliance between Republicans, big business, and big government policy to dole out pork.

And that’s Feehery’s real concern: that his clients won’t be around to hire the money.

Subscribe to Ben’s daily newsletter, The Transom.

[Originally published at The Federalist]

 

Categories: On the Blog

Review of Glenn Beck’s “We Will Not Conform”

July 28, 2014, 9:22 AM

Common Core is a unique issue in American politics because it has the ability to unite a variety of people from different end of the spectrum who agree on little else other than Common Core must be stopped. The reason so many people have joined forces on this issue is because they have found common bonds as parents and Americans concerned for their children and the future of America.

On July 22, 2014 Glenn Beck hosted a live, interactive event at 700 theaters across the country in order to formulate a plan of action for Americans to fight Common Core. During the show, the audience had the ability to email or tweet at Beck as well as participate in surveys to better gage the most successful types of efforts.

“This is something we all can unite on,” Beck said, “And we don’t really have a choice.” Parents across the country, regardless of political background, have witnessed the effects of the age inappropriate and unnecessary complexity that the Common Core standards have on their children. A group of parents and students gathered in the Blaze’s New York studio explained how Common Core causes needless frustration among students and takes away their desire to learn.

The standards espouse a “one-size fits all” style that disregards the fact that individual students learn differently. Brandon Gibson, a New York student, explained “Common Core makes you do it their way.” This approach has negative effects on many students who do not learn the Common Core way. Alphonsine Eglberth, a mother of a third grade student, told the audience that her son had to go to therapy when he was seven because of the frustration and anger he experienced as a result of Common Core.

The event featured an array of people from activists, to parents, to politicians willing to equip ordinary Americans with the necessary resources to stop Common Core in their own states and localities. The group was divided into five tables (research and resources, grassroots, alternatives, politics, and messaging) each geared toward providing different strategies to fight Common Core.

Matt Kibbe, president of FreedomWorks, asserted that the biggest issue with repealing Common Core is that the federal government gave the states money to implement it. If states want to continue to receive these federal funds for education, they have to conform to the federal standards. [FreedomWorks provides an overview of the issues with the Common Core standards here]

Jenni White, founder of Restore Oklahoma Public Education, agreed citing her own experience in trying to repeal Common Core in her state. In attempting to obtain answers, White found the biggest obstacle to be the state chamber of commerce. The reason for this difficulty is that the Gates Foundation has given large sums of money to the U.S. Chamber of Commerce to promote Common Core, and this has extended to the states as well.

Kathleen Jasper, founder of ConversationEd and a former high school administrator, although she holds very different political views, partnered with Glenn Beck in the effort to halt Common Core. She echoed Kibbe’s and White’s ideas that the money is one of the main issues. Jasper argued that parents must work against the machine. She explained that certain corporations are profiting off of the tests and textbooks for Common Core. The tests are designed for children to fail so that the companies make more money off of students repeatedly taking the tests and purchasing the materials for preparation. Jasper claimed that the best way to stop the machine is to boycott high stakes testing, which stops the “fuel.” [ConversationEd will host a webinar on August 24th about these tests]

With Common Core, students legally only have to take a third grade and a high school assessment, but schools offer many more tests more often. The research and resources table emphasized the importance of parents knowing their rights and holding the school boards and administrations accountable for the decisions they make about testing the students. Parents can opt out of these tests for their children, even if schools make it difficult.

In gathering information, it is important to go back to the original source to verify facts. Shane Vander Hart, president of Truth in American Education, expounded on this concept saying the impetus was Race to the Top; states wanted federal funding so they agreed to implement Race to the Top, which was $4.3 billion in earmarks. To understand Common Core, citizens should look at Race to the Top contracts as well as the National Governors’ Association, which asked the federal government to fund Common Core. Many organizations and think tanks have public records available to view. Additionally, citizens can also contact their local officials to request public records.

“Nothing is more disruptive than an informed citizen,” asserted author and syndicated columnist Michelle Malkin. Parents must know their rights and exercise them to stop this unconstitutional takeover of education. As Emmett McGroarty, director of education at the American Principles Project, stated “Common Core ushers in a highly defective curriculum…[and] undermines the Constitution.” In this way, Common Core provides a “blueprint” for foisting other policies on the people without their approval.

In addition to being informed, action is critical. A live poll concluded that Twitter was the most effective means to spread the word about Common Core. The grassroots table explained that face-to-face (or mom-to-mom) contact is also essential. With this effort, parents can find common ground to discuss the issues at hand. The table also suggested to use pictures and examples (such as Common Core worksheets) whenever possible in order to make Common Core something real and personal so parents don’t view it as something abstract but rather something that is in their own homes.

Along with spreading the message to other parents, legislators must be made aware of the dissatisfaction with Common Core to actually bring about policy change. Kibbe stated that it is key to “get parents to understand what a difference they can make…Politicians respond to incentives. Parents represent a voting bloc that is unstoppable.”

The politics table suggested a three step plan to influence legislators and change policy. First citizens must know what they are talking about; parents must know their constitutional rights and their rights as parents. Second people must get organized; this is something visible to legislators. Finally, it is crucial to show up. Even after a repeal, it is still necessary to show up because proponents of Common Core will try to usher it back in under a new name.

It is also important to note that there are other options besides public schools using Common Core. The program discussed homeschooling, online education or “distance learning”, and charter schools, specifically in the classical model. Dr. Terrence Moore, founder of Atlanta Classical Academy, asserted that Common Core is attempting to knock out school choice when the evidence is revealing the success of classical charter schools. He also argued that it is essential to take back the public schools because they are taxpayer funded and they control the future of America.

In discussing the most successful method of communicating about Common Core to bring about change, the messaging table highlighted the importance of finding common ground with others. This common ground could be the well-being of children or local control.

The complete plan of action, with viewer input, is available at commoncorefails.com. Now is the time to act before Common Core is firmly implanted in schools and produces catastrophic effects. As Heidi Huber, founder of Ohioans Against Common Core, stated, “You can restore your country if you take back your classroom.”

 Image originally published at http://www.theblaze.com/wp-content/uploads/2014/05/Beck-We-Will-Not-Conform.png

Categories: On the Blog

To Reward or Not to Reward: Motivating Students to Learn

July 28, 2014, 7:54 AM

[NOTE: The following is excerpted from Chapter 1 of the next Heartland Institute book titled Rewards: How to use rewards to help children learn — and why teachers don’t use them well. Title of the chapter is The Psychology of Motivation.This piece was first published at The American Thinker.]

The late Jere Brophy, a longtime Michigan State University professor of educational psychology, started the second edition of his 428-page tome titled Motivating Students to Learn with the following summaries of two opposing views about how best to motivate students:

Learning is fun and exciting, at least when the curriculum is well matched to students’ interests and abilities and the teacher emphasizes hands-on activities. When you teach the right things the right way, motivation takes care of itself. If students aren’t enjoying learning, something is wrong with your curriculum and instruction — you have somehow turned an inherently enjoyable activity into drudgery.

School is inherently boring and frustrating. We require students to come, then try to teach them stuff that they don’t see a need for and don’t find meaningful. There is little support for academic achievement in the peer culture, and frequently in the home as well. A few students may be enthusiastic about learning, but most of them require the grading system and the carrots and sticks that we connect to it to pressure them to do at least enough to get by. (1)

Brophy observed that “neither [view] is valid, but each contains elements of truth.” They illustrate the two extreme ends of a continuum of views among psychologists of student motivation. At one extreme is a teaching philosophy based on what Brophy called “overly romantic views of human nature,” while at the other is a philosophy based on “overly cynical or hedonistic views of human nature.” Between these extremes lies a realistic and research-supported theory of student motivation.

The core message we deliver in Rewards: How to use rewards to help children learn – and why teachers don’t use them well is that too many teachers adhere to the first view and reject the use of rewards that have been proven to be effective in classrooms in carefully controlled studies covering many years and many thousands of students.

The well-designed reward systems we describe do not include the unearned praise and uncritical recognition associated with the self-esteem fad that swept the U.S. in recent years. Some writers observe that Millennials (persons born from the early 1980s to the 2000s, also called Generation Y) grew up believing that simply participating in a sport or “trying hard” at some other activity entitled them to rewards, regardless of their level of performance. As a result, they enter the workforce with unrealistic expectations of recognition, promotions, and pay increases [2]. Greater use of well-designed reward systems would have better prepared this generation for the challenges and responsibilities of adult life.

Rewards need not be crude “carrots and sticks”; rather, they can take the form of feedback and encouragement that make learning a rewarding experience long before the acquisition of a particular piece of knowledge or skill might earn material rewards. Learning without rewards is usually more difficult than learning with rewards. For this reason, the tendency among educators to discourage the use of rewards hurts rather than helps students.

Rewards and Learning

According to Aristotle, we become what we do [3]. Education contributes to that process by building skills and habits of mind that are learned in a variety of ways. Psychologists have identified incremental methods for helping individuals learn. Rewards constitute part of this learning enterprise when they help individuals attend to the short- and long-term goals that drive their learning [4].

When students learn something well, they reduce their costs of doing it; that is, they can use their well-absorbed knowledge or well-practiced skills nearly automatically, with little effort. The more automatic a requisite skill is, the faster a person reaches his or her goals. Skills such as recognizing letters exemplify the learning needed to reach the goal of reading. Students who struggle to distinguish a “b” from a “d” are unlikely to readily comprehend what they read. Once they achieve “automaticity” with such recognition skills, however, they can move on to word recognition and sentence comprehension. Mastering the prerequisite stages makes the later stages less costly in time and effort – even enjoyable. Just as practice in sports makes a physical skill more automatic, practice in reading makes a mental skill more automatic.

Students typically must exert effort over some period of time to acquire sufficient levels of automaticity to achieve rigorous goals. Ideally, schooling offers efficient means of allowing learners to improve their knowledge and skills and acquire increasingly advanced forms of both. Educators who use rewards to help learners persist in the face of challenging tasks to gain automaticity also help them reduce the amount of effort needed later to attain their ever more challenging goals. Appropriate rewards improve learners’ ability to perceive cues by guiding their attention to constructive action, reinforcing specific forms of learning, and rewarding high levels of achievement [5].

During learning, repetition can help individuals experience the pleasure of increasingly easy accomplishment. Repeated cycles of presentation, action, and reinforcement can foster high levels of mastery. Complex forms of personal achievement are possible only when individuals set progressively challenging personal goals requiring sustained drive or grit to attain. When the personal goals of these individuals align with those valued in the communities in which they live, they acquire social and material rewards [6].

Some credibility should be given to theories and evidence that employees may be more effective when they are involved in setting goals to which they commit themselves [7]. Students may similarly benefit.

Conclusion

Knowledge of the positive effects of rewards on motivation is well established in behavioral psychology despite the controversy in recent years over whether experimental evidence confirms or rejects the effectiveness of specific reward and punishment systems. Critics of the use of all or most rewards in learning are on the extreme end of a continuum of opinion on the subject. The results of rigorous research studies do not support their point of view, and they overlook or misrepresent research that contradicts their views.

Most experts recognize that reward systems are especially valuable at the earliest ages to help students attain the habit of deferring gratification. Failure to develop this habit can handicap learners for the rest of their lives. Students need rewards to engage in the difficult or tedious work of achieving automaticity, another key step in learning progress. Without rewards, fewer students develop the drive or grit needed to achieve high levels of skill.

Herbert J. Walberg and Joseph L. Bast are chairman and president, respectively, of The Heartland Institute and author of  Rewards: How to use rewards to help children learn — and why teachers don’t use them well (October 1, 2014; ISBN 978-1-934791-38-7).  This article is excerpted from Chapter 1, “The Psychology of Motivation.”

Notes

[1] Jere Brophy, Motivating Students to Learn (Mahwah, NJ: Lawrence Erlbaum Associations, Publishers, 2004), p. 1.

[2] Ron Alsop, The Trophy Kids Grow Up: How the Millennial Generation Is Shaking Up the Workplace (San Francisco, CA: Jossey-Bass, 2008).

[3] Aristotle, Metaphysics, trans. Joseph Sachs (Santa Fe, NM: Green Lyon Press, 1999).

[4] See Theresa A. Thorkildsen, Courtney J. Golant, and Elizabeth Cambray-Engstrom, “Essential Solidarities for Understanding Latino Adolescents’ Moral and Academic Engagement,” in Cynthia Hudley and Adele E. Gottfried, eds.,Academic Motivation and the Culture of Schooling in Childhood and Adolescence (Oxford: Oxford University Press, 2008), pp. 73–89.

[5] Jere Brophy, supra note 1; Dennis G. Wiseman and Gilbert H. Hunt, Best Practice in Motivation and Management in the Classroom (Springfield, IL: Charles C. Thomas, Publisher, Ltd., second ed., 2008).

[6] Julian L. Simon, Effort, Opportunity, and Wealth (New York, NY: Basil Blackwell, 1987).

[7] Edwin A. Locke and Gary P. Latham, A Theory of Goal Setting and Task Performance (Englewood Cliffs, NJ: Prentice Hall, 1990).

Categories: On the Blog

Science for the Picking

July 27, 2014, 9:45 AM

[Editor's note: The other day, Lawrence Kogan, an attendee of Heartland's Ninth International Conference on Climate Change July 7-9 — which featured more than 100 excellent presentations in Las Vegas — wrote . He asked me to republish it here, which I do gladly with a recommendation you read it in full]:

In a May commencement speech delivered at the University of California, Irvine, President Obama mocked members of the U.S. Congress who “duck the question,” as he put it, “of whether climate change is real by saying that they are not scientists.” Since then, articles appearing in a number of “neutral” media outlets, including the Washington PostNew York TimesLos Angeles TimesMiami HeraldSan Francisco ChronicleCape Cod TimesHuffington Post, etc., have endorsed this learned approach of addressing the issue of climate change.

Clearly, they display climate change believers’ chosen tactic of ridiculing or dismissing as climate change deniers anyone — including scientists, analysts and politicians — who dares to raise questions about the views of many within the contemporary climate science community.

Perhaps a more thoughtful way to address this matter would be to question the president and his chief science adviser, John Holdren concerning which paradigm of science they subscribe to. Is it the evolved modern notion of quantitative science built on the firm empirical principles of Newtonian physics, or is it a new postmodern brand of qualitative science incorporating precautionary and other subjective concepts — all infused with a certain sense of intellectual ascendency.

If they subscribe to Enlightenment-era science, they should legitimately ask whether the myriad scientific uncertainties discussed in the Intergovernmental Panel on Climate Change (IPCC) assessment reports, and by extension, the administration’s national climate assessments, provide reason to question whether science has clearly identified the necessary causal links definitively establishing that humankind’s activities are primarily, if not exclusively, responsible for all or most current global warming — both observed prior and future computer modeled global warming.

If however, they subscribe to what ‘futurist’ Jeremy Rifkin describes as “a radical new approach to science and technology based on the principle of sustainable development and global stewardship of the Earth’s environment” premised on “[t]he precautionary principle, [which] is designed to allow government authorities to respond pre-emptively, as well as after damage is inflicted, with a lower threshold of scientific certainty than has been the rule of thumb in the past,” they are likely to interpret the uncertainties reflected in the IPCC and administration-developed climate science assessments differently. It would certainly explain why the president has argued that immediate regulatory actions are necessary, notwithstanding the current costs, because the possible future endangerment to human health, the environment and the economy posed by inaction is unacceptable.

President’s pledge of “unparalleled transparency”

Consistent with the president’s pledge of “unparalleled transparency,” Mr. Holdren’s response would be illuminating. He would likely confirm the White House has, by design or happenstance, helped facilitate a subtle but nonetheless substantive change in the way Americans define, understand and apply principles of science to address perceived environmental and health risks.

If by design, it would reveal a non-transparent, activist-inspired agenda of shifting America away from the Enlightenment era paradigm of science premised on probability, risk assessment, prevention, causation and “hard” quantitative evidence of foreseeable harm toward the postmodern anti-Enlightenment era paradigm of science based on possibility, hazard assessment, precaution, correlation, coincidence and ‘soft’ qualitative evidence of possible and unforeseeable harm.

As the European experience demonstrates, the legal and economic consequences flowing from such a paradigm shift are real and present their own significant dangers. This new postmodern regulatory science paradigm relies more on politics and notions of policy-based science informed by environmental and social justice concerns rather than on science-based policy.

As a second question, one would ask how this shift is coming about. Mr. Holdren’s response would likely be more complex and nuanced. There are multiple mechanisms at play, ranging from moral suasion to fear mongering to nudged behavior modification to weakened scientific peer review processes.

EPA’s 2009 greenhouse gas endangerment findings: Less than stellar job of conducting proper and robust peer review

In an ideal world, Mr. Holdren and other Administration officials would acknowledge the Environmental Protection Agency (EPA) and the National Oceanic and Atmospheric Administration (NOAA) have done a less than stellar job of conducting proper and robust peer review of the climate science underlying EPA’s 2009 greenhouse gas endangerment findings. But we do not live in such a world. Indeed, they seem to have forgotten or conveniently explained away their obligation under the federal Information Quality Act to ensure each of the 28 highly influential scientific assessments summarizing and synthesizing IPCC ‘climate science’ supporting those findings had been adequately validated. This would not have been an insignificant undertaking yet it was required to detect serious systemic violations of the letter and spirit of the Act and is a serious omission.

Overwhelming evidence produced by the nonprofit Institute for Trade, Standards and Sustainable Development (ITSSD) demonstrates how EPA and NOAA, under Mr. Holdren’s stewardship of the White House Office of Science and Technology Policy-led interagency US Global Change Research Program, systematically circumvented the Information Quality Act’s most rigorous and least discretionary peer review, objectivity/bias, transparency and conflict-of-interest standards. Such evidence is contained in a recast 145-page annotated FOIA request ITSSD filed with EPA on June 30, 2014, and in a clarified FOIA request ITSSD filed with NOAA in May 2014.

As InsideEPA reported, the new EPA FOIA request seeks disclosure of “documents reflecting four different levels at which EPA should have followed IQA requirements”. By comparison, the EPA Office of Inspector General’s 2011 investigation of agency GHG endangerment finding data quality processes had been limited to only one of these levels. And as we previously reported in the Washington Times, the clarified NOAA FOIA request described how the National Research Council of the National Academies of Science had hand-selected scientists affiliated with universities receiving NOAA climate research grant monies to peer review six NOAA-developed climate assessments, presenting a significant appearance, if not actual conflict-of-interest in violation of the IQA.

EPA’s decision to circumvent the requirements of the Information Quality Act

ITSSD’s findings and analysis demonstrate that EPA’s decision to circumvent the requirements of the Information Quality Act likely originated from within and beyond the agency itself. While the full breadth of EPA’s actions date back to before the beginning of the Obama administration, a 1989 law review article written by Harvard Law School Professor Laurence Tribe with the assistance of the then editor-in-chief of the Harvard Law Review, B. Obama (“Curvature of Constitutional Space”) provides a window into the president’s views on the future of public policy and science.

It reveals the president’s long-held skepticism about the intersection of science and the law especially where matters of environmental and social justice are concerned. It also sheds light on the president’s long-held view that science and the law can be finessed, where necessary, to suit policy ends.

We arrive, then, at a point where ends seem to justify means, fair and transparent processes fall by the wayside, and critics become villains. Partisans lob amusing but ultimately unsatisfying barbs at each other while the rules of science shift behind the curtain. Given the global economic and environmental challenges faced by our nation, we should expect and demand better.

—————–

Lawrence A. Kogan is chief executive of the Institute for Trade, Standards and Sustainable Development and managing principal of The Kogan Law Group, P.C. Richard D. Otis Jr. is an environmental-policy expert and has held senior positions at the EPA.

Categories: On the Blog

Republicans Will Run On An Obamacare Replacement in 2016 – Will Democrats?

July 26, 2014, 9:00 AM

The decision in the Halbig v. Burwell case this week was an unexpected legal boon to opponents of Obamacare. Spearheaded by the Cato Institute’s Michael Cannon and law professor Jonathan Adler, the case will almost certainly lead this debate about the text of the Affordable Care Act back to the Supreme Court. My colleague Sean Davis has written a comprehensive piece on the case, particularly on the nature of the supposed “drafting error” at its core.

But whatever the ultimate outcome for Halbig, the case serves as a reminder of the uneven ground on which Obama’s health care law is likely to be standing over the next two years. Whether facing challenges in the courts, or in implementation, as we saw in the GAO’s security report this week, or simply as a matter of political approval, Obamacare is going to be a subject of uncertainty in 2016, and its survival will depend on who wins the election, as I wrote here last month.

This raises an interesting question about how the presidential candidates will interact with the law. The law’s continued instability and problems will have to be answered – but the odd circumstance likely to result from the political frame of the issue is that Republicans will put forward a plan to replace Obamacare, but Democrats won’t.

One of the lazier memes of Democratic politicians and a few too many members of the media over the past several years has been the myth that Republicans have no alternative to Obamacare. This is the sort of thing that doesn’t pass even the most basic assessment of accuracy in reporting – here is a list of the health care reforms introduced by Republican House members in 2012, and here’s one for 2013. While their plans vary in scope, there are eight things Republicans generally agree about when it comes to health care reform:

They want to end the tax bias in favor of employer-sponsored health insurance to create full portability, either through a tax credit, deductibility, or another method;

 

  • They want to incentivize the reform of medical malpractice laws, likely through carrot incentives to the states;
  • They want to allow for insurance purchases across state lines;
  • They want to support state-level pre-existing condition pools;
  • They want to fully block grant Medicaid;
  • They want to shift Medicare to premium support;
  • They want to speed up the FDA device and drug approval process; and
  • They want to maximize the consumer driven health insurance model, making high deductible + health savings account plans larger and more attractive.

Now, some feel that none of these count as Obamacare replacements, because they aren’t aimed at doing the same things Obamacare does (namely, dramatically expanding the number of people on taxpayer-subsidized insurance or entitlement programs). But that’s how Republicans will present them. There have been a host of such plans introduced in the Congress and put forward by would-be Republican presidential nominees. And it stands to reason that in 2016, every serious candidate for the presidency in the Republican Party will put forward an alternate plan or endorse one that has already been introduced, if they haven’t already. The choice of nominee will also determine the choice of the replacement plan Republicans run on.

For the Democratic nominee, however, the challenge of running in defense of Obamacare could prove more difficult than might be anticipated. Obama’s law is sacrosanct for some factions of the Democratic Party. Many observers have cited a variety of poll data showing that Americans want to fix Obamacare as opposed to repealing it or keeping it as-is. But those “fixes” are largely vague at this juncture, thanks to the administration’s decision to enforce aspects of the law as it sees fit.

Assuming for the sake of argument that Hillary Clinton is the 2016 nominee, will she really be able to navigate the entire election without putting forward a plan on how she intends to fix Obama’s law? The likeliest scenario is that she will avoid saying anything that could cause her political difficulty – urging voters not to risk going back to a pre-Obamacare era, while acknowledging that certain aspects of the law need fixing in a general sense. But fixing Obamacare in the long term (particularly fixing the kind of problems that spawned the Halbig case) will almost certainly ultimately require legislative reform, not just administrative fixes.

If this is how the 2016 election plays out as it relates to health care policy, the amusing aspect is that – for all the talk of Republicans not having a plan to replace Obamacare – we will likely end up knowing a lot more about the plan Republicans intend to put forward after the election than the one Democrats have in mind.

Originally published at The Morning Consult. 

Categories: On the Blog

Ending the Costly Integration Ban

July 26, 2014, 9:00 AM

This past Tuesday the House of Representatives, in a bipartisan voice vote, passed a bill reauthorizing the Satellite Television Extension and Localism Act. STELA allows satellite providers, such as Dish Network and DirecTV, to import TV signals from other markets when their subscribers cannot pick up over-the-air local stations. The current STELA authorization expires at the end of this year, and the Senate is expected to act on an extension before then.

A provision in the STELA reauthorization bill would end the FCC’s set-top box “integration ban.” This outdated, costly FCC regulation bans cable operators from integrating the security and programming navigation functions in set-top boxes. The supposed rationale for the integration ban, which was implemented in 2007, was to promote the availability of an independent retail market in set-top boxes.

In short, from the very beginning, in light of the competition among multichannel video providers that already then existed, it was clear that the costs imposed by the mandated separation of security and program navigation functions outweighed the consumer benefits. Consumers never took to purchasing set-top boxes enabled with the “CableCard” technology. And all the while, robust competition among video service providers has been driving ongoing enhancements and new features in the video providers’ own set-top boxes, not to mention the various new innovative navigation devices used in conjunction with Internet video services.

Congressman Bob Latta, Vice Chair of the House Commerce Committee’s Subcommittee on Communications and Technology, deserves much credit for leading the effort in the House to eliminate the integration ban. It is his bill, H.R. 3196, the “Consumer Choice in Video Devices Act,” co-sponsored by Texas Congressman Gene Greene, that is now incorporated into the STELA reauthorization. Indeed, shortly after Congressman Latta introduced H.R. 3196, he delivered a keynote address at a Free State Foundation event at which he explained that the integration ban already has resulted in over $1 billion in increased costs to consumers since it went into effect in 2007. The separation mandate imposed over $50 in additional costs on each leased set-top box. Moreover, as Congressman Latta said in introducing his bill: “In today’s ultra competitive video marketplace, cable operators have no incentive to make it more difficult for their customers to use their preferred devices to access their video programming services.”

I argued against adoption of the integration ban back in 2006 before the FCC implemented the mandated separation. Since then, I, along with other Free State Foundation scholars, have argued for its elimination on a regular, some might even say incessant, basis. Here is (it’s true!) just a small sampling of such Free State Foundation publications:

Free State Foundation Blogs:

The Integration Ban and Integrating DTV Transition Policy (2006); Integration Bans Then and Now (2007); National Broadband Plan: A Setback On Set-Top Box Regulation (2010); FCC Should Let the Sun Set on Its Set-Top Box Regulations (2011); It’s Time to Remove the Costly Integration Ban (2013); Switching Off an Outdated Cable Rule: End the Costly Integration Ban (2014); STELA Offers an Opportunity to Clean Out Old Cable Regulations (2014).

Perspectives from FSF Scholars:

Don’t Inflict Analog Era Equipment Rules On The Digital Age (2006); The FCC’s Continuing, Costly Video Navigation Device Regulation (2010); AllVid Regulation Risks Harm to Next Generation Video Innovation (2012);Consumers Would Benefit from Deregulating the Video Device Market (2013); It’s Time to Remove the Costly Integration Ban (2013); A Costly Affair: Retaining Outdated Set-top Box Mandates? (2014).

Normally, I wouldn’t string-cite so much of our previous work in one piece. In this instance, perhaps you can chalk it up to celebrating the adoption of the House bill that offers the prospect of ending the integration ban and pride in our own efforts to advance this reform cause.

Or perhaps you can chalk it up to wanting to provide plenty of readily available substantive educational material for the Senators who will now be considering a STELA authorization bill. There is no reason that the Senate should not follow the House’s lead in ending the costly integration ban. Indeed, for the benefit of consumers, there is every reason for it to do so.

PS – By focusing in this piece on ending the integration ban, regular readers know that, by no means, do I wish to imply that there is not a need to eliminate other outdated video regulations, many of which date back to the 1992 Cable Act or even before. Of course, twenty-two years ago the video marketplace was not nearly as competitive as it is today. Indeed, many of the Perspectives and blogs listed above, in addition to advocating an end to the integration ban, address the need for comprehensive reform of video regulations, such as, for example, eliminating or modifying retransmission/must carry, program carriage, channel placement, and basic tier “must buy” mandates.

Categories: On the Blog

Silicon Valley’s 6 Biggest Net Neutrality Fantasies Special Report

July 25, 2014, 2:14 PM

If Silicon Valley folks are indeed the smartest of the smart, how could they be so easily fooled on net neutrality?

Normally smarts distinguish between what’s testable and real versus what is the pixie-dust of dreams.

So where’s the real data and sound scientific thinking behind Silicon Valley’s grandiose net neutrality presumptions?

Why isn’t Silicon Valley adhering to its own data-driven, scientific decision-making principles?

Summary of Silicon Valley’s 6 Biggest Net Neutrality Fantasies:
1.    The law and courts don’t matter.
2.    Congress doesn’t matter.
3.    Calling for maximal regulation of ISPs poses no regulatory risk to Silicon Valley.
4.    “Separations” of the transmission component of telecommunications from the computer processing component is easy.
5.    Having to make the case for Title II forbearance isn’t acknowledging that Title II would broadly.
6.    The FCC’s forbearance tool is precise, controllable, and predictable.

Silicon Valley’s 6 Biggest Net Neutrality Fantasies:

Fantasy #1: The law and courts don’t matter.

Silicon Valley interests are effectively urging the FCC to ignore two obvious and core data points everyone knows, i.e. the FCC’s claimed authority on net neutrality has been overturned twice in Comcast v. FCC and Verizon v. FCC.

On top of that they want the FCC to effectively ignore the obvious main thrust of the D.C. Court of Appeals decision in Verizon v. FCC that it is illegal for the FCC to compel an information service provider to furnish service to others at no cost (p. 53).

That’s exactly what Silicon Valley interests are calling for in asking the FCC to regulate Internet peering for the first time to ban “fast lanes” and to compel ISPs to carry all downstream traffic at no cost, i.e. “zero pricing” (p. 60).

In effect, Silicon Valley is saying we don’t care what the law or court says, or what authority the FCC may have or not have, we want what we want and three FCC commissioners should give us what we demand.

Why is no one in Silicon Valley objectively looking at this available court “data” like a scientist would?

Fantasy #2: Congress doesn’t matter.

Another data point that’s central to this equation is that the FCC is a creature of Congress. Translation: the FCC works for and is funded by the Congress under the Constitution.

Consider how Silicon Valley interests are ignoring this obvious Congressional data relevant to their net neutrality regulatory algorithm.

The last study with data relevant to this test was Congress’s reaction to the FCC’s 2010 Open Internet Order. Then in letters to the FCC, Members of Congress opposed Title II reclassification by a 6-1 margin; 56% (299 of 535 members) wrote in opposition; see: House Democrat letter, House Republican letter, & Senate letter. A small minority of Congress 9% (49 of 535 members) wrote in support.

In the 2010 mid-term elections, >950 candidates ran for Senate or House seats, and only 95 or 10% took the PCCC/FreePress net neutrality pledge. All 95 lost. Simply the data on FreePress’ form of “real” net neutrality was 95-0 against.

In 2011, the House also passed a rare Resolution of Disapproval of the FCC’s 2010 Open Internet Order 240-179.

The House also passed a bill 244-181 to defund any FCC implementation of the FCC Open Internet Order.

In July of 2014, consider new data on how Congress reacts when the FCC seeks to ignore the law and Constitution and overreach its statutory authority.

The House voted this month 223-200 to prevent the FCC Chairman’s stated goal of over-riding state legislatures on the issue of municipal broadband networks.

In addition, Congress is seriously preparing to update the Communications Act for the 21st century which would put all of the FCC’s obsolescing legal authority up in the air. Overstepping their authority and ignoring court rulings for a third time on net neutrality, could put the FCC’s authority to enforce net neutrality long-term seriously at risk.

How is it smart scientific decision-making for Silicon Valley interests to wholesale exclude all of this “real” and relevant congressional data from their net neutrality advocacy algorithm?

Fantasy #3: Calling for maximal regulation of ISPs poses no regulatory risk to Silicon Valley.

Influential progressive and U.S. Senator Elizabeth Warren just spoke last week at NetRoots Nation, a convention for liberal bloggers and activists and laid out her vision for the 11 tenets of progressivism.

Her #3 was on net neutrality: “We believe that the Internet shouldn’t be rigged to benefit big corporations, and that means real net neutrality.” Note that she said “big corporations,” not big ISPs or big cable or telecom companies.

Consider the data that “big corporation” members of CCIA are calling for the “nuclear option” of Title II reclassification of broadband; and they are Google with a valuation of $400 billion making it is the third biggest corporation in America, Facebook with a very big valuation of $180 billion, eBay with a big valuation of $64 billion, and Yahoo with a big valuation of $34 billion.

Consider the data that Silicon Valley’s Big Corporations are the least taxed, least regulated, most protected, most immunized from liability, and least diverse of America’s big corporations, and that at the same time they are officially calling for maximal permanent price regulation of another industry in a way that conveniently would enrich them with the corporate welfare of crony capitalism, i.e. multi-billion dollar government price subsidies paid entirely by average consumer internet users.

In what tested Silicon Valley algorithm does the simple equation of tinder + spark not = combustion?

What engineer logically would think of building on top of such an unstable foundation?

Fantasy #4: “Separations” of the transmission component of telecommunications from the computer processing component is easy.

Reclassifying the transmission component of broadband access as a telecommunications service separate from the computer processing component may appear to be a binary sort to computer engineers, but to regulators whose operative definitions are not scientific definitions, but legal-regulatory definitions, it would be like trying to unscramble eggs all over the country all the time.

No one knows better than computer engineers and scientists that continuous, analog, circuit-switched, networks represent radically different engineering and science from discontinuous, digital, packet-switched networks.

They understand the inherent “transmission” predictability of traditional “telecommunications” is by design the antithesis of the inherent “transmission” un-predictability of today’s Internet packet “information services.”

If they think about this in the regulatory context, computer engineers and scientists will grasp the folly and fantasy of a non-engineering, non scientific FCC trying to look at today’s packet-switched Internet transmissions as if they were still the traditional circuit-switched “telecommunications” transmissions of yesteryear.

For example, is a best-efforts Internet transmission that can re-route or resend dropped packets via multiple routing points around the country and world, as easy to hive off, isolate, and track as continuous, circuit transmission “telecommunications” between two well-known points?

Classifying the “transmission” of data packets, as analog “telecommunications” also would destroy the purposeful distinction of regulated vs. unregulated services first established in the 1956 AT&T consent decree between telecommunications and data services. Treating today’s data transmissions like the voice “telecommunications” transmissions of yesteryear also would eviscerate the very important practical separation of legacy common-carrier-regulated voice service and today’s unregulated Internet information services.

Speaking of “separations” policy, most in Silicon Valley are joyfully ignorant about the FCC’s “jurisdictional separations” policy to apportion the inter-state versus intra-state costs of telecommunications to determine appropriate telecommunication subsidies.

FCC “separations” activities are arguably the most arcane, complex and difficult activities the FCC has been responsible for administering, and that is when the transmission paths at issue are well-known, highly-predictable and measurable.

Imagine the mind-boggling new FCC “separations” effort required for the FCC to separate the pure transmission of packets from the computer processing of packets when all sorts of reasonable network management computer processing is embedded throughout the Internet network of networks to prevent denial of service attacks, viruses, spam etc.

If geographic separations of telecommunications involved mind-numbing micro-monitoring of transmission flows, imagine then how beyond-mind-numbing, nano-monitoring of separations of “pure” transmissions from “processed” transmissions would be?

Devising rules to do this, putting them out for comment, and reply comments, then passing a rule to implement them, and then designing and testing the technology and human oversight processes to implement and monitor them would take many years, and likely encounter complexity management problems like the Obamacare website suffered through last year.

What computer engineer and scientist would want to wish that amount of abject inefficiency on anyone?

Fantasy #5: Having to make the case for Title II forbearance isn’t acknowledging that Title II would broadly.

Once again, CCIA, of which Big Internet corporations Google, Facebook, eBay, and Yahoo are members, formally acknowledges that reclassification of broadband as a Title II service demands forbearance to ensure that it is not more broadly applied than Silicon Valley wants.

If Silicon Valley acknowledges that Title II would apply more broadly than just to ISP services, on what objective data or basis, does Silicon Valley believe they would be perfectly exempt or immune?

If the FCC truly cared about preventing those with market power from discriminating, why wouldn’t the FCC consider applying potential Title II regulation to companies in addition to ISPs which exhibit market power on the edge and also have a commercial incentive to discriminate?

What data is there that only ISPs discriminate and no edge company discriminates?

What data is there that market power only exists around broadband access and not in other parts of the Internet ecosystem involving Internet transmissions?

What data is there that shows that the type of private information that the FCC protects under Title II privacy regulations is not like the private information that edge providers use for commercial purposes without a consumer’s reasonable consent?

Simply, what data gives Silicon Valley confidence no Title II regulation could affect them?

Fantasy #6: The FCC’s forbearance tool is precise, controllable, and predictable.

Washington is not the predictable mathematical world of engineering or computer programming.

Washington is the much more uncertain and shifting world of public policy, arcane legal interpretation, politics, and public relations.

Those that assume the FCC could easily and surgically apply Title II regulation to specific companies in specific services, in specific ways, in a specific certain process, and in a specific timeframe are in a fantasy world, not a data-driven, scientific one.

All the data available at the FCC — which chronicles every time the FCC has tried to officially apply forbearance — shows the process to be inherently slow; easily-contested, and unpredictable.

What data or scientific method is Silicon Valley relying on to presume that forbearance for net neutrality would be easy?

What engineer would imagine that they and their partner contractor (the FCC) would be the only one to determine if the forbearance structure was sound and legal, and not a building inspector (the courts)?

Conclusion:

The data above indicate that the smartest of the smart people of Silicon Valley, have been fooled by their net neutrality movement allies. Much of what they have been assured is true, is in fact not, when one takes the time and effort to examine the actual available data.

Somehow Silicon Valley leaders have let themselves collectively go out on a franchise-endangering limb without any of the data-driven or scientific rigor that they normally require to make important strategic business decisions.

Maybe the Silicon Valley mono-culture and group-think assumed that someone in Silicon Valley must have checked the data or tested the many fundamental assumptions undergirding Silicon Valley’s net neutrality position, so they did not have to worry about it.

Silicon Valley has been conned into thinking that pressuring the FCC to consider maximal regulation of a key part of the Internet ecosystem is safe to do, and also is in their self-interest, when ultimately it is not.

As the old poker adage says, if you look around the table and can’t identify the sucker, it’s you.

Categories: On the Blog

Capitalists Shouldn’t Be Afraid of Pope Francis

July 25, 2014, 9:00 AM

All over the world, advocates of the free market are looking askance at Pope Francis. Since succeeding Benedict XVI in 2013, Pope Francis has mounted a vocal challenge to what he sees as the now dominant global ideology of capitalism:

“While the earnings of a minority are growing exponentially, so too is the gap separating the majority from the prosperity enjoyed by the happy few. This imbalance is the result of ideologies that defend the absolute autonomy of the marketplace and financial speculation.”

The pope’s stance on capitalism and excessive acquisitiveness has earned substantial criticism from the conservative and pro-market media. Rush Limbaugh got a lot of publicity when he commented, “This is just pure Marxism coming out of the mouth of the pope.” Paul Ryan was slightly more charitable than Rush, arguing that Pope Francis was merely ignorant of capitalism. “The guy is from Argentina, they haven’t had real capitalism in Argentina,” Ryan said.

Ryan and Limbaugh represent a fairly accurate sample of the reactions of many leaders and commentators on the right. They either conclude the pope is a left-wing radical, or simply unaware of the world outside of Argentinian crony capitalism. Neither of these views is a fair or accurate representation of what the pope is talking about.

The Catholic Church and Capitalism

In reality, Pope Francis is not suggesting a radical alteration of traditional Catholic teaching. Indeed, even his immediate predecessors were critical of the “excesses” of the capitalist economic system. Pope Benedict XVI, in the wake of the Great Recession, was extremely critical of the financial institutions at the center of the crisis:

“In recent years, a new cosmopolitan class of managers has emerged, who are often answerable only to the shareholders generally consisting of anonymous funds which de facto determine their remuneration…Profit is useful if it serves as a means towards an end. Once profit becomes the exclusive goal, if it is produced by improper means and without the common good as its ultimate end, it risks destroying wealth and creating poverty.”

Even the pope most favorably disposed toward capitalism in recent decades, John Paul II, was not without reservations about unfettered capitalism. In a 1989 encyclical, John Paul wrote that, “The free market is the most efficient instrument for utilizing resources and effectively responding to needs. But there are many human needs which find no place on the market.” John Paul argued that the countries that were then coming out of the shadow of Soviet Communism should adopt a free enterprise system, but that it would have to be constrained not just by legal boundaries, but also “ethical and religious” ones.

There is no doubt that throughout the Cold War ers, and in the two decades of American ascendancy after its conclusion, that the Catholic Church hierarchy has been, by and large, rather well disposed toward the capitalist world order led by the United States. From the perspective of self-preservation, this made perfect sense. Taken in that historical context, it is understandable that the Catholic Church now feels comfortable shedding its close attachment to the system of global capitalism. The threat of Communism has been vanquished and the Church’s long-term survival is largely secure thanks to large and growing congregations of adherents in Latin America and Africa. Ironically, the very success of capitalism and free trade in defeating its ideological rivals has made the world safe for a Catholic Church that can pursue an independent economic worldview.

Ethical Capitalism

Understanding that Pope Francis belongs to a long tradition of Church skepticism toward unregulated free markets (and is not a Latin American Marxist aberration) is an important step toward understanding where Catholic teaching is heading and what that means for the rest of world. What exactly is the pope proposing?

Basically, Pope Francis is arguing for a change in the culture of capitalism. According to Archbishop Timothy Dolan, the pope is advocating a kind of “virtuous capitalism.” Dolan says Pope Francis desires to “remind us that free economic activity should indeed be pursued, but the human dignity of our needy brothers and sisters must always be at the center of our attention.”

Taken in this light, Pope Francis is not a radical, but simply an advocate for the welfare of the needy. Charity is one of the central tenets of Catholicism, so this should be of little surprise. The pope seems skeptical of the ability of unregulated capitalism to provide for those who are vulnerable. That may rub champions of the free market the wrong way, but what do they expect from the leader of an organization that is nominally committed to the welfare of all people and to providing for the needy and suffering. The world is unquestionably full of a great deal of suffering, so it is unsurprising that a man who has dedicated his life to service would be upset at the economic system he sees prevailing around the world.

Squaring the Circle

Rather than dismissing Pope Francis as an anti-free market socialist, supporters of free enterprise ought to pay more attention to what he is saying. Corporations, if the recent Supreme Court case concerning Hobby Lobby is to be accepted, are capable of holding ethical stances. Is it anti-capitalist to challenge firms, and their owners and managers, to think in terms of ethics and morality? It seems like corporations ought to be able to act in responsible ways.

The best way to defeat the proponents of statism and socialism is not to reject what Pope Francis is saying, but to consider carefully what he is saying. If the free market is to survive, the actors within it must behave in such a way as to not exploit the weakness of others. That can, and often does, happen in well-functioning free markets. We should pray that such moral capitalism will prevail all over the world.

Categories: On the Blog

Mismanagement, Not Global Warming, Caused Chicago Sewage Overflows

July 24, 2014, 10:55 AM

Global warming is not the reason why Chicago’s 1800s-era sewer system occasionally floods people’s basements, despite Washington Post propaganda to the contrary. Instead, the culprits are the age of Chicago’s sewer system and the city’s tremendous population growth since the 1800s.

Utilizing global warming alarmists’ same tired playbook of mischaracterized anecdotes, the Washington Post published an article this morning highlighting the story of a Chicago woman whose basement flooded when the city’s aging sewage system could not adequately discharge water during a strong rainstorm. According to the Post, because sewage overflow occurred and because global warming is also occurring, global warming must be to blame for such unwelcome sewage overflow.

The facts tell a completely different story. Chicago has the oldest sewage system among large American cities. As the Post acknowledged, Chicago’s sewage system is ancient and obsolete, “designed to absorb rain nearly 120 years ago.” Of course, 120 years ago, Chicago’s population barely topped 1 million people. Today, Chicago’s population is nearly 3 million people.

Chicago sits on a flat plain that makes effective water and sewage removal particularly problematic. The Chicago River used to be “little more than a creek” that swelled dramatically during rainfall or snowmelt events. Despite its small size, the river accomplished its natural purpose well, quickly discharging excess water into nearby Lake Michigan.

As the city grew, however, civil engineers in the 1800s devised a scheme that reversed the flow of the Chicago River and required the city’s sewage and water overflow to traverse a much longer route into the Mississippi River basin. Sewage overflows during rain events occurred almost from the start. According to the Encyclopedia of Chicago, “With Chicago’s continued growth, this system could not maintain the reversal under adverse weather conditions.” These sewage systems failures, of course, began long before humans drove SUVs and derived electricity from coal-fired power plants.

Objective data and peer-reviewed studies show no increase in high-flow (flooding) events for streams and rivers still in a predominantly natural state. The only increase in flooding occurs in rivers and streams altered by human population growth and civil engineering, as is the case with the Chicago River. Unless one chooses to argue that global warming causes human population growth, especially in urban areas, there is absolutely no link between increased global warming and flooding events like those that occur in Chicago.

Indeed, Chicago’s sewage failures stand in stark contrast to those of other cities that more effectively upgrade their sewage systems and sewage capacities to keep pace with urban water demands. Even woefully managed Detroit has reduced its sewage overflow events by 80 percent since 1995. One cannot claim global warming is to blame for Chicago sewage overflow events unless one similarly claims global warming deserves credit for the dramatic decline in Detroit sewage overflow events.

Even in Chicago, sewage overflow events are poised to largely become a thing of the past. As the Post acknowledged, the city is in the process of expanding its sewage capacity, which should triple by the end of next year. By 2029, the city’s sewage capacity will be more than 600% of current capacity.

When Chicago’s sewage overflow events soon become a thing of the past, will the Washington Post credit global warming? Don’t bet on it.

[Originally published at Forbes]

Categories: On the Blog

Appeals Court Slightly Wounds Obamacare

July 24, 2014, 10:48 AM

It’s too soon for champagne, but perhaps a beer is in order.

In a 2-1 decision in the case of Halbig v. Burwell, a three-judge panel of the U.S. Court of Appeals for the District of Columbia Circuit has ruled that the Internal Revenue Service cannot interpret the Affordable Care Act, also known as Obamacare, as allowing subsidies for those Americans who purchase health insurance from the federal health insurance exchange known as Healthcare.gov. This is because the text of the law specifies that subsidies or tax credits are available for insurance purchased on state-created exchanges.

Later on Tuesday, the Fourth Circuit Court of Appeals ruled oppositely: that the subsidies are permissible for the federal exchange. More in this in a moment.

Should the D.C. Circuit’s ruling ever actually take effect, this would mean that those who purchased Obamacare insurance in a state that did not create its own exchange but instead relied on the federal exchange must cover the full cost of their insurance rather than have others pay for some share of it. (What a novel concept in Barack Obama’s America!)

A lower court had ruled that the intent of the law was to permit subsidies for insurance purchased on either a state or federal exchange, but the panel ruled otherwise: “Because we conclude that the ACA unambiguously restricts the section 36B subsidy to insurance purchased on Exchanges ‘established by the State,’ we reverse the district court and vacate the IRS’s regulation.”

The ruling comes down to the permissibility of the IRS to interpret the law under a relatively lenient standard: “we will uphold an agency action unless we find it to be ‘arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.’”

Despite language that clearly states that tax credits apply to insurance purchased on a state exchange, the IRS’s relevant regulation allowed them “regardless of whether the Exchange is established and operated by a State (including a regional Exchange or subsidiary Exchange) or by HHS.”

Standing for the case came not due primarily to the subsidy issue but rather the not entirely obvious fact that permitting subsidies on the federal exchange also “significantly increases the number of people who must purchase health insurance or face a penalty” through the individual mandate provisions of Obamacare.

According to the D.C. Circuit’s opinion, the impact on employers through the employer mandate is even more significant: “If credits were unavailable in states with federal Exchanges, employers there would face no penalties for failing to offer coverage. The IRS Rule has the opposite effect: by allowing credits in such states, it exposes employers there to penalties and thereby gives the employer mandate broader reach.”

The potential impact of the ruling cannot be overstated. About two-thirds of the 8 million Americans who (we are told) have purchased health insurance through an exchange did so through Healthcare.gov because only 14 states have functioning state-based exchanges.

The D.C. Circuit majority closes their opinion with a paragraph that is worth reading in its entirety:

We reach this conclusion, frankly, with reluctance. At least until states that wish to can set up exchanges, our ruling will likely have significant consequences both for the millions of individuals receiving tax credits through federal Exchanges and for health insurance markets more broadly. But, high as those stakes are, the principle of legislative supremacy that guides us is higher still. Within constitutional limits, Congress is supreme in matters of policy, and the consequence of that supremacy is that our duty when interpreting a statute is to ascertain the meaning of the words of the statute duly enacted through the formal legislative process. This limited role serves democratic interests by ensuring that policy is made by elected, politically accountable representatives, not by appointed, life-tenured judges.

This is a fascinating bit of legal writing: On one hand, the judges suggest some sympathy for those Americans who are being put through uncertainty and perhaps even harmed financially by the panel’s obviously correct ruling. On the other hand, they issue a warning that doing anything other than what they did — standing up for the plain meaning of the law’s language — would be anti-democratic.

The warning has a very specific audience: the other judges of the D.C. Circuit Court of Appeals.

A party to a case may request an en banc hearing where the entire court rules, rather than just the 3-judge panel. In a press conference on Tuesday morning, White House Press Secretary Josh Earnest (one of the most ironically named public officials in memory) said that he anticipates that the Department of Justice “will ask for a ruling from the full D.C. Circuit.”

Late last year, anticipating this very moment, Senate Majority Leader Harry Reid changed Senate filibuster rules in order to permit the packing of this court with liberals. The court had been evenly divided between Republican and Democratic appointees and it was well known that the court did not have enough work to support even its existing judges. But the Obama administration knew that this day would come and that they would need an unbalanced court to defend the unconstitutional, poorly conceived, and poorly written law that is considered this president’s “signature achievement” — a dubious compliment at best.

An en banc hearing of a court with seven Democrat appointees and four Republican appointees is likely to overturn the panel because liberals, as constituted in 2014, simply do not believe in the rule of law. It is therefore unsurprising that Josh Earnest sounded unconcerned, even dismissive of Tuesday’s ruling: “For those who are keeping score, we’re still ahead two to one here.”

Earnest was referencing the fact that lower courts have ruled for the government in this case. There are other cases making their way through the appellate process now. If the D.C. Circuit en banc reverses its own panel and if the other circuits rule in favor of the government — which seems more likely than not — that would leave the possibility for the Supreme Court to refuse to hear the case.

Typically, the Supreme Court will “grant cert” when the circuits are divided, though it also takes many cases where an appeals court is simply wrong (in the opinion of a majority of the Justices). So while the Obama administration will work hard to get an en banc reversal of the panel to prevent a split among the circuits, the resolution of the question is most likely to hinge on whether Chief Justice John Roberts is looking for an opportunity to reverse his error in NFIB v. Sebelius in which he upheld the constitutionality of Obamacare — or whether he does not want to go through that morass again.

Should the Supreme Court take the case, it would suggest that Roberts has reconsidered and that a 5-4 opinion along the same lines as Tuesday’s D.C. Circuit ruling would eventually issue.

In that case, the entire structure of Obamacare will fall under its own cost and the enormous and impermissible differences between the operations of insurance exchanges in some states versus in other states. Unsubsidized millions will cancel their insurance. The resulting massive decline in revenue as well as actuarial changes in the composition of the insured (since the sickest will likely keep their insurance regardless of subsidy) will create a financially unsustainable system — or rather will hasten the recognition of socialized medicine as unsustainable. The political turmoil will also be enormous — but necessary and perhaps ultimately beneficial as long as the left is prevented from using another crisis to increase the size and scope of government.

The D.C. Circuit’s opinion, written by Judge Thomas Griffith, is obviously correct to anyone who believes that laws mean what they say and that, as concurring Judge A. Raymond Randolph notes, quoting Justice Brandeis, when a law omits something, it’s not a judge’s job to fix it: “What the government asks is not a construction of a statute, but, in effect, an enlargement of it by the court.… To supply omissions transcends the judicial function.”

Just try to tell that to the judges just appointed to the D.C. Circuit Court of Appeals by President Obama and permitted by Harry Reid’s changing generations of Senate protocol in order to protect Obamacare from this very occurrence.

And just try to tell that to the judges of the Fourth Circuit Court of Appeals, based in Richmond, which, just a few hours after the D.C. Circuit’s opinion was released, ruledunanimously that that tax credits for the federal exchange are in fact legal. Their ruling was based on a claim that “the applicable statutory language is ambiguous and subject to multiple interpretations.” Clearly these people went to the Bill Clinton school of English and are unsure whether “state” means “state.”

While the dispute among the circuits would normally point to the Supreme Court taking the case, the near certain en banc hearing in the D.C. Circuit and the likely overturning of the panel means that the Supreme Court may not face divided circuit opinions pushing them to hear the cases. In that case, the chances of the case being heard hinge almost entirely on whether John Roberts wants to revisit Obamacare — something I very much doubt.

While today’s ruling is welcome, it is far from dispositive.

It’s too early for champagne — and there may well never be occasion for a bigger celebration if John Roberts is not looking for vindication. But for the welcome reminder that Obamacare is fundamentally flawed and that it’s not a judge’s job to fix a bad law, we can at least pop open your favorite beer or pour a nice glass of Pinot Noir (or, if you prefer, Amarone) and toast the fact that all is not yet lost.

 

[Originally published at the American Spectator]

Categories: On the Blog

Is Tax Inversion Really Unpatriotic?

July 24, 2014, 10:27 AM

The subject of tax inversion, in which American firms avail of lower tax rates in foreign countries by merging companies in those countries, has become very topical in the last couple weeks thanks to a decision by Abbvie, a drug company, to merger with Shire, an Ireland-based firm and move its headquarters overseas. One of at least 47 tax inversions in the last decade, the Abbvie-Shire deal is the largest such action yet, worth $54 billion. Perhaps unsurprisingly, President Obama and Democrats in Congress have become apoplectic with rage at the audacity of a business making a prudent decision to escape bloodsucking taxes.

The president has spewed a load of bile at all tax inverters, saying that they “are essentially renouncing their American citizenship so that they can ship their profits overseas to avoid paying taxes—even as they benefit from all the advantages of being here in America.” Treasury Secretary Jack Lew has echoed this sentiment, calling for businesses to not move abroad and to show “economic patriotism” (a terms that carries some unsettling notes of mercantilism). Obama and co. have decided that this subject is a major voter-winner in the run-up to the midterms, so be ready for more business-bashing in the months ahead.

What is so strange about the attitude of Obama and his cronies is that they seem dead set on blaming the businesses for “not playing fair” and running off to more business-friendly lands. But that is exactly what America has traditionally done to other countries. As a president set on making America a more “responsible” player on the world stage, he turns to threats of economic violence awfully quickly. He seems perfectly at home using America’s economic clout to aid American business abroad through entities like the Export-Import Bank. Apparently, businesses are only “patriotic” when they are friendly to his administration.

Also, whatever happened to Obama’s whole “corporations aren’t people” shtick? Are we to believe that a corporation has a duty of patriotic loyalty (economic or otherwise) but is not entitled to speech? Or is he referring to the managers and owners who took the dastardly action of defending their private interests against the anti-business, anti-market attitudes of the present administration? Whatever Obama means, he is totally off the mark.

The simple fact is that business is international. Corporations are frequently not solely of one country, and they certainly are not the exclusive property of one country. Moving to a friendlier climate is not reason to further restrict free trade, exact ruinous retroactive regulations, and to attack the businesses that are simply responding to economic pressures. Yet those are the exact responses the Obama administration is considering.

The problem with the way Obama and his friends in Congress are responding to tax inversion is that they are blaming the businesses for responding to pressures created by the government. America has prospered thanks to its open economy and the easy business environment it nurtured. Those advantages have been eroded by the twin forces of increasing taxes and regulations at home and their relative decline abroad.

Businesses leaving is a sign that something has to change. Obama should take the hint and start restoring free enterprise to America. If he continues down the path he’s treading, he will only succeed in driving more firms overseas.

Categories: On the Blog

Five Fatal Flaws of Solar Energy

July 24, 2014, 9:21 AM

The sun is the most important energy source on Earth. It provides our daily warmth and light and the rotation and orbit of the earth turn its steady output into fluctuating day and night, summer and winter. Solar energy powers the growth of all trees, grasses, herbs, crops and algae; it creates the clouds and powers the storms; it is the source of all hydro, photo-voltaic (PV), solar-thermal, bio-mass and wind energy; and, over geological time, it also creates coal.

PV solar panels are useful in remote locations, for some portable applications and, with enough panels and batteries, stand-alone solar can even power homes.

But solar energy has five fatal flaws for supplying 24/7 grid power.

Firstly, sunshine at any spot is always intermittent and often unreliable. Solar panels can only deliver significant energy from 9am to 3pm – a maximum of 25% of each day. Solar can often help supply the hot afternoon demand for air conditioning, but demand for electricity generally peaks at about 6.30pm, when production from solar is usually zero.

Secondly, to be a stand-alone energy supplier, PV solar needs batteries to cover those times when solar is not producing – about 75% of the time under ideal cloudless skies. To charge the batteries for continuous power, while also supplying usable power, a solar plant can only deliver a theoretical maximum of 25% of its day-time capacity. The chance of cloudy days will greatly increase the battery storage needed, and the generating capacity absorbed in charging the batteries. Currently, only pumped hydro storage could possibly supply the storage capacity needed and then only at massive cost, in a few suitable locations.

Thirdly, solar energy is very dilute, so huge areas of land are needed to collect industrial quantities of energy.

If it were possible to anchor a solar collector one meter square at the top of the atmosphere, aligned continuously to face the sun, and never shadowed by the earth or the moon, it would receive solar energy at the rate of 1,366 Watts per square metre (W/m2) – that would power 13 light bulbs each using 100 watts.

If that panel were located on the surface, at the equator, under clear skies, aligned continuously to face the sun, and never shaded by the earth or the moon, solar energy dissipated by the atmosphere would reduce energy received to 1,000 watts.

In the real rotating world, where sunshine only reaches usable intensity for about 25% of the time, the best located panel would have a capacity factor of about 17% – it would receive 170 watts of energy – not quite 2 light bulbs.

PV solar panels convert solar energy to electrical energy at an efficiency factor of about 15%. Thus our panel, at the equator, year round, should deliver 25.5 watts of electrical energy – one very dim light bulb.

Away from the equator, solar energy hits the Earth’s surface at an angle, thus delivering less energy per panel. This useful site shows how solar intensity varies with latitude in Australia:
http://www.bom.gov.au/jsp/awap/solar/index.jsp?colour=colour&time=latest&step=0&map=solarave&period=3month&area=nat

Shift that panel to Melbourne, add clouds, shading, urban air pollution and dirt on the panels, and fix it to a sloping roof often aligned poorly to collect sunshine, and it is time to start the diesel generator in the car port.

It is sensible to use unused space like roofs for solar collectors but such fragmented facilities will never match a compact well-designed solar plant in construction, maintenance and cleaning costs or go close in achieving the correct panel orientation.

People underestimate the land needed for significant solar collectors. In a learned paper published in 2013, Graham Palmer has produced a credible calculation that it would need a square with 31 km sides, completely filled with PV panels, to collect energy equivalent to Australia’s annual electricity requirements.
Source: http://www.mdpi.com/2071-1050/5/4/1406

To also charge batteries to maintain steady supply from a stand-alone solar facility would require at least four times this area – imagine 3,844 square kilometres of collectors, even if suitable battery technology was available.

In addition, PV panels start to degrade in rain, hail and sunshine from the day they are installed, some panels losing significant capacity in as little as three years. And unless washed regularly, dust and bird poop degrades their performance even quicker. All those sparkies checking panel performance and all those cleaning ladies with mops need access roads – this greatly increases the area needed for industrial solar installations.

The fourth fatal flaw of solar energy is the pernicious effect of the dramatic fluctuations in supply on the reliable and essential parts of the grid. When solar electricity floods the network around mid-day, the back-up stations have to throttle back, all the stations needed for stability and backup have their profits reduced, and some may be forced to close, making the network even more fragile and prone to blackouts. Then if a cloud floats across the sky, the backups have to re-start swiftly.

Fifthly, large-scale solar power will create environmental damage over large areas of land. Solar collectors may only manage to convert about 10% of the sun’s energy into electricity, the rest being reflected or turned into heat. But the whole solar spectrum is blocked, thus robbing 100% of the life-giving sunshine from the ground underneath, creating a man-made solar desert. For solar thermal, where mirrors focus intense solar heat to generate steam, birds that fly through the heat beams get fried. Why would true environmentalists support industrial-scale solar energy collection?

All consumers should be free to use solar energy in their own way at their own cost. But these five fatal flaws mean that collecting solar energy will never play more than a minor and very expensive role in supplying grid power.

Desertec, the utopian US$560 billion project designed to cover 16,800 square km of the Sahara Desert with solar panels, and then export electricity 1,600 km to Europe, has collapsed.

A similar fate awaits other attempts to extract 24/7 grid power from intermittent, unpredictable and dilute solar power.

The latest “Desertec Idea” is “solar roads” where highways are paved with solar panels. Imagine the construction and maintenance costs, the length of transmission lines, and the problems of shading and abrasion by traffic, the hazards of cleaning and the random non-ideal orientation of the panels.

Not with my money thanks.

Categories: On the Blog

Showing the Flag: The Transit Policy Failure

July 24, 2014, 9:17 AM

David King has a point. In an article entitled “Why Public Transit Is Not Living Up to Its Social Contract: Too many agencies favor suburban commuters over inner-city riders,” King, an assistant professor of urban planning in the Graduate School of Architecture, Planning and Preservation at Columbia University notes that transit spends an inordinate share of its resources on suburban riders, short changing the core city riders who cost transit agencies far less to serve and are also far more numerous. He rightly attributes this to reliance on regional (metropolitan area) funding initiatives. Many in transit think it is necessary to run near empty buses in the suburbs to justify the use of transit taxes to suburban voters (what I would refer to as “showing the transit flag”)

King asks: “So does public transit serve its social obligations?” He answers: “Increasingly the answer is no.” King is rightly concerned about the disproportionate growth in spending on commuter rail lines that carry transit’s most affluent riders from deep in the suburbs to downtown. Transit policy has long been skewed in favor of the more affluent suburban dwellers in the United States.

My Experience in Los Angeles

I saw this first-hand as a member of the Los Angeles County Transportation Commission (LACTC). When we placed what was to become the first regional transit tax on the ballot (Proposition A in 1980), the shortage of transit service was critical in the highest demand, largely low-income areas of Los Angeles such as Los Angeles and East Los Angeles. I described the situation in a presentation to the annual conference of the American Public Transportation Association: “Often waiting passengers are passed at bus stops by full buses” Approximately 40 percent of the local bus services between the Santa Monica Mountains, Inglewood, Compton, Montebello and Santa Monica reached peak loads of 70 passengers, well above seating capacity

At the same time, suburban area buses were usually less than half-full. In connection with this concern, I produced a policy paper, Distribution of Public Transit Subsidies in Los Angeles County, which was published in by the Transportation Research Board. The abstract follows:

“Public transit today is faced with the challenge of serving its clientele while subsidies are failing to keep pace with increasing operating costs. In Los Angeles County, there are service distribution inequalities–overcrowding and unmet demand in some areas and, at the same time, surplus capacity in other areas. To use subsidy resources efficiently requires that the effects of present subsidy allocation practices be understood–that is, how subsidies are translated into consumed service, both by type of service and by geographic sector within the urban area. An attempt is made to provide a preliminary understanding of that distribution in Los Angeles County. It is postulated that significantly more passengers are carried per dollar of subsidy in the central Los Angeles area than in other areas and local services require a lower subsidy per passenger than do express services. A number of policy issues are raised, the most important being the very purpose of public transit subsidies.”

Generally, transit operating subsidies per passenger were far higher in the suburbs than in the central area (where incomes are the lowest, and poverty rates the highest), and subsidies were much higher for commuter express services than for local bus services.

I attempted to address this problem by proposing a “Mobility Policy” that would have reallocated service based on customer needs, giving precedence to areas where mobility was restricted due to limited automobile availability and lower incomes. Some colleagues whose constituents were disadvantaged by this inequity objected,  feeling compelled, it appeared, to rally about the “transit flag”

On a Siding: Transit Policy in Recent Decades

Since that time, Los Angeles and other major metropolitan areas have built expensive rail and busway systems. Despite the promises of attracting people out of their cars (routinely invoked during election campaigns for higher taxes), the reality is that single occupant commuting has risen from 64 percent in  1980 to 76 percent in 2012. Over the same period, transit’s share of urban travel has fallen, though stabilized in recent years at very low levels in most metropolitan areas. Indeed, when New York, Chicago, Philadelphia, Washington, Boston, and San Francisco are excluded (with their “transit legacy cities“), the 46 major metropolitan areas have a transit commute share of just three percent. Overall, more people work at home than commute by transit in 38 of these metropolitan areas and more people walk or cycle to work in 27, according to American Community Survey 2012 data.

Yet the politically driven inequality in transit spending continues. Transit subsidies continue to be far higher for services that are patronized by more affluent riders. For example, subsidies (operating and capital expenditures minus fares) are three times as high for the commuter rail services, with their higher income riders, than for buses, with their lower income riders (Figure).

The difference can be stark, as an example from the New York area indicates. A Fairfield County, Connecticut commuter rail rider with the median family income of $102,000 would be subsidized to the extent of $4,500 per year (assuming the national subsidy figure). By comparison a worker from the Bronx or Hudson County, New Jersey, with a poverty level family income of $18,500 per year (or less) would be subsidized only $1,500 per year. In fact, the bus subsidy would likely be even lower, because transit in lower income areas is much better patronized and thus less costly for the public. My Los Angeles research found inner city services to be subsidized approximately half below the average of all bus services (Note).

Where Transit Works

The functional urban cores contain the nation’s largest downtowns (central business districts). Their population densities are nearly five times that of the older suburbs and nine times that of the newer suburbs. The functional urban cores have transit market shares six times that of the older suburbs and 15 times that of the newer suburbs. Yet, it is in these poorer, denser areas where overcrowding is most acute and the need for more service is most acute. In Los Angeles, for example, the greatest potential for increasing transit ridership is where ridership is alreadyhighest.

The vast majority of suburban drivers are not plausible candidates for transit, simply because it cannot compete well with automobiles, except, for example, for some trips to the downtowns of the six transit legacy cities (which account only one of seven jobs in their respective metropolitan areas).

Where transit makes sense, people ride. Where it doesn’t, they don’t. Allocating resources inconsistent with this reality impairs the mobility of lower income residents, wastes resources and relegates transit to an inferior role in the city. Charging the affluent fares well below the cost of service compromises opportunities to serve more people in the community.

Better allocation of transit resources would likely improve core area unemployment rates by increasing the number of jobs that can be accessed by lower income workers. Further, because the better used services would require lower subsidies, there would be funding available for additional service expansions.

The principal fault is not that of transit management. It’s the politics.

—–

Note: These data (expenditures per boarding) are estimated from Federal Transit Administration and American Public Transportation Association data for 2012. Commercial revenues other than fares are excluded (the most important such source is advertising). Debt service is also excluded because it is not reported in the annual reports of either organization. The subsidy ratios between lower income and more affluent riders would be changed by including transfers (though the subsidies would still be considerably higher for the more affluent). Some low income riders use more than one bus or rail vehicle for their trip, while some commuter rail riders transfer to bus or rail services at one or both ends of their trips. No readily available data is available to make such an adjustment. The New York area example assumes 225 round trips per year.

—–

Wendell Cox is principal of Demographia, an international public policy and demographics firm. He is co-author of the “Demographia International Housing Affordability Survey” and author of “Demographia World Urban Areas” and “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.” He was appointed to three terms on the Los Angeles County Transportation Commission, where he served with the leading city and county leadership as the only non-elected member. He was appointed to the Amtrak Reform Council to fill the unexpired term of Governor Christine Todd Whitman and has served as a visiting professor at the Conservatoire National des Arts et Metiers, a national university in Paris.

Originally published at www.newgeography.com
Categories: On the Blog