I am honored to join Drs. Riccardo Polosa and Pasquale Caponnetto and their colleagues at the University of Catania in Italy as a coauthor of a new scientific article on e-cigarettes published in Harm Reduction Journal (available here). We scientifically disprove the stated premise of a recent NPR broadcast (here), “… little is known about the potential health effects [of e-cigarettes].” NPR “expert” Stanton Glantz stated that “e-cigarettes today are the triumph of wishful thinking over data.” Our publication shows that e-cigarettes are the triumph of scientific evidence over feigned ignorance.
For the benefit of members of the Flat Earth Society, I reproduce our summary here:
“The dream of a tobacco-free, nicotine-free world is just that—a dream. Nicotine’s beneficial effects include correcting problems with concentration, attention and memory, as well as improving symptoms of mood impairments. Keeping such disabilities at bay right now can be much stronger motivation to continue using nicotine than any threats of diseases that may strike years and years in the future.
“Nicotine’s beneficial effects can be controlled, and the detrimental effects of the smoky delivery system can be attenuated, by providing the drug via less hazardous delivery systems. Although more research is needed, e-cigs appear to be effective cigarette substitutes for inveterate smokers, and the health improvements enjoyed by switchers do not differ from those enjoyed by tobacco/nicotine abstainers.
“It is of paramount importance that government and trusted health authorities provide accurate and truthful information about the relative risks of smoking and alternatives to smoking. If the public continues to be misled about the risks of THR products, millions of smokers will be dissuaded from switching to these much less hazardous alternatives. One of us recently wrote that, “It’s time to be honest with the 50 million Americans, and hundreds of millions around the world, who use tobacco. The benefits they get from tobacco are very real. It’s time to abandon the myth that tobacco is devoid of benefits, and to focus on how we can help smokers continue to derive those benefits with a safer delivery system” [reference here].
“In the absence of regulatory standards, it is important that currently marketed products are of high quality. For example, the hardware should be reliable and should produce vapour consistently. The liquids should be manufactured under sanitary conditions and use pharmaceutical grade ingredients, and labels should contain a list of all ingredients and an accurate and standardized description of the nicotine content.
“According to a recent article by CDC researchers, the proportion of U.S. adults who have ever used electronic cigarettes more than quadrupled from 0.6% in 2009 to 2.7% in 2010 with an estimated number of current electronic cigarette users of about 2.5 million [reference here]. Although rigorous studies are required to establish THR potential and long term safety of electronic cigarettes, these figures clearly suggest that smokers are finding these products helpful. If they were ineffective one would not expect the market to take off as it is. Most importantly, even if this THR product proves to be effective for only 25% of the smoking population, it could save millions of lives world-wide over the next ten years.”
[Originally Published at Tobacco Truth]
On Friday, November 8, Obama arrived in New Orleans to talk about improving the U.S. economy. There, he drew attention to the already forgotten government shutdown: “there’s no question that the shutdown harmed our jobs market. The unemployment rate still ticked up. And we don’t yet know all the data for this final quarter of the year, but it could be down because of what happened in Washington.”
Yes, 850,000 furloughed federal employees may have tipped the scales, but really they all got a free 16-day vacation. They were, ultimately, all paid.
Obama wants to blame the 16-day federal employee paid vacation on the Republicans—and there may well be fault there. But what about the thousands of jobs that have been lost due to the polices of his administration—people in the private sector who have been out of work for more than 16 months?
Because of what happens in Washington, thousands—if not millions—of private-sector employees have, effectively, been permanently furloughed and/or new jobs are not created as a result of the Obama Administration’s war on energy.
A quick search using the phrase “jobs lost due to coal-fired power plant closures” produces a plethora of stories with hundreds of job losses. Nearly 300 coal-fired power plants are being shutdown across the U.S. due to EPA regulations—which according to the American Coalition for Clean Coal Electricity (ACCCE) is “five times greater than the EPA predicted would be forced to shut down due to its regulations.” Mike Duncan, President and CEO of ACCCE, says: “The EPA continues to downplay the damage its regulations are causing to the U.S. economy and to the many states that depend on coal for jobs and affordable electricity.”
In Massachusetts, the closure of Brayton Point Power Station—which is the second shutdown in the region, will cost 240 jobs.
A Fox News story about the closure of coal-fired plants in Georgia that will result in the loss of 480 jobs, states: “several utilities have made similar announcements, saying they opted to close aging coal plants rather than pay hundreds of millions of dollars to install pollution-control equipment to comply with federal clean-air rules.”
In Pennsylvania, the closure of two plants has resulted in the loss of 380 jobs. Melody Longstreth, executive director of the Waynesburg Area Chamber of Commerce, blames the EPA for the jobs losses: “Unfortunately, regulation affects everybody, whether you’re a mom-and-pop business all the way up to that.” Her colleague, Muriel Nuttall, executive director of the Fayette County Chamber of Commerce, adds: “In addition to job losses at the plant, the closing will have a ‘domino effect’ on related jobs, such as truck drivers and barge operators.”
Congressman Tim Murphy (R-PA), who represents the district where the Pennsylvania plants are located, laments: “The front line in the administration’s war on coal is right here in Southwestern Pennsylvania, and the casualties are the factories and homeowners, who will pay higher electric bills, as well as the hundreds of utility workers, boilermakers, and miners who will be out of work. With the shutdown of the Mitchell plant and Hatfield’s Ferry, which just four short years ago underwent $1 billion in upgrades, the president is making good on his promise to ‘bankrupt’ anyone who operates a coal plant.”
These closures result in jobs lost at the power stations and the surrounding communities, but they—along with additional EPA regulations—also cause the loss of jobs at the coal mines.
Addressing the Mitchell and Hatfield’s Ferry plants, Phil Smith, a spokesperson for the United Mine Workers of America, said: “It’s clearly going to have bad implications for coal miners.”
According to a report in the Herald-Standard, about a quarter of the coal-fired plants that have been closed or are scheduled for shutdown, buys, or has bought, coal from Alpha Natural Resources. Company spokesperson Samatha Davison states: “By all indications, regulators in Washington are intent on limiting the production and use of coal, the most abundant domestic energy source we have in America. With that said, this is another example of how the EPA regulations are impacting the electricity-generation business and the coal industry.”
Much of the coal comes from Kentucky. Bill Bissett, President of the Kentucky Coal Association, told me about the loss of thousands of jobs as a result of EPA regulations: “Eastern Kentucky has lost more than 6,000 direct coal-mining jobs. For every one direct mining job lost, three other Kentuckians lose their livelihoods with the loss of indirect jobs. Beyond this loss of direct and indirect employment, you also have the loss of what is called induced jobs, which are unrelated directly to the mining of coal, but negatively impact automobile dealerships, restaurants and anywhere these former direct and indirect employees spent their paychecks.”
The regulations coming from the Obama Administration are not just hurting mining jobs through the expensive and excessive rules on power plants, but through retroactively blocking new mining projects such as the Mingo Logan mine in West Virginia. The decision on the part of the EPA to withdraw specific parts of a permit after it was issued is unprecedented in the history of the Clean Water Act. The Mingo Logan permit was issued in January 2007, by the Army Corps of Engineers and authorized discharge from the Spruce No. 1 coal mine into nearby streams. Nearly three years later, the EPA published a final determination on the permit that withdrew those disposal sites.
The EPA’s decision is an example of bureaucracy at its worst and has sparked bipartisan criticism from West Virginia’s Congressional representatives. Senator Joe Manchin, a Democrat states: “One agency grants a permit, another agency takes it away and business suffers in the end. The federal government should be an ally, not an adversary, in helping to strike a balance between protecting the environment and creating good American jobs.” Likewise, Republican Representative Shelley Moore Capito said: “The Environmental Protection Agency has continued to overstep its bounds in its efforts to implement the president’s anti-energy policies. Not only will this ruling cost West Virginians hundreds of jobs, but it begs the question: Who is safe?”
The Mingo Logan case will continue to be played out in the courts. Meanwhile, Senators Manchin and David Vitter (R-LA)—who is also the top Republican on the Environment and Public Works Committee, have issued a letter to the EPA regarding its “extreme approach.” They are demanding information from EPA about the revocation of Mingo Logan’s previously approved federal permit. In the letter, the Senators point to the company’s “faithful compliance with the permit’s terms”—yet, the EPA’s unprecedented actions unilaterally revoked the permit.
The Obama Administration’s control of permits is also impacting the oil-and-gas industry,which even in its fettered state is largely responsible for the private sector job growth. A recent EIA report states that from the start of 2007 through the end of 2012, total U.S. private sector employment increased by about 1%. “Over the same period, the oil and natural gas industry increased by more than 162,000 jobs, a 40% increase.” But there could be much more.
Before the Macondo Well accident, analysts expected 45 rigs to be operating in the Gulf by 2013. In a recent investors’ report, the International Strategy & Investment Group said: “After essentially being left for dead following the devastating Macondo blowout, we believe the deepwater Gulf of Mexico is in the early stages of an extended growth cycle.” Only 38% of deepwater reserves are currently producing oil, while an additional 60% are in appraisal, development or discovery status. Today there are 40 rigs currently under contract in the Gulf.
While drilling permits are being issued in the gulf, it is, according to Senator Vitter at “an anemic pace.” The National Ocean Industries Association (NOIA) confirmed that it now takes on average 108 days to get an offshore drilling permit, twice as long as the average 54 days it took pre-Macondo.
Earlier this year, the House of Representatives passed the Offshore Energy and Jobs Act (H.R. 2231), which directs the Secretary of the Interior to “implement a leasing program that includes at least 50% of the available unleased acreage within each outer Continental Shelf (OCS) planning area…”
Regarding H.R. 2231, Randall Luthi, President of NOIA, issued a statement that includes: “Opening up more of our OCS to energy exploration makes good sense and even better cents, adding thousands of domestic jobs and potentially billions of dollars in state and Federal revenues.”
A report released last month from the Consumer Energy Alliance found that more than 10,000 new jobs could be created if exploration and production were to take place in just the mid-Atlantic region of the OCS. A thorough economic analysis by the American Energy Alliance found that the potential resources in the OCS alone could sustain 1.2 million new, full-time jobs per year over the next 30 years if the OCS is authorized and permits are granted.
[Originally published on TownHall.com]
Which on a daily basis is shown to be more and more untrue.
The reason for this coverage hemorrhage is – more government. More government mandates and regulations are constricting the private health insurance sector into blue-faced oblivion – and squeezing more and more people out of their coverage.
These government regs are canceling insurance – and making it much more expensive. Which was another President Obama and Democrat lie.
Again, not so much.
Just released were the slightly-less-pathetic-than-Obama-Normal October jobs numbers. Which undid another Leftist lie – that the half-month government ever-so-partial-shutdown would be detrimental to the economy.
The government being slightly-closed for the first half resulted in one of the best jobs months in five Obama years. But let’s not get too excited.
Hiring paper pushers in the nation’s Government Compliance Departments is just about the biggest waste of time, money and effort there is. Wonder why things get more and more expensive, and less and less effective? Because our nation’s innovators are forced with ever greater regularity and intensity to deal with government.
That was way back in 2010 – before ObamaCare (and the rest of the Obama Fiat agenda) really began kicking in.
Time, money and effort are finite resources – the more government consumes, the less there is for everyone and everything else.
Really? Let’s look at some of the ways this is supposedly so.
1. It’ll be more expensive
Yes, because as ObamaCare has yet again demonstrated, Big Government regulations dramatically lower costs.
3. Service will be worse, unless you’re willing to pay
Yes, because as ObamaCare – and its website – have yet again demonstrated, greater government involvement is a tremendous boost to service. Like the Post Office. Or the DMV. More government means worse service – and higher prices anyway.
4. There will be less innovation
Net Neutrality is basically the government regulating the Internet the way it has for seventy-plus years the landline telephone. You know – that bastion of innovation. (At least we’re no longer rotary dialing or having to ring Sarah Andy Griffith-style.)
6. …BitTorrent sites will suffer
You mean sites that steal music and movies and then sell them? Shouldn’t the government – law enforcement specifically – be causing them to suffer?
8. It’ll be censored
Ok, this is just whiplash-inducing. Government is the only named censor in the Constitution:
Congress shall make no law…abridging the freedom of speech, or of the press.…
So how will undoing the sole censor’s Internet power grab – lead to the Internet being censored? Isn’t it in fact preempting censorship?
The hysteria of the pro-government-control Left is palpable. Because the Internet pre-Obama Administration was a virtual government-free zone – and Leftists have been looking for decades for a regulatory hook to being reeling it in.
They are caterwauling so loudly about Net Neutrality – because it is said hook.
Keep in mind that the Internet as we know it has existed since the early-to-mid 1990s – and the government didn’t execute its Net Neutrality power grab until December 2010.
None of the things the Left fears post-Net Neutrality ever happened in the two-plus decades pre-Net Neutrality – it instead became the free speech-free market Xanadu we all know and love.
Of course we have always had “net neutrality” – because it makes zero business sense for an Internet service provider to block its customers’ access to what they want on the Web. And we will always have it – because it will never make sense for a company to not give its customers what they want.
What we didn’t have was an over-arching, over-reaching government regulatory superstructure Network Neutrality – which we do not need, but the Big Government Leftists have long wanted.
We passed ObamaCare – another longtime Leftist wish list item - to find out what is in it. How’s that working? Of course it needs to go.
Net Neutrality should also be undone – before it does to the Internet and the entire economy what ObamaCare is doing to health care and beyond.
[Originally published on Human Events]
Four scientists at the forefront of global warming activism published an open letter this week encouraging their fellow warmists to embrace safe nuclear power as a means of reducing carbon dioxide emissions. If environmental activist groups honestly believe humans are causing a global warming crisis, they will eagerly join in support.
The four scientists are pretty close to embodying a Mt. Rushmore of global warming activists. They are James Hansen at the Columbia University Earth Institute, Tom Wigley at the National Center for Atmospheric Research, Kerry Emanuel at the Massachusetts Institute of Technology and Ken Caldeira at the Carnegie Institution.
“Continued opposition to nuclear power threatens humanity’s ability to avoid dangerous climate change,” the scientists wrote.
“We call on your organization to support the development and deployment of safer nuclear power systems as a practical means of addressing the climate change problem. Global demand for energy is growing rapidly and must continue to grow to provide the needs of developing economies. At the same time, the need to sharply reduce greenhouse gas emissions is becoming ever clearer.”
To be sure, widespread nuclear power would come at a substantial price. Nuclear power is about 50 percent more expensive to produce than conventional power. Granted, much of that cost disadvantage is due to excessive government regulation that is unique to nuclear power, but there is no reason to expect government will ease such regulation anytime soon.
In return for that price, people get the ease of mind knowing they are utilizing carbon-free power. The science is pretty strong that humans are not creating a global warming crisis, but the extra insurance policy may still hold some value. Also, nuclear power eliminates all types of air pollution, which brings additional benefits.
Nuclear power also carries many advantages over wind power and solar power – which to this point have been global warming activists’ preferred power sources.
First, let’s take a look at nuclear power’s environmental advantages. Unlike wind power, nuclear power will not kill millions of birds and bats each year, including endangered and protected species. Unlike solar thermal power, nuclear power will not use up much of the very limited water supplies in deserts and arid regions that are best suited for solar power production. Unlike both wind and solar power, nuclear power does not require the development of vast swaths of some of our nation’s most pristine lands. And also unlike wind and solar power, nuclear power plants can be built almost anywhere, eliminating the need to build long transmission lines through previously undisturbed lands.
Next, let’s take a look at the nuclear power’s economic advantages. Nuclear power is substantially less expensive than wind and solar, and will remain that way for the foreseeable future. Also, unlike variable wind and solar power, nuclear power is available on demand, which increases efficiency and eliminates wind and solar power’s need for redundant backup baseload power.
When you compare the economic and environmental costs of wind and solar power to nuclear power, it is hard to understand why global warming activists have historically opposed it. It makes one wonder whether they really believe in their asserted global warming crisis. If global warming is really the gravest threat humanity has ever faced, why do the same people who sound a worrisome alarm categorically reject the most affordable, effective and readily available means of eliminating carbon dioxide emissions?
People like me who are skeptical of the asserted global warming crisis, but nevertheless are very concerned about government energy restrictions punishing our living standards are still going to have our concerns about nuclear power. Widespread use will function as a substantial tax on the American economy in comparison to affordable coal and natural gas power. But that tax will not impose nearly as much economic punishment and energy supply disruption as prohibitively expensive and unreliable wind and solar power.
The four scientists calling for warmists to embrace nuclear power is a conversation starter, and an opportunity to perhaps find some common ground in what has otherwise become an increasingly divisive and polarizing global warming discussion. At the very least, nuclear power is an economically and environmentally preferable option to widespread wind and solar power. It will be very interesting to see how other warmists, and particularly the large environmental activist groups, respond to the embrace of nuclear power by some of the most prominent scientists leading their cause.
[Originally published on Forbes]
The release of a supplemental poverty measure by the Census Bureau is being touted in the media as indicating the government is not doing enough to ameliorate poverty in the country.
The number of poor people, as officially measured, is at an historical high, 47 million. The number of poor people, per the supplemental measure, is even higher, 50 million. But, the problem isn’t that the government isn’t doing enough. Rather, the problem is that the government is already doing too much.
The official poverty line is equal to three times the cost of food in 1963. Over time, the poverty level, by family size and composition, is simply adjusted for inflation. The official measure has the great advantage of being simple and at least approximately in line with what most people considered to be poor back in 1963.
For example, the official poverty line roughly corresponded to the average amount of money people said a family needed “to get by” when queried by organizations such as the Gallup Poll.
Using this definition of poverty, and historical statistics, it could be seen that the percentage of people who were poor by the 1963 standard was falling dramatically going into the 1960s. But, from the 1960s until the past few years, the percentage of people who are poor has merely fluctuated. And, during the past few years, it has gone up.
As to whether this recent movement is merely cyclic or the start of a new trend is a very important question.
A big problem with the official poverty line is that it does not reflect the impact of non-cash aid to low-income people, such as food stamps and housing subsidies; and, it does not reflect the impact of taxes paid by low-income people.
Back in 1963, the Social Security payroll tax was 3.625 percent. Today, the Social Security and Medicare payroll tax is 7.65 percent.
Non-consideration of non-cash assistance and of taxes has been a bone of contention for years. These are routinely discussed in college textbooks, were included in a National Academy of Science review, and were incorporated into some exploratory analysis conducted by the Urban Institute. A couple other issues with the official measure seem unexceptional.
For example, court-ordered spousal and child support payments count as income when received, but do not count as deductions to income when paid.
Why the asymmetry?
Unfortunately, the supplemental poverty measure also includes a number of other adjustments that seem presumptive. Out of pocket health care expenses, for example. Is health care a right, such that it is not income to those receiving health care through Medicaid; and, is a deduction from income to those not in that part of the welfare system?
It would best to separate the measurement of poverty from current political controversies. But, putting aside any qualms with the supplemental poverty measure, let us see what we are doing in the name of fighting poverty.
• As between men and women, poverty among men is higher by 2 million with the supplemental measure, and among women lower by 2 million.
• As between persons under and over 18, poverty is lower among those under 18 by 8 million, and higher among those higher by 8 million.
• As between married couples and singles, poverty is higher by 8 million among married couple, and lower among singles by 6 million.
• As between blacks and whites and Asians (and why some people are described as a color and others not, I do not know), poverty is less by 3 million among blacks, and higher by 2 million among whites and by 1 million among Asians.
• As between native and foreign born, poverty is lower by 4 million among native-born Americans, and higher by 4 million among foreign born.
• As between people with private health insurance and people with government health insurance, poverty is higher by 10 million among people with private health insurance, and lower by 11 million among those with government health insurance.
• As between people who work and those who do not work, poverty is lower by 2 million for those who do not work, and higher by 6 million among those who work.
The re-distribution of poverty affected by the government seems to be guided by a combination of randomness and perversion. What’s wrong with being married, or with working? Why are those behaviors punished; and, being single and not working rewarded? Is this the result of an intentional anti-poverty system? Or, is this the result of a hodge-podge Rube Goldberg contraption, and of reacting to perceived needs without concern for long-run consequences?
And, is it just a coincidence that the re-distribution of poverty seems to roughly align with the tendencies of groups of voters to vote for the candidates of a certain political party as indicated by exit polls?
A friend and self-described “libertarian physicist” recently sent me a short PowerPoint presentation, dated 2007, and I gave him some feedback. His presentation warned that libertarians shouldn’t be too eager to argue the science of global warming, since it is at least plausible, and instead focus on whether CO2 emission must be limited by government.
I share my reply to my physicist friend below. I probably shouldn’t have gone on at such length, but as they say, it’s easier to write long than short:
Thanks for passing this along. Feel free to pass along my reactions to Dr. X. I mean no disrespect, and apologize for the length of what follows, but this reply flows easily because I’ve written and read and said it so many times before.
A physicist knows that carbon dioxide is a greenhouse gas, that its concentration in the atmosphere is rising due to human emissions, and that all else being equal this will lead to a warming of 1-2 deg. C. if the concentration rises to 2X pre-industrial levels. These are important facts that nearly all skeptics concede. But these are not very important facts in the policy debate.
A physicist is no more likely than a sociologist to know what human emissions will be 50 years from now — if a slight warming would be beneficial or harmful to humans or the natural world; if forcings and feedbacks will partly or completely offset the theoretical warming; if natural variability will exceed any discernible human effect; if secondary effects on weather will lead to more extreme or more mild weather events; if efforts to reduce emissions will be successful; who should reduce emissions, by what amounts, or when; and whether the costs of attempting to reduce emissions will exceed the benefits by an amount so large as to render the effort counterproductive.
Uncertainty about these matters is pervasive in the science community. If the alarmists are wrong about even one or two of them, human greenhouse gas emissions move out of the realm of a nuisance requiring a response — whether by governments or via a (presently nonexistent) global property rights regime — and into the realm of speculation. For example, if modest warming and the fertilization effect of CO2 have boosted agricultural output around the world — something biologists have no doubts about — then the Third World owes the First World an enormous debt of gratitude for that positive externality of the Industrial Revolution, though that is not a debt the First World is entitled to collect.
I’m not a scientist, but I edited and coauthored parts of three volumes in the Climate Change Reconsidered series, nearly 2,000 pages consisting almost entirely of reviews of peer-reviewed research written by astronomers, biologists, climatologists, economists, geologists, oceanographers, and physicists. What I learned is that they have made some progress in answering the questions posed above. In particular, they are pretty sure that most of the warming of the late 20th century was natural, not man-made; that it was beneficial to humans and wildlife and not harmful; that it is virtually unstoppable by man; and that the costs of attempting to change the weather would erase two centuries of human progress.
Of course, it’s that final thing — “erase two centuries of human progress” — that is the objective of those who launched the global warming scare, keep it going despite the collapse of scientific support for its hypothesis, and reap enormous windfall profits from its subsidies and regulations. It’s been a political and social movement from start to finish, not a science-driven effort. Go back and read Julian Simon’s Hoodwinking the Nation, or the more recent Rupert Darwall’s The Age of Global Warming, a detailed history of the movement. Neither writer hardly ever mentions a scientist. They don’t need to.
Conceding the science to the alarmists would have been a terrible strategic mistake. We would have gotten cap and trade in 2009 or a carbon tax sometime since then. Trillions of dollars would have been wasted.
Here at The Heartland Institute, we have frequently debated the wisdom (or lack of wisdom) of taking the science seriously. Most free-market think tanks don’t. But failing to do so rendered them irrelevant in what is arguably the biggest public policy debate of our lifetimes. We decided not to debate “what should be done about global warming.” We looked under the hood at the science touted by the other side and found it to be utterly unpersuasive.
So we set out to persuade the public and their elected representatives that global warming is not a crisis, it needs neither a government solution nor a private property solution. Opinion polls and political outcomes in recent years show we were remarkably successful.
Maybe a libertarian think tank shouldn’t have taken up this task, but as I looked around seven years ago, and as I looked around every year since, I saw nobody else willing to do it.
The world’s largest economies seem engaged in something like the children’s game of “musical chairs.” For years, the United States has been the world’s largest national economy, though in recent decades the integrated economy of the European Union has challenged that claim given that the region includes four of the ten top national economies, Germany, the United Kingdom, France and Italy. The most recent data, reflecting the deep European recession, indicates that the top position has been retaken by the United States.
The International Monetary Fund (IMF) has released its semi-annual World Economic Outlook Database for October 2013. Information is provided for 189 country-level geographies, from 1980 to the present, with projections to 2018. Despite the economic malaise, the IMF data shows the US gross domestic product, adjusted for purchasing power parity (GDP-PPP), to be greater than that of the combined 28 member European Union (EU). This development, however, is at least partially due to accounting revisions, which are described below.
2012 Gross Domestic Product (Purchasing Power Parity)
The new data shows the United States to have a 2012 GDP-PPP of $16.245 trillion (current international dollars), two percent above the EU’s $15.933. This difference is relatively minor – the equivalent of Maryland’s GDP. In 2011, the EU led the US by a small margin, before the accounting methodology change. The IMF expects the US lead to be lengthened to approximately 10 percent by 2018. For comparison, in 1980, the same 28 EU economies had a GDP nearly one-quarter larger than that of the United States (Figures 1 and 2). However, it must be noted that in 1980, the European Union had only nine members and had an economy 8 percent smaller than that of the US.
China’s reduced, but still strong economic growth has propelled it to a GDP-PPP of $12.3 trillion, reaching 75 percent of the US figure. By 2018, the IMF expects China to reach 96 percent of the US GDP. If the IMF projected GDP increase rates of China and the US were to continue, China would be a larger economy than the United States by 2020. While this may be seem to be occurring sooner than expected, it is consistent with the expectation of former IMF economist Arvind Subramanian, in his book Eclipse: Living in the Shadow of China’s Economic Dominance. The scale of Chinese economic miracle that started under Deng Xiaoping can be seen by the fact that in 1980 its GDP was barely 10 percent of the US economy (See Ronald Coase and Ning Wang, How China Became Capitalist).
India’s economy also continues to progress. Now the world’s fifth largest economy, India’s GDP-PPP is estimated at $4.7 trillion. By 2012, India’s economy had reached 29 percent of that of the United States, nearly triple the 1980 figure. IMF expects India to close the gap by another five percentage points by 2018.
Japan has fallen to the fifth largest economy, at approximately $4.58 trillion. Japan had grown strongly after World War II, having reached 35 percent of the US economy by 1980. A number of experts, such as Harvard’s Ezra Vogel, expected that Japan would continue to close the gap with the United States. But Japan’s ascendency stopped by 1991, when it reached a size 41 percent of the US economy. In the subsequent economic slide, Japan’s economy fell to 28 percent of the US by 2012. IMF expects another two point drop by 2018.
Gross Domestic Product per Capita (Purchasing Power Parity)
The United States remains dominant in personal affluence among the world’s largest economies. In 2012, the US GDP-PPP per capita was $51,700. The European Union had a GDP-PPP of $31,600 in 2012, but is declining relative to the United States. In 2012, the EU GDP per capita was 61 percent of the US figure. This is down from a peak of 66 percent in 1982. IMF projects a further three percentage point loss by 2018 (Figures 3 and 4). The GDP-PPP per capita of the nations in the 9 nation European Union of 1980 was higher, at $36,100 in 2012 (Figure 5).
Despite China’s potential for becoming the world’s leading economy by the beginning of the next decade, its huge population makes the GDP per capita much lower. In 2012 China’s GDP per capita was $9,100, about 18 percent of the US figure. This is, however, far higher than the 1980 figure of 2 percent. IMF expects China’s GDP per capita to rise to $14,900 by 2018, 23 percent of the US figure.
India’s GDP per capita was $3,800 in 2012, or seven percent of the US GDP per capita. India’s progress has been rapid, though strongly overshadowed by China. India’s GDP per capita was 70 percent higher than China’s in 1980, but now China’s is now 60 percent higher. However, India has gained five percentage points on the US since 1980.
Japan’s GDP per capita stood at 69 percent of the US figure in 2012 ($35,900), down significantly from 1991, when Japan’s GDP per capita reached 84 percent of the US level. IMF projects about a 1.5 percentage point further decline by 2018.
As is noted above, the accounting changes implemented by the United States have changed the world rankings and their prospects
Data in the IMF’s last release (March 2013) placed the European Union slightly ahead of the United States in GDP-PPP. The United States is the first country to fully implement internationally agreed upon changes to national accounts (United Nations’ System of National Accounts 2008). The IMF summarizes the revisions and its impact on the US economy as follows:
“…expenditures on research and development activities and for the creation of entertainment, literary, and artistic originals are now treated as capital expenditures. Furthermore, the treatment of defined-benefit pension plans is switched from a cash basis to an accrual basis. The revisions increase the level of GDP by 3.4 percent and boost the personal savings rate.”
The US Department of Commerce, Bureau of Economic Analysis indicates that Europe will convert to the new methodology in 2013 and it is to be expected that other nations will quickly follow.
Before the accounting revision IMF data predicted that US would not pass the EU until 2015. Further, the previously lower GDP figures predicted that China would pass the United States just two years later (2017). China may have to wait to assume the top chair, but perhaps not. It all depends on how fast China converts to the new accounting and the impact of the revision on GDP figures.
An Uncertain World
Of course, economic projections cannot be “taken to the bank.” The world economy is volatile and uncertain and more so now that in more stable times.
The US economy continues to sputter along with lagging growth. The European economy is doing even more poorly. Mixed signals continue to be heard from China, where astronomic growth rates are being replaced, at least for the moment, by more modest ones. President Xi Jinping says that China can create sufficient employment for its growing urban workforce with a 7.2 percent growth rate (See: “China Needs 7.2% Growth to Ensure Employment” in The Wall Street Journal) – a rate that would be the envy of each of the world’s strongest economies.
The big high income world nations also have reason to envy India. According to the Organization for Economic Cooperation and Development (OECD), the economy of India “clocked a low growth rate of 4.4 percent” in the April to June quarter. The OECD characterized India’s immediate economic prospects as “weak,” yet India’s growth rate is far above those of the US, EU and Japan.
The Bank of Japan (BOJ), the nation’s reserve bank, is optimistic about the nation’s new growth-seeking policies under “Abenomics” (named after Prime Minister Shinzo Abe). But the BOJ predictions of economic growth at 1.5 percent in 2014 and 2015 are favorable only in the light of Japan’s anemic recent growth.
All of these predictions, combined with accounting changes, paint a blurred picture. This is the nature of a world economy that the IMF refers to as being stuck in “low gear.”
[Originally published on New Geography]
The call for net neutrality is really a call for a return to monopoly common carrier regulation for broadband networks.
At Wired, Marvin Ammori, a leading Save-The-Internet voice, has again called for the net neutrality movement to mobilize.
In his latest piece, “We’re about to Lose Net Neutrality and the Internet as We Know It,” Ammori warns of the end of the Internet world if the D.C. Court of Appeals finds the FCC’s net neutrality order illegal.
Specifically, he concludes: “if the court throws out the non-discrimination rule, permission-less innovation on the Internet as we know it is done.”
To grasp how desperate these calls for net neutrality have become, consider how many times net neutrality has been re-packaged and re-branded to try and gain mainstream traction, credibility and support.
In 1999, Harvard Law Professor Larry Lessig popularized the notion of the Internet as a “commons” with “neutral” access. In 2002, his student, now Columbia Law Professor Tim Wu, coined the new term “net neutrality.”
In 2006, Free Press organized SaveTheInternet.com and redefined net neutrality as “the First Amendment of the Internet” guaranteeing freedom of speech, while simultaneously accusing ISPs of discrimination and being gatekeepers, censors, and toll-booth collectors.
It is noteworthy that after both the House and Senate rejected calls to legislate net neutrality into law, every single one of the 95 House and Senate candidates that Free Press got to publicly support net neutrality lost their election in the 2010 midterms.
After those defeats, net neutrality was repackaged yet again in late 2010 as the FCC “open Internet” order.
In it, three unelected commissioners effectively self-legislated new purposes for the FCC: to “preserve the Internet as an open platform enabling consumer choice, freedom of expression, end-user control, competition, and the freedom to innovate without permission.”
This brings us back to the end of “the Internet as we know it” hyperbole from Mr. Ammori – that if the court overturns the FCC’s common carrier regulation of broadband, “permission-less innovation on the Internet as we know it is done.”
Now let’s fact-check this Chicken-Little analysis of the D.C. Court of Appeals review of Verizon’s challenge of the FCC’s Open Internet order.
Mr. Ammori is wrong on the facts and the sky isn’t falling. The broadband Internet has never been the neutral common carrier utopia Mr. Ammori imagines, where all bits are treated equally. Internet bits often need to be treated differently for a wide variety of legitimate and necessary network management and other reasons.
Over the last decade, net neutrality proponents incorrectly have warned of a parade of alleged ISP horribles that simply has not happened: censoring free speech, discrimination, blocking of websites, toll-booths, “dirt-road” Internet access, etc.
In fact the exact opposite has occurred in America. Over the last decade, over a thousand broadband providers have handled literally quadrillions of Internet actions with virtually no incidents of the alleged litany of abuses listed above.
Moreover, U.S. broadband providers have invested over a trillion dollars in Internet infrastructure, enabling more facilities-based broadband competition than anywhere in the world.
Furthermore, exceptional U.S. broadband competition enabled the amazing innovation and mass adoption of smart-phones and tablets, and all the mobile apps and streaming that come with them.
If the D.C. Court of Appeals rules that the law prohibits the FCC from imposing monopoly common carrier regulations on unregulated information services providers, the Internet will be just fine. The Internet was open and vibrant before and during the FCC’s Open Internet order, just like it will remain if it is partially or completely vacated.
If net neutrality violations are a real modern problem, and net neutrality is truly a modern concept, then those who believe in it should ask Congress to modernize communications law and FCC authority to address it for the 21st century.
How productive is it for the FCC to continue to push an illegal policy in opposition to Congress?
A big problem that net neutrality proponents have is that they are trying to dress up obsolete common carrier regulation as a modern policy.
Why do people who claim to support “innovation without permission” not want the government to innovate like it did in deciding to auction spectrum, replace monopoly with competition, and make broadband an unregulated information service?
Those modern government innovations have lain bare how obsolete net neutrality proponents’ ideas are.
In sum, have net neutrality proponents thought about how silly it is to call for a policy to promote innovation by imposing a policy proven to kill any government policy innovation?
We need to remember that in the so-called good old dial-up days of common carrier regulation, the government thought nothing was wrong with screeching modems and eternal online waits.
[Originally published at the Daily Caller]
Thirty years ago, while driving home to Chicago following a speaking engagement in Milwaukee, I got stuck in a two-mile long traffic jam — the result of a multi-car pile up. As we crawled along Interstate 94 at a snail’s pace, my business associate spotted an exit ramp just ahead of us. I managed to switch lanes and join another line of cars looking for a faster route. We drove west for a couple of miles and turned south onto a lonely country road.
After a short time, it became apparent just how lonely that road was; and given all its twists and turns, we weren’t sure where we were headed. So I stopped the car, took a quick look at the night sky, got back behind the wheel and told my colleague, “We’re heading west. We need to find another road.” My friend asked how a 30-second stop could tell me that we were headed in the wrong direction. “Elementary,” I replied. “I found the constellation Ursa Major which includes the Big Dipper and its two guide stars. They pointed the way to Polaris (the North Star). And its location told me that we’re heading west.”
Had it been a cloudy night … we might have wound up in Iowa!
Today’s drivers don’t have to worry about cloudy skies or locating the North Star to guide their path. A whole new constellation of man-made ‘stars’ now orbits our Earth — providing travelers with precise locations (and directions) through the revolutionary Global Positioning System (GPS). GPS Declassified describes in fascinating detail some of the navigation techniques that were employed prior to GPS, the Space Age events that paved the way to GPS, the development of GPS and the assorted applications of GPS in today’s society. It also tells the inside stories of the people who created GPS — including Roger Easton, the father of co-author Richard Easton.
Despite the highly technical nature of the subject matter, GPS Declassified is totally accessible to any reader with a modest interest in science, technology, history and culture. The authors use plain English and employ a largely chronological approach that is easy to follow. Space enthusiasts will enjoy reading about some little-known chapters in space history — episodes that were once shrouded in Pentagon secrecy, but which eventually led to things that most of us now take for granted (like the GPS units in today’s cars).
Ten chapters guide the reader from the ancient Greeks and 18thCentury mariners to early Space Age pioneers and (as the sub-title suggests) modern smartphones. The authors repeat themselves in a few places. This is not a criticism. Easton and Frazier simply remind us of some things mentioned earlier in the book — reminders which I found very useful. There is a lot of material packed between the covers of this tome, and there could have been even more. Again, this is not a criticism. So many interesting tidbits are included that I frequently found myself going to my bookshelf or searching the Internet for additional information. Here are three examples …
1) Captain James Cook was able to determine his longitude while exploring Pacific islands by observing and timing the motions of Jupiter’s Galilean satellites. (And to think that I always thought those tables were published in nautical almanacs simply for the benefit of amateur astronomers like myself!)
2) The downing of Korean Air Lines Flight 007 in 1983 precipitated the civil use of GPS. Had that Boeing 747 been equipped with GPS instead of INS, it is unlikely the crew would have strayed over restricted Soviet airspace. Which is why President Reagan approved the use of the Pentagon’s Global Positioning System for civilian airliners and other commercial applications.
3) Actress Hedy Lamarr and pianist-composer George Antheil invented the spread-spectrum signal structure during World War II. As the authors note, “They deserve at least some of the credit for the signal used by GPS.”
The invention of GPS takes up a good portion of the book, and co-author Richard Easton is in an excellent position to tell his father’s story. He was a youngster when Roger Easton, a scientist at the Naval Research Laboratory, helped develop the Vanguard satellites and invented the Minitrack tracking system, which determined their orbits. That work led to Easton’s invention of the Naval Space Surveillance System (NAVPASUR) to detect Soviet spy satellites. That system, in turn, led to Roger Easton’s invention of GPS.
Being a space historian myself, the authors really held my attention through the first half of the book as they recounted those early years of the Space Age. Project Vanguard is often depicted as a failure because of the launch pad explosion that took place on December 6, 1957 — two months after the Soviet Union put the world’s first artificial Earth satellite into space. But, Vanguard was really a great success, and it represented an innovative approach to designing and tracking scientific satellites. Vanguard I, as the authors note, is still in orbit today. A mere fifteen years separated the launch of the “grapefruit” sized Vanguard I and Apollo 17 — the last human voyage to the Moon. What a remarkable fifteen years that was.
The path from Vanguard I to Timation I — an experimental satellite designed by Easton and launched in 1967 to test his ideas for a Global Positioning System — was a long one. Inter-service rivalries between the Navy and Air Force made the effort even more difficult, as the authors describe in detail. The second half of the book recalls the evolution from military to commercial use of GPS. It also describes the constellation of satellites required to make GPS work. At least four satellites must cover every spot on Earth — 24 hours a day. Currently, a network of 30 satellites makes that possible. Each of the satellites is equipped with an atomic clock (another Roger Easton innovation) because the system depends on extreme accuracy in measuring time and distance. The first civilian application of GPS was for land surveying – something Easton envisioned during the Vanguard era when he talked about using satellites for geodetic purposes.
The 1991 Gulf War provided the first real test of GPS for guiding smart bombs. The same technology that was used to accurately find targets could also be used to locate troops caught behind enemy lines. The authors recall the dramatic story of downed F-16 pilot Scott O’Grady’s daring rescue from Bosnia in 1995. GPS was also being used for commercial aviation by that time, as well as in agriculture and highway navigation. Just as GPS was employed to locate Captain O’Grady, it is now used to find lost hikers. Of course, smartphones have become commonplace, too — thanks to GPS.
GPS Declassified is a carefully researched, well written, fast-paced and thoroughly enjoyable book. It offers an excellent mix of science, technology, history and culture. Kudos to co-authors Richard Easton and Eric Frazier for their outstanding contribution to space history. Highly recommended.
After seeing Michael Lotus — author of America 3.0: Rebooting American Prosperity in the 21st Century — speak at an America’s Future Foundation event, and then hearing him again on the Heartland Daily Podcast series, I am now even more hopeful than I already was.
Why, you ask? Because Lotus said something that I’ve been saying for the past couple of years that has earned me doubtful glances. In response to the question of why he’s so hopeful for America’s future, one of his answers is this:
There are underlying cultural foundations of the United States that are unique” that make us more resilient against temptations of socialism. Socialism does not fit our culture.
I agree! And, thank you, Michael Lotus.
During the Greek government debt crisis, my father was all huffy-puffy that America is headed straight into riotous economic collapse, like Greece. But, no, Dad, we are not Greece. Americans grow up (more or less) with values of freedom, liberty, and constitutional law. We grow up with a capitalistic mind-set, regardless if the anti-capitalists accept it or not.
My hopeful attitude expanded also because it’s incredibly refreshing to hear an intelligent human being speaking somewhat optimistically about our future. All I hear, day-to-day, is how we are headed for a totalitarian, fiery hole of tyranny and slave labor. I just don’t think it’s true. And Lotus has now provided me with some hard evidence to back it up.
America will be “rebooted” and we will recover — perhaps with an invigorated sense of capitalism and grounded values of human liberty. This is ideal, but there’s a catch. Like Lotus says, it’s up to us, to some degree. It’s not society that will fall apart- it’s the government sector that will collapse and force us to reform. And we must be the leading actors in the reforming- especially my generation.
In the podcast, Lotus says, “we can’t just move to a ‘Galt’s Gulch‘” as much as we would like to. We need to show our alternatives and “have them ready” for when the failures of our government become too large to overlook.
If I can live in an economically and personally liberating state, 30 to 40 years from now, you can keep your flying cars!
Although I kind of hope we do have flying cars by then.
On the surface, it might seem to make sense for the President to want to “do something” about climate events such as hurricanes, but there have always been hurricanes and blaming them and everything from droughts to wildfires on “climate change” is not just absurd, it is a deliberate lie that blames a rise in the amount of carbon dioxide, a so-called but incorrectly named “greenhouse gas”, as the cause of these natural events.
The President has issued an Executive Order to ramp up efforts to address “climate change.”
At the heart of the global warming hoax has been this carbon dioxide lie, but there has been no warming for over 17 years and the many computer models that predicted it were wrong; many were deliberately false.
When one looks at the actual facts about climate related events, we find that in recent years there have been fewer tornados with a decline of severe tornadoes over the past forty years. There has been more than eight years without a major hurricane strike in the U.S. and the nation has had the fewest number of forest fires for the past three decades.
President Obama has made it known that one of his goals—other than the destruction of the U.S. economy—has been to reduce “greenhouse gas” emissions by 17% by 2020. His Environmental Protection Agency has been feverishly producing regulations that reflect his “war on coal”, imposing rules on coal-fired plants that have already forced many to close. All this is based on a lie. Even the Supreme Court has taken notice and will hear a case that challenges the EPA. (It previously ruled carbon dioxide was a “pollutant”, a baseless error.)
In general, Congress has refused to take up most of Obama’s “climate change” agenda, especially his wish for a carbon tax. So it comes as no surprise that he has issued an executive order on November 1st allegedly to get states and local communities to prepare for “the impact of global warming.”
Perhaps the most curious aspect of this is that states have long had emergency response protocols and other laws in effect that are intended to respond to various climate-related problems. Many communities have plans in place. About the only thing the executive order would achieve would be to layer on more rules that would doubtless come with a higher cost.
In a recent edition of Human Events, three prominent climate scientists, experts on forecasting Dr. Kesten C. Green, on the faculty of the University of South Australia; Prof. J. Scott Armstrong of the University of Pennsylvania, and Dr. Willie Soon, got together to write “The Science Fiction of IPCC Climate Change”, reviewing the latest announcement by the UN’s Intergovernmental Panel on Climate Change.
“Global warming alarmists nevertheless claim that the ‘nearly all’ climate scientists believe dangerous global warming will occur. This is a strange claim in view of the fact more than 30,000 American scientists signed the Oregon Petition, stating that there is no basis for dangerous manmade global warming forecasts, and ‘no convincing evidence’ that carbon dioxide is dangerously warming the planet or disrupting the climate.”
Forgive me if I ask you who you believe; three experts on climate forecasting or Barack Obama?
The forecast experts debunked the IPCC assertion (and by extension Obama’s) noting that carbon dioxide is “a colorless, odorless, non-toxic gas that is a byproduct of growing prosperity. It is also a product of all animal respiration and is also essential for most life on Earth, yet in total it makes up only 0.0004 of the atmosphere.”
In May 2012, The Economist took note of The Heartland Institute, calling it “the most prominent think tank promoting skepticism about man-made climate change.” The Economist has long been an advocate of global warming so this represents significant recognition of the Chicago-based, free market think tank that has sponsored several international conferences on the subject over the years.
Anyone who wants to get the facts about the IPCC’s continued and relentless promotion of the greatest hoax of the modern era can visit http://climatechangereconsidered.org/ and read “Climate Change Reconsidered II”, a Heartland Institute project whose latest edition is a thorough review that debunks the IPCC’s misleading “science.”
As the Human Events article notes, “Other scientists contest the IPCC assumptions on the grounds that the climatological effect of increases in atmospheric carbon dioxide is trivial—and that the climate is so complex and insufficiently understood that the net effect of human emissions on global temperatures cannot be forecasted.”
Obama’s latest executive order, however, is not trivial. It continues the UN’s effort to deny access to the benefits of the use of energy worldwide to increase development, provide employment, and enhance the lives of the Earth’s population.
Obama is gearing up “climate change” as an issue to divert our attention from the many scandals and failures of his administration.
[First Published by Warning Signs]
Heartland’s Senior Fellow James M. Taylor, J.D. interviewed Professor Scott Armstrong from Pennsylvania’s Wharton School regarding his work in forecasting. Dr. Armstrong criticizes the Intergovernmental Panel on Climate Change’s work, as they claim they are use scientific forecasting when in reality they are scenarios.
These scenarios are invalid and are not scientically verifiable. In other words, scenarios are fabricated story telling. The work of Dr. Armstrong receives no outside support and provides a much more accurate climate information as a public service. His no change model is consistent to predict in regards to areas of high uncertainty.
Dr. Armstrong conducted an audit of the IPCC’s climate change findings, treating them as the ‘forecasts’ they stated, and found that 72 of the 89 scientific principles of forecasting were voided. His correlating work has used the forecasting method to use reverse predictions dating from 1850, the start of the Industrial Revolution, to present day, and 100 years into the future.
What he found in his work was that the IPCC’s predictions and calculations error margins were seven times larger than his findings. His future forecast finds showed no temperature increase whereas the IPCC’s future inaccurate temperature increase findings were 12 times the size of Dr. Armstrong’s.
Listen to this insightful discussion — which leads to an elaboration on the scientific process and validity of his work — in the player above.
[Subscribe to the Heartland Daily Podcast for free at this link.]
Heartland Institute Policy Advisor, Mischa Poppoff joins Canada AM to discuss the organic industry in Canada. Until 2009 the USDA national organic program was the effective standard for all of North America. Now that Canada is no longer held to this standard things have actually gotten worst, because they no longer have to test the organic crops to make sure nobody is cheating using the term organic.
He then points out when people buy from a grocery store most of the organic food is most likely imported from places like China, Mexico, Brazil, and Argentina. The reliance that it is certified is based on an arbitrary audit trail. It does not have to be tested to certify it under the Canadian Food Inspection Agency. Essentially they are renting out its name so consumers can believe the food they are choosing is organic when there is about a 50/50 chance its domestic. Ultimately, the only way somebody can really insure their food is organic is to either grow it yourself or visit a farm because buying organic from a grocery store in Canada does not mean it’s organic.
Watch the link below:
Every day the headlines and the TV-radio news scream murder at us. My local daily out of Newark, NJ leads with a murder there and elsewhere in the state every day. Only the occasional mass murder gets national attention.
I got to thinking about this when the shooting at the Los Angeles airport recently closed the facility and created havoc for all the flights going out or heading in. Reportedly, it was a mentally ill young man (Paul Ciancia) and, in the case of these random and mass murders, it is almost always a mentally ill person. It has nothing to do with how many people have guns.
Grant Duwe, a criminologist, told the National Public Radio that “Mass murder rates and mass public shootings have been on the decline, but what we did see was an especially bad year for mass public shootings in 2012. The number of victims who were killed and wounded was greater than in any previous year in U.S. history.”
The statistic Duwe cited is astonishingly low, only 0.2 percent of all homicides that occur in the U.S. are mass murders and, of those mass murders, ten percent are mass public killings, such as those in Aurora, Newtown, and most recently the Washington Navy Yard. “Within a given year, there are about 30 mass murders—those involving four or more—that occur in this country.”
That should be regarded as good news given the size of our population. The U.S has more guns per person than most nations. When a Russian official tweeted about the Navy Yard shootings, NPR noted that there are “fewer than 13 million firearms in circulation in Russian, compared to 300 million in the United States. That works out to about nine guns per 100 people in Russian and closer to 100 guns per 100 people in America.” Russia has some of the toughest gun laws on the books as any nation, but there were an estimated 21,603 killings in Russia in 2009. By comparison, there were 13,636 homicides in the U.S. in 2009.
As we have come to learn, the facts never stop President Obama from using every murder to advance his agenda to strip Americans of their guns.
After the Los Angeles shooting in which a Transportation Security agent was killed, he gave a speech in which he said:
As Americans bound in grief and love, we must insist here today there is nothing normal about innocent men and women being gunned down where they work. There is nothing normal about our children being gunned down in their classrooms. There is nothing normal about children dying in our streets from stray bullets. No other advanced nation endures this kind of violence, none.
This is yet another lie. In nations around the world, people die from terrorism and comparable acts of insanity in far greater numbers.
As Duwe said on NPR, mass murders in the U.S. have been on the decline, although 2012 was unusual for the number of incidents. There were seven which was higher than any year since 1999, with the highest number of victims. Asked why, Duwe attributed the decline in crime rates to demographic changes, greater numbers of police, increased used of incarceration, decreated social tolerance for crime and violence.
There is a particular irony in Obama’s knee-jerk attribution of murders to gun ownership. In September, Fox News reported: “Statistics released by the FBI earlier this week show that Chicago passed New York as America’s murder capital in 2012 despite the Windy City only having a third of the Big Apple’s population.”
Chicago is Obama’s hometown.
According to the FBI, there were 500 murders in Chicago in 2012, up from 431 in 2011. New York reported 419 murders in 2012, down from 515 the previous year. There are fifteen cities across the U.S. that were reported to have had more than 100 murders in 2012. Of the homicide in 2012, 69% involved the firearms. Many of the murders involved gang activity.
There is more irony in the promise of the Democratic mayoral candidate for New York promising to end the stop-and-frisk practices that have been credited with reducing the killings in the Big Apple. The NYPD has targeted gang violence by closely monitoring the social and family circles of criminals. Why is it that liberals seem incapable of ever learning anything from the facts?
[First Published by Warning Signs]
The Environmental Protection Agency today is hosting a “Public Listening Session” at the Metcalfe Federal Building in Chicago, allowing the public to weigh in on the Obama administration’s “Climate Action Plan.” The plan would grant broad powers to the EPA to curb the production of fossil fuels, strictly regulate coal-fired power plants, and funnel taxpayer money toward “green energy” companies.
Heartland Institute Policy Advisors Steve Goreham and Paul Driessen are at the hearing today. Below is their testimony.
“December 7, 2009 is a date that will live in infamy. Not in memory of Pearl Harbor, but because on that date, the Environmental Protection Agency declared carbon dioxide to be a pollutant under the Clean Air Act.
“Ladies and gentlemen, that is bizarre. Carbon dioxide is not a pollutant. It’s an odorless, harmless, invisible gas. It does not cause smoke or smog. The white vapor above a power plant’s cooling tower is condensing water vapor. We can’t see carbon dioxide. Each of us breathes in just a trace of CO2, but our bodies burn sugars and produce CO2, so with every breath we exhale air with 100 times the carbon dioxide that is in the atmosphere.
“In fact, CO2 is green! Hundreds of peer-reviewed studies show that higher levels of atmospheric CO2 increase the growth rate and size of plants. Plants get larger vegetables, larger fruit, and thicker tree trunks; they’re more resistant to drought with higher levels of CO2. If humans could put one compound into the atmosphere that is great for the biosphere, carbon dioxide is that compound.
“The greenhouse effect is a natural effect and man-made emissions play only an insignificant part. Earth’s dominant greenhouse gas is not carbon dioxide or methane. Earth’s dominant greenhouse gas is water vapor. Between 75 and 90 percent of Earth’s greenhouse effect is caused by water vapor and clouds. Of the remaining portion of the greenhouse effect that is due to carbon dioxide and methane, 96 percent of that is due to natural emissions from oceans and the biosphere. This means that the man-made portion of the greenhouse effect is only about one part in 100.
“Because the greenhouse effect is dominated by natural, not man-made factors, no EPA policy will have a measureable effect on Earth’s climate. No EPA policy will have an effect on icecap size, sea level rise, the frequency or intensity of hurricanes or tornadoes, or on droughts or floods. No EPA policy, however severe, will have a measurable effect on global temperatures.
“But, EPA regulations can have a severe impact on Americans. Today, 38 percent of US electricity and 47 percent of Illinois electricity comes from coal. Destruction of our coal industry will raise the price of electricity for American citizens and businesses, disproportionately affecting the poor.
“As a citizen of Illinois and the United States, I urge you to abandon this costly and futile fight to control Earth’s climate and return to solving the real pollution problems that we face.”
“The EPA says carbon dioxide from America’s coal-fired power plants is causing dangerous climate change. It says computer models support these claims.
“But the models are useless. Their predictions have been totally wrong – and none of EPA’s claims about hurricanes, tornadoes, rising seas and other alleged dangers have been accurate. Climate change has been ‘real’ since Earth began. The Dust Bowl, hurricanes, the Little Ice Age and droughts that destroyed the Anasazi and Mayan cultures were all terrible. People adapted and coped and survived – and today’s technologies allow us to deal much better with future climate changes.
“What we cannot cope with so easily are government regulations that deliberately shut down reliable, affordable coal-based electricity – and, after that, natural gas power generation. These rules will drive up energy prices and make it very hard for companies to stay in business or avoid layoffs.
“The rules will kill jobs, shut down factories, companies and industries – and devastate families and communities that depend on coal mining, factory jobs and affordable energy. And yet the EPA isn’t even holding any hearings in the states and areas that are most dependent on coal mining and coal-generated power.
“The EPA’s proposed rules will also force greater dependency on wind turbines, which kill millions of threatened and endangered birds and bats every year. That is unacceptable and unsustainable.
“But the worst impacts from EPA’s rules will be on the health and welfare of Americans. When people are unemployed, or holding two lower paying part-time jobs, the extra time, stress and financial worries have huge impacts on their health and well-being. Their nutrition suffers.
“They battle with sleep deprivation; longer commuting times; higher incidences of depression; more prevalent alcohol, drug, spouse and child abuse; higher suicide rates; and lower life expectancies. This means every life that EPA claims its rules will improve – by supposedly preventing climate change – will be made worse by the EPA’s own rules. Every life that the EPA says will be saved by its costly, job-killing CO2 regulations will be offset by lives shortened or lost by those rules.
“The EPA doesn’t even mention any of this – much less conduct any cost-benefit studies, or calculate how many lives will be shortened or lost because of its proposed rules. The EPA needs to do that work, before it takes one more step toward implementing these rules.”
Rumor says newly elected New York City Mayor Bill de Blasio is considering hiring the president of the nation’s second-largest teacher union as his schools chancellor. Expect some complaints about this obvious conflict of interest for Randi Weingarten, but not much will be different if she doesn’t get the post.
Unions are supposed to advocate for their members against management that has other priorities: making a profit, pleasing stockholders, etc. But when government employees can unionize, they effectively sit on both sides of the table during negotiations. This is why President Franklin Delano Roosevelt opposed government unions: “All Government employees should realize that the process of collective bargaining, as usually understood, cannot be transplanted into the public service.”
It’s now traditional for teacher unions to elect the school leaders they will bargain with. Accordingly, school boards and chancellors typically feel beholden to union demands, not taxpayers. This is certainly the case with de Blasio’s election. After supporting another candidate in the primary, the United Federation of Teachers (which Weingarten used to lead, and is one of the dominant locals within her national union, the American Federation of Teachers) quickly pivoted and chipped in $250,000 to a de Blasio PAC
Not just financially, but policy-wise, de Blasio’s education platform is in line with union fantasies. He suggested charging charter schools rent, which is nonsensical since charters are public schools so charging rent would literally entail sending them tax money and then taking some back. (In a recent poll, New Yorkers said they support charter schools and want more.) He also has proposed taxing the wealthiest New Yorkers more to pay for full-day preschool for all, a proposal of dubious value to children but likely to increase the number of unionized education workers. De Blasio is against merit pay for teachers, and so are unions. He’s spoken about reducing standardized testing and publicly grading schools, centralized accountability mechanisms that unions also despise.
This all seems to be a dramatic contrast to de Blasio’s charter-friendly predecessor, Michael Bloomberg, who styled himself “the education mayor.” But Bloomberg’s actual accomplishments are far smaller than his press, as Fred Siegel and Sol Stern pointed out in 2011:
Bloomberg also began dipping deeper into the city treasury for more and more tax dollars for the schools. From fiscal 2003 to 2011, the education budget grew from $12.7 billion to $23 billion annually—almost a 70 percent increase in inflation-adjusted dollars. Most of the money was paid out in 43 percent across-the-board teacher-salary increases in just the first six years of Bloomberg’s tenure. He also added more than 4,000 teachers to the payroll, reaching 80,000—one teacher for every 13 students in the system. But the mayor who prided himself on his business acumen in managing the city’s workforce obtained almost nothing in return from the United Federation of Teachers (UFT) for this unprecedented bonanza… And then, in early 2010, the Bloomberg education bubble burst. State Board of Regents chancellor Merryl Tisch and education commissioner David Steiner acknowledged that over the past several years, the test scores had been grossly inflated. Under previous education commissioner Richard Mills, the two officials said, the questions on the tests had become more and more predictable, so that teachers were able to help their students “game” the tests. For good measure, the previous Albany education administration had also set the “cut scores” for determining the different levels of student proficiency too low. When the results of the readjusted 2010 tests were announced, practically all the gains students had made since 2007 were erased.
In other words, despite the pretense that UFT was damaged by Bloomberg’s policies, the union maintained its financial and political influence over the city. With de Blasio, none of that will change. It will just be admitted more openly.
What would that mean for the kids? Well, one could take a look at the charter school UFT opened in NYC. Seven years after it opened under Weingarten’s direction, the school is one of the lowest-performing in the city (which is saying a lot), which puts it at risk of being closed by city officials. “Fewer than a third of students are reading on grade level, and the math proficiency rate among eighth-graders is less than half the city average,” Gotham Schools reported last year
I go to that school and I’m very, very happy with what we see,” current UFT President Michael Mulgrew told the nonprofit news site.
Financial realities in New York City will make it difficult for de Blasio to follow Bloomberg’s union-favored tax-and-spend-without-results policies without steering the city closer to Detroit.
“Mr. de Blasio will also face an important battle with the city’s unions,” reports the Wall Street Journal. “All 152 employee bargaining units, representing almost 300,000 workers, have been operating under expired contracts for as long as six years. The unions—which include teachers, police and firefighters—have said they would seek retroactive raises. That could cost billions of dollars a year when adjusted for inflation and coupled with future raises, according to city calculations. Mr. de Blasio said relationships he has built with union leaders over his career in government will help him broker a solution.”
So far, “relationships” with unions have meant New York City spends approximately $20,000 per K-12 pupil (if not the most in the country, that’s close) while two in five of their fourth graders essentially cannot read, and less than one third of fourth and eighth graders are “proficient” readers.
So even if Randi Weingarten isn’t NYC’s next education chancellor, she might as well be. Either way, the academic beatings will continue, and morale will likely not improve.[Originally posted on theFederalist.com, photo credit: The Federalist: Monojussi]
After an Inspector General’s audit earlier this year of now-bankrupt electric vehicle charging company Ecotality, which determined that millions of taxpayer dollars were wasted in a nearly unworkable program, the IG has returned with findings that the Department of Energy withheld information about the project’s problems during his first investigation.
The audit, released by DOE IG Gregory Friedman in July, determined (among other things) that the persistent weak demand for electric vehicles harmed the deployment and timeliness of a $135 million-plus taxpayer funded charging network, which led to excessive grants and project expansion that became virtually unusable under the grants’ guidelines. Investigators discovered that conditions for reimbursement to Ecotality for the EV charging demonstration project were “very generous” and that cost-sharing requirements were extremely lenient.
Shortly after that report was released, on August 7, Ecotality informed DOE that it was in financial distress and that its ability to do business was a going concern. The following day DOE suspended reimbursements to the San Francisco-based company, and instructed them to not incur any further costs. In an August 12 filing Ecotality informed the Securities and Exchange Commission that it had suffered “adverse financial-related developments” that would harm its ability to meet its obligations. Ecotality filed for Chapter 11 bankruptcy on September 16.
The developments that came so quickly after the IG’s audit aroused suspicions that DOE wasn’t forthcoming about everything they knew about Ecotality’s troubles. As a result, Friedman initiated a follow-up investigation to determine whether or not relevant information was withheld. In a report released yesterday, the IG said DOE’s division of Energy Efficiency and Renewable Energy failed to disclose important details that were relevant to the purpose of the audit.
“We found that the Department had not fully disclosed known concerns regarding Ecotality’s ability to meet its EV project obligations to the Office of Inspector General prior to completion of our previous audit,” the IG reported yesterday. “Information that raised questions about Ecotality’s ability to meet its project goals, including completing planned EV charger installations and the collection of EV usage data, was not provided even though the data had a readily apparent connection to our in-process audit.
“The Department became aware of the EV project concerns at about the same time that the Program was preparing a response to a draft of our July 2013 audit report.”
That DOE was ignorant about the severity of Ecotality’s financial duress either reflects their ignorance or their ineptitude. NLPC – with far less detailed information at its disposal than DOE – saw the signs long ago that the former eco-friendly cleaning products company that suddenly became a vehicle-charging “expert” run by a self-proclaimed “political beast” was going to run into problems. Nevertheless, DOE bureaucrats told the IG’s investigators that their failure to disclose information for the audit was “unintended.”
Specifically of concern to the IG was the fact that Ecotality had informed DOE in May that it would not reach the requirement to complete charging station installations by September 2013. In mid-June DOE informed Ecotality it would be required to submit a “corrective action plan” to explain how it would fix the problem. But almost a month later, when DOE provided comments in response to the IG’s draft report, the agency reported that Ecotality’s installation goals were achievable. Incredibly, the IG reported, “[DOE] officials stated that they did not think that the information was relevant to our audit, which was still in process at the time.”
“The disclosure of issues that could have impacted project completion would have led us to perform additional audit procedures to evaluate Ecotality’s ability to fulfill its obligations under the Recovery Act award,” the IG’s report said. “These issues also could have impacted our overall conclusions regarding Ecotality’s performance under the award.”
The most significant information withheld from the IG, as it moved toward the conclusion of its audit, was that Ecotality (and DOE) knew the company would fall far short of the goals set within the grant agreement. The purpose of the grant was to create in several metro areas a system of electric chargers to serve EV owners, in order to glean information about how they might be used in a widespread build-out and adoption of the technology. Originally 16 months of data was to be collected from all chargers that were installed, but Ecotality’s revised plan sought to reduce the test time to three months. The company also wanted to extend the deadline for full deployment of the chargers and more flexibility as to whether the chargers would be located in residential or commercial locations.
“These changes…would have further limited, if not virtually eliminated, data collection for certain units,” the IG reported. “As noted in our previous audit report, these types of changes would impact the quantity and perhaps the overall quality of usage data gathering and analyses.”
Information the IG received indicated that about 1,000 commercial chargers still needed to be installed and that Ecotality’s rate of deployment for them was ½ to 1/7 of what it should be.
Despite all the warning signs, and DOE’s discontinuance of payments for the $100 million Recovery Act grant after Ecotality’s notification that it was in financial straits, the IG said DOE still continued funding another $26 million vehicle testing grant that had been awarded in 2011. Among the excuses DOE provided the IG for not discontinuing all Ecotality’s funding was that the company “provided assurances that the project was not affected by the company’s financial difficulties” and that DOE instructed Ecotality not to purchase additional vehicles or equipment.
At this point IG staff had to be tearing their collective hair out at either the ineptitude, or hubris, of DOE staff. It’s difficult not to use strong language in characterizing the horrid performance of the Obama administration in holding stimulus beneficiaries accountable.
In context, it looks like the overseers on the DOE staff were running scared. During the conduct of the IG’s audit they also witnessed the scathing House hearing in April about the failed Fisker Automotive loan, in which the start-up EV company’s poor stewardship of DOE’s stimulus backing was laid bare. Also fresh in their minds was the failure of loan recipient Vehicle Production Group, whose collapsing condition was known by the Acting Director of DOE’s Advanced Technology Vehicles Manufacturing Loan Program, yet concealed during that April hearing. And the agency also had to be nervously anticipating an early September House Oversight Committee hearing in which Jonathan Silver, former director of the Department of Energy’s Loan Program Office, would be grilled about the practice of conducting government business on private email accounts.
The picture left for taxpayers is of one in which companies who ingratiated themselves with the Obama administration, who otherwise could not access the financing they needed in the private investment market, were left to spend their money without accountability. The pattern, as revealed in cases such as Solyndra, Abound Solar, Fisker, VPG and Ecotality, was that DOE was anything but transparent because there were too much information that would prove embarrassing.
Ecotality’s carcass has now been auctioned off for $3.3 million, and government’s “investment” in another green-tech company has been squandered. If this administration was truly “the most transparent in history,” could taxpayers bear the sight of it?[Article Originally Posted on nlpc.org]
On October 15, 2013, the US Supreme Court (SCOTUS) granted ‘Certiorari’ to Petitioners who have been suing the EPA over regulations to control CO2. In 2007, SCOTUS had ruled that CO2 may be considered a pollutant under the Clean Air Act (CAA), provided EPA could demonstrate that continued emission of CO2 would harm ‘human health and welfare.’ In 2009, EPA published the required Endangerment Finding, which was subsequently attacked on scientific grounds by a collection of plaintiffs. [Full disclosure: SEPP is one of the many plaintiffs involved in this lawsuit.]
However, in June 2012, the Court of Appeals for the DC Circuit ruled against plaintiffs, giving deference on the science to EPA. EPA had proceeded to institute emission limits for motor vehicles, essentially by setting mileage standards. EPA is now arguing that, having successfully set CO2 limits for motor vehicles in May 2010, the CAA requires that emission limits be set on all other emitters of CO2. Using their statutory authority to set New Source Performance Standards (NSPS), EPA has proposed stringent limits on new power plants that will make new coal plants virtually impossible to construct. The EPA also wants to limit emissions from existing coal plants, arguing that EPA can set guidelines which the states would have to follow in regulating emissions from existing plants.
In the coming case, plaintiffs are essentially appealing the decision of the DC Circuit Appeals Court and hope to prevail — even though SCOTUS is not likely to listen to scientific arguments — although publication of the authoritative NIPCC report “Climate Change Reconsidered-II” (Heartland, 2013) cannot be ignored. In fact, the Supreme Court has restricted its Cert to the single question: Is EPA permitted to extend its authority to regulate emissions from motor vehicles to stationary sources?
The EPA is likely to use a section of the Clean Air Act called Prevention of Significant Deterioration (PSD). They have a strong case; it will require considerable ingenuity for the plaintiffs’ lawyers to prevail over the EPA. One sees at least two possibilities.
A. NAAQS (National ambient air quality standards)
The CAA requires the setting of NAAQS. However, it is a fact that
1. EPA has not set a NAAQS for CO2
2. EPA does not know how to set a NAAQS for CO2. There is no scientific basis for doing so.
3. Even if EPA were to set a NAAQS, there is no way in which EPA can demonstrate that its regulations can achieve it; further, there is no way whereby EPA can enforce it — since CO2 is global and EPA cannot control emissions from other nations, like China.
4. But without a NAAQS as a goal, any effort to set emission limits must be judged to be ‘arbitrary and capricious.’ In other words, without a specific target, there is no rational way for setting emission limits for power plants or other emitters.
B. Tailoring Rule
In the regulation of ‘criteria pollutants’ (of which there are currently six) the arcane provisions of the CAA require EPA to regulate emissions from sources that emit more than either 100 or 250 tons per year.
These limits are ok for CO2 from individual small sources, motor vehicles and even big trucks. But when applied to stationary sources, there would be millions of them, including apartment and office buildings, hospitals, schools, prisons, etc.
Clearly, EPA is unable to muster an effort to issue controls for all such sources. They have therefore arbitrarily raised the lower limit to 75,000 to 100,000 tons per year — by issuing a so-called ‘Tailoring Rule.’ It would permit regulation of major sources, such as power plants, refineries, and other large industrial installations.
However, EPA cannot simply change the law to suit its convenience. This cannot be done by administrative action; it must be done by the author of the law — which is Congress. Again, EPA’s revised lower limit may be considered “arbitrary and capricious”
What to do?
The sensible course for EPA is to go back to Congress and suggest an amendment to the Clean Air Act to permit continuing without setting a NAAQS for CO2 and for allowing a ‘Tailoring Rule’ that makes CO2 regulation more manageable. But it is unlikely that EPA will choose to do this. Because once the matter goes back to Congress, it is no longer under the control of the executive branch of government.
Congress, in its wisdom, could decide to do away with the PSD (Prevention of Significant Deterioration) requirement for CO2 and thus not permit EPA to extend emission controls to power stations. Or, going even further, Congress may decide that CO2 is not to be considered as a criteria pollutant and cannot be regulated at all under the Clean Air Act. In other words, Congress may decide that CO2 is not a ‘pollutant’ — and thus overturn and make irrelevant the Supreme Court decision of 2007.
There is little doubt that the House, as currently constituted, would choose one of these routes. It is entirely possible that the US Senate will go along — even though it has a Democratic majority. But 16 Democratic senators are up for re-election in 2014 — with some from coal states in the Midwest. So there is a strong possibility that the Congress will consider CO2 to be a non-pollutant. Even if the White House were to apply a veto, there is a good chance that it will be overturned — which would constitute a big defeat for the Obama Administration.
But political futures are hard to predict. Much may ride on the outcome of Obamacare and other snafus that might affect public opinion about the White House and thereby the mood of Congress. One thing for sure: public policy should not be set by unelected bureaucrats.
[Originally published on the American Thinker]
Today we are witnessing the consequence of a country that has drifted to the left ideologically. The vast majority of people are ignorant on the foundations of freedom and the foundations of liberty.
So, how do we remedy this illness?
Yaron Brook, in his talk at The Heartland Institute’s Author Speaker Series, says the foundation of any liberty campaign should be that of education. He says that as long as the American people remain ignorant, political change is impossible.
Along with education about economics, the foundations of liberty and freedom, and the principles that founded the United States, Brook believes- as he’s spoken about many times- that there must also be a serious moral and ethical revolution in order to change the country. People must stop accepting principles of sacrifice and altruism.
Brook tells his audience to “get energized” because it’s going to be a long and drawn-out campaign for the coming decades.
You can watch the video in the player above.
[Subscribe to The Heartland Institute Youtube Channel at this link]
Think back to the fall of 2008. Congress was asked to pass a $700 billion taxpayer bailout for Wall Street. We were told it had to be passed, or else the economy would collapse, perhaps into another Great Depression.
House conservatives voted it down. The stock market fell hundreds of points in response. In the ensuing panic, Congress went along and passed the bailout.
That bailout, and the insane, nearly $1 trillion “stimulus” bill passed just a few months later as Obama’s first act, gave birth to the Tea Party revolution that gave Republicans a 63-seat landslide in the House in the 2010 elections. Voters supported that to stop the run on taxpayer funds.
Next on the horizon is an Obamacare “death spiral” for the private health insurance industry. Taxpayers will now be told a new bailout of hundreds of billions for the private health insurers must be passed, or else private health insurance will go out of business under Obamacare. That would leave the government in complete control of American health care, especially as he who pays the piper calls the tune.
It is called “single payer” by advocates of government-run health care. In plain English, “single payer” means “government monopoly” over health care, more commonly known as “socialized medicine.” That means the government decides who gets what health care, or ultimately, who lives and who dies. For those who think the government is God, such health care socialism is long overdue.
Meaning — spiraling decline in the quality of American health care, the end of investment in new, breakthrough medicine, Sarah Palin’s death panels.
The Obamacare Death Spiral Begins President Obama sold the idea of Obamacare to its most credulous supporters on the promise that it would mean universal health insurance, with no uninsured. That is what appealed about it to the Left, and to the simple-minded true believers in Obama. But even the Washington establishment Congressional Budget Office scored Obamacare as still leaving 30 million uninsured 10 years after implementation!
But the reality is even worse than that. Because the effect of Obamacare so far has been to increase the number of uninsured, rather than reduce it. Consider recent news reports from around the country:
• Florida Blue terminates 300,000 policies, about 80 percent of its individual policies in the state.
• Kaiser Permanente in California terminates 160,000 policies, about half of its individual policies in the state.
• Independence Blue Cross in Philadelphia cancels 45% of its individual policies.
• CareFirst Blue Cross Blue Shield drops 76,000 individual policies in Virginia, Maryland, and Washington, D.C., over 40 percent of its individual policies in those states.
• Insurer Highmark in Pittsburgh sends out termination notices for about 20% of its individual policies.
The Weekly Standard summarized the impact last month as likely to total 16 million terminated individual policies, out of a total individual market of 19 million policies. So more than 80% of the health insurance policies individuals buy directly by themselves would be terminated as a result of Obamacare.
But the great majority of health insurance pre-Obamacare came through employer-provided health coverage. The Obamacare wrecking ball, however, is terminating health insurance there too. CBO scored Obamacare as causing as many as 20 million workers to lose their employer-provided health insurance. Former CBO Director Douglas Holtz-Eakin estimated double that, or 40 million, in a study for the American Action Forum.
But even that does not match what Obama knew would happen back in 2010 when Obamacare was passed. Section 1251 of the Obamacare legislation includes the “grandfather” provision that is supposed to allow people to keep their current health plan if they like it. But on June 17, 2010, Obama’s own Department of Health and Human Services (HHS) published a notice in the Federal Register saying, “The Department’s mid-range estimate is that 66 percent of small employer plans and 45 percent of large employer plans will relinquish their grandfather status by the end of 2013.” That adds up to 51 percent of employer-provided health insurance plans that would become illegal and terminated under Obamacare. Counting both the employer provided and individual health insurance markets, Avik Roy of Forbes calculates that means that altogether 93 million Americans will lose their health insurance under Obamacare.
President Obama promised over and over that if you like your health insurance you can keep it, in selling his Obamacare snake oil. The ringing declaration was, “If you like your health plan, you will be able to keep your health plan. Period. Nobody will be able to take that away from you.” But now Obama derides the millions and millions of terminated policies as substandard in not meeting the Obamacare regulatory requirements. In other words, his much repeated promise to the American people is now revised to proclaim: If Obama likes your health plan you can keep it. What a sad egomaniac for suckers not only to elect but reelect as President.
Moreover, now we know that Obama knew that this pledge to the American people, which he has continued to repeat over and over since 2010, was false from the beginning.
The Obamacare Death Spiral Crash and Burn President Obama also promised repeatedly that Obamacare would reduce the cost of health insurance for families on average by $2,500 per year. But as with everything that government promises, Obamacare is having just the opposite effect, as another Obama pledge to the American people is proven false.
Avik Roy in a report for the Manhattan Institute estimated that the new Obamacare policies being offered on the Exchanges involved increased insurance premiums of 99 percent for men, and 62 percent for women. But that finding was more recently superseded by a new study by economists for the American Action Forum concluding that the Obamacare premium increases for individual policies on average amount to 260 percent for men and 193 percent.
Obamacare advocates are quick to point out that Obamacare includes tax credits to offset some of these premium increases. But for most people, those tax credits will offset only a fraction of these increases. Many people, singles making over $46,000 and families making only somewhat more, close to half the country at least, will not be eligible for these credits at all. Even those just above poverty will not see all of the increases offset.
This is what people are facing who lose their coverage and turn to the Obamacare Exchanges to find a new plan. And this too was known and predicted by opponents of Obamacare from the beginning. It’s a no-brainer that all the “free” benefits mandated by Obamacare were going to increase the cost of health insurance premiums.
Then there are the Obamacare regulatory requirements of “guaranteed issue” and “community rating.” Those requirements mean that no matter how sick and costly a new applicant for health insurance is when he first shows up, the health insurance company must issue a new policy to them covering everything at standard rates.
That is like requiring fire insurance to issue new homeowner policies to those whose first call comes when their houses are already on fire. The insurer must cover them and can charge no more than the standard rates that apply to everyone else. Of course, those standard rates must soar to ensure that the insurance company will have enough money to pay for an insurance pool covering all burnt down houses, because as the standard rates explode, no one is going to buy fire insurance until their house catches fire.
Those very guaranteed issue and community rating requirements will further contribute to the final crash and burn of the death spiral. When people see the costs of the new replacement Obamacare insurance, healthy and low cost customers are going to react the same way that George Schwab of North Carolina did when interviewed by NBC News. He liked his insurance plan from Blue Cross Blue Shield, which insured him and his wife for $228 a month. But his plan was recently cancelled (Obama did not like it). The insurance company offered him a “comparable” plan costing $1,208 a month, with an annual deductible soaring to $5,500 a year. The best alternative he could find on the Exchange charged a deductible of $948 a month, more than four times what his previous plan cost. He told NBC, “I’m sitting here looking at this, thinking we ought to just pay the fine and just get insurance when we’re sick.”
You can bet, though, that everyone who is sick with costly illnesses like cancer, heart disease, or diabetes will pay to get the insurance. Just like everyone whose house catches fire would analogously sign up immediately for fire insurance.
This problem is made even worse by the difficulties of working with the online Exchanges. Those who are healthy and low cost and just checking out prices will be quickly discouraged by the dysfunctional Obamacare website. But those who are sick with costly illnesses will persevere (as if their lives depended on it, which they may) until they succeed in getting coverage.
Moreover, as John Goodman, president of the National Center for Policy Analysis, points out in his highly insightful health policy blog, waves of the sickest and most costly are now swamping the insurance offered on the Exchanges. That is because state and federal High Risk Pools formerly serving these sickest and most costly are officially closing now, expecting Obamacare to pick them up. Moreover, government and private employers are in the process of dumping retirees they formerly covered on the Obamacare Exchanges as well. Sick and costly employees sticking with their current employers only for the health coverage because they would have costly pre-existing conditions for any new insurer are also in the process of leaving for new opportunities because they are assured of new coverage under Obamacare.
Even worse, the soaring premiums on the Obamacare Exchanges were calculated by health insurers on the assumption that the individual and employer mandates would succeed in forcing everyone to buy health insurance, the healthy and low cost as well as the sick and high cost. But with 93 million Americans, more than half of everyone with health insurance pre-Obamacare, losing their health coverage under Obamacare, and facing the incentives described above, the covered health insurance pool will not come remotely close to covering everyone, with the healthy, younger, and low cost being exactly the ones to drop out, and evade the mandates.
The pool the insurers end up covering, then, will be a lot more like the pool of all burnt down houses for fire insurers discussed above. The premiums the insurers receive from this adversely shrunken pool will not remotely cover the costs of that pool. Hence they will be facing bankruptcy next year, absent another taxpayer bailout of hundreds of billions. So the choice will be that, or socialized medicine, including the government death panels we see in every other country weighed down by this “enlightened” last century albatross.[Article originally posted on Americanspectator.com]