Somewhat Reasonable

Syndicate content Somewhat Reasonable | Somewhat Reasonable
The Policy and Commentary Blog of The Heartland Institute
Updated: 19 min 38 sec ago

Repeal Jones Act Before Exporting Oil

April 29, 2014, 10:17 PM

For the past 40 years, in response to the OPEC embargo of 1973, crude petroleum exports from the U.S. have been severely restricted. Back then, we referred to oil as “liquid gold” and felt we needed to hoard our limited supplies. But because of the “shale revolution,” U.S. oil output is at its highest level in more than 25 years. In 2013 alone, production jumped by more than 1 million barrels a day, and output is projected to jump another 1 million in 2014.

This newfound abundance has come primarily from the application of horizontal drilling and hydraulic fracturing in the many shale plays currently under development, most notably the Eagle Ford in South Texas and the Bakken in North Dakota. In response, recent months have witnessed a virtual explosion of debate and commentary about the United States getting into the business of exporting crude oil.

Some politicians and pundits claim that exporting oil will divert us from the path toward “energy independence.” Others argue that exporting oil will weaken our energy security since we’re still a net importer. Still others claim that keeping domestic oil at home will help lower gasoline and diesel prices. These arguments are baseless.

What’s more, it’s hard to envision a political scenario that would result in our inability to import oil. Over the past year, we’ve seen political unrest in Iraq, Libya, Bahrain, Syria and other petroleum exporting countries, but there has been little change in oil prices.

But removing the ban on oil exports will be a pyrrhic victory for consumers unless the Jones Act is repealed as well. A section of The Merchant Marine Act of 1920, the Jones Act requires that any ship carrying goods or commodities in U.S. waters between U.S. ports be built, registered, owned and crewed by American citizens or permanent residents. Though originally intended to improve the nation’s maritime security, today the Jones Act is simply a form of protectionism for America’s shipping industry and seafaring unions.

More important, the Jones Act distorts the allocation of America’s crude oil resources, increases our dependence on imports, and drives up energy prices for businesses and households in the Northeast. For example, today we have a glut of “light sweet crude” being produced in the Eagle Ford play of South Texas because Gulf Coast refineries are geared principally to processing heavy grades of crude oil. Unfortunately, because there are no crude oil pipelines connecting the Gulf Coast to the Northeast, where most refineries are designed to process sweet crude, those facilities must rely on imported oil that is typically more expensive. East Coast refineries are also receiving light crude from the Bakken by rail tank car, a costly and risky way to move oil over long distances.

In theory, crude oil could be shipped by Jones Act tankers from the Gulf to the Northeast. But the cost would be prohibitive, about $4 per barrel. By contrast, shipping oil by foreign-flagged carriers would cost only about $1.20 per barrel.

Repealing the Jones Act would generate a broad range of economic benefits, not only for residents of the Northeast but for all Americans. Refineries on the East Coast would have access to cheaper domestically-produced crude oil, which would lower the cost of gasoline, diesel and fuel oil for households and businesses. The resulting drop in imported oil would enhance U.S.’ energy security while at the same time improving our balance-of-payments.

Boosting the demand for domestic oil will also help sustain the energy boom that has created hundreds of thousands of jobs in recent years against the backdrop of a less-than-robust economic recovery from the Great Recession. Should repeal of the Jones Act be accompanied by, or followed by, the removal of restrictions on U.S. crude oil exports, the positive economic impacts would be magnified.

The U.S. has become a global energy powerhouse. Let’s start acting like one by removing anti-competitive and anti-growth relics like the Jones Act and the ban on crude oil exports.

[First published at the San Antonio Express-News.]

Categories: On the Blog

Heartland Institute Experts React to Supreme Court Ruling on EPA Air Pollution Rule

April 29, 2014, 9:43 PM

The U.S. Supreme Court on Tuesday ruled 6–2 to affirm the Environmental Protection Agency’s ability via the so-called “Cross-State Air Pollution Rule” to regulate power-plant emissions when those emissions have the potential to hurt downwind air-quality. The ruling in EPA v. EME Homer City Generation will affect about 1,000 power plants in 28 states.

The following statements from environment policy and legal experts at The Heartland Institute – a free-market think tank – may be used for attribution. For more comments, refer to the contact information below. To book a Heartland guest on your program, please contact Director of Communications Jim Lakely at jlakely@heartland.org.

—–

“Through geographical luck, emissions from East Coast states like Connecticut and Massachusetts drift over the Atlantic Ocean where there are no states demanding abatement or compensation. Then, hypocritically, these same East Coast states complain about emissions crossing into their state borders from upwind states. EPA is all too happy to take advantage of these hypocritical complaints as an excuse to expand the agency’s power through new rules and restrictions.

“It is a shame that the U.S. Supreme Court continues to empower EPA to issue nonsensical interpretations of statutes with the primary goal of amassing more money and power.”

James M. Taylor
Senior Fellow for Environmental Policy
The Heartland Institute
jtaylor@heartland.org

—–

“The Supreme Court’s ruling is very unfortunate. The justices are not scientists at any level and cannot imagine how totally unscientific, and in fact erroneous, are the EPA standards required for air quality. If they held a modicum of reasonableness from a scientific standpoint, the court’s decision would not be terrible. The dispersion of chemicals in the air is such that at any reasonable distance that could be harmful to human health — a few hundred yards away — is clearly innocuous a few miles away.

“The justices optimistically believe that EPA knows what it is doing. Actually, the EPA does know what it is doing — which is to do the bidding of environmental extremists who wish at every level to stifle economic progress in the name of public health.

“It is a sad state of affairs that the extreme alarmists, without a scientific leg to stand on, are winning for now. Hopefully, the day will come when an administration will fill EPA with scientists instead of anti-progress greens.”

Jay Lehr
Science Director
The Heartland Institute
jlehr@heartland.org

—–

“Clean air and clean water are, of course, good things, but so are constitutional government and the rule of law. Agree or disagree with today’s decision, it is probably best encapsulated by some thoughts from the first and last paragraphs of Justice Scalia’s dissent:

“ ‘Too many important decisions of the Federal Government are made nowadays by unelected agency officials exercising broad lawmaking authority, rather than by the people’s representatives in Congress. With the statute involved in the present cases, however, Congress did it right. … EPA’s utterly fanciful ‘from each according to its ability’ construction sacrifices democratically adopted text to bureaucratically favored policy. Addressing the problem of interstate pollution in the manner Congress has prescribed … is a complex and difficult enterprise. But ‘[r]egardless of how serious the problem an administrative agency seeks to address, . . . it may not exercise its authority’ in a manner that is inconsistent with the administrative structure that Congress enacted into law.’ Brown & Williamson, 529 U. S., at 125 (quoting ETSI Pipeline Project v. Missouri, 484 U. S. 495, 517 (1988)).’ ”

David L. Applegate
Policy Advisor, Legal Affairs
The Heartland Institute
media@heartland.org

—–

“April seems to be the month in which the Supreme Court devotes itself to decisions that have no basis in real science and can do maximum damage to the economy. Invariably, the cases are brought by the Environmental Protection Agency and are decided in its favor.

“In April 2007, the court decided that carbon dioxide, the second most essential gas for all life on the planet was a ‘pollutant,’ the definition the EPA had applied to it in order to regulate it. Now comes word that the court had concluded that the EPA may regulate power-plant emissions that blow across state lines as per a 2011 regulation, the Cross-State Air Pollution Rule.

“Not content to put nearly 150 or more coal-fired power plants out of commission, the court’s rule now gives the EPA authority to do the same thing to about a thousand power plants in the eastern half of the U.S. that will have to adopt new pollution controls or reduce operations.

“In effect, the court has just agreed to a regulation that represents a major increase in the cost of electricity in 28 states. The EPA’s claims that this will save lives they attribute to the alleged pollution is as bogus as all the rest of their justifications, the purpose of which is to undermine the nation’s economy in every way it can.”

Alan Caruba
Founder, The National Anxiety Center
Policy Advisor, The Heartland Institute
acaruba@aol.com

—–

The Heartland Institute is a 30-year-old national nonprofit organization headquartered in Chicago, Illinois. Its mission is to discover, develop, and promote free-market solutions to social and economic problems. For more information, visit our Web site or call 312/377-4000.

Categories: On the Blog

Two Cheers for Illinois Taxpayers!

April 29, 2014, 1:55 PM

Monday, April 28, 2014, purportedly marked Tax Freedom Day for Illinois taxpayers.

Coming thirteen days after state and federal income tax returns were initially due, Tax Freedom Day, according to the Illinois Policy Institute’s Senior Budget and Tax Policy Analyst Benjamin VanMetre,  marks the point in the year when Illinoisans have worked long and hard enough in the aggregate to cover their share of state, federal and local taxes “and can start keeping their hard-earned money.”  About a third of Illinois residents’ efforts this year – 118 days’ worth out of the calendar year’s 365, in other words – went just to paying taxes.

Like all statistics, though, even if technically accurate, this one is misleading.

Due to differences in income, home ownership, and spending habits, no two Illinois taxpayers likely pay the same percentage of their income in taxes.  (Put otherwise, no “average taxpayer” actually exists.)  Low-income renters pay relatively (and nominally) less in income and property taxes and possibly relatively more (but nominally less) in sales taxes.  High-income homeowners likely pay both relatively and absolutely more in income and property taxes and possibly relatively less (but nominally more) in sales taxes.  They aren’t likely simply to average out.  Many people pay a whole lot more, and thus effectively longer.

As in the federal system, a relatively smaller number of higher-income working individuals pays a disproportionate share of total taxes.  These taxes fund not only reasonably necessary government services but also wealth-transfer programs including pensions to former state officials, teachers, and other public employees who once arguably provided government services but no longer do so.  A relatively larger number of lower-income taxpayers pays a smaller percentage of total taxes and, in some cases, receives net payments from the government, therefore being effectively taxed at a negative tax rate.

Putting aside for a moment how their money is spent, things are about to get worse for Illinois taxpayers.  The – ahem! cough!  cough! – “temporary” 67% increase in the state’s flat rate income tax of a couple of years ago from 3% to 5% of adjusted gross income is about to take one of two directions:  a drop to a flat 3.75% effective January 1, 2015 – still 25% higher than its 3% predecessor – or, more likely, a change to a “progressive” income tax system for persons who earn income in Illinois.

The “progressive” tax increase being sold as a solution to Illinois’ increasingly desperate financial straits is likely only to exacerbate them.  If enacted, it would decrease the Illinois income tax rate from the scheduled reduced 3.75% to 2.9% of adjusted gross income only for those making the pathetically small amount of $12,500 per year or less; increase the rate to 4.9% for those making up to $180,00 per year; and hike the rate on those above $180,000 to 6.9%.

Why 2.9% and 4.9% instead of 3% and 5%, respectively?  Because that way the politicians in Springfield can cynically claim they’ve “reduced” tax rates for a majority of Illinois taxpayers even though everyone making over $12,500 per year in Illinois will be paying more than they were three years ago.  (Those in the 4.9% bracket will pay 63.33% more than they did before the “temporary” tax increase, and the fortunate few earning over $180,000 will pay nearly two-and-a-third times as much.)

Fortunately, this proposal would require an amendment to the Illinois constitution, and is drawing fire even from some perhaps unlikely quarters like the teachers’ unions.   Still, don’t be surprised to see some sort of personal income tax increase in Illinois come 2015, even as the state continues to refuse to rein in wasteful spending.

On second thought, make that only one cheer for Illinois taxpayers.

Categories: On the Blog

A Key Ingredient in the Left’s Wins: Persistence

April 29, 2014, 11:32 AM

In the late, great Harold Ramis’ cinematic classic “Animal House,” perpetual Faber College student John Blutarsky succinctly summed up the Left’s approach to policy.

“Over? Did you say ‘over?’ Nothing is over until we decide it is!  Was it over when the Germans bombed Pearl Harbor? Hell no!…

“It ain’t over now, ’cause when the going gets tough, the tough get going. Who’s with me? Let’s go! Come on!….”

Note the Left’s characteristic (historical) accuracy.

Bluto fictitiously rode this philosophy to the Senate.  Several Leftists have in real-life taken it all the way to the White House.  Too many to count have infested Congress and elected offices all over the country.

The Left has made an art form out of the maxim “If at first you don’t succeed – try, try again.”  They perpetually push terrible, government-expanding policies – and no number of failures deters them from pushing until they win.

It is largely why they fight so hard to protect power grabs already won – no matter how huge the failures.

Trustees Say Long-Run Medicare, Social Security Deficit is $66 Trillion

Reform Or Go Broke: Medicaid, Medicare Must Change

They in so many instances fought so hard and waited so long to impose them.

Their incessant, infernal screeching is largely for effect – the Left is actually very patient in waiting to attain new government power grabs.  A quintessential example – the abomination that is ObamaCare.

The Affordable Care Act – ObamaCare – was signed into law in 2010.  Was President Barack Obama and this modern cadre of Leftists the first ever to a want top-down, government health care takeover?  Of course not.

The Bill Clinton Administration floated in 1993 HillaryCare.  President Harry Truman in 1945proposed full-on federal government medicine.  And piecemeal government-medicine expansion began all the way back in the mid-19th Century.

The going got tough – and the Left kept going.  Do Establishment Republicans show this sort of stamina and intestinal fortitude?  Not so much.

McMorris Rodgers Says ACA Likely to Stay

“McMorris Rodgers…part of the Republican leadership in the House…said the framework established by the law likely will persist and reforms should take place within its structure.”

The ObamaCare framework will certainly persist if the leadership of the Party that is supposed to be for repeal – publicly gives up on repealing it.

Why offer the exact same “Mend it – don’t end it” message as the desperate-to-get-out-from-under-ObamaCare Democrats?  How does that delineate you – and help you defeat them in, say, November 2014?

Especially when the law has never, ever enjoyed public support.

After Four Years, ObamaCare Is Still Unpopular

ObamaCare’s Unpopularity Rises Among Uninsured

Oh, Look, ObamaCare’s Unpopular Again

Republicans have tried without success for repeal of a terrible, majority-opposed law for…four years.  So sure, just throw in the towel.

The Left pushed for full-on government medicine for (at least) seven decades.  And they are consistently persistent.

It’s Groundhog Day – Again: Government Taking Third Stab at Net Neutrality Power Grab

This is the (Left’s) third attempt at this particular power garb.  It’s becoming fetishistic.

There was 2007.

Court Backs Comcast Over FCC on ‘Net Neutrality’

And there was 2010.

Verizon Wins Net Neutrality Court Ruling Against FCC

“Yet here we remain – stuck in government overreach Groundhog Day.”

Already twice bitten – they are not shy.  Onward they push.

“The price of freedom is eternal vigilance.”

The cost of a lack of vigilance is soaring and searing – and we’ve been paying it in skyrocketing increments for decades.

1913 federal budget: $715 million.

2013 federal budget: $3.45 trillion.

We here in Reality must be at least as vigilant and persistent in reducing government as the Left is at expanding it.

We have a lot of undoing to do.

Categories: On the Blog

The Multi-Speed Internet is Getting More Faster Speeds

April 29, 2014, 10:49 AM

The Internet has long had multiple speeds. And it constantly gets faster speeds via technological and commercial innovation, competition, and investment.

The Internet also has long met people’s diverse needs, wants and means for speed, with different technologies, pricing, and content delivery methods, and it will continue to do so.

Net neutrality activists’ latest rhetoric that opposes the FCC’s court-required update of its Open Internet rules, by implying that there haven’t been “slow and fast lanes” on the Internet before, is obviously factually wrong and misleading, both for consumers receiving content and for entities sending content.

Many in the media have fallen for this mass “fast lane” deception without thinking or questioning it.

First, isn’t it odd that those who routinely complain that the Internet is not fast enough oppose genuine FCC efforts to make the Internet faster?

Moreover, isn’t it ironic that the net neutrality activists — who have long criticized the FCC for the U.S. falling behind in the world in broadband speeds, and long advocated for municipalities to create giga-bit fast lanes for some communities — vehemently oppose FCC efforts to create “faster lane” Internet for those entities that need it and are willing to pay for it?

Do net neutrality activists really want a faster Internet, or do they want a utopian Internet speed limit designed to enforce Internet speed equality by preventing anyone from going faster on the Internet than anyone else?

Second, Internet consumers have long had their choice of multiple Internet speeds.

Different technologies inherently offer different speeds. Fiber is faster than cable, cable faster than DSL and DSL faster than dial-up. Wireless speeds are very different depending on the protocol, LTE, 4G, 3G, 2G, etc., and depending on the amount of spectrum available and the number of people using a given cell at any given time.

Since people naturally have diverse wants, needs and means, the Internet marketplace has long offered consumers different market prices for different Internet access speeds, different amounts of usage, and different discounts for buying more than one service in a bundle.

The Internet also offers consumers different speeds for free-of-charge Internet access, depending on what public or private institution’s free WiFi one wants to use and how many others avail themselves of that free-of-charge Internet access at any given time.

Consumers know there is not one Internet speed. They know there are many speeds depending on whether one wants to pay nothing, something, or varying options of paying more.

Third, Internet content providers also have long had their choice of multiple speeds. Since the late 1990’s there has been a vibrant and diverse marketplace of content delivery networks (CDNs) that content providers of most all sizes can pay for to ensure faster Internet delivery of their content.

This is generally accomplished by geographically-locating server farms closer to consumers in order to ensure their content can arrive faster and avoid congestion. But CDN competition continues to encourage a wide variety of innovations to ensure faster Internet delivery.

Importantly, the largest corporate users of downstream bandwidth, Netflix and Google-YouTube, which together consume half of North America’s downstream bandwidth per Sandvine, have long paid others to ensure faster Internet service for their customers, than others would get without the additional cost they pay to achieve faster Internet speed and performance for their users.

Finally, net neutrality activists’ obsession with imposing an Internet speed limit to guarantee equality of transmission, completely ignores that content providers have other technical attributes besides speed that they must have to compete and best serve their customer bases over the Internet.

Cloud companies’ customers tolerate near no downtime, so they may need and want to pay for specialized higher quality-of-service than Internet “best efforts” can deliver. Video streamers and video conference call providers may need guaranteed faster speeds and specialized quality-of-service to prevent jitter or buffering problems.

Gaming companies and Voice-over-Internet (VoIP) providers may need prioritized specialized services to ensure real-time quality-of-service without latency. Health care providers may need specialized services for a variety of very-high bandwidth and usage needs to immediately send and receive massive files like MRIs with no downtime.

In sum, the Internet has long had multiple speeds, faster and slower Internet lanes, to naturally meet the diverse needs, wants and means of those who use the Internet.

Imagine how silly it would sound if some activists said it was unfair and anti-competitive to have faster delivery of mail and packages? That next-day service or priority-expedited-delivery by the U.S. Postal Service, Fedex, UPS, or DHL, should be banned by government because it would stifle the next innovative company in a garage, because they could not afford it and thus it would infringe on their freedom of speech?

The dispute here is not over freedom of speech; it’s over opposing definitions of “free.” Net neutrality activists define “free” here as no-cost use of the Internet, where others define “free” as the freedom to access the legal Internet content of their choice.

Internet speed limits, designed to force a zero-price on all downstream traffic via FCC price regulation, are unnecessary and antithetical to a faster Internet.

[Originally published at Precursor Blog]

Categories: On the Blog

Largest World Cities: 2014

April 29, 2014, 9:01 AM

The recently released 10th edition of Demographia World Urban Areas provides estimated population, land area and population density for the 922 identified urban areas with more than 500,000 population. With a total population of 1.92 billion residents, these cities comprise approximately 51 percent of the world urban population. The world’s largest cities are increasingly concentrated in Asia, where 56 percent are located. North America ranks second to Asia, with only 14 percent of the largest cities (Figure 1). Only three high income world cities are ranked in the top ten (Tokyo, Seoul and New York) and with present growth rates, Tokyo will be the lone high-income representative by the middle 2020s.

Demographia World Urban Areas is the only regularly published compendium of urban population, land area and density data for cities of more than 500,000 population. Moreover, the populations are matched to the urban land areas where sufficient data is available from national census authorities.

The City

The term “city” has two principal meanings. One is the “built-up urban area,” which is the city in its physical form, encompassing virtually all of the land area encircled by rural land or bodies of water. Demographia World Urban Areas reports on cities as built-up urban areas, using the following definition (Note 1).

An urban area is a continuously built up land mass of urban development that is within a labor market (metropolitan area or metropolitan region). As a part of a labor market, an urban area cannot cross customs controlled boundaries unless the virtually free movement of labor is permitted. An urban area contains no rural land (all land in the world is either urban or rural).

The other principal definition is the labor market, or metropolitan area, which is the city as the functional (economic) entity. The metropolitan area includes economically connected rural land to the outside of the built-up up urban area (and may include smaller urban areas). The third use, to denote a municipal corporation (such as the city of New York or the city of Toronto) does not correspond to the city as a built-up urban area or metropolitan area. This can – all too often does –   cause confusion among analysts and reporters who sometimes compare municipalities to metropolitan areas or to built-up urban areas.

A Not Particularly Dense Urban World

Much has been made of the fact that more than one-half of humanity lives in urban areas, for the first time in history. Yet much of that urbanization is not of the high densities associated with cities like Dhaka, New York, or even Atlanta.

The half of the world’s urban population not included in Demographia World Areas lives in cities ranging in population from the hundreds to the hundreds of thousands (see: What is a Half-Urban World). In the high income world, residents of large urban areas principally live at relatively low densities, with automobile oriented suburbanization accounting for much of the urbanization in Western Europe, North America, Japan and Australasia. This point was well illustrated in research by David L. A. Gordon et al at Queen’s University (Kingston, Ontario), released last year which concluded that the metropolitan areas of Canada are approximately 80 percent suburban.

Population

There are now 29 megacities, with the addition in the last year of London. London might be thought of as having been a megacity for decades, however the imposition of its greenbelt forced virtually all growth since 1939 to exurban areas that are not a part of the urban area, keeping its population below the 10 million threshold until this year (Demographia World Urban Areas Table 1).

The largest 10 contain the same cities as last year, though there have been ranking changes.Tokyo, with 37.6 million residents, continues its half century domination, though its margin over growing developing world cities is narrowing, especially JakartaManila became the fifth largest urban area in the world, displacing Shanghai, while Mexico City moved up to 9th, displacing Sao Paulo (Figure 2).

Land Area

Often seen as the epitome of urban density, the urban area of New York continues to cover, by far, the most land area of any city in the world. Its land area of nearly 4,500 square miles (11,600 square kilometers) is one-third higher than Tokyo’s 3,300 (8,500 square kilometers). Los Angeles, which is often thought of as defining low-density territorial expansion ranks only fifth, following Chicago and Atlanta, with their substantially smaller populations (Figure 3). Perhaps more surprisingly is the fact that Boston has the sixth largest land area of any city in the world. Boston’s strong downtown (central business district) and relatively dense core can result in a misleading perception of high urban density. In fact, Boston’s post-World War II suburbanization is at urban densities little different than that of Atlanta, which is the world’s least dense built-up urban area with more than 3 million population. Now, 29 cities cover land areas of more than 1,000 square miles or 2,500 square kilometers (Demographia World Urban Areas Table 3).

Urban Density

All but two of the 10 densest cities are on the Indian subcontinent. Dhaka continues to lead in density, with 114,000 residents per square mile (44,000 per square kilometer).  Hyderabad (Pakistan, not India) ranks a close second. Mumbai and nearby Kalyan (Maharashtra) are the third and fourth densest cities. Hong Kong and Macau are the only cities ranking in the densest ten outside the subcontinent (Figure 4). Despite its reputation for high urban densities, the highest ranking city in China (Henyang, Hunan) is only 39th (Demographia World Urban Areas Table 4).

Smaller Urban Areas

Demographia World Urban Areas Table 2 includes more than 700 additional cities with fewer than 500,000 residents, mainly in the high income world. Unlike the main listing of urban areas over 500,000 population, the smaller cities do not represent a representative sample, and are shown only for information.

Density by Geography

Demographia World Urban Areas also provides an average built-up urban area density for a number of the geographical areas. Africa and Asia had the highest average city densities, at 18,000 per square mile (7,000 per square kilometer), followed by South America. Europe was in the middle, while North America and Oceania have the lowest average city densities (Figure 5).

Some geographies, however, had much higher average urban densities. Bangladesh was highest, at 86,800 per square mile (33,000 per square kilometer), nearly five times the Asian average. Other geographies above 30,000 per square mile (11,500 per square kilometer) included Pakistan, the Democratic Republic of the Congo, the Philippines, India and Colombia, the only representative from the Western Hemisphere (Demographia World Urban Areas Table 5).

———————————————–

Note 1: Urban areas are called also called “population centres” (Canada), “built-up urban areas” (United Kingdom, “urbanized areas’ (United States), “unités urbaines” (France)  and “urban centres” (Australia). The “urban areas” of New Zealand include rural areas, as do many of the areas designated “urban” in the People’s Republic of China, and, as a result, do not meet the definition of urban areas above.

Note 2: Demographia World Urban Areas is a continuing project. Revisions are made as more accurate satellite photographs and population estimates become available. As a result, the data inDemographia World Urban Areas is not intended for comparison to prior years, but is intended to be the latest data based upon the best data sources available at publication.

[Originally published at New Geography]

Categories: On the Blog

Something Rotten in the State of Kentucky

April 28, 2014, 11:28 AM

Last week the presidential hopes of Senator Rand Paul took a serious blow. The Kentucky House of Representatives allowed a bill to die without a vote that would have permitted candidates to run for more than one elected office at at time. The bill could be revisited in the next legislative session, which begins in January 2015, but the House does not appear eager to pass the bill at all. And even if it did, Senator Paul would already be months behind other Republican contenders for the presidency in starting on the campaign trail.

The ability to run for multiple offices seems like hardly a serious issue. After all, nowhere in the bill does it suggest that, if elected to both offices, an individual would actually serve in both. All it means is that there might need to be a special election, or perhaps an appointment by the governor.

The decision to block Senator Paul’s dual run is pure tribal politics. The Democrat-controlled House is happy to cause problems for a senator who is popular amongst his constituents and is one of the chief contender for his party’s presidential nomination. Forcing a choice allows them score a victory whatever the choice may be. If he chooses to eschew the presidency and run for the Senate, the House will have scuppered a potentially powerful presidential contender. If he chooses to run for the White House, the Democrats can run a candidate for the Senate unburdened by the challenge of incumbency.

There is certainly no lack of precedent for permitting candidates to run for multiple offices, from both sides of the political aisle. In 2000, Joseph Lieberman won his Senate seat in Connecticut while losing his run for vice president alongside Al Gore. Similarly, in 2012, Paul Ryan won his House seat while being defeated beside Mitt Romney.

Forcing the choice makes for bad outcomes. It thieves the public of valuable potential choices and can seriously cost political movements. Barry Goldwater ran for president in 1964, but could not run for his Senate seat. When he lost the race, he was out of the Senate and the country was deprived of one of its most competent legislators.

The choice to run for president is a monumental one. It demands a huge amount of sacrifice in terms of time, resources, and privacy. Making that choice even harder means the public suffers for greater lack good of candidates to choose between.

Categories: On the Blog

Should California Dictate US Energy Policies?

April 28, 2014, 10:42 AM

California loves to be seen as the trendsetter on energy and environmental policies. But can we really afford to adopt their laws and regulations in the rest of America? Heck, can the once Golden State afford them itself? The path to hell is paved with good intentions, counter-productive policies – and hypocrisy.

The official national unemployment rate is stuck at 6.7% – but with much higher rates for blacks and Hispanics that remains the lowest in 35 years. Measured by gross national product, our economy is growing at an abysmal 1.5% or even 1.0% annual rate.

Meanwhile, California’s jobless rate is higher than in all but three other states: 8.1% – and with far worse rates as high as 15% for blacks, Hispanics and inland communities. First the good news, then the insanity.

Citigroup’s Energy 2020: North America report estimates that the United States, Canada and Mexico could make North America almost energy independent in six years, simply by tapping their vast recoverable oil and gas reserves. Doing so would help lower energy and consumer prices, insulate the three nations from volatile or blackmailing foreign suppliers, and spur job creation based on reliable, affordable energy, says the U.S. Energy Information Administration.

Driving this revolution is horizontal drilling and hydraulic fracturing. According to Citigroup, IHS Global Insights, the EIA and other analysts, “fracking” technology contributed 2.1 million jobs and $285 billion to the US economy in 2013, while adding $62 billion to local, state and federal treasuries! Compare that to mandates and subsidies required for expensive, unreliable, job-killing wind, solar and biofuel energy.

Fracking also slashed America’s oil imports from 60% of its total petroleum needs in 2005 to just 28% in 2013. It slashed our import bill by some $100 billion annually.

By 2020 the government share of this boom is expected to rise to $111 billion. By 2035, U.S. oil and natural gas operations could inject over $5 trillion in cumulative capital expenditures into the economy, while contributing $300 billion a year to GDP and generating over $2.5 trillion in cumulative additional government revenues.

A Yale University study calculates that the drop in natural gas prices (from $8 per thousand cubic feet or million Btu in 2008, and much more on the spot market, to $4.00 or so now) is saving businesses and families over $125 billion a year in the cost of heating, electricity and raw material feed stocks.

The only thing standing in the way of a US employment boom and economic and industrial renaissance, says Citigroup, is politics: continued or even more oppressive anti-hydrocarbon policies and regulations.

Here’s the insanity. Fully 96% of this nation’s oil and gas production increase took place on state and private lands. Production fell significantly on federal lands under President Obama’s watch, with the Interior Department leasing only 2% of federal offshore lands and 6% of its onshore domain for petroleum, then slow-walking drilling permits, according to the Institute for Energy Research.

The President continues to stall on the Keystone pipeline, while threatening layers of expensive carbon dioxide and other regulations, to prevent what he insists is “dangerous manmade climate change.” His EPA just adopted California’s expensive all-pain-no-gain rules for sulfur in gasoline, and the Administration and environmentalists constantly look to the West Coast for policy guidance.

Governor Jerry Brown says 30 million vehicles in California translate into “a lot of oil” and “the time for no more oil drilling” will be when its residents “can get around without using any gasoline.” However, that rational message has not reached the state’s legislators, environmental activists or urban elites.

California’s oil production represents just 38% of its needs – and is falling steadily, even though the state has enormous onshore and offshore natural gas deposits, accessible via conventional and hydraulic fracturing technologies. The state imports 12% of its oil from Alaska and 50% more from foreign nations, much of it from Canada, notes Sacramento area energy consultant Tom Tanton.

Of course, California’s ruling elites are also opposed to drilling and fracking – and leading Democrats are campaigning hard to impose a temporary or permanent ban, on the ludicrous grounds that fracking causes birth defects, groundwater contamination and even earthquakes.

Its record is far worse when it comes to electricity. The Do-As-I-Say state imports about 29% of its total electricity from out of state: via the Palo Verde nuclear power plant in Phoenix, coal-fired generators in the Four Corners area, and hydroelectric dams in the Southwest and Pacific Northwest, Tanton explains.

Another 50% of its electricity is generated using natural gas that is also imported from sources outside California. Instead, the Greener-Than-Thou State relies heavily on gas imported via pipelines from Canada, the Rockies and the American Southwest, to power its gas-fired turbines. Those turbines and out-of-state sources also back up its forests of unreliable bird-killing wind turbines.

That’s certainly one way to preen and strut about your environmental consciousness. Leach off your neighbors, and let them do the hard work and emit the CO2.

These foreign fuels power the state’s profitable and liberal Silicon Valley and entertainment industries – as well as the heavily subsidized electric and hybrid vehicles that wealthy elites so love for their pseudo-ecological benefits, $7,500 tax credits, and automatic entry into fast-moving HOV lanes.

Meanwhile, California’s poor white, black, Hispanic and other families get to pay $4.23 per gallon for regular gasoline, the second highest price in America – and 16.2 cents per kWh for residential electricity, double that in most states, and behind only New York, New England, Alaska and Hawaii.

However, the state’s eco-centric ruling classes are not yet satisfied. Having already hammered large industrial facilities with costly carbon dioxide cap-and-trade regulations, thereby driving more jobs out of the state, on January 1, 2015 they will impose cap-and-trade rules on gasoline and diesel fuels. That will instantly add more than 12 cents per gallon, with the price escalating over the coming years.

Regulators are also ginning up tough new “low-carbon fuel standards,” requiring that California’s transportation fuels reduce their “carbon intensity” or “life-cycle” CO2 emissions by 10% below 2010 levels. This will be accomplished by forcing refiners and retailers to provide more corn-based ethanol, biodiesel and still-nonexistent cellulosic biofuel.

These fuels are much more expensive than even cap-tax-and-traded gasoline – which means the poor families that liberals care so deeply about will be forced to pay still more to drive their cars and trucks.

In fact, Charles River Associates estimates that the LCFS will raise the cost of gasoline and diesel by up to 170% (!) over the next ten years, on top of all the other price hikes.

In the meantime, China, India, Brazil, Indonesia, Germany and a hundred other countries are burning more coal, driving more cars and emitting vastly more carbon dioxide. So the alleged benefits to atmospheric CO2 levels are illusory, fabricated and fraudulent.

Of course, commuters who cannot afford these soaring prices can always park their cars and add a few hours to their daily treks, by taking multiple buses to work, school and other activities.

There’s more, naturally. A lot more. But I’m out of space and floundering amid all the lunacy.

Can we really afford to inflict California’s insane policies on the rest of America? In fact, how long can the Left Coast afford to let its ruling classes inflict those policies on its own citizens?

 

[Originally published at Townhall.com]

Categories: On the Blog

Time Magazine Gets it Wrong on the Suburbs

April 28, 2014, 10:34 AM

Time Magazine’s Sam Frizell imagines that the American Dream has changed, in an article entitled “The New American Dream is Living in a City, Not Owning a House in the Suburbs.” Frizell further imagines that “Americans are abandoning their white-picket fences, two-car garages, and neighborhood cookouts in favor of a penthouse view downtown and shorter walk to work.” The available population data shows no such trend.

Frizell’s evidence is the weak showing in single family house building permits last month and a stronger showing in multi-family construction.

This is just the latest in the “flocking to the city” mantra that is routinely mouthed without any actual evidence (see: Flocking Elsewhere: The Downtown Growth Story). The latest Census Bureau estimates show that net domestic migration continues to be negative in the core counties (which include the core cities) of the major metropolitan areas (those with more than 1,000,000 residents). The county level is the lowest geographical level for which data is available.

At the same time, there is net domestic inward migration to the suburban counties. Moreover, much of the net domestic migration to metropolitan areas has been to the South and Mountain West, where core cities typically include considerable development that is suburban in nature (such as in Austin, Houston and Phoenix). As the tepid “recovery” has proceeded, net domestic migration to suburban counties has been strengthened (see: Special Report: 2013 Metropolitan Area Population Estimates), as is indicated in the Figure.

There is no question but that core cities are doing better than before. It helps that core city crime is down and that the South Bronx doesn’t look like Berlin in 1945 anymore. For decades, many inclined toward a more urban core lifestyle were deterred by environments that were unsafe, to say the least. A principal driving force of this has been millennials in urban core areas. Yet, even this phenomenon is subject to over-hype. Two-thirds of people between the ages of 20 and 30 live in the suburbs, not the core cities, according to American Community Survey data.

To his credit, Frizell notes that the spurt in multi-family construction is “not aspirational,” citing the role of the Great Recession in making it more difficult for people to buy houses. As I pointed out in No Fundamental Shift to Transit: Not Even a Shift, 2013 is the sixth year in a row that total employment, as reported by the Bureau of Labor Statistics was below the peak year of 2007. This is an ignominious development seen only once before in the last 100 years (during the Great Depression).

In short, urban cores are in recovery. But that does not mean (or require) that suburbs are in decline.

[Originally published at New Geography]

Categories: On the Blog

The Other Side of the Global Warming Story

April 28, 2014, 10:19 AM

[This article was originally published in the Alumni magazine of the Hotchkiss School] Many readers will surely agree with me that Hotchkiss launched us into successful and fulfilling careers, mine as a scientist, physicist and engineer continues actively 60 years past graduation. Among my specialties has been the study of climate, present, past and future, ever since the 1970s when the popular press was warning of an advancing ice age, and on into the 90s when global warming took over with unsubstantiated anecdotal evidence and mathematical models.  Models that are unable to calculate past temperatures when all the variables are known, or the future whether a year or 10 down the road. They continue to predict doom a century from now if mankind does not alter its proclivity to lighten man’s burden with inexpensive fossil fuel.

The pages of this amazing magazine have promoted this unwarranted fear and unmitigated arrogance as to man’s impact on his climate. Happily they have offered me the use of 750 words to try and set the record straight or at least level the playing field with some irrefutable scientific facts.

1 – While temperatures have fluctuated over the past 5000 years, today’s earth temperature is below the average for these past 5000 years.

2 – Temperature fluctuations during the current 300 year recovery from the Little Ice Age, which ended around the time Washington’s soldiers were freezing at Valley Forge, correlate almost perfectly with our sun’s changes in activity level.

3- The National Aeronautic and Space Agency (NASA) has determined that during the time the Earth was warming in the past century so was Mars, Pluto, Jupiter and the largest moon of Neptune.

4- We know that 200 million years ago when the dinosaurs walked the Earth, average Carbon Dioxide concentration in the atmosphere was 1800 parts per million, more than four times higher than today.

5- If greenhouse gases were responsible for increases in global temperature, then atmospheric physics shows that higher levels of our atmosphere would show greater warming than lower levels.  This was not found to be true during the 1978 to 1998 period of .3 degrees centigrade warming.

6- 900.000 years of ice core temperature records and carbon dioxide content records show that CO2 increases follow rather than lead increases in Earth temperature which is logical because the oceans are the primary source of CO2 and they hold more CO2 when cool than when warm, so warming causes the oceans to release more CO2.

7- The effect of additional CO2 in the atmosphere is limited because it only absorbs certain wave lengths of radiant energy.  As the radiation in that particular wave length band is used up, the amount left for absorption by more of the gas is reduced.

8- While we hear much about one or another melting glaciers, a recent study of 246 glaciers around the world  indicated a balance between those that are losing ice, gaining ice, and remaining in equilibrium.

9- It is amusing that the polar bear has become the symbol of global warming while its North American population has increased from 5000 in 1960 to more than 25,000 today.

Although the court of public opinion already weigh climate change as a very low economic priority, the media continues to uncritically accept and vigorously promote shrill global warming alarmism. The United States government budgets $6 billion a year for climate research supporting a growing industry of scientists and university labs that specialize in the subject.  It all adds up to a significant institutionalization of the impulse to treat carbon as a problem.

Climate change is not a scientific problem that found political support.  It is about eco-activists and politicians who found a scientific issue they feel can leverage them into power and control. The environment is a great way to advance a political agenda that favors central planning and an intrusive government.  What better way to control someone’s property than to subordinate one’s private property rights to environmental concerns.

While the most extreme environmental zealots may be relatively few in number, they have managed to gain undue influence by exploiting the gullibility of many ordinary and scientifically untrained people, willing to believe that the planet needs saving from man’s excesses.  Perhaps it is a psychological throwback to those earlier civilizations that offered human sacrifices to the gods, to assuage their sins and spare them from punishment in the form of drought, flood, famine or disease. There are certainly many parallels between modern environmentalism and religion.

Categories: On the Blog

Response to a Critic of Climate Realists and the NIPCC Reports

April 28, 2014, 12:02 AM

Keith Peterman is a professor at York College and the proprieter of Global Hot Topic blog. Judging from the photo he has on his college website, he seems like a fun and inspiring teacher.  G

The other day, Peterman decided to weigh in against letter-writer Missy Updyke at the York Daily News. Updyke committed a global warming party foul by citing the work of the Nongovernmental International Panel on Climate Change (NIPCC) to counter the idea that the United Nations IPCC deserves to be treated as Moses holding stone tablets when it comes to what is happening to the climate.

They don’t, and kudos to Updyke for citing the definitive scientific organization out there countering the politicized science of the IPCC.

Typically with the MSM, climate-alarmist Peterman was given a lot more words to counter climate-realist Updyke’s letter (801 words for him; 330 word for her). You can read Peterman’s lettert here. And below is the rebuttal we put beneath his very long counter “argument”:

This essay contains a nice catalogue of myths, making it a useful vehicle for setting the record straight.

* Unlike the bogus “97 percent agree” meme, the latest serious and scientific examination of the views of climate scientists shows no “consensus” that human activity is causing catastrophic, runaway global warming.

* It is unfortunate that Peterman did not take better advantage of his attendance at ICCC-3. We have 18 presentations on record which included examinations of the actual scientific data. In fact, the eight conferences (with a ninth slated for July) feature hundreds of such presentations. Many side sessions don’t include standing ovations, but probably deserve them.

* None of the NIPCC reports — ZERO — have been funded with corporate money. They are funded by family foundations that have no interest in the energy sector. The NIPCC reports are vigorous, pulling from the peer-reviewed literature. They then go through through another peer-review process before being published. There are now some 4,000 pages in the Climate Change Reconsidered series with tens of thousands of citations to the peer-reviewed literature. Judge it for yourself.

* Ahh … the “Koch Brothers” meme — so precious to so many on the left and those who buy into the debating tactic of demonization. The Heartland Institute is not funded by the Koch Brothers. The Charles G. Koch Foundation donated $25,000 in 2012 to support what Heartland was already doing on health policy. That was the first Koch-connected donation to Heartland in a decade, and is completely irrelevant to this discussion. It’s a shame Peterman spent so many precious words on that false point.

Besides, Heartland has policies in place that prevent funders from interfering with our research — just as the science journal “Nature,” the National Academy of Sciences, and other entities do. Those entities accept great sums of money from the fossil fuel industry. Does Peterman question their reports? The scientists who participate in the IPCC reports also receive BILLIONS of dollars from governments around the world. Heartland getting $25,000 from the “Koch Brothers” is irrelevant in this debate, but those billions might not be.

* Also: The UN, the sponsor of both the WMA and UNEP, is quintessentially a political body best known for its corruption and not its scientific rigor. We thank Missy Updyke for hitting some of those points.

* Heartland is PROUD to be able to pay scientists to help write and review its reports — the same way “Nature,” the NAS, and IPCC pay professionals to compile its reports. If you don’t have scientists on your staff, how can you review and publish scientific research?

* If you want some balance from the politicized “science” of the IPCC reports — information that adheres to the scientific method of skepticism, and doesn’t put its conclusions before the observations — visit Climate Change Reconsidered.

* We also invite Peterman and anyone who is interested in learning about what the data says about our climate — not computer models, but actual data — to attend the 9th International Conference on Climate Change. That invitation is also extended to Keith Peterman. The scientists who attend and present are open to all challenges — from Peterman and anyone else. If Peterman contacts me, I’ll waive the very modest attendance fee.

Y’all should come to Vegas for Heartland’s climate confrence, too. I hear there’s lots to do there.

Categories: On the Blog

The Least Efficient Part of Government

April 27, 2014, 12:54 PM

Spectrum management is the least efficient part of the federal government.

That’s a big national problem because radio spectrum is the essential fuel of the mobile revolution of smart-phones, tablets, video streaming and the Internet of things.

The worst-resource management in the federal government unnecessarily burdens one of the most modern, dynamic and innovative parts of the American economy – mobile.

This is not a partisan issue, it’s a good government issue that badly needs fixing – fast. The status quo is indefensible.

Fortunately, Congress is listening.

The House Energy and Commerce Committee has asked all interested parties for input into how to best modernize U.S. spectrum policy as part of the committee’s broader legislative effort to update the Communications Act next Congress. Comments are due this week.

The inability of the federal government’s spectrum system to make available sufficient federal spectrum for auction, just to keep pace with skyrocketing mobile usage, threatens to become an unnecessary growth cap on the increasingly mobile-driven 21st century economy.

How did the government’s spectrum management system become the anachronism it is today?

It was designed for a bygone era – the analog 1900s, not the mobile broadband 21st century.

The obsolete 1992 law in question simply codified a 1978 executive order that established the National Telecommunications and Information Administration (NTIA), which only partially covered spectrum.

To put that in perspective, that means the federal government’s basic spectrum management approach predates the first commercialization of cell phones in the U.S. in 1982 – by four years.

At that time most all radio spectrum was used or controlled by government agencies with very little spectrum used directly by consumers or the private sector. That means that spectrum management was, and remains largely today, organized for the convenience of government agencies – not for the benefit of American consumers or the private sector.

The year after that 1992 law fossilized spectrum policy in the analog 1900s, Congress passed a provision in the 1993 Reconciliation Act that authorized spectrum auctions for about 120 MHz of spectrum for personal communications services. In perspective, that auction commercialized 4 percent of the most commercially valuable wireless spectrum between 400 MHz and 3 GHz.

That PCS auction catapulted wireless competition, consumer use, and growth forward. How much? The number of American wireless subscribers exploded more than 3,000 percent from 11 million connections then, to 335 million today.

Currently only about 15 percent of the nation’s choice spectrum suitable for commercial broadband is available to the private sector.

Unfortunately in the past few years, the government agencies that control the lion’s share of choice spectrum have largely refused to clear more spectrum for private auction. Their offer is to let the private sector “share” some of their spectrum scraps as second class spectrum citizens.

So why is spectrum management the least efficient part of government?

First, the system now can take 9-12 years to make spectrum available for auction. Contrast that with the speed of change in the private sector, where in less than seven years after the invention of the smartphone, about 200 million Americans use one of these new spectrum-hungry devices.

Second, there is no national policy that recognizes spectrum is a scarce valuable resource that must be put to its highest and best use for the nation.

Third, in the current ad hoc spectrum system, literally no one has the authority to ensure spectrum is efficiently and effectively utilized by the government.

Fourth, America’s spectrum system scandalously has none of the normal or standard government accountability processes – i.e., decision-making responsibility, budget, accounting, audit, etc. – that is required of every other valuable government resource like land, buildings, vehicles, personnel, money, telecom services, oil, etc.

Finally, it is obvious that no one is minding the government’s spectrum store because no one in government is requiring government agencies to share spectrum better among themselves.

The current claim of a “transformative” spectrum policy that promotes government-private sector sharing of spectrum is less a policy-improvement, and more akin to a policy-surrender to unaccountable government agencies who are unwilling to use their spectrum more efficiently so Americans may benefit.

America currently has a scandalously dysfunctional spectrum system that remains organized for the convenience of government agencies’ 1900s needs, not the 21st century needs of consumers, taxpayers and businesses, who need the spectrum as much or more than the government agencies do.

The bipartisan solution is simple – good government.

Like any other trillion dollar resource, someone must be responsible and accountable so that it is put to its highest and best use in both government and in the private sector, and also so that it is not vulnerable to waste, fraud and abuse like it is now.

Congress needs to update U.S. spectrum law to protect our nation’s mobile communications future. Without more spectrum available to the private sector, the mobile revolution of smartphones, tablets, video streaming and the Internet of things are at increasing risk of hitting an unnecessary wall in long-term mobile growth.

Congress must legislate to ensure that someone minds the nation’s trillion-dollar spectrum store.

 

[Originally published at Daily Caller]

Categories: On the Blog

It’s Groundhog Day: Government Taking Third Stab at Net Neutrality Power Grab

April 26, 2014, 12:40 PM

This is at once obnoxious and pathetic.

Wheeler Tees Up Net Neutrality for May Meeting

FCC Chairman Tom Wheeler said Wednesday that he would be circulating a draft of the FCC’s new network neutrality rules to the other commissioners Thursday (April 24).

This is their third attempt at this particular power garb.  It’s becoming fetishistic.

There was 2007.

Court Backs Comcast Over FCC on ‘Net Neutrality’

And there was 2010.

Verizon Wins Net Neutrality Court Ruling Against FCC

Yet here we remain – stuck in government overreach Groundhog Day.

We haven’t yet seen the Net Neutrality power grab order – but the fact that they’re trying again at all is at once obnoxious and pathetic.

Not yet having seen the order hasn’t stopped the Left from going apoplectic.  Because the Left never allows the facts to get in the way of a good beating.

The FCC Tosses Net Neutrality Out the Window

So This is How Net Neutrality Dies, Under a Democratic President

Thanks to the FCC Net Neutrality is Dead

Obama’s Second FCC Chairman Fails on Net Neutrality

Stop the FCC from Breaking the Internet

Not at all over the top.

To all of which I say:

It’s disappointing – but not surprising – that the Obama Administration is yet again going to impose the Internet power grab that is Network Neutrality.

The federal government is twice bitten – and not shy.  Having twice had their Net Neutrality impositions unanimously rejected by the courts – they remain undaunted in their desire to over-regulate the Web.

And it’s always a pleasure to watch the Leftist likes of Free Press and Public Knowledge preemptively freak out – over an order they haven’t even yet seen.

They are as always the perfect tool.  The Leftist screeching visual aides that allow an overreaching government to pretend “See, we’re finding middle ground – both sides are angry with us” as they grab even more Leviathan-sized hands-full of the once-private sector.

Don’t drive angry.

[Originally published at PJ Tatler]

Categories: On the Blog

Common Core and Communism

April 26, 2014, 12:28 PM

“A lie told often enough becomes the truth,” said Vladimir Lenin who led the revolution that imposed Communism on Russia.

When he wrote, ‘Mein Kampf’, Adolf Hitler said “whoever has the youth has the future.” In his vision for the Nazi Party, education would be the key that ensured that he had ‘the youth’ of Germany fully indoctrinated.

All dictators and authoritarian regimes know that what is taught in their schools offers the greatest opportunity to maintain control over their societies.

That is what has been occurring since the introduction of the Common Core standards that the Obama regime has imposed on our national education system and the good news is that protests against it from concerned parents and others are beginning to increase and gain momentum.

Teachers will tell you that “one size fits all” does not apply in the classroom and never has. Children learn at a different pace with some doing so rapidly while others need extra help and attention. Learning that is entirely dependent on ceaseless testing puts stress on every child and that is the most common complaint about Common Core.

Education in America has been in decline since the 1960s when the teachers unions gained control over the process, putting themselves between the local boards of education and parents. Would it surprise anyone to learn that the Department of Education was established by President Jimmy Carter who signed it into law in 1979? It began operation on May 4, 1980. You will find no reference, no mention of education in the U.S. Constitution and it should not be a function of the federal government.

In a recent commentary by Joy Pullmann in The Daily Caller, she said, “The latest scheme is the field testing of Common Core assessments. This spring more than four million kids will be required to spend hours on tests that have little connection to what they learned in class this year and will provide their teachers and schools no information about what the kids know.”

“Parents who object to this scheme,” said Pullman, “face bullying and harassment from public officials. From New York to Denver to California, some schools are responding by forcing kids who opt out to sit at their desks and do nothing during the several-hour tests. Normal people call that a ‘time out’ and it is a punishment.”

Wyoming has become the first State to block a new set of national science standards that address climate change. In Michigan last year a group of protesters stopped the State from adopting the science standards.

Here are some excerpts of what the science standards teach as “The Essential Principles of Climate Science.”

“The impacts of climate change may affect the security of nations. Reduced availability of water, food, and land can lead to competition and conflict among humans, potentially resulting in large groups of climate refugees.

“Humans may be able to mitigate climate change or lessens its severity by reducing greenhouse gas concentrations through processes that move carbon out of the atmosphere or reduce greenhouse gas emissions.

“The most immediate strategy is conservation of oil, gas, and coal, which we rely on as fuels for most of our transportation, heating, cooling, agriculture, and electricity. Short-term strategies involve switching from carbon-intensive to renewable energy sources, which also requires building new infrastructure for alternative energy sources.”

From “A Framework for K-12 Science Education” children are to be taught that “If Earth’s global mean temperature continues to rise, the lives of humans and other organisms will be affected in many different ways.” Only the Earth’s mean temperature is not rising! The planet is in a natural cooling cycle that is now seventeen years old, meaning that none of the students in today’s schools have ever experienced a single day of “global warming.”

By the end of grade 8, the Framework teaches that “Human activities have significantly altered the biosphere, sometimes damaging or destroying natural habitats and causing the extinction of many other species.”  This, too, is untrue. The U.S. Endangered Species Act, despite listing thousands of species, has not officially “saved” more than a handful at best and this assertion is questionable.

By the end of grade 12, students are expected to believe that “Changes in the atmosphere due to human activity have increased carbon dioxide concentrations and thus affect climate.” While it is true that there has been an increase in the amount of carbon dioxide this is a good thing because it is an essential factor in the increase of all vegetation that includes food crops and healthier forests. Moreover, this increase does not play any role in the Earth’s climate.

The central theme of these “science standards” is to teach that we should be reducing our use of fossil fuels, the primary energy sources our nation and the world requires. What these standards do in reality is repeat and reinforce federal government laws and regulations to justify its CO2 emissions regulations based on the current method of computing the Social Cost of Carbon, but these “costs” are pure fiction.

None of the computer models that have predicted global warming over the past four decades have been accurate. None are capable of representing the state of the Earth’s vastly complex climate.

The sooner Common Core is removed from the nation’s education system, the better.

 

[Originally published at Warning Signs]

Categories: On the Blog

The FCC Disincentive Auction

April 25, 2014, 1:28 PM

The Federal Communications Commission’s upcoming “incentive” auction of TV airwaves is already at war with itself.

Somehow the FCC imagines it can maximize the revenue necessary to incent TV broadcasters to sell their 600 MHz spectrum by minimizing actual revenue collection via dis-incenting, and even banning some wireless company bids.

Old habits die hard.

Apparently, in its zeal to solve a wish-list of secondary wireless goals, in one fell auction swoop the FCC has lost sight of first principles – the need to economically-align all participants’ market incentives in order to maximize auction revenues.

It is obvious the FCC has never attempted such a complex two-sided auction of sellers in a “reverse auction” and buyers in a traditional “forward auction.” If they had, they would give more than lip service to “economics” and focus much more on economically aligning the market incentives of both sellers and buyers.

If this was really to be a market “incentive” auction of spectrum, the collective incentive for broadcasters to sell their spectrum, would be matched by the collective incentive of the wireless industry to bid for the broadcasters’ licenses. It is not.

The FCC has created Rube Goldberg auction rules to mask that the FCC will effectively ban the two largest wireless providers, Verizon and AT&T, from bidding and winning badly needed spectrum in roughly two-thirds of the country.

This is a big problem for the potential success of the incentive auction and most Americans because Verizon and AT&T have the highest network utilization and hence the highest real consumer-demand for this spectrum. They also have the most financial wherewithal to bid.

Any entry-level economics student understands that if most demand is eliminated from a market by banning the biggest bidders, potential prices will plummet. Predictably, broadcasters will suffer from earning a fraction of what a real market incentive auction would generate.

And the FCC-chosen wireless beneficiaries permitted to bid will enjoy the bargain of a lifetime – the best spectrum possible at a multi-billion dollar discount, unwittingly paid for by the American taxpayer.

Since the FCC does have excellent economists, the only explanation for the FCC’s obvious misalignment of economic incentives is that the FCC is not trying to maximize revenues at all.

It is trying to maximize its regulatory power to pick market winners and losers and to tilt the competitive playing field by politically reallocating the foregone, multi-billion-dollar market-surplus largely to Sprint and T-Mobile.

Another big way this auction is at war with itself is that it is proposing to act anti-competitively to ostensibly promote competition.

Just over two years ago the Department of Justice and FCC blocked the AT&T/T-Mobile merger because they determined it would reduce a four-competitor market to three competitors because Verizon, AT&T, Sprint, and T-Mobile comprised over 90 percent of the U.S. wireless market.

Using the government’s own recent wireless market definition, the DOJ and FCC plan to reduce the effective bidding competition for this 600 MHz spectrum auction market from four existing national competitors effectively to two.

Ironically the DOJ and FCC are creating and blessing a de facto, “wireless duopoly” of auction bidders for low-band spectrum – Sprint and T-Mobile.

A large reason that the DOJ and FCC ostensibly argued for a four-competitor market is that they believe a market of just two or three players creates a much higher opportunity and incentive to collude and carve up markets.

So on what basis do the DOJ and FCC believe that their artificially-created wireless bidding duopoly, Sprint and T-Mobile, which are widely-reported to be in behind-the-scenes talks to merge, would not have substantial opportunity and “incentive” to collude to win spectrum at the lowest possible price?

This DOJ-sanctioned auction bidding duopoly is in stark contrast to standard DOJ policy and precedent that normally opposes bid-rigging of government auctions.

In prosecuting a company which attempted a scheme to rig bids on military contracts, then DOJ Antitrust Chief Thomas O. Barnett said,“The antitrust division is committed to protecting the competitive market for Americans. We will continue to bring to justice those who rig bids and thereby deprive the public the benefits afforded by a competitive bidding process.”

How does DOJ square the circle that it’s ok for government agencies to do what is illegal for companies to do? How do their purported pro-competitive ends justify their anti-competitive means?

The problem here is doubly simple – unfairness.

First, it is unfair to shortchange the American taxpayer of many billions of dollars of deficit reduction so the FCC effectively can steer unauthorized, multi-billion dollar, implicit-bidding subsidies to Sprint and T-Mobile, who don’t even need them to bid in this auction. Sprint is owned by the wealthiest person in Japan and T-Mobile is one-third owned by the German government.

Second, without due process, the DOJ and FCC are unfairly punishing Verizon and AT&T financially and competitively for having done nothing wrong under the law. It’s not their fault Sprint and T-Mobile chose not to bid in the last “low-band” auction for 700 MHz spectrum.

This new contrived bifurcation of the spectrum market into lower-than, and higher-than, 1GHz segments specifically for the purposes of this 600 MHz auction, arbitrarily and capriciously punishes only two companies for legal competitive success without the due process of a judicial proceeding.

Hopefully, the FCC will reconsider these uneconomic and unfair auction rules.

[Originally published at Daily Caller]

Categories: On the Blog

The Failure of Oregon’s Welfare Program

April 25, 2014, 1:05 PM

Oregon just became another example of how excessive hand-outs are hurting needy families. A recent audit released by the Oregon Secretary of State’s Office found the state made “little to no progress” in moving clients off welfare and towards self-sufficiency. The 2011 Legislature made the decision to cut funding for programs that helped parents move towards work and job training. This led case managers to have twice as many clients and very little time to help them.

Although Oregon officials are quick to blame a slow recovery from the recession, the state’s own recovery is significantly slower than the nation’s. In order to expand Temporary Assistance to Needy Families (TANF) and offer a lower threshold to receive checks, Oregon decided to cut the programs that would actually help people out of poverty. Oregon TANF clients now rank among the worst in finding and retaining jobs. The handouts were not enough to help the needy families and with the cuts to work-related support programs, clients had few options. The legislature’s actions actually forced organizations that promoted self-sufficiency to close—a clear step in the wrong direction.

Oregon’s blunder shows why we must promote work-related support for the needy. States across the country are using One-Stop Career Centers to reduce unemployment and welfare dependency. These centers offer programs on how to gain computer skills (using Excel, LinkedIn, etc.), draft an effective resume, and prepare for interviews. Many also offer extensive job search tools and ways to purchase or borrow work-appropriate clothing. Programs like this get people back into the workforce. It gives the unemployed the skills they need to find and keep a job, something monthly checks are unable to do. Most importantly, getting people back into the work force will reduce unemployment and welfare costs for the state.

Some of the most effective programs include non-profits that establish themselves in underserved communities. Connections to Success has set up offices throughout Kansas City and St. Louis that cater to the specific needs of those in the community. They understand what their clients need and work with them to become self-sufficient—exactly what Oregon is working against. The organization works to break the cycle of poverty rather than perpetuate it. Its clients tout exceptional success stories that could have never been reached with a monthly check and no personal support.

The audit on Oregon’s TANF program actually recommends that the state starts working with non-profits (like Connections to Success) to begin fixing their problem. The state’s extension and expansion of handouts has severely hurt those it was meant to help. Instead, Oregon must start following the lead of other states and begin prioritizing work-related support over cash assistance. By doing so, the state will be able to see long-term, permanent success.

Categories: On the Blog

The 2014 State of Wind Energy: Desperately Seeking Subsidies

April 25, 2014, 12:23 PM

With the growing story coming out of Ukraine, the ongoing search for the missing Malaysian jet, the intensifying Nevada cattle battle, and the new announcement about the additional Keystone pipeline delay, little attention is being paid to the Production Tax Credit (PTC) for wind energy—or any of the other fifty lapsed tax breaks the Senate Finance Committee approved earlier this month. But, despite the low news profile, the gears of government continue to grind up taxpayer dollars.

The Expiring Provisions Improvement Reform and Efficiency Act (EXPIRE) did not originally include the PTC, however, prior to the committee markup hearing on April 3, Senators Charles Grassley (R-IA), Michael Bennet (D-CO), and Maria Cantwell (D-WA) pushed for an amendment to add a two-year PTC extension. The tax extender package passed out of committee and has been sent to the senate floor for debate. There, its future is uncertain.

“If the bill becomes law,” reports the Energy Collective, “it will allow wind energy developers to qualify for tax credits if they begin construction by the end of 2015.” The American Wind Energy Association’s (AWEA) website calls on Congress to: “act quickly to retroactively extend the PTC.”

The PTC is often the deciding factor in determining whether or not to build a wind farm. According to Bloomberg, wind power advocates fear: “Without the restoration of the subsidies, worth $23 per megawatt hour to turbine owners, the industry might not recover, and the U.S. may lose ground in its race to reduce dependence on fossil fuels driving global warming.” The National Renewable Energy Laboratory released a report earlier this month affirming the importance of the subsidies to the wind industry. It showed that the PTC has been critical to the development of the U.S. wind power industry. The report also found: PTC “extension options that would ramp down by the end of 2022 appear to be insufficient to support recent levels of deployment. …extending the production tax credit at its historical level could provide the best opportunity to sustain strong U.S. wind energy installation and domestic manufacturing.”

The PTC was originally part of the Energy Policy Act of 1992. It has expired many times— most recently at the close of 2013. The last-minute 2012 extension, as a part of the American Tax Relief Act, included an eligibility criteria adjustment that allows projects that began construction in 2013, and maintain construction through as long as 2016, to qualify for the ten-year tax credit designed to establish a production incentive. Previously, projects would have had to be producing electricity at the time the PTC expired to qualify.

Thomas Pyle, the president of the American Energy Alliance, which represents the interests of oil, coal, and natural gas companies, called the 2013 expiration of the wind PTC “a victory for taxpayers.” He explained: “The notion that the wind industry is an infant that needs the PTC to get on its feet is simply not true. The PTC has overstayed its welcome and any attempt to extend it would do a great disservice to the American people.”

As recently as 2006-2007, “the wind PTC had no natural enemies,” states a new report on the PTC’s future. The Declining Appetite for the Wind PTC report points to the assumption that “all extenders are extended eventually, and that enacting the extension is purely a matter of routine, in which gridlock on unrelated topics is the only source of uncertainty and delay.” The report then concludes: “That has been a correct view in past years.”

 

The report predicts that the PTC will follow “the same political trajectory as the ethanol mandate and the ethanol blenders’ tax credit before it.” The mandate remains—albeit in a slightly weakened state—and the tax credit is gone: “ethanol no longer needed the blenders’ tax credit because it had the strong support of a mandate (an implicit subsidy) behind it.”

The PTC once enjoyed support from some in the utility industry that needed it to bolster wind power development to meet the mandates. Today, utilities have met their state mandates—or come close enough, the report points out: “their state utility commissioners will not allow them to build more.” It is important to realize that the commissioners are appointed or elected to protect the ratepayers and insure that the rates charged by the utilities are fair and as low as possible. Because of the increased cost of wind energy over conventional sources, commissioners won’t allow any more than is necessary to meet the mandates passed by the legislatures.

The abundance of natural gas and subsequent low price has also hurt wind energy’s predicted price parity. South Dakota’s Governor Dennis Daugaard (R), in Bloomberg, said: “If gas prices weren’t so cheap, then wind might be able to compete on its own.” David Crane, chief executive officer of NRG Energy Inc.—which builds both gas and renewable power plants—agrees: “Cheap gas has definitely made it harder to compete.” With the subsidy, companies were able to propose wind projects “below the price of gas.” Without the PTC, Stephen Munro, an analyst at New Energy Finance, confirms: “we don’t expect wind to be at cost parity with gas.”

The changing conditions combined with “wide agreement that the majority of extenders are special interest handouts, the pet political projects of a few influential members of Congress,” mean that “the wind PTC is not a sure bet for extension.” Bloomberg declares: “Wind power in the U.S. is on a respirator.” Mike Krancer, who previously served as secretary of the Pennsylvania Department of Environmental Protection, in an article in Roll Call, states: “Washington’s usual handout to keep the turbines spinning may be harder to win this time around.”

Despite the claim of “Loud support for the PTC” from North American Windpower (NAW), the report predicts “political resistance.” NAW points to letters from 144 members of Congress urging colleagues to “act quickly to revive the incentives.” Twenty-six Senate members signed the letter to Senate Finance Committee Chairman Ron Wyden (D-OR) and 118 signed a similar letter to Speaker John Boehner (R-OH). However, of the 118 House members, only six were Republicans—which, even if the PTC extension makes it out of the Senate, points to the difficulty of getting it extended in the Republican-controlled House.

Bloomberg cites AWEA as saying: “the Republican-led House of Representatives may not support efforts to extend the tax credits before the November election.” This supports the view stated in the report. House Ways & Means Committee Chairman David Camp (R-MI) held his first hearing on tax extenders on April 8. He only wants two of the 55 tax breaks continued: small business depreciation and the R & D tax credit. The report states: “Camp says that he will probably hold hearings on which extenders should be permanent through the spring and into the summer. He hasn’t said when he would do an extenders proposal himself, but our guess is that he will wait until after the fall elections. …We think the PTC is most endangered if Republicans win a Senate Majority in the fall.”

So, even if the PTC survives the current Senate’s floor debate (Senator Pat Toomey [R-PA] offered an amendment that would have entirely done away with the PTC), it is only the “first step in a long journey” and, according to David Burton, a partner at law firm Akin Gump Hauer and Feld, is “unlikely on its own to create enough confidence to spur investment in the development of new projects.” Plus, the House will likely hold up its resurrection.

Not to mention the growing opposition to wind energy due to the slaughter of birds and bats—including the protected bald and golden eagles. Or, growing fears about health impacts, maintenance costs and abandoned turbines.

All of these factors have likely led Jeff Imelt, chief executive officer of General Electric Co.—the biggest U.S. turbine supplier—to recently state: “We’re planning for a world that’s unsubsidized. Renewables have to find a way to get to the grid unsubsidized.”

Perhaps this time, the PTC is really dead, leaving smaller manufacturers desperately seeking subsidies.

Categories: On the Blog

Big Business’ Slush Fund Seeks Re-Authorization

April 25, 2014, 10:02 AM

Shortly, Congress will be debating the fate of the U.S. Export-Import Bank (Ex-Im). Its authorization — last extended in 2012 — will expire on September 30 unless reauthorized. Ex-Im was first incorporated in 1934 by President Franklin D. Roosevelt to finance trade with the Soviet Union. Under the Export-Import Bank Act of 1945, Congress established the bank as an independent agency. It provides loans and loan guarantees (as well as capital and credit insurance) to facilitate U.S. exports. Backed up by the full faith and credit of the U.S government, taxpayers are put on the hook.

According to Katherine Rosario writing for Heritage Action on April 1, 2014, “The United States Export-Import Bank is essentially a microcosm of some of Washington’s biggest problems, from the corruption it encourages, to putting taxpayers at risk, to the cronyism it facilitates.”

An example of the cronyism facilitated by Ex-Im to advance political ideologies was in the Bank’s backing of Solyndra, the ailing solar panel company, which received an Ex-Im Bank loan guarantee of $10.3 million. As such the Ex-Im Bank made possible the exchange of political favors under the guise of boosting the economy and growing American jobs.

Those who most benefit for Ex-Im financing are multinational corporations such as the construction and engineering firm of Bechtel (ranked by Forbes as the fourth largest privately held company by revenue), Lockheed Martin, and its biggest beneficiary by far, Boeing. In the banking industry, the U.S. Export-Import Bank is commonly referred to as Boeing’s Bank. According to some estimates, Boeing receives upwards of 80-percent of the Ex-Im Bank’s taxpayer-backed loan guarantees.

The claim is made that Ex-Im financing is essential to fill “gaps” that exist in financing when the economic or political risk to garner private property is too great. This excuse is contradicted with the fact that U.S. exports hit a record high of $2.2 trillion in 2013. This was up from $1.4 trillion five years ago, thus proving that there is no shortage of private export capital.

Because Boeing does not have to compete for financing and enjoys favorable terms, Boeing is able to sell more planes overseas.  The Ex-Im Bank gives money to foreign governments and airlines to buy Boeing products.  In 2013 85% of the Ex-Im bank’s guarantees made to Russian companies were centered around Boeing.  Specifically, $580 million was slated for the purchase of luxury airliners out of total of $639 Russia received.

Other countries like China benefited from Ex-Im bank transactions during 2013 to the tune of $636 million.  Australia’s richest person (reportedly worth $17.7 billion) was more than happy to accept $700 million in American taxpayer backing via the Bank for her company.

Supporters of Ex-Im claim that the bank is necessary to create a level playing field.  If so, how it is that only 2.2% of all U.S. exports last year received Ex-Im financing, while 98% of American exporter had to compete without the bank’s intervention.  And what about all the other domestic businesses that don’t get any Ex-Im funding?

A debate reauthorizing the Export-Import Bank is threatening to further escalate the tension between House conservatives and the Republican leadership.  Eric Cantor, who struck a deal with Democrats in 2012 to reauthorize the Export-Import Bank, is now wary of battling conservatives angered by a number of his recent legislative moves.  Cantor has privately told members of the House that he does not intend to get involved this time around.

In a recent private meeting of the conservative Republican Study Committee, Paul Ryan struck the right tone when pushing for the Ex-Im Bank charter to expire.  Why should Republicans support what conservatives consider corporate welfare programs, at the same time cuts are being advocated for individual welfare programs like food stamps?  And what about the earmark ban which Republicans agreed to honor once again in the 113th Congress?  Unfortunately establishment Republicans with upcoming primaries in Georgia, Kentucky and Mississippi seem increasingly willing to disregard the earmark ban and instead brag about bringing home the bacon.

Prior to the 2012 vote by the House reauthorizing the Ex-Im bank charter, Brian Darling set forth in writing “Top Ten Reasons to End The Export Import Bank.”In the same article Darling had the following to say why the Export-Import Bank is bad for America.

“There is nothing free market about the idea of the United States government providing loans to private companies for the purpose of completing against other private companies.  Especially when these large corporations are gaming the system with teams of lobbyists pushing Members of Congress to reauthorize this taxpayer funded slush fund or big business.  If the loans don’t pay off, taxpayers have to pick up the tab.”

Hear what Senator Obama had to say when on the Campaign Trail in 2008.

“I’m not a Democrat who believes that we can or should defend every government program, just because it’s there. There are some that don’t work like we had hoped…[like] the Export-Import Bank that has become little more than a fund for corporate welfare…If we hope to meet the challenges of our time, we have to make difficult choices. As President…I will eliminate the programs that do not work and are not needed.”

Corporate welfare to large private corporations is not an easy thing for Democrats to explain to their constituents in an election year, despite President Obama’s pre-election musings.  As for Republicans, should corporate welfare be blessed in an election year when earmarks serving the same purpose are frowned upon and have been theoretically banned?

Instead, Republicans should enact policies that are free market in nature to produce jobs and grow the economy through cutting taxes, reforming the tax code, deregulation, and opening up energy exploration.

Congress must now decide whether it will reauthorize billions of dollars in corporate welfare on the backs of taxpayers or to allow private investors to finance U.S. exports.  This is not a partisan issue. There are good reasons for both Democrat and Republican legislators to allow the crony capitalist Export-Import Bank to expire (sunset) when it comes up for re-authorization.

 

[Originally published at the Illinois Review]

Categories: On the Blog

VIDEO: Heartland Talks Earth Day with John Stossel

April 24, 2014, 1:15 PM

Last week John Stossel hosted a segment on Earth Day featuring Heartland Senior Fellow James Taylor and Paul Gallay from the Riverkeeper environmentalist organization. Taylor (with some backup from Stossel) crossed swords with Gallay on a number of environmental subjects.

The discussion started on Gallay’s topic of interest, namely clean water, and he, Taylor and Stossel all happily agreed that our nation’s water is getting cleaner, though Gallay argued that it was not happening swiftly enough and that further government action was needed. Taylor deflated this line of reasoning by pointing out that current policies are working out quite well in an incremental fashion and that increasing government spending and power to accelerate the process would be wasteful and dangerous.

The program then turned its attention to alternative energy and the “sustainable future” of green energy lauded by Earth Day supporters and environmentalists the world over. Gallay’s position was that the only sustainable solution to our energy needs was to rely on “smart” energy, in other words green, renewable energy. Taylor rightly pointed out that the current green energy favorites, solar and wind energy, are both energy intensive and often inefficient. Solar energy in particular, he explained, was “very land and water-intensive.” The water-intensiveness is a particular problem, due to the fact that solar farms tend to be built in high-heat, water-scarce environments.

Gallay tried to answer Taylor by saying that non-renewable energy is given more subsidies than is green energy. Taylor would not be trapped, however, answering that if one considers the subsidy per-kilowatt hour of energy produced, then wind and solar receive ten times the subsidies as natural gas, and fifty times that of coal.

What the program really served to highlight was the major flaw in the mainstream environmentalist movement’s attitude, namely its fixation on what it perceives to be the problem at the expense of critically assessing the proposed solutions. Anti-carbon activists sing the praises of solar and wind energy while trying to ignore or obfuscate the issues surrounding those methods’ costs and efficiency. If supporters of green energy hope to win over a reluctant public, they need to be willing to debate the facts, not just feelings.

If an honest, informative debate is going to be had about America’s energy future, then the costs of alternatives must be honestly addressed. The fact is that lavish subsidies (often given as political favors) are the only thing that keeps most alternative energy suppliers in business. If those alternatives are to have a future in the energy sector, they need to stop suckling at the government teat and learn to compete in the real world economy. Then we might consider them a worthwhile addition to American energy production.

Categories: On the Blog

Video: The Government is a Horrible Private Sector Prognosticator

April 24, 2014, 12:54 PM

We have before us – and the government – Comcast’s acquisition of Time Warner Cable.

Which the Leviathan and the Left view as an opportunity to outright blockade the free market.  Or, barring that, unilaterally impose many, many a la carte regulations – that they otherwise can’t get imposed – in exchange for Mother-May-I government-transaction-approval.

These are known as regulatory “concessions.”  Absurdly called that because we pretend the imposed-upon companies happily concede to these regulations – the same way I happily concede my wallet to the guys with the masks and the guns.

These “concessions” are imposed – they claim – to try to mitigate what they think will be market harm resulting from the transaction.

But really – who is LESS qualified to prognosticate what will happen in the market…than bureaucrats?

Sadly, this is where we currently stand.

In this video, we assess the situation – and implore the government to not yet again try to guess what’s going to happen, and preemptively regulate predicated thereon.

Soothsayers they ain’t.

[Originally published at PJ Tatler]

Categories: On the Blog