On the Blog

Misdirection Increasing in Size, Intensity & Coverage

Somewhat Reasonable - May 14, 2014, 9:06 AM

Once upon a time, oh, say 20 years ago, the talk was that the Pacific would be in a constant state of El Nino. Though this was an admission that the antics of the tropical Pacific control a large part of the global temperature, the idea of the El Nino and a forever warming planet was a global warming proponent’s dream come true. Because they ignored climate cycles and did not understand what Weatherbell.com meteorologist Joe D’Aleo, who also runs the climate blog “ICECAP,” showed plainly – that in the colder cycles of the Pacific, the La Ninas outdo El Ninos and vice versa – they assumed this would continue forever.

As the earth adjusted to the warmth supplied by this natural cycle, the warmth that was occurring, combined with the change of the Atlantic cycle to warmer, lead to a marked decrease in Arctic sea ice. It reached a crescendo in 2007, the year of the death spiral along with forecasts of no summer ice in 2014. Through it all, our side of the AGW argument said this is a natural phenomena, and once the AMO flipped, the summer sea ice, which is the most obvious talking point for those advocating the Arctic death spiral, would come back. As always, the Southern Hemisphere ice, because it was above normal, was ignored.

So here we are, with the summer of 2014 approaching. Much is being made of the coming El Nino, including for the fifth time since 1997, the dream of many of a “Super Nino” to get the badly busting global temp forecast back on track. We believe strongly this a classic Multivariate ENSO (MEI) bounce back event that spikes quickly then retreats, as we are back in the period that favors this. We can plainly see this cycle by looking at the MEI chart below.

The theory is not rocket science. It simply says the strongest events are after prolonged warm run ups, which happen when the Pacific in the overall sense is cool. You can plainly see the cyclical nature of the overall MEI and the spikes that occur, both when it’s been warm and cold. As I have said a thousand times, the explanation for the behavior of the oceans lies with Dr. William Gray’s ideas.

But here we are with the talk of a Super Nino, yet the far bigger event climate-wise is the increasingly positive summer sea ice anomaly being forecast that’s getting more impressive by the week. When combined with the major positive anomaly in the Southern Hemisphere, this offers a chance, in the summer of Al Gore’s no Arctic ice cap, for a record high global ice anomaly.

Heck of a way to run a global warm up, eh? There’s a chance of a record high global ice anomaly because of an above average summer sea ice anomaly in the north and what appears to be a a Southern Hemispheric sea ice that is heading for a record high itself. As of this writing, the Southern Hemisphere looks like this:

The north, as you can see, is below average, and you see the two summer sea ice minimums that lead to the hysteria. But while they were happening there was robust sea ice in the south (and I am all for thinking globally).

Average all this out, and here’s what you get.

Again, this is not rocket science. Given where we are globally now with the Arctic still below average, a forecast for the winter around Antarctica as depicted on the graph below would mean it’s likely each anomaly in their winter would remain well above average.

Should the northern ice cap expand to above average, the global average would have to go up, perhaps breaking the record. And you have to love all this, as it would occur in the summer touted by Al Gore to perhaps see the Arctic ice cap disappear. Ouch, that is going to leave a mark. If only someone would actually watch it!

The reason for the increase in the Arctic ice is because the north Atlantic, at least for the time being, has cooled. Most of the reason for the decrease in ice is not because of the warmth of the winters but because the warm cycle in the north Atlantic attacks the ice cap at the warm time of the year – both with warmer air temperatures and the warmer current below! But what happens when that changes for good? There were times in the 1950s when Arctic sea ice was very low, and though I have no satellite measurements, we do have panic reports like this from 1957.

20 years ago the idea of a constant El Nino warming the planet was a big deal, which is why we see the current fervor about the threat of a Super Nino. But the other, greater story is this canary in the coal mine; that the AMO will flip to cold for good by 2020, as Dr. Gray has opined, because of the cyclical nature of the oceans. This means that the darling of the warming crowd a few years ago – sea ice – will be the lipstick on a pig it always has been.

Think about it.

Super Ninos galore – NOT.

Ice caps decreasing. How did that work out given the Southern Hemisphere?

And now this?

Yet what are we hearing about? A likely overhyped event to get attention and whip up fervor, while the event that actually means something is ignored.

Now let me ask you this question. If we have a world with less than average global sea ice meaning all this warmth, what should be the natural progression of thought to the same person that pushed this missive as to what two above normal caps mean?

A question they probably do not want to answer.

Joe Bastardi is chief forecaster at WeatherBELL Analytics, a meteorological consulting firm.

© Copyright 2014 The Patriot Post

[Originally published at The Patriot Post]

Categories: On the Blog

Media Marxists Persuading the Government to Execute its Hugest Internet Power Grab Yet

Somewhat Reasonable - May 13, 2014, 1:24 PM

Allow me to (re)introduce you to the hardcore Leftists rabble rousing for a full-on government takeover of the World Wide Web.

Theirs is now an all-out push to have the Federal Communications Commission (FCC) “Reclassify” the Internet – so as to then impose the utterly unpopular Network Neutrality.

 

All 95 Pro-Net-Neutrality-Pledge Democrats Lost On Tuesday - and Pledge Group Raised Less Than $300 On It (in 2010)

 

Net Neutrality: The Kid Sitting By Himself in the High School Cafeteria

 
“Reclassification” means unilaterally shoving the Web out from under the existing light-touch Title I rules the 1996 Telecommunications Act placed upon it – which have allowed it to blossom into the free speech-free market Xanadu we all know and love.

And then slamming it into the Title II heavy-regulatory uber-structure that has for the last seventy-plus years crushed with regs and taxes landline telephones – that well-known bastion of technological and economic innovation.

Does the FCC have the authority to Reclassify?  Of course not.  But this is the Barack Obama Administration - when has that ever stopped them?

The 1996 Act gave the Web the freedom every person and entity needs to thrive – and it has thrived beyond any and everyone’s wildest dreams.

Because it left the Left bereft of a regulatory hook – by which they can reel it in sport-fish-style – they now want the government to seize the Web, enmeshing it in a landline regulatory nightmare mess.

So they now want the government to seize the Web – enmeshing it in a landline regulatory nightmare mess.

The people calling for this are ridiculously Leftist.  They bear a striking resemblance to – and the heinous patchouli aroma of – the Occupy Wall Street radicals who in 2012 illegally befouled public places all across our nation.

Only these people are far more organized – and thus far more dangerous.

Call this iteration #OccupyTheInternet.  Led by the tiny band of Merry Media Marxists known as Free Press.

 

Activists Set Up Protest Camp Outside FCC Headquarters

 

FCC Protestors Vow To Remain Until Vote

 

Say nothing short of Title II classification for Internet access will do.

This is not Free Press’ first attempt at ridiculous Government Internet Power Grab Street Theatre.  The last time involved spatulas.

 

Waffle Breakfast at the FCC

Which was even more absurd than it sounds.

 

Who is Free Press?  For what do they stand?  Meet Free Press co-founder and current Board member Robert McChesney.

In addition to teaching college (Heaven help us) and having co-founded Free Press, he was the editor (2000-2004) and is a current board member of Monthly Review, which he himself describes as “one of the most important Marxist publications in the world, let alone the United States.”

 

McChesney describes their Internet objective thus:

 

“(T)he ultimate goal is to get rid of the media capitalists in the phone and cable companies and to divest them from control.”

 

How very Hugo Chavez of them.

These clowns are of course getting support from the Congressional Communist Progressive Caucus.

 

Progressive Caucus Leaders Call for Reclassification

 

The leaders of the Congressional Progressive Caucus are drafting a letter asking the FCC to reclassify broadband as a telecommunications service, a move that would give the agency more flexibility on net neutrality but may be legally or politically difficult.

Reps. Raul Grijalva and Keith Ellison plan to send the letter to the agency next week, and plan to send a dear colleague letter to fellow lawmakers in hopes of garnering more signatories. Their backing of reclassification is significant, since it endorses an alternative to Chairman Tom Wheeler’s proposal in addition to just criticizing the plan.

 

Good old Representative Grijalva.

Raul Grijalvas first documented ties to the Communist Party USA date from 1993, when then-Pima County Board of Supervisors member Grijalva penned an article on NAFTA for the Partys Peoples Weekly World (now People’s World)’s November 13 issue.

These Marxists and their publications.

How do those of us over here in Reality view this government takeover of the Internet?  Not quite as highly.  Back in 2010 the following Democrat-festooned assemblage united against the same FCC power grab:

 

299 members of Congress - a large bipartisan majority - have asked you to not reclassify the Internet, and wait for Congress to first write law.  (And that was BEFORE the 2010 Republican wave election.)

 

More than 150 organizations, state legislators and bloggers have asked you to not reclassify the Internet, and wait for Congress to first write law.

 

Seventeen minority groups – that are almost always in Democrat lockstep – have asked you to not reclassify the Internet, and wait for Congress to first write law.

 

And many additional normally Democrat paragons have also asked you to not reclassify the Internet, and wait for Congress to first write law.  Including:

 

  • Large unions: AFL-CIO, Communications Workers of America (CWA), International Brotherhood of Electrical Workers (IBEW).

 

  • Racial grievance groups: League of United Latin American Citizens (LULAC), Minority Media and Telecom Council (MMTC), National Association for the Advancement of Colored People (NAACP) and the Urban League.

 

  • Anti-free market environmentalist group the Sierra Club….

 

Even (former) Massachusetts Democrat Senator (and now Secretary of State) John Kerry…at one point said you do not have the authority.

I would imagine very little has changed for these folks – since nothing about the Left’s Internet assault has.

So the question now is: To whom will President Barack Obama’s FCC Democrat appointees - Chairman Tom Wheeler and Commissioners Jessica Rosenworcel and Mignon Clyburn - listen?

A massive, overwhelming bipartisan swath of Washington and the nation – or a tiny, uber-radical, Communist-riddled cadre of government expansionists?

Our first glimpse at an answer is Thursday’s Net Neutrality vote.

 

Seton Motley is the founder and president of Less Government.  Please feel free to follow him on Twitter (@SetonMotley) and Facebook.  It’s his kind of stalking.

[Originally published at Human Events]

Categories: On the Blog

Manmade Climate Disruption – the Hype and Reality

Somewhat Reasonable - May 13, 2014, 11:25 AM

The White House has released its latest National Climate Assessment. An 829-page report and 127-page “summary” were quickly followed by press releases, television appearances, interviews and photo ops with tornado victims – all to underscore President Obama’s central claims:

Human-induced climate change, “once considered an issue for the distant future, has moved firmly into the present.” It is “affecting Americans right now,” disrupting their lives. The effects of “are already being felt in every corner of the United States.” Corn producers in Iowa, oyster growers in Washington, maple syrup producers in Vermont, crop-growth cycles in Great Plains states “are all observing climate-related changes that are outside of recent experience.” Extreme weather events “have become more frequent and/or intense.”

It’s pretty scary sounding. It has to be. First, it is designed to distract us from topics that the President and Democrats do not want to talk about: ObamaCare, the IRS scandals, Benghazi, a host of foreign policy failures, still horrid jobless and workforce participation rates, and an abysmal 0.1% first quarter GDP growth rate that hearkens back to the Great Depression.

Second, fear-inducing “climate disruption” claims are needed to justify job-killing, economy-choking policies like the endless delays on the Keystone XL pipeline; still more wind, solar and ethanol mandates, tax breaks and subsidies; and regulatory compliance costs that have reached $1.9 trillion per year – nearly one-eighth of the entire US economy.

Third, scary hyperventilating serves to obscure important realities about Earth’s weather and climate, and even in the NCA report itself. Although atmospheric carbon dioxide levels have been rising steadily for decades, contrary to White House claims average planetary temperatures have not budged for 17 years.

No Category 3-5 hurricane has made landfall in the United States since 2005, the longest such period since at least 1900. Even with the recent Midwestern twisters, US tornado frequency remains very low, and property damage and loss of life from tornadoes have decreased over the past six decades.

Sea levels are rising at a mere seven inches per century. Antarctic sea ice recently reached a new record high. A new report says natural forces could account for as much ashalf of Arctic warming, and warming and cooling periods have alternated for centuries in the Arctic. Even in early May this year, some 30% of Lake Superior was still ice-covered, which appears to be unprecedented in historical records. Topping it off, a warmer planet and rising CO2 levels improve forest, grassland and crop growth, greening the planet.

Press releases on the NCA report say global temperatures, heat waves, sea levels, storms, droughts and other events are “forecast” or “projected” to increase dangerously over the next century. However, the palm reading was done by computer models – which are based on the false assumption that carbon dioxide now drives climate change, and that powerful natural forces no longer play a role. The models have never been able to predict global temperatures accurately, and the divergence between model predictions and actual measured temperatures gets worse with every passing year. The models cannot even “hindcast” temperatures over the past quarter century, without using fudge factors and other clever tricks.

Moreover, much of the White House and media spin contradicts what the NCA report actually says. For example, it concludes that “there has been no universal trend in the overall extent of drought across the continental U.S. since 1900.” Other trends in severe storms, it states, “are uncertain.”

Climate change, Johnstown Floods, Dust Bowls, extreme weather events and forest fires have been part of Earth and human history forever – and no amount of White House spin can alter that fact. To suggest that any changes in weather or climate – or any temporary increases in extreme weather events – are due to humans is patently absurd. To ignore positive trends and the 17-year absence of warming is abominable.

Fourth, sticking to the “manmade climate disaster” script is essential to protect the turf, reputations, funding and power of climate alarmists and government bureaucrats. The federal government doles out some $2.6 billion annually in grants for climate research – but only for work that reflects White House perspectives. Billions more support subsidies and loans for renewable energy programs that represent major revenue streams for companies large and small, and part of that money ends up in campaign war chests for (mostly Democrat) legislators who support the climate regulatory-industrial complex.

None of them is likely to admit any doubts, alter any claims or policies, or reduce their increasingly vitriolic attacks on skeptics of “dangerous manmade global warming.” They do not want to risk being exposed as false prophets and charlatans, or worse. Follow the money.

Last, and most important, climate disruption claims drive a regulatory agenda that few Americans support. Presidential candidate Obama said his goal was “fundamentally transforming” the United States and ensuring that electricity rates “necessarily skyrocket.” On climate change, President Obama has made it clear that he “can’t wait for an increasingly dysfunctional Congress to do its job. Where they won’t act, I will.” His Environmental Protection Agency, Department of the Interior, Department of Energy and other officials have steadfastly implemented his anti-hydrocarbon policies.

Chief Obama science advisor John Holdren famously said: “A massive campaign must be launched to … de-develop the United States … bringing our economic system (especially patterns of consumption) into line with the realities of ecology and the global resource situation.… [Economists] must design a stable, low-consumption economy in which there is a much more equitable [re]distribution of wealth.”

(The President also wants to ensure that neither a Keystone pipeline approval nor a toned-down climate agenda scuttles billionaire Tom Steyer’s $100-million contribution to Democrat congressional candidates.)

This agenda translates into greater government control over energy production and use, job creation and economic growth, and people’s lives, livelihoods, living standards, liberties, health and welfare. It means fewer opportunities and lower standards of living for poor and middle class working Americans. It means greater power and control for politicians, bureaucrats, activists and judges – but with little or no accountability for mistakes made, damage done or penalties deliberately exacted on innocent people.

A strong economy, modern technologies, and abundant, reliable, affordable energy are absolutely essential if we are to adapt to future climate changes, whatever their cause – and survive the heat waves, cold winters, floods, droughts and vicious weather events that will most certainly continue coming.

The Obama agenda will reduce our capacity to adapt, survive and thrive. It will leave more millions jobless, and reduce the ability of families to heat and cool their homes properly, assure nutritious meals, pay their rent or mortgage, and pursue their American dreams.

America’s minority and blue collar families will suffer – while Washington, DC power brokers and lobbyists will continue to enjoy standards of living, housing booms and luxury cars unknown in the nation’s heartland. Think Hunger Games or the Politburo and nomenklatura of Soviet Russia.

Worst, it will all be for nothing, even if carbon dioxide does exert a stronger influence on Earth’s climate than actual evidence suggests. While the United States slashes its hydrocarbon use, job creation, economic growth and international competitiveness, China, India, Brazil, Indonesia – and Spain, Germany, France and Great Britain – are all increasing their coal use … and CO2 emissions.

President Obama and White House advisor John Podesta are convinced that Congress and the American people have no power or ability to derail the Administration’s determination to unilaterally impose costly policies to combat “dangerous manmade climate disruption” – and that the courts will do nothing to curb their executive orders, regulatory fiats and economic disruption.

If they are right, we are in for some very rough times – and it becomes even more critical that voters learn the facts and eject Harry Reid and his Senate majority, to restore some semblance of checks and balances.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (www.CFACT.org) and author of Eco-Imperialism: Green power – Black death.

[Originally published at Canada Free Press]

Categories: On the Blog

Urban Core Jurisdictions: Similar In Label Only

Somewhat Reasonable - May 13, 2014, 11:05 AM

The fortunes of U.S. core cities (municipalities) have varied greatly in the period of automobile domination that accelerated strongly at the end of World War II. This is illustrated by examining trends between the three categories of “historical core municipalities” (Figure 1). Since that time, nearly all metropolitan area (the functional or economic definition of the city) growth has been suburban, outside core municipality limits, or in the outer rings of existing, core municipalities.

Approximately 26 percent of major metropolitan area population is located in the core municipalities. Yet, many of these municipalities include large areas of automobile orientation that are anything but urban core in their urban form. Most housing is single-detached, as opposed to the much higher share of multi-family in the urban cores, and transit use is just a fraction of in the urban cores.

Even counting their essentially suburban populations, today’s core municipalities represent, with a few exceptions, a minority of their metropolitan area population. The exceptions (San Antonio, Jacksonville, Louisville, and San Jose) are all highly suburbanized and have annexed land area at a substantially greater rate than they have increased their population.

According to the 2010 census, using the 2013 geographic definitions, core cities accounted for from five percent of the metropolitan area population in Riverside-San Bernardino to 62 percent in San Antonio (Figure 2).

International Parallels

These kinds of differences are not limited to the United States. For example, the city (municipality) of Melbourne, Australia has little more than two percent of the Melbourne metropolitan area population. Indeed, the city of Melbourne is only the 23rd largest municipality in the Melbourne metropolitan area and has a population smaller than a single city council district in Columbus, Ohio.

These virtually random variations in core city sizes lead to misleading characterizations. For example, locals sometimes point out that San Antonio is the 6th largest city in the United States. True, San Antonio is the 6th largest municipality in the United States, but the genuine, classically defined city – the broader metropolitan area that is the urban organism – ranked only 26th in size in 2010. The suburbs and exurbs, as defined by municipal jurisdictions, are smaller than average in San Antonio, but the city itself stretches in a suburban landscape up to more than 15 miles (24 kilometers) beyond its 1950 borders.

Core municipality mayors have been known to travel around the as representatives of their metropolitan areas. In some cases core municipality mayors represent constituencies encompassing the entire metropolitan area (such as Auckland or soon to be major metropolitan Honolulu). Others have comparatively small constituencies. For example, the mayor of Paris presides over only 18 percent of the metropolitan area population, the mayor of Atlanta 8 percent, the mayor of Manila 6 percent, Melbourne 2 percent and Perth, Australia just 0.5 percent (Figure 3).

Core Municipalities in the United States

A remnant of U.S. core urbanization is evident within the city limits of municipalities that were already largely developed in 1940 and have not materially expanded their boundaries. These are the Pre-World War II Core & Non Suburban category of core municipalities. Between 1950 and 2010 these core municipalities lost a quarter of their population, dropping from 24.5 million residents to 19.3 million (Figure 4). All but Miami lost population. Despite improved downtown population fortunes, the last decade saw a small further decline of 0.2 percent overall. Only two legacy cities, New York and San Francisco, now exceed their peak populations of the mid-20th Century.

Again, this is the typical pattern internationally. Throughout the high-income world, the urban cores that have not expanded their boundaries and had little greenfield space for suburban development have had declining in population for years. My review of 74 high income world core municipalities that were fully developed in the 1950s and have not annexed materially showed that only one had increased in population by 2000 (Vancouver). Since that time, a few that had experienced more modest declines have recovered to record levels, such as Munich and Stockholm. Most others, such as London, Paris, Milan, Copenhagen and Zurich remain below their peak populations.

In the United States, most of the strong growth has taken place in the “Pre-World War II & Suburban” classification, doubling from 10.1 million residents to 20.4 million since 1950. These include core cities with strong pre-war cores, but which have either annexed large areas or already contained large swaths of rural territory at that time (like Los Angeles, with its San Fernando Valley, which was largely agricultural) that later became heavily populated.

Many of these core cities experienced population declines within their 1950 boundaries (such as Portland, Seattle and Nashville between 1950 and 1990). Los Angeles, however, has been the exception. The more highly developed central area (as defined by the city Planning Department) within the city limits has increased in population by one-third since 1950. The continuing suburbanization of the city of Los Angeles, however, is indicated by the fact that the central area’s share of city population has fallen from 68 percent to 47 percent.

The “Post-World War II & Suburban” core cities are much smaller and their metropolitan areas are nearly all suburban. These include metropolitan areas like Phoenix and San Jose. The population of these metropolitan areas has increased more than seven fold, from 700,000 to 5.2 million.

Land Area: The differences between the three historical core municipality classifications are most evident in land area. Among the “Pre-World War II & Non-Suburban” cores, land areas were almost unchanged from 1950, with much of the difference reflected in Chicago’s O’Hare International Airport annexation. In contrast, the “Pre-World War II & Suburban” cores more than tripled in size, adding an area larger than Connecticut to their city limits. The percentage increase was even larger in the “Post-World War II & Suburban” cores which covered 10 times as much land in 2010 as in 1950 (Figure 5).

Population Density: Over the 60 year period, the population density of the “Pre-World War II & Non-Suburban” cores dropped from 15,300 per square mile to 11,400 (5,900 per square kilometer to 4,400). The “Pre-World War II & Suburban” and “post-World War II & Suburban” cores started with much lower densities and then fell farther. The core city densities in these municipalities are approximately one-half the population densities of Los Angeles suburbs (Figure 6).

The Need for Caution

All of this indicates the importance of caution with respect to core versus suburban and exurban comparisons. For example, Atlanta, which represents only 8 percent of the urban organism (metropolitan area) in which it is located is not comparable to San Antonio, with its 62 percent of the metropolitan population. These distinctions are important when we talk about different regions.

Wendell Cox is principal of Demographia, an international public policy and demographics firm. He is co-author of the “Demographia International Housing Affordability Survey” and author of “Demographia World Urban Areas” and “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.” He was appointed to three terms on the Los Angeles County Transportation Commission, where he served with the leading city and county leadership as the only non-elected member. He was appointed to the Amtrak Reform Council to fill the unexpired term of Governor Christine Todd Whitman and has served as a visiting professor at the Conservatoire National des Arts et Metiers, a national university in Paris.

Chicago photo by Bigstock.

[Originally published at New Geography]

Categories: On the Blog

The Legally Problematic Nature of a Title II Reclassification of Internet Services

Somewhat Reasonable - May 13, 2014, 9:58 AM

I confess that I am more than a bit mystified at the way FCC Chairman Tom Wheeler and his Democrat colleagues, seemingly, are moving ever closer in the direction of embracing a Title II reclassification of Internet access services. No matter how loud the banging of pots and pans outside the FCC’s headquarters, it would be terribly unsound as a matter of policy to subject Internet services to the same Title II public utility regulatory regime that applied to last century’s POTS (“plain old telephone”) service.

The irony of the Free Press organization urging protesters to bring pots to the FCC to make noise in the cause of imposing on today’s Internet providers the same public utility regulation that applied to Ma Bell’s POTS-era service seems to have escaped the protesters.

But put aside my mystification as to why Chairman Wheeler and his Democrat colleagues would want to align themselves with such a backwards-looking policy.

What also mystifies me is how little discussion there has been concerning the likelihood of success, or not, that a Title II reclassification would be sustained. As a said in my May 9 blog, “Pots and Pans and the Neutrality Mess,” the “FCC’s legal case would be fairly problematic.”

Here is the way I explained why this is so:

“While it is true enough that, under established administrative law principles, an agency may change its mind, it nevertheless must provide a well-reasoned explanation for such a change. Pointing to the number of protesters banging on pots and pans outside the FCC’s doors is not likely to suffice. Neither is pointing to the agency’s disappointment at already having been twice rebuffed by the DC Circuit under alternative theories.

The main reason the FCC’s case for sustaining a Title II challenge would be problematic is this: In defending its decision to classify Internet service providers as information service providers – thereby removing them from the ambit of Title II regulation – the Commission argued that, from a consumer’s perspective, the transmission component of an information service is integral to, and inseparable from, the overall service offering. This functional analysis of ISPs’ service offerings was the principal basis upon which the Supreme Court upheld the FCC classification determination in 2005 in its landmark Brand X decision.

I don’t think the integrated, inseparable nature of ISPs’ service offerings, from a functional standpoint, and from a consumer’s perspective, has changed since the Brand X decision, so it won’t be easy for the Commission to argue that it is changing its mind about the proper classification based on changed consumer perceptions of the service offerings’ functionality. And to the extent that the Brand X Court cited favorably to the FCC’s claims concerning the then-emerging marketplace competition and the dynamism in the broadband marketplace, those factors, if anything, today argue even more strongly for a non-Title II common carrier classification.

I understand the role that so-called Chevron deference can play in upholding agency decisions. Indeed, it played an important role in the Court’s decision in Brand X. But invoking Chevron deference won’t relieve the FCC of the need to provide persuasive reasoning in support of an abrupt about-face on a point the agency litigated – successfully – all the way up to the Supreme Court.”

As I’ve been puzzling over the lack of comment concerning the lawfulness of a potential FCC switcheroo regarding Title II, I reviewed once again the FCC General Counsel’s Memorandum dated May 6, 2010, in which Austin Schlick, the then-GC, set out to bolster the case for a Title II reclassification of Internet services should the Commission choose to adopt that course. Of course, the then-commissioners did not choose the Title II route.

Nevertheless, given its clear intent to bolster the legal justification for a Title II reclassification, the General Counsel’s memorandum is instructive. As I acknowledged in my blog last Friday, Mr. Schlick rightly observes that the FCC may well receive substantial Chevron deference for a reclassification determination and that an agency is entitled to change its mind if it offers persuasive reasoning for doing so.

I agree with these points of administrative law. But I think if Mr. Schlick’s memo is read closely, it indicates that it will not be so easy for the Commission to supply such persuasive reasoning. This is because, as Mr. Schlick readily acknowledges, in his opinion for the Supreme Court in Brand X, Justice Thomas declared: “The entire question is whether the products here are functionally integrated (like the components of a car) or functionally separate (like pets and leashes).  That question turns not on the language of the Act, but on the factual particulars of how Internet technology works and how it is provided, questions Chevron leaves to the Commission to resolve in the first instance….”

Having already resolved in the first instance the question of “the factual particulars of how Internet technology works and how it is provided,” it won’t necessarily be so easy for the Commission now to do an about-face. For as Mr. Schlick went on to say, an agency reassessment of the classification issue would have to include:

“[A] fresh look at the technical characteristics and market factors that led Justice Scalia to believe there is a divisible telecommunications service within broadband Internet access.  The factual inquiry would include, for instance, examination of how broadband access providers market their services, how consumers perceive those services, and whether component features of broadband Internet access such as email and security functions are today inextricably intertwined with the transmission component.  If, after studying such issues, the Commission reasonably identified a separate transmission component within broadband Internet access service, which is (or should be) offered to the public, then the consensus policy framework for broadband access would rest on both the Commission’s direct authority under Title II and its ancillary authority arising from the newly recognized direct authority.”

In other words, as Mr. Schlick understood, it won’t suffice for the Commission simply to bemoan the fact that the D.C. Circuit twice has held that the agency lacked authority for its earlier forays into net neutrality regulation. Instead, the Commission will need to show, as a factual matter, from a functional standpoint and from the consumer’s perspective, why its earlier technical analysis concerning the integrated nature of Internet service – that is, the inseparability of the transmission and information services components – is no longer “operative.”

Mr. Schlick quotes heavily from Justice Scalia’s dissenting analysis to bolster his case. But Justice Scalia’s analysis was accepted by only two other Justices. He was on the losing side of a 6-3 decision.

I am not saying that the Commission could not prevail if it ever decides to go the Title II route – as unwise as such a decision would be. But I am not aware that the functional nature of Internet access services has changed since the Commission initially classified Internet access as an information services. Nor am I aware that consumers perceive the way these services are offered, from a functional standpoint, any differently today than they did at the time of the agency’s initial classification determination.

That being so, I remain mystified at how little discussion there has been concerning the lawfulness, or not, of a potential Title II reclassification.

[Originally published at The Free State Foundation]

Categories: On the Blog

Lessons from the Great Austrian Inflation

Somewhat Reasonable - May 13, 2014, 9:29 AM

This year marks one hundred years since the beginning of the First World War in the summer of 1914. The Great War, as it used to be called, brought great devastation in its wake. Millions of human lives were lost on the battlefields of Europe; vast amounts of accumulated wealth were consumed to cover the costs of combat; and battles and bombs left a large amount of physical capital in ruins. But the “war to end war,” as it was called, also resulted in another weapon of economic mass destruction – an orgy of paper-money inflations.

One of these tragic episodes that is worth recalling and learning from was the disintegration of the Austro-Hungarian Empire and the accompanying Great Austrian Inflation in the immediate postwar period in the early 1920s.

The Habsburg Monarchy and the Coming of World War I

In the summer of 1914, as clouds of war were forming, Franz Joseph (1830–1916) was completing the 66th year of his reign on the Habsburg throne. During most of his rule Austria-Hungary had basked in the nineteenth-century glow of the classical-liberal epoch. The constitution of 1867, which formally created the Austro-Hungarian “Dual Monarchy,” ensured that every subject in Franz Joseph’s domain had all the essential personal, political, and economic liberties of a free society.

The Empire encompassed a territory of 415,000 square miles and a total population of over 50 million. The largest linguistic groups in the Empire were the German-speaking and Hungarian populations, each numbering about 10 million. The remaining 30 million were Czechs, Slovaks, Poles, Romanians, Ruthenians, Croats, Serbs, Slovenes, Italians, and a variety of smaller groups of the Balkan region.

Austria-Hungary’s Wartime Inflation and Postwar Political Disintegration

Like all the other European belligerent nations, the Austro-Hungarian government immediately turned to the printing press to cover the rising costs of its military expenditures in the First World War. At the end of July 1914, just after the war had formally broken out, currency in circulation totaled 3.4 billion Austrian crowns. By the end of 1916 it had increased to over 11 billion crowns. And at the end of October 1918, shortly before the end of the war in early November 1918, the currency had expanded to a total of 33.5 billion crowns. From the beginning to the close of the war the Austro-Hungarian money supply in circulation had expanded by 977 percent. A cost-of-living index that had stood at 100 in July 1914 had risen to 1,640 by November 1918.

But the worst of the inflationary and economic disaster was about to begin. Various national groups began breaking away from the Empire, with declarations of independence by Czechoslovakia and Hungary, and the Balkan territories of Slovenia, Croatia, and Bosnia being absorbed into a new Serb-dominated Yugoslavia. The Romanians annexed Transylvania; the region of Galicia became part of a newly independent Poland; and the Italians laid claim to the southern Tyrol.

The last of the Habsburg emperors, Karl, abdicated on November 11, 1918, and a provisional government of the Social Democrats and the Christian Socials declared German-Austria a republic on November 12. Reduced to 32,370 square miles and 6.5 million people—one-third of whom resided in the city of Vienna—the new, smaller Republic of Austria now found itself cut off from the other regions of the former empire as the surrounding successor states (as they were called) imposed high tariff barriers and other trade restrictions on the Austrian Republic. In addition border wars broke out between the Austrians and the neighboring Czech and Yugoslavian armies.

Postwar Austria and Socialist Redistributive Policies

Within Austria the various regions imposed internal trade and tariff barriers on other parts of the country, including Vienna. The rural regions hoarded food and fuel supplies, with black marketeers the primary providers of many of the essentials for the citizens of Vienna. Thousands of Viennese would regularly trudge out to the Vienna Woods, chop down the trees, and carry cords of firewood back into the city to keep their homes and apartments warm in the winters of 1919, 1920, and 1921. Hundreds of starving children were seen every day begging for food at the entrances of Vienna’s hotels and restaurants.

The primary reason for the regional protectionism and economic hardship was the policies of the new Austrian government. The Social Democrats imposed artificially low price controls on agricultural products and tried to forcibly requisition food for the cities from the countryside. The rural population resisted the food-requisitioning police units sent from Vienna, sometimes opposing the confiscation of their harvests with armaments.

The only thing that saved even more starvation was the effectiveness of a huge black market that got around the network of price controls and the provincial government restrictions that attempted to prevent the exporting of food to Vienna. Housewives in the Vienna would refer to, “My smugger,” meaning the regular black market provider of the essentials of life, and of course at prices far above the artificial prices set by the socialist government in the capital.

The Costs of Austrian Socialism and Hyperinflation

By 1921 over half the Austrian government’s budget deficit was attributable to food subsidies for city residents and the salaries of a bloated bureaucracy to manage an expanding welfare state. The Social Democrats also regulated industry and commerce, and imposed higher and higher taxes on the business sector and the shrinking middle class. One newspaper in the early 1920s called Social Democratic fiscal policy in Vienna the “success of the tax vampires.”

The Austrian government paid for its welfare state subsidies and expenditures through the monetary printing press. Between March and December 1919 the supply of new Austrian crowns increased from 831.6 million to 12.1 billion. By December 1920 it increased to 30.6 billion; by December 1921, 174.1 billion; by December 1922, it was 4 trillion; and by the end of 1923, it had increased to 7.1 trillion crowns. Between 1919 and 1923, Austria’s money supply had increased by 14,250 percent.

Prices rose dramatically during this period. The cost-of-living index, which had risen to 1,640 by November 1918, had gone up to 4,922 by January 1920; by January 1921 it had increased to 9,956; in January 1922 it stood at 83,000; and by January 1923 it had shot up to 1,183,600. The hypothetical consumer basket of goods that had cost 100 crowns in 1914 cost over one million crowns less than nine years later.

The foreign-exchange value of the Austrian crown also reflected the catastrophic depreciation. In January 1919 one dollar could buy 16.1 crowns on the Vienna foreign-exchange market; by May 1923, one dollar traded for 70,800 crowns.

At first the black marketeers in Vienna would accept the depreciating Austrian crown as payment for smuggled goods from the rural areas. But by the autumn of 1923, they would only sell for other commodities considered of higher and more tradable value that increasingly worthless paper money. A gold watch bought four sacks of potatoes; fifty cigars of a superior quality purchased four pounds of pork or ten pounds of lard.

During the worst of the inflation, the Austrian central bank printing presses were working night and day churning out the vast quantities of the currency. At the 1925 meeting of the German “Verein für Sozialpolitik” (the Society for Social Policy), Austrian economist Ludwig von Mises told the audience:

“Three years ago a colleague from Germany, who is in this hall today, visited Vienna and participated in a discussion with some Viennese economists . . . Later, as we went home through the still of the night, we heard in the Herrengasse [a main street in the center of Vienna] the heavy drone of the Austro-Hungarian Bank’s printing presses that were running incessantly, day and night, to produce new banknotes. Throughout the land, a large number of industrial enterprises were idle; others were working part-time; only the printing presses stamping out banknotes were operating at full speed.”

Ludwig von Mises and Ending the Austrian Inflation

Finally in late 1922 and early 1923 the Great Austrian Inflation was brought to a halt. This was due to a great extent to the efforts of Ludwig von Mises. Mises was a senior economic analyst at the Vienna Chamber of Commerce. He worked tirelessly to persuade those in political power that the food subsidies had to end. Finally, in 1922, he was able to arrange for several prominent business associations and the association of labor unions in Vienna to call for the elimination the government’s costly food subsidies at the controlled prices.

To ameliorate the differential effects that inflation was having on often raising the prices of goods before any rise in money wages, Mises proposed and had accepted a price indexation scheme linked to the value of gold, so that money wages on average would rise at the same rate as the general level of prices was going up. This would take pressure off the government to have to compensate with expensive food subsidies in the face of the rising cost of living, and which could be funded with no means other than further and further increases in the paper money supply.

Then Mises succeeded in persuading the Austrian Chancellor, Ignaz Seipel, that continuation of the inflation would lead to the economic and political ruin of the country. Mises warned Seipel that with an end to the inflation there would be a “stabilization crisis” during which the Austrian economy would have to go with an adjustment period. The market would have to rebalance itself due to the distortions and misdirection of labor, capital and resources that the inflation had brought about.

Seipel accepted that fact that the readjustment consequences were necessary if a worse disaster was to be avoided from a total collapse of the Austrian monetary system. The Austrian government appealed for help to the League of Nations, which arranged a loan to cover a part of the state’s expenditures. But the strings attached to the loan required an end to food subsidies and a 70,000-man cut in the Austrian bureaucracy to reduce government spending.

At the same time, the Austrian National Bank was reorganized, with the bylaws partly written by Ludwig von Mises. A gold standard was reestablished in 1925; a new Austrian shilling was issued in place of the depreciated crown; and restrictions were placed on the government’s ability to resort to the printing press again.

Austria’s Short-Lived Stability before Depression and Nazi Annexation

Unfortunately, Austria’s economic recovery was short-lived. In the second half of the 1920s, the Austrian government again increased expenditures, borrowed money to cover its deficits and raised taxes on the business sector and higher income individuals. This resulted in economic stagnation.

In 1931, Ludwig von Mises co-authored a report for the Austrian government that showed that fiscal policy had resulted in capital consumption. Business taxes, social insurance taxes and workers’ wages had increased so much between 1925 and 1929 relative to the rise in selling prices for manufactured goods that many enterprises had not had enough after-tax revenues to replace physical capital used up in production. Misguided Austrian fiscal policy had resulted in a partial “eating of the seed corn.”

With the coming of the Great Depression in the early 1930s Austria suffered a new financial crisis due to banking mismanagement. An attempted “bailout” to save some of Vienna’s leading banks created even more fiscal havoc with the Austrian government’s budget and a partial moratorium on payment of Austria’s international debt. Loans arranged through the League of Nations provided temporary stopgap remedies to the fiscal crisis.

But overshadowing even all of the economic chaos was a political crisis in 1933. A procedural voting dispute in the Austrian Parliament lead the Austrian Chancellor, Engelbert Dollfuss, to suspend the country’s constitution and impose a one-party fascist-type dictatorship. In 1934, Austrian Nazis inspired by Hitler’s coming to power in Germany a year earlier murdered Dollfuss in a failed coup attempt.

Four years later, in March 1938, Hitler ordered the invasion of Austria, and the country was annexed into Nazi Germany. Austria’s previous monetary and fiscal mismanagement soon paled in comparison to its fall into the abyss of Nazi totalitarianism and then the destruction of World War II.

For those who say that such things as a hyperinflation, economic chaos, capital consumption, and political tyranny “can’t happen here,” it is worth remembering that a hundred years go, in 1914, few in prewar Vienna could have imaged that it would happen there.

[Originally published at EpicTimes]

Categories: On the Blog

Environmental Shakedown

Somewhat Reasonable - May 12, 2014, 2:34 PM

Over a three-year period, 2009-2012, Department of Justice data shows American taxpayers footed the bill for more than $53 million in so-called environmental groups’ legal fees—and the actual number could be much higher. The real motivation behind the Endangered Species Act (ESA) litigation, perhaps, could have more to do with vengeance and penance than with a real desire to protect flora and fauna.

On May 7, I spoke at the Four Corners Oil and Gas Conference in Farmington, New Mexico. During the two-day event, I sat in on many of the other sessions and had conversations with dozens of attendees. I left the event with the distinct impression that the current implementation of the ESA is a major impediment to the economic growth, tax revenue, and job creation that comes with oil-and-gas development. I have written on ESA issues many times, most recently I wrote about the lesser prairie chicken’s proposed “threatened” listing (which the Fish and Wildlife Service [FWS] listed on March 27) and the Oklahoma Attorney General’s lawsuit against the federal government over the “sue and settle” tactics of FWS and the Department of the Interior.

 While at the conference, I received an email announcing that FWS has asked a federal court for a six-month delay in making a final determination on whether to list the Gunnison sage grouse as an endangered species—moving the decision past the November elections. Up for re-election, Senator Mark Udall (D-CO) “cheered” the extension request. The E & E report states: Colorado elected leaders “fear the listing could have significant economic impacts.”

Kent Holsinger, a Colorado attorney specializing in lands, wildlife and water, posited: “Senator Udall is among those lauding the move—perhaps because a listing decision would affect his fate in the U.S. Senate. Gunnison sage grouse populations are stable, if not on the increase. In addition, myriad state, local and private conservation efforts have been put into place over the last decade. Those efforts, and the Gunnison sage grouse, are at risk if the FWS pursues listing.”

The report continues: “WildEarth Guardians is not opposing the latest extension after Fish and Wildlife agreed to some extensive new mitigation measures that will be made in the interim, including increasing buffer zones around sage grouse breeding grounds, called leks, and deferring coal, oil and gas leasing, said Erik Molvar, a wildlife biologist with WildEarth Guardians.” It goes on to say: “But the Center for Biological Diversity, which is a party to the settlement agreements with WildEarth Guardians, said the latest extension is a bad move for the grouse, which it says has needed ESA protections for years.”

Two important items to notice in the Gunnison sage grouse story. One, the power the environmental groups wield. Two, part of appeasing the environmental groups involves “deferring coal, oil and gas leasing.”

It is widely known that these groups despise fossil fuels. The Center for Biological Diversity (CBD)brags about its use of lawsuits to block development—but it is not just oil and gas they block, it is virtually all human activity.

In researching for this week’s column, I have talked to people from a variety of industry and conservation efforts. The conversations started because I read something they’d written about CBD. Whether I was talking to someone interested in protecting big horn sheep, a fishing enthusiast, or an attorney representing ranching or extractive industries, CBD seems to be a thorn in their side. All made comments similar to what Amos Eno, who has been involved in conservation for more than forty years, told me: “CBD doesn’t care about the critters. They are creating a listing pipeline and then making money off of it.” Environmental writer Ted Williams, in a piece on wolves, called CBD: “perennial plaintiffs.”

New Mexico rancher Stephen Wilmeth directed me to a CBD profile he’d written. In it he addressed how the CBD’s efforts targeted livestock grazing and sought “the removal of cattle from hundreds of miles of streams.” Wilmeth states: “CBD has elevated sue and settle tactics, injunctions, new species listings, and bad press surrounding legal action to a modern art form. Consent decrees more often than not result in closed door sessions with concessions or demands made on agency policy formulation.”

In a posting on the Society for Bighorn Sheep website titled: Legal tactics directly from the Center for Biological Diversity, board member Gary Thomas states: “The Center ranks people second. By their accounting, all human endeavors, agriculture, clean water, energy, development, recreation, materials extraction, and all human access to any space, are subordinate to the habitat requirements of all the world’s obscure animals and plants. But these selfish people don’t care about any person, plant, or animal. The Center collects obscure and unstudied species for a single purpose, specifically for use in their own genre of lawsuits. They measure their successes not by quality of life for man nor beast, but by counting wins in court like notches in the handle of a gun.”

You’d expect someone like me, an energy advocate, to dis the CBD—and I have (CBD is not too fond of me)—but how’d it get such a broad-based collection of negativity from within the environmental community?

Ted Williams told me: “environmentalists who are paying attention are not happy with CBD.” He has written the most comprehensive exposé on CBD that can be found—for which he was threatened with a lawsuit. Without Williams’ work, one has to resort to bits and pieces off the internet to put together CBD’s modus operandi—but there is plenty to choose from!

One of the most interesting ones to catch my eye was a part of the post on SheepSociety.com. There, Thomas points out the fact that the three founders of CBD are ex-forest service workers. He states: “To donors, their motives appear altruistic. To the informed, they look more like a 20-year quest for revenge for their firing.”

I am fairly well acquainted with CBD, but Thomas’ accusation was new to me—though it fit what I knew. (One of the very first pieces I ever wrote, when I originally got into this work seven plus years ago, was on the one and only legal victory ever won against CBD. Arizona rancher Jim Chilton won a defamation suit against CBD with a $600,000 dollar settlement. Nearly everyone I talked to as a part of my research for this story mentioned Chilton’s name with reverence.)

I dug around and found an interesting story from Backpacker Magazine that gave credence to Thomas’ claim. The February 2003 issue features a multi-page profile on Kieran Suckling, co-founder and executive director. Addressing the three founders, who were working for the Forest Service, Backpacker reports: “All three of them were frustrated by their agencies’ inaction.” The story goes on to explain how the threesome “hatched a plan” to petition the Forest Service and force it to list the spotted owl.

Then, I found a 2009 profile on Suckling in High Country News (HCN). It quotes Suckling describing how the roots of his full-time activism started while working for the Forest Service doing spotted owl surveys: “We had signed contracts saying we wouldn’t divulge owl locations, but we went the next day to the Silver City Daily Press, with a map that told our story. We were fired within seconds. That was the start of us becoming full-time activists.”

These snippets help explain Suckling’s animosity toward the Forest Service and other government agencies. CBD is gleeful over its results. It has sued government agencies hundreds of times and has won the majority of the cases—though many never go to court and are settled in a backroom deal (hence the term: “sue and settle”). Thomas writes: “They are extremely proud to report that single-handedly they deplete the U.S. Fish and Wildlife’s entire annual budget, approximately $5 million, for endangered species listings year after year by forcing them to use their limited funds defending lawsuits instead of their intended purpose.”

The HCN piece describes Suckling’s approach to getting what he wants—which he explains in the New Yorker, as “a new order in which plants and animals are part of the polity”: “The Forest Service needs our agreement to get back to work, and we are in the position of being able to powerfully negotiate the terms of releasing the injunction. … They [federal employees] feel like their careers are being mocked and destroyed—and they are. So they become much more willing to play by our rules and at least get something done. Psychological warfare is a very underappreciated aspect of environmental campaigning.”

“In CBD speak,” adds Wilmeth, “the suggestion of playing by the rules equates to its rules of manipulating positive outcomes for its mission.”

Putting the pieces together, it does appear, as Thomas asserts, that Suckling is on a 20+ year “quest for revenge” for being fired—vengeance that American taxpayers are funding.

Suckling is an interesting character. The Backpacker story cites his ex-wife, who said the following: “He’s not tethered on a daily basis to the same things you and I are tethered to.”

Tierra Curry is another name that comes up frequently in CBD coverage. CBD’s staff section of the website lists her as “senior scientist” and says she “focuses on the listing and recovery of endangered species.” As Warner Todd Huston reports: “Curry has an odd profile for an activist. She once claimed to have enjoyed dynamiting creek beds in rural Kentucky and taking perverse pleasure at sending fish and aquatic animals flying onto dry land and certain death. Now Curry spends her time filing petitions to ‘save’ some of the same animals she once enjoyed killing.”

Perhaps Curry’s frenetic listing efforts are her way of doing penance for her childhood penchant of killing critters.

The role vengeance and penance may play in CBD’s shakedown of the American public is just a hypothesis based on facts. But the dollars paid out are very real.

In an April 8, 2014 hearing before the House Committee on Natural Resources, fifth-generation rancher and attorney specializing in environmental litigation, Karen Budd-Falen talked about the need for ESA reform, as four different House bills propose: “Public information regarding payment of attorney’s fees for ESA litigation is equally difficult to access.” Addressing HR 4316—which requires a report on attorney’s fees and costs for ESA related litigation—she says: “It should not be a radical notion for the public to know how much is being paid by the federal government and to whom the check is written.” As she reports in her testimony, Budd-Falen’s staff did an analysis of the 276-page spreadsheet run released by the Department of Justice (DOJ) listing litigation summaries in cases defended by the Environment and Natural Resources Division, Wildlife Section. She explains: “The spreadsheets are titled ‘Endangered Species Defensive Cases Active at some point during FY09-FY12 (through April 2012).’ Although the DOJ release itself contained no analysis, my legal staff calculated the following statistics.” Budd-Falen then shows how she came up with the nearly $53 million figure of taxpayer money paid out over an approximate three-year period. However, she then shows how her own Freedom of Information Act requests have proven “that the DOJ does not keep an accurate account of the cases it defends”—making the actual dollar figure much higher.

Budd-Falen has stated: “We believe when the curtain is raised we’ll be talking about radical environmental groups bilking the taxpayer for hundreds of millions of dollars, allegedly for ‘reimbursement for attorney fees.’”

Budd-Falen’s research shows that for groups like CBD—who sue on process not on substance—it really is about the money.

Eno believes that for the CBD, it isn’t about the critters: “CBD endangers the endangered species program on multiple fronts. First, their petitions and listing suits use up significant financial and personnel resources of both Office of Endangered Species and solicitors office in DOI. This means less funding and personnel devoted to species recovery. Second, CBD suits antagonize and jeopardize recovery programs of cooperating federal land management agencies, particularly USFS and BLM. Third, their suits have hampered forest and grassland management thereby inviting forest fires which endanger both human and wildlife (sage grouse) communities throughout the west. Fourth, CBD suits antagonize, alienate and create financial hardship for affected private land owners, thereby reducing both public support and initiatives and active assistance for listed species recovery.”

Despite numerous attempts, the ESA has not had any major revisions in more than 25 years. The Wall Street Journal states: “The ESA’s mixed record on wildlife restoration and its impact on business have made the law vulnerable to critics.” Groups like CBD have twisted the intent of the law. Reform is now essential—not just to save taxpayer dollars, but to put the focus back on actually saving the species rather than, as Wilmeth calls it: “the bastardized application of science, policy and education.”

The author of Energy Freedom, Marita Noon serves as the executive director for Energy Makes America Great Inc. and the companion educational organization, the Citizens’ Alliance for Responsible Energy (CARE). Together they work to educate the public and influence policy makers regarding energy, its role in freedom, and the American way of life. Combining energy, news, politics, and, the environment through public events, speaking engagements, and media, the organizations’ combined efforts serve as America’s voice for energy.

[Originally published at Red State]

Categories: On the Blog

The Real State of the Economy–Not Obama’s Lies

Somewhat Reasonable - May 12, 2014, 1:13 PM

My Father was a Certified Public Accountant and so is my older brother, now comfortably retired in Florida. I tell you this because I would be hard-pressed to balance my checkbook.

Even so, you do not have to be smart with numbers to know that the real state of the U.S. economy is pathetic these days. You can thank Barack Obama for that because, dear reader, he is utterly clueless regarding America’s economy; how it works, and what it needs to work.

Peter Ferrara, a Senior Fellow at The Heartland Institute specializing on entitlement and budget policy and a contributor to Forbes magazine, is one of the people to whom I go to understand the economy.

In a May 2 edition, in an article titled “What Obama’s Growth Recession Is Stealing From Your Wallet”, Ferrara wrote “Restoring that booming economic growth and prosperity (of past decades) is the core of solving all of our nation’s problems, not income or wealth redistribution, or addressing ‘inequality.’  But President Obama is not on the path of restoration. The latest report on real GDP growth estimates this year’s first quarter at a pitiful 0.01%. This is in the 6th year of Obama’s Presidency.”

The Heritage Foundation’s chief economist, Stephen Moore, writing on May 1st in the National Review, asked, “What happens to an economy when you do just about everything wrong?” Here’s his list:

# Say you spend $830 billion on a stimulus stuffed with make-work government-jobs programs and programs to pay people to buy new cars,

# you borrow $6 trillion,

# you launch a government-run healthcare system that incentivizes businesses not to hire more workers,

# you raise tax rates on the businesses that hire workers and on the investors that invest in the businesses that hire workers,

# you print $3 trillion of paper money,

# you shut down an entire industry (coal), and try to regulate and restrain the one industry that actually is booming (oil and gas).

“We made all of these imbecilic moves,” wrote Moore, “and the wonder of it all is that the U.S. economy is growing at all. It is a tribute to the indestructible Energizer Bunny that is the entrepreneurial U.S. economy that it keeps going and going even with all the obstacles.”  I want to argue with his use of “we”, but enough Americans elected Obama twice to justify it.

The Associated Press, much like most of the mainstream press, paused from protecting Obama in a May 2nd article that began “Despite the unemployment rate plummeting, more than 92 million Americans remain out of the labor force.”

As Harvard Ph.D., Jerome R. Corsi, a World Net Daily senior staff reporter, noted the same day as the AP article, “The Bureau of Labor Statistics (BLS) announcement that unemployment has dropped from 6.7 percent in March to 6.3 in April was partly attributed to some 800,000 workers dropping out of the labor force last month, reducing the labor participation rate to 62.8 percent, a new low for the Obama administration.”

When people stop looking for work, they are not counted as “unemployed.” Dr. Corsi put the actual unemployment rate in April at12.3 percent! The numbers you read about from the BLS are “virtually meaningless.” They should just drop the “L” from their acronym.

As the Wall Street Journal opined on May 3rd, “The Americans who left the workforce include older workers who retired before they wanted to, millions who have taken disability, and others who simply don’t find the job openings to be worth the cost of giving up public benefits.”

You don’t have to be an economist to know the truth that has finally sunk into the minds of millions of Americans, many of whom are unemployed or know someone who is. Obama has driven the economy into the toilet. He has foisted trillions of debt on future generations. In order to vote for “the first black President of America”, what those voters and the rest of us got was a man with no experience running so much as a sidewalk lemonade stand.

I think those voters will want a change in November when the midterm elections are held. Between now and then, I want the Republican Party to spend a little less time on the Benghazi scandal and a lot more time telling voters their plans to revive the economy because, in the end, that is the single most important issue facing all of us.

[Originally published at Warning Signs]

Categories: On the Blog

Thorner: Com Ed’s Smart Meters Poke Holes In Privacy Walls

Somewhat Reasonable - May 12, 2014, 9:39 AM

Historical background: It was on January 17, 2008 that President Obama said, “Under my plan of a cap-and-trade system, electricity rates would necessarily skyrocket.” Cap-and-Trade legislation was voted down in the U.S. Senate in April of 2009, despite the heaviest in the nation lobbying efforts by Chicago-based Exelon Corporation (John Rowe, CEO at the time.) to pass Cap and Trade legislation. Commonwealth Edison, commonly known as ComEd, is the largest electric utility in Illinois, serving the Chicago and Northern Illinois area. It is a unit of the Chicago’s Exelon Corporation. As such, Exelon supplies the power; ComEd sells the power.

With the failure of Cap-and-Trade legislation, so-called smart meters (representing a power takeover), are being forced upon consumers by electric utilities, including Illinois’ ComEd, as just another technology that will achieve government-sponsored extortion of American citizens. It was in 2009 that the U.S. government allocated $11B of taxpayer funds from the 2009 bailout package to develop a “smart” grid, including “smart” meters for every home’s electricity, gas and water. Accordingly, smart meters have now become an integral part of the infrastructure to implement U.N. Agenda 21, the resulting document of the 1992 Rio Conference in Brazil (Informal name: The Earth Summit), whose principal themes are the environment and sustainable development. President George H. Bush represented this nation, along with 172 other nations, 108 at the level of heads of State or Government.

Smart meter defined:   Smart meters are digital meters utilities are using to replace current mechanical meters. As smart meters can control the amount of energy used, their use ties in with the Obama administration’s efforts to reign in CO2 emissions as the cause of manmade Global Warming.  While both look similar, a mechanical meter has a rotating wheel to calculate energy usage in contrast to a smart meter which uses a digital read out.  That is where the similarity ends.  As this article cites:

Smart meters may be “smart” but they are not private.  Once a smart meter is attached to a home, it can tell how many people live in the house, when they get up, when they go to bed, and when they aren’t home.  I can tell how many showers they take and loads of laundry they do, how often they use the microwave and how much and what kind of TV they watch.  The information gathered from your house is sent to a neighborhood smart meter which then wirelessly transmits your information to a municipal network and to the national network which is the Smart Grid.

Where smart meters have already been installed, some utility companies have already established three main time categories at which kilowatt hours will be billed:  on peak, mid-peak and off-peak, with peak time almost twice as expensive as off-peak time. As off-peak is often given at 9:00 at night, keeping electric bills down would mean cooking, bathing, running the washer or dryer, having heat or air conditioning until after 9:00 at night. There is every possibility that eventually the utility companies will control how much energy we use and when. Going a step further, it is not unreasonable to believe that in the future smart chip-equipped appliances could be developed that could be shut off remotely.

Evidence of health and property damage:  There are already hundreds of thousands of cases against various utility companies because of the detrimental health effects of smart meters.  As stated by Dr. De-Kum Li, a respected Kaiser Permanente scientist, “Utilities and industry simply haven’t done any studies to show that “smart” meters are safe.  Not one.” Heed this report as to the evidence of health damage:

Tens of thousands of individuals are reporting officially, to governments and utilities, that they are experiencing illness or functional impairment following the installation of ‘smart’ meters.  Reported symptoms include headaches, sleep problems, ear ringing, focus difficulties, fatigue, heart palpitations, nausea and statistically abnormal recurrences of cancer.  According to court-ordered documentation, and independent testing, utilities have been proven to be lying about how often “smart” meters transmit bursts of microwave radiation.  Depending on the utility their claims is typically something like ’4 to 6 times per day’ (Pepco), or  ’45-60 seconds per day’ (PG&E)  –  whereas courts and independent testing reveal that meters are transmitting in the range of 10,000 to 190,000 pulsed microwave transmissions per day. 

 The amount of transmitted microwave radiation has been measured up to 200 times greater (if one is standing next to the meter) than the Building Biology standard threshold for ‘extreme concern.’

Incredible as it may seem almost none of the meters, mostly made in China, have been tested or approved by UL or an equivalent standards body, leading to well over a thousand home fires linked to smart meters and tens of thousands of other individuals experiencing appliance breakages in their home.  Because there is no UL approval of smart meters, homeowners have no guarantee of coverage and often are left to deal the expense of damage and repair.  Since everything we plug in has been tested and checked before we purchase it, why would smart meters not require the same UL approval?  It just so happens that the federal government has waived that requirement for the utilities.

As with the phasing out of the Edison light bulb only to be replaced by mostly imported CFL bulbs from China which are not to be disposed of in the normal way because of the mercury they contain, the push for smart meters over mechanical meters and CFL light bulbs over Edison light bulbs has all to do with controlling the amount of energy we use to limit CO2 emissions, which has become the culprit of manmade climate change with the Obama administration, dovetailing with U.N. Agenda 21.

Commonwealth Ed is coming for you!  ComEd’s Smart Meter Roll-out will affect its customers throughout Illinois.  While ComEd is accelerating the deployment of Smart Meters here in Illinois, the largest Massachusetts electric utility has declared Smart Meters as “irrational.”    With most things our Government gets involved with, it is a lose lose situation for the consumer and for taxpayers   Com Ed is offering a temporary op-out, but the utility has made it very difficult for those wishing to do so.  A fee is also being imposed for opting out.

Bonnie O’Neil, a co-writer with Thorner of a series of six U.N. Agenda 21-related articles published at Illinois Review and Vice President of Eagle Forum in CA, sent me the following information about her hometown and smart meters.

Without any warning, a smart meter was put on all the homes in Newport Beach (and most of the County).  It was located right outside of my office.  The biggest problems regarding health issues surrounding smart meters is usually associated with the meter being by a bed or in an area where there is lots of contact.   I decided to use the “opt out” program to have my meter replaced with one like I had been using previously.  I had to pay a fee and I continue to have an extra fee on my electric bill as a result.  I believe the “opt out” program was allowed in Ca. because there was a growing controversy about smart meters.

Smart meters are not mandated by the federal government.  According to the Energy Policy Act of 2005, the utilities may “offer” them and install them “upon customer request” but not force customers to have one installed.

Analog Meters (mechanical meters) function as electro timers and have been safely used for decades, and they still work. Smart meters are extremely hackable and customers suffer increased utility bills virtually across the board immediately following a “smart” meter installation.  Weather-proofed do not install smart meter sign can be placed next to your mechanical meter.   Six signs are available for purchase, among them, Say NO to Health Effects from Radiation and Say NO to increased utility costs.

Time for pubic action!  A public meeting and screening of “Take Back Your Power,” Josh del Sol’s award-winning documentary, is being held at National Louis University, 850 Warrenville Road, Lisle, IL (Room #145) on Tuesday, May 27.  The program is being sponsored by West Suburban Patriots & Americans for Prosperity.  A Q/A workshop will follow the film. Showtime is 6:45 p.m. Doors open at 6:30 p.m.  Although free, donations will be gratefully accepted.  DVD’s of Take Back Your Power, which investigates the erosion of rights in the name of “smart” and “green,” will be available for purchase for $20.  Check out this youtube movie trailer.

The good news is that this master plan of control as envisioned by U.N. Agenda 21, of which one aspect is to install smart meters in virtually all western countries, cannot be achieved if enough individuals simply op out of participating in this nation. The bad news is that Illinois legislators are not interested in addressing the issue. Why? Because their bread is often buttered through campaign donations from ComEd.

What’s at stake is our basic right to life, health, choice and freedom itself.  Say NO to ComEd when it wants to install a smart meter in your home.  Also inform all you know about what smart meters are and why they too must op out.  And by all means, contact your Illinois House and Senate representatives!

[Originally published at Illinois Review]

Categories: On the Blog

Read AT&T’s FCC Filing that Totally Debunks Title II Reclassification

Somewhat Reasonable - May 12, 2014, 9:27 AM

Given the avalanche of misinformation and manufactured hysteria by net neutrality proponents over the FCC’s proposed rulemaking to make the FCC’s Open Internet Order comply with the Appeals Court Verizon v. FCC decision, AT&T’s FCC filing here (and below) is a welcome and much-needed total debunking of the call for Title II reclassification of broadband.

For anyone, analyst, reporter, etc. who cares to really understand how Title II common carrier law and regulation actually would play out in the real world, not in the nostalgic imaginations of people who have no real life experience in this matter, this filing eviscerates Title II proponents’ partial, over-simplified, inexperienced, and ill-informed thinking.

Beware proponents of Title II reclassification; if you read this AT&T rebuttal you will begin to comprehend the depth of vacuousness of arguments for reclassification of broadband and you will realize that manufactured-public-perception, is no match for facts, reality and real world experience.

Opponents of Title II reclassification will be heartened to read the vast amount of new strong, factual, and legally defensible arguments against this ill-conceived, ill-advised and ill-timed effort to reclassify broadband as a monopoly telephone service.

For those interested, be sure to read this filing and encourage others to read it as well.

May 9, 2014

VIA ELECTRONIC SUBMISSIONMarlene H. Dortch

Secretary

Federal Communications Commission

445 12thStreet S.W. Washington, D.C. 20554

Re:      Open Internet, GN Docket No. 14-28

Dear Ms. Dortch:

On May 8, 2014, Hank Hultquist, Gary Phillips,Christopher Heimann, and I, on behalf of AT&T, met separately with Daniel Alvarez,Legal Advisor to Chairman Wheeler, Priscilla Delgado Argeris, Legal Advisorto Commissioner Rosenworcel, Nick Degani, Legal Advisor to Commissioner Pai, and Amy Bender, Legal Advisor to Commissioner O’Reilly, to discuss how the Commission should proceed in response to the D.C. Circuit’s vacatur and remand of the Commission’s Open Internet rules inVerizon v. FCC, 740 F.3d 623 (D.C. Cir. 2014).  Consistent with our comments in this proceeding, we explained that, the court’s decision requires only that the Commission fine-tune itsprior rules insofar as they apply to fixed broadband by narrowly tailoringthenondiscriminationrequirement to address only true threats to Internet openness and allowing ISPs to make individualized decisions whether and on what terms to deal with edge providers.1

We noted in particular that calls for reclassification of broadband Internet access services as a Title II telecommunications service would cause risks and harms that dwarf any putative benefits, all but scuttle the administration’s ambitious broadband agenda, and would not, in all events, preclude the paid prioritization arrangements that seem to be the singular focus of reclassification proponents.

As the FCC’s National Broadband Plan2recognized, the nation’s overriding communications policy objective for the 21st century is to promote universal broadband deployment and adoption. Private investment, not prescriptive regulation,is the key toachieving that goal. According to the Plan, “the American broadband ecosystemhas evolved rapidly” over the past decade, and this evolution has been “[f]ueled primarily by private sector investment and innovation.”3  Broadband providers are continuingto invest tens ofbillions ofdollars each year

1AT&T Comments,GN Docket No.14-28 (filedMarch 21,2014).

2FCC, ConnectingAmerica:TheNational BroadbandPlan(2010) (BroadbandPlan).

3Id.atXI.

 

in America’s broadband future, creating thousands of new jobs.  But achieving the next phase of broadband deployment envisioned by theNational Broadband Plan will require more— according tothe Commission’s own estimates, $350 billion more.4    The National Broadband Plan thus wisely endorsed “actions government should take to encourage more private innovation and investment,” whileemphasizingthat“therole of government is and should remain limited.”5

When the Commission last considered reclassification proposals, industry analysts warned that such proposals, even when accompanied by forbearance and portrayed as “third way” alternatives to maximal dominant-carrier regulation, would createenormous investment- deterring regulatory uncertainty.  For example:

  •  Craig Moffett of Bernstein Research observed, on the day the Commission proposed Title

II reclassification, that: “Markets abhor uncertainty.Today we got uncertainty in spades.” He added that “it is unclear what, precisely, this means for [other]information service providers, including Google”; that he “expect[ed] aprofoundly negative impact on capital investment”; and that the “third way” was “an unequivocal negative development[.]”6

  •  Jonathan Chaplin of Credit Suisse explained, alsoin the aftermath of thereclassification proposal, that “[t]he biggest disconnect between Washingtonand Wall Street is on how the competitiveness of the industry isviewed. . . . Competitionis doing its job and regulations would make it very difficult for companies to get reasonable return on investment. . . . The threat of regulation could discourage investment and cost jobs[.]”7
  •  Mike McCormack of J.P. Morgan agreed that investors were “extremelynervous about what’s coming” out of this proceeding, and added that “[b]roadband is a very competitive place so there’s no point [in]fixingit[.]”8
  •  Anna-Maria Kovacs of Regulatory Source Associates noted that it would“take years to know whether [any reclassificationdecision] is upheld in court. . . . [W]e would expect the industry—telco, wireless, and cable—toassess capital investments from this point in light of the potential fornew and more extensive regulations.”9

4Staff Presentation, September 2009Commission Meeting, at 45 (Sept.29,2009),http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-293742A1.pdf.

5Broadband Plan at 5.

6Craig Moffett, QuickTake-U.S.Telecommunications,U.S.Cable&Satellite Broadcasting: TheFCC Goes

Nuclear,BernsteinResearch, May 5, 2010 (“Moffett,Quick Take”) (emphasisadded).

7Yu-TingWang& HowardBuskirk,Reclassification SaidtoPoseBroad Risk to U.S.Economy,Communications

Daily,at 1 (June 14, 2010) (someemphasisaddedandsome omitted).

8Id.(emphasisadded).

9Anna-Maria Kovacs,TelecomRegulatoryNote:  D.C.Circuit vacatesFCC’sComcastnetwork-managementorder, Regulatory SourceAssociates,LLC,at 2 (Apr.7, 2010) (emphasisadded).

2

  •  Stanford tech analyst Larry Downes claimed that a reclassification “would be the worst example in history of a tail wagging the dog” and perhaps “the worst ideain communications policy to emerge in the last 75 years—that is, since the [FCC] was first created in 1934.”10
  •  PC Magazine commentator and MarketWatch analyst John Dvorak described the proposed Title II reclassification as “the worst possible outcome” ofthe net neutrality debate and “a terrible idea” that would “destroy the Internet as we know it.”11
  •  Former Chairman Michael Powell, then withProvident Equity Partners, “fear[ed] a prolonged period of uncertainty and instability” in the wake ofany Title II reclassification decision that would “undermine the shared goal of intensifying our nation’s investment in broadband.”12
  •  The Washington Post editorial page explained that any attempted reclassification under Title II would be “a legal sleight of hand that would amount to a naked power grab” and “could damage innovation in what has been a vibrant and rapidly evolving marketplace.”13

A panel of financial experts held at New YorkUniversity law school agreed with all of these concerns:

  •  Height Analytics Managing Director TomSeitz warned that “the FCC could be inhibiting investment through its net neutrality andreclassifications investigations” because “[i]nvestors hate uncertainty and clearly what is being created right now is uncertainty in the marketplace[.]”
  •  Citigroup Managing Director Mike Rollins expressed concern that reclassification would open the door for “a later FCC to . . . limit thenumber of Title II provisions fromwhich it will forbear[.]” This risk,he added, would have aninvestment impact today, because “[w]heninvestorsarelooking at policy decisions they’re not just looking at what the FCC wants to accomplish today but what those policies can do over time.
  •  Wise Harbor founder Keith Mallinson notedthat “people are hungry to have more capabilities [in their broadband connections] andthe market has the capability to deliver

10Larry Downes,What’s in a title?For broadband, it’s Oz vs. Kansas, CNET News, Mar. 11, 2010,http://news.cnet.com/8301-1035_3-20000267-94.html(“Ozvs.Kansas”) (emphasisadded).

11John Dvorak,Net neutrality becomes a dangerousissue,MarketWatch,Apr.16,2010,http://www.marketwatch.com/story/print?guid=2012C86A-55C5-4CA0-821F-F203C21E2B6E(emphasisadded).

12Michael K. Powell,MyTakeon theAppealsCourt Decision,Broadband for America, Apr.7, 2010,http://www.broadbandforamerica.com/blog/michael-powell-my-take-appeals-court-decision(“Powell, My Takeon theAppealsCourt Decision”) (emphasisadded).

13Editorial,Internetoversight is needed,but not inthe formof FCC regulation,Wash.Post,Apr.17,2010,http://www.washingtonpost.com/wp-dyn/content/article/2010/04/16/AR2010041604610.html(emphasisadded).

3

that, but increasing regulation has the risk of stifling that through the uncertainties but also by limiting some basic economic freedoms.”14

These concerns about the long-terminvestment deterring effects of regulatory uncertainty are, if anything, understated.  First, by themselves, the threshold legal challenges to the Commission’s reclassification decision could consume muchof the next decade, depending on

the number of judicial remands. The communications industry suffered through similar regulatory chaos following the Commission’s effort in 1996 to shape the industry around the UNE-P model of intramodal “competition” for voice telephony services. That model ultimately succumbed to judicial challenges—but only eight years later, in2004, after multiple and increasinglyskeptical remands by the Supreme Court andthe courts of appeals.

Second, quite apart from direct legal challenges to the Title II regime itself, any reclassification decision would ignitemulti-year regulatory controversies on a variety ofissues, including, what portions of Title II would and would not apply to Internet service providers, and which Internet services and service providers would be subject to Title II.Title II is a comprehensive regulatory framework put intoplace in 1934 to regulate monopoly telephone companies.  Additional provisions were added over the years, including in 1996, with a spate of wholesale obligations applicable to telecommunications service providers.  Title II reclassification would automatically trigger application of allthese requirements to broadband Internet access services and Internet service providers.

To be sure, the Commission might attempt to minimize the disruption and calmthe markets by proposing to forbear from most statutory provisions inTitle II, as it did when it first proposed reclassification.But sorting through which of these provisions should apply and which would be subject to forbearance would itself ignite controversy, disagreement, and litigation, creating protracted regulatory uncertainty.And even if the Commission were to successfully exercise its forbearanceauthority, thenew Title IIregime would still be far more regulatory, and create far more regulatory uncertainty, than the pre-ComcastTitle I regime – as the Commission itself recognized sixteen years ago in the StevensReport. In that report, the Commission rejected a Title II classification for ISPsand, in the process, rejectedclaims that forbearance would eliminate the policy harms of such a classification.  It explained:

Notwithstanding the possibility of forbearance, we are concerned that including Information service providers within the “telecommunicationscarrier” classification would effectively impose a presumption in favor ofTitle II regulation of such providers.  Such a presumption would be inconsistent with the deregulatoryand procompetitive goals ofthe 1996 Act.  In addition, uncertainty about whether the Commission would forbear fromapplying specific provisions could chill innovation.

Stevens Report, 13 FCC Rcd at 11525, para. 47.

14Howard Buskirk,RegulatoryUncertainty Created byFCCSeen Limiting Network Investment,Communications Daily, July15,2010 (“Buskirk, RegulatoryUncertainty”) (emphases added); seealso John Curran,Panelists: Neutrality, Title II Broadband Issues Breeding InvestorUncertainty, TR Daily, July 14,2010(“Curran,Panelists”)

4

Indeed, reclassification would raise a host of issues that reclassificationproponentshave completely ignored in their advocacy.   For example, if broadband Internet access service is a telecommunications service, then broadband Internet access providers could be entitled to receive transport and termination fees under section 251(b)(5).15The Commission could not avoid this occurrence by establishing a bill-and-keep regime because, unlike voice traffic, Internet traffic is asymmetric. And because Internet traffic would now be subject to reciprocal compensation, virtually every settlement free peering arrangementwould have to be replaced by newly negotiated arrangements implementing the reciprocal compensation provisions of the Communications Act.Moreover,in those instances in which reciprocal compensation does not apply, ISPs would be entitled to file tariffs for the collection ofcharges for terminating Internet traffic to their customers.

Section 222 obligations would also kick in, imposing new obligations on a host of entities and causing wholesale disruption of currentInternetbusinessmodels.  ISPs at both edges of the network, as well as transit providers, content delivery networks and others would appear to be statutorily required to take reasonable measures to prevent disclosure or use of information, such as IP addresses, websites visited, customer location information and other data, and they would

be precluded fromusing this information withoutcustomer consent.   Email providers and search engines, as telecommunications service providers in their own right,couldlikewisebesubjectto these requirements.

And on top of all this, entities classified astelecommunications service providers would have to assess Universal Service Fees on theircustomers.  While the current 17% contribution factor wouldpresumably be reduced, this would still amount to a substantial tax on Internet use.

Moreover, sections 201 and 202 would automatically apply once the Commission classified broadband Internet access services astelecommunications services.  And since the flashpoint for this debate is “paid prioritization,” it is unlikely that the Commission would forbear from applying either ofthese provisions.But both sections contain vague and self- executing prohibitions that could make Internet service providers liable for any conduct that some future Commission, bowing to the same types of political pressures and irrational hysteria that we now see, decides to deemunreasonable. ISPs would thus have toassesslitigationrisk whenever, among other things, they engage in new anti-piracy measures, network-management techniques, or commercial arrangements with particular applications and content providers.  The uncertainty could deter such initiatives to thedetriment of broadband providers, application and content providers, and ultimately consumers.

Beyond all this, any forbearance decision todaycould be prone to judicialchallengeand attempted reversal by future Commissions.   No issue would ever be settled, and the Internet ecosystemwould be subject to a state of perpetual regulatoryuncertainty.As Commissioner

15Inits 2011 USF/ICC Transformation Order, the Commission heldthatall telecommunications traffic exchanged withaLEC issubject tosection 251(b)(5)obligationto establishreciprocal compensationarrangements.  Connect America Fund, et al.,WCDocket No. 10-90,et al.,Reportand Orderand Further Notice of Proposed Rulemaking,

26 FCC Rcd 17663,para.769 (2011) (USF/ICC Transformation Order),pets. forreview pendingsubnom.In re: FCC11-161,No.11-9900(10thCir.filed Dec. 8,2011).

5

McDowell has noted, this would hardly be the “environment needed to attract up to $350 billion in private risk capital to build outAmerica’s broadband infrastructure.”16

In this regard, it is no meansclear whether a decision now toforbear from particular Title II requirements could be reversedby this or a future Commission.  Indeed, there have been a spate of petitions to overturn past Commission forbearance decisions, and the Commission has, conspicuously, declined to dismiss those petitionson the grounds that forbearance decisions are irreversible. Moreover, insofar as theCommission has forborne fromapplying Title II itself to Verizon’s broadband transmission services, the Commission would have to reverse that very forbearance decision in order to resurrect Title II regulation of the connectivity component of a broadband Internet access service.  This action,in itself, would be inconsistent with any purportedassurancethatforbearancedecisions are not readily reversible.

Moreover, it is foolish tothink that the Commission could reclassify the provision of broadband Internet access to consumers as a telecommunications service without similarly reclassifying a broad array of other functionally analogous servicesin the Internet ecosystem. For example, there is no logicalor legally sustainable basis to distinguish between ISPs serving consumer “eyeballs” andthose serving contentand other edge providers.Likewise, transit

providers and content delivery networks (CDNs)would be telecommunications service providers subject to Title II, as would connected device customers.  (The latter would be resellers of telecommunications services and thus telecommunications service providers in their own right.) Indeed, the logic behind reclassification would dictate that whena search engine connects an advertising network to a searchrequest or effectuates a connection between a search user and an advertiser, it too would be providing a telecommunications service.And so too would an email provider that transmits an email or a social network that enables a messaging or chat session.

The point is, once the Commission separates transmission from information processing, there is no way logically to limit that rationale to one segment of the Internet and not others.Every

entity that provides an over-the-top communications capability, whether it’s voice, text, or video, becomes either a facilities-based provider or a reseller (or both) of a telecommunications service.

In this regard, any attempt to confineTitleII reclassification to owners oflast-mile transmission facilities would crash headlong into the statutory language, Supreme Court precedent, and 75 years of Title II jurisprudence.The classification of any provider as a Title II “common carrier” has never depended on whether the provider owns transmission facilities, let alone last-mile facilities.That is why standalone long-distance telephone companies, such as the legacy AT&T Corp., MCI, and Sprint, were always treated as Title IIcarriers even though they depended on local exchange carriers for their last-mile connectivity, and why even long-distance resellers are treated as Title II carriers even though they often own no facilitiesat all. Here, the retail service that ISPs offer to consumer and business users encompasses end-to-end access to all points onthe Internet,even though each user’sISP must generally relyon other providers to supply someof the links to each of those points (for example, through peering and transit arrangements among Internet backbones).

16Commissioner Robert McDowell,“TheBest BroadbandPlanforAmerica: First,Do NoHarm,” FreeState

FoundationKeynoteAddress,at 13 (Jan.29,2010), http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-

296081A1.pdf.

6

The key legal rationales for any TitleII reclassification decision would thus necessarily extend to any Internet provider that holds itself out to customers as arranging for the transmission of data from one point on the Internet to another, whether or not it owns transmission facilities. As discussed above, this category would extend to ISPs such as Earthlink and AOL that do not own last-mile transmission facilities; to content delivery networks (“CDNs”) such as Akamai that hold themselves out to the commercial public as transporters of data to distant points on the Internet; to providers of e-readers like Amazon.com, which provides Internet access through the Kindle; to companies like Google that provide advertising-supported Internet search services and, on behalf of countless commercial customers, arrange for the transmission of advertising content to end users; and to a variety of other online transport

providers ranging fromNetflix to Level 3 to Vonage. In short, Title II reclassification would be a sledgehammer, not a scalpel.

The supreme irony here is thatTitle II reclassification wouldnot even preclude the paid prioritization arrangements that are purportedlyanimating reclassification proposals.Title II does not require that all customers be treatedthe same as reclassification proponents seemto believe.  Rather, by its express terms, Title II prohibits only “unjust and unreasonable” discrimination, and it is well established that Title II carriers may offer different pricing, different service quality,and different service quality guarantees to different customersso long

as the terms offered are “generally available to similarly situated customers.”  For example, even telecommunications carriers considered “dominant” are permitted to negotiate contracts for special access services that include such preferential treatment as:  (1) service level guarantees, (2) expedited and prioritized service installation and/or (3) expedited and prioritized repair.  Such offerings may be individually negotiated withthe customer, along with the other terms on which the service is made available, and need not be provided to all customers — only those customers who execute the same contract as the first customer or who are able to negotiate the same terms

in the context of another contract.  Indeed, telecommunications carriersare not even obligated under Title II to offer the same contract to every customer who might want it.   Rather, the contract (including the service level guarantee or prioritizedinstallation or repair) must only be made available to “similarly situatedcustomers,” and under well-established precedent, customers are not similarly situated if, among other things, they operate in different competitive environments or if the cost of serving themis higher than the cost of serving the first customer.

Nor does Title II require uniformpricing.For example, the Commission has allowed dominant carriers to make the following types of price distinctions for years:

  •  Volume discounts — discounts that are available only to customers who commit to purchase services in larger volumes.
  •  Termdiscounts – discounts available only to customers who commit to purchase services for specified terms, with longer termcommitments commanding bigger discounts.
  •  Multiple service discounts – discounts available only to customers who purchase multiple services.

7

8

  •  Competitive necessity discounts – discounts needed to respond to competition may be

offered on a selective basis.

And it has provided nondominant carriers even broader latitude to negotiate individually tailored agreements regarding rates, terms and conditions.  For example, the Commission has concluded that CMRS providers’ grant of discriminatoryconcessions to consumers that haggle was reasonable, benefitted consumers, and thus consistent with section 202’snon-discrimination clause.17

In short, reclassification of broadband Internet access services would impose a host of harms, including investment killing uncertainty,without doing anything to remedy the alleged “problem” (i.e.,paidprioritization)itpurportedlyis intended to address.  Calls for reclassification are not well-thought out and should be promptly rejected.

RespectfullySubmitted,

/s/ Robert W. Quinn, Jr. cc:

Jonathan Sallet

Daniel Alvarez

Priscilla Delgado Argeris

Nick Degani

Amy Bender

17

SeeOrloff v. VodafoneAirTouch Licenses LLC d/b/a Verizon Wireless,17FCC Rcd 8987 (2002),petitionfor

Review Denied subnomOrloff v. FCC,352F.3d 415 (D.C. Cir.2003),cert. denied, 542 U.S.

[Originally published at Precursor Blog]

Categories: On the Blog

Pots and Pans and the Net Neutrality Mess

Somewhat Reasonable - May 11, 2014, 3:04 PM

What a mess!

I am speaking, of course, of the “deliberation” concerning FCC Chairman Tom Wheeler’s plans for the FCC to consider adopting a new net neutrality rulemaking. I put “deliberation” in quotes, in part, because Free Press is calling on its supporters to“bring pots and pans and whatever else you can bang on” to make a lot of noise on the FCC’s doorstep.

Well, the “Pots and Pans” strategy is one way of thinking about what influences the FCC’s decision-making process. And this way of thinking doesn’t particularly go hand-and-hand with the notion of the agency’s commissioners calmly trying to figure out how to employ their so-called policymaking “expertise.”

As regular readers know, it has been my firm position that, after the DC Circuit’s Verizon decision, absent convincing evidence of market failure and demonstrable consumer harm, the FCC should not try to reinstate the net neutrality regulations the DC Circuit tossed out. Nevertheless, when Chairman Wheeler announced his intent to move forward with yet another net neutrality rulemaking, this time one based on a “commercial reasonableness” standard for assessing Internet providers’ practices, I said in a statement that “there appear to be elements in his proposal that may mitigate the otherwise potential harmful effects of unnecessary government intervention.”

I understand the pressures that Chairman Wheeler apparently felt as a result of the blowback and sloganeering from “consumer advocacy groups” and even the media. But I wish that he had offered a principled, consistent defense of his position regarding the commercial reasonableness standard. Instead, he seemed, at least rhetorically, to back away from articulating such a defense, rather quickly pivoting to a “get tough” mode of threatening Internet providers with Title II common carrier regulation — the regime devised early in the last century to regulate POTS, or “plain old telephone service.”

Chairman Wheeler’s noticeable shift to “Title II talk” didn’t at all mollify the net neutrality advocates. It only further fueled their Title II fires and calls for pots and pans to bang on.

With almost a decade’s worth – yes, you read that right – of pieces on the Free State Foundation website explaining why, at least absent evidence of market failure and consumer harm, adoption of net neutrality regulation is unwarranted, I don’t want to re-argue the case here. I just want to suggest two points for your consideration at this particular moment in time:

First, in my view, it is not nearly as clear as the pro-net neutrality advocates seem to believe that Title II classification of ISPs – that is, classifying them as common carriers – would survive a judicial challenge. Indeed, I think the FCC’s legal case would be fairly problematic.

While it is true enough that, under established administrative law principles, an agency may change its mind, it nevertheless must provide a well-reasoned explanation for such a change. Pointing to the number of protesters banging on pots and pans outside the FCC’s doors is not likely to suffice. Neither is pointing to the agency’s disappointment at already having been twice rebuffed by the DC Circuit under alternative theories.

The main reason the FCC’s case for sustaining a Title II challenge would be problematic is this: In defending its decision to classify Internet service providers as information service providers – thereby removing them from the ambit of Title II regulation – the Commission argued that, from a consumer’s perspective, the transmission component of an information service is integral to, and inseparable from, the overall service offering. This functional analysis of ISPs’ service offerings was the principal basis upon which the Supreme Court upheld the FCC classification determination in 2005 in its landmark Brand X decision.

I don’t think the integrated, inseparable nature of ISPs’ service offerings, from a functional standpoint, and from a consumer’s perspective, has changed since the Brand X decision, so it won’t be easy for the Commission to argue that it is changing its mind about the proper classification based on changed consumer perceptions of the service offerings’ functionality. And to the extent that the Brand X Court cited favorably to the FCC’s claims concerning the then-emerging marketplace competition and the dynamism in the broadband marketplace, those factors, if anything, today argue even more strongly for a non-Title II common carrier classification.

I understand the role that so-called Chevron deference can play in upholding agency decisions. Indeed, it played an important role in the Court’s decision in Brand X. But invoking Chevrondeference won’t relieve the FCC of the need to provide persuasive reasoning in support of an abrupt about-face on a point the agency litigated – successfully – all the way up to the Supreme Court.

The second point I wish to make is this: We have now come to a juncture where – again, assuming no present convincing evidence of market failure and consumer harm – the FCC ought to await further direction from Congress. I understand that inVerizon the majority, over a dissent, interpreted Section 706 in a way that arguably gives the agency authority to adopt new net neutrality rules, as long as they don’t, as a practical matter, amount to imposing common carrier obligations on ISPs. And I understand the DC Circuit’s decision represents the current “state of the law” on the question of Section 706 authority. But I think Commissioner Michael O’Rielly makes a persuasive argument – an argument that happens to reflect the Commission’s original position until it did a late switcheroo – that Congress never intended Section 706 to be interpreted as an affirmative grant of authority to allow the agency to adopt a net neutrality-like regime. I’m glad Commissioner O’Rielly articulated his position at the Free State Foundation’s annual conference.

Now that we’ve come to the present point – the “What a mess!” point – the FCC ought to give Congress an opportunity to act before moving forward. After all, the members of Congress actually were elected to make important policy decisions. In order to agree with me, you don’t have to believe the FCC couldn’t possibly succeed in adopting new net neutrality regulations. You simply have to agree that it is a good time for Chairman Wheeler and his colleagues to exercise a modicum of regulatory humility.

[Originally published at The Free State Foundation]

Categories: On the Blog

Government Policy: Save the Planet from the Plague of Hungry Humans

Somewhat Reasonable - May 10, 2014, 10:18 PM

Our friends at the Independent Women’s Forum sent a letter the other day to Agriculture Secretary Tom Vilsack expressing concern about the Dietary Guidelines Advisory Committee becoming overly ideological.

Brace yourself: The guidelines recommend taking great care to feed humanity while being mindful of the carbon footprint consuming food requires … no matter the cost.

IWF Senior Fellow Julie Gunlock wrote at National Review Online about the food nannies our First Lady has decided to direct:

Every five years, a committee of officials chosen by the U.S. Department of Agriculture and the U.S. Department of Health and Human Services reviews the federal dietary guidelines. This committee, called the Dietary Guidelines Advisory Committee, is mandated by Congress to work on “providing nutritional and dietary information and guidelines for the general public . . . based on the preponderance of scientific and medical knowledge currently available.”

In other words, these are the government-fat-camp counselors, and they’re here to tell you what to eat.

Gunlock notes the sketchy track record of the federal food police. Once they advocated a food pyramid that since 1992 was heavy on carbs. But under the guidance of our First Lady, we moved to the “Choose My Plate” program, which was high in vegetables and fruits, lean proteins, whole grains, and healthy fats. Writes Gunlock:

The new plate was met with much optimism. Celebrity chef Padma Lakshmi gushed that the new plate was a “triumph for the first lady and the rest of us.” Marion Nestle, professor of Nutrition, Food Studies, and Public Health at New York University said, “The new design is a big improvement.” Others suggested the plate would finally knock some sense into us piggy Americans and make us eat better and lose weight.

Of course, reasonable people realize this is ludicrous because what normal person says, “You know, I really need to eat better. I think I’ll go check out the USDA website for diet info.”?

Only Washington bureaucrats could be oblivious enough to miss the utter uselessness of the DGAC. Only they could be unaware that the United States has a thriving, $60 billion diet and exercise industry (not to mention a whole host of independent bloggers) that already provides people with a variety of choices and advice on how to get fit and eat nutritiously. The DGAC members must avoid grocery stores altogether because if they did ever stand in the checkout lane, they’d be bombarded with magazine headlines promising guidance on dieting (along with pictures of bikini-clad hard bodies).

Great point by Gunlock. Liberals have zero faith in the public to make the right choices — and an equal lack of recognition that the free market endlessly urges Americans to shape up. But nothing matters to a liberal unless the government urges/mandates it.

Gunlock watched the live feed of the DGAC event so you don’t have to, and noticed something that caught her attention — and should have yours:

Kate Clancy, billed as a “food systems consultant” (yeah, so am I!) came to the podium and explained that the DGAC must integrate environmental concerns into the guidelines. As her speech went on, I heard phrases like “environmentally friendly food choices” and making “low impact food choices” and looking at things with an “ecological perspective.” Her point was clear: Americans must not only make nutritious food decisions, they must make environmentally responsible food decisions even if that means Americans’ food costs increase. And food prices most definitely will go up if her recommendations are included in the final guidelines.

The liberal elite shops at Whole Foods (despite it being headed by a libertarian), so everyone else should too!

While Clancy doesn’t say we have to swear off meat altogether, she envisions a population that procures protein from local sources, only buying line-caught fish, grass-fed beef, and organic milk. Again, she makes no mention of the added costs associated with this Whole Foods-style food shopping. Which should make us all wonder, do these folks understand that the highest rates of obesity are suffered by those who live under the poverty line? This administration, which portrays itself as looking out for the poor, might want to reconsider making recommendations that will needlessly hike the prices of healthy food for that very demographic.

Sure. That will happen. As soon as kids all across the country stop dumping their First-Lady-approved lunches into the garbage.

When will we be free of the food nannies? Maybe when we all agree to compost the crappy food they demand we eat.

Categories: On the Blog

Downsizing Australia’s Government and Repealing Green Laws

Somewhat Reasonable - May 10, 2014, 12:52 PM

Try to imagine a commission of the U.S. government recommending that it get rid of the Department of Education, the Department of Health and Human Services, countless agencies, and, for good measure, restructure Medicare so it doesn’t go broke. There are few Americans who will argue that our federal government isn’t big enough and many who trace our present problems to Big Government.

That is why what has been occurring in Australia caught my attention because its voters rid themselves of a political party that imposed both a carbon tax and renewable energy tax on them. The purpose of the latter was to fund the building of wind turbines and solar farms to provide electricity.

Taxing carbon emissions—greenhouse gases—said to be heating the Earth has happily died in the U.S. Senate, but in Australia the taxes were a major reason that the Liberal Party (which is actually politically conservative despite its name) took power after a former Prime Minister, Julia Gillard, pushed it and the renewable energy tax through its parliament.

Gillard became the first woman PM after she challenged then PM Kevin Rudd to lead the Labor Party (which is politically liberal.) Like John Kerry, Gillard was against the taxes before she was for them. How liberal is Rudd? In February he was named a senior fellow of Harvard’s John F. Kennedy School of Government. Like Obama, Rudd came out in favor of same-sex marriage when he was the PM.

Bjorn Lomborg, writing in The Australian in late April, noted that both of the taxes “have contributed to household electricity costs rising 110 percent in the past five years, hitting the poor the hardest.” I repeat—110 percent!

It didn’t take Australians long to discover what a disaster taxing carbon emissions was and how useless renewable energy is. In both cases the taxes were based on the notion that “fossil fuels”, coal, oil and natural gas, are a threat to the environment. Despite an increase in the amount of carbon dioxide in the atmosphere, the Earth has been cooling for the last seventeen years. Mother Nature always has the last word.

As of this writing, the repeal of the two Green laws is in the Parliament’s Senate after having won assent in the lower House. A September 2013 election provided enough new Senate lawmakers  to ensure the repeal.

The Commonwealth of Australia is the sixth largest nation by total area. It was claimed by Great Britain in 1770 and New South Wales was used as a penal colony initially. As the general population grew and the continent was explored, five more self-governing crown colonies were established. On January 1, 1901, the six colonies and several territories federated to form the Commonwealth. The population is approximately 23 million is highly urbanized and lives primarily in the eastern states.

Australia is the world’s 12th largest economy making it one of the wealthiest in the world, but the environmentally-inspired taxes had a deleterious impact on its economy, particularly the mining of coal and iron. As noted, the cost of electricity skyrocketed.

The present Prime Minister is Anthony John “Tony” Abbott. He has held the office since 2013 and has been the leader of the Liberal Party since 2009. A Member of Parliament, he was first elected in 1994 as the representative of Warringah. He made a lot of news when he protested a proposed Emissions Trade Scheme and forced a leadership ballot that defeated it, becoming in the process the Liberal Party leader and leader of the opposition to Rudd and Gillard’s Labor Party.

As reported in the April 30 edition of the Sydney Morning Herald,Abbott’s Commission of Audit “has recommended massive cuts to the size of government, with whole agencies to be abolished, privatized, or devolved to the states, in what would be the biggest reworking of the federation ever undertaken.”

The Commission, the Herald reported, has 86 recommendations, among which are “calls for the axing of multiple agencies and the surrender of huge swathes of responsibility back to the states in education, health, and other services.”

The Australian reported that Joseph Benedict “Joe” Hockey, Australia’s Treasurer as part of the Abbott government, said that the proposed budget would axe “the vast number of (environmental) agencies that are involved in doing the same thing.” Hockey is no fan of wind power, saying “If I can be a little indulgent, I drive to Canberra to go to parliament and I must say I find those wind turbines around Lake George to be utterly offensive. I think they are a blight on the landscape.” That kind of candid talk, if he was an American politician, would be considered astonishing.

The best “transformation” America could undergo is not President Obama’s version, but a return to the limits set forth in the U.S. Constitution, a document that reflected the Founder’s distinct distrust of a large central government and its allocation of civic responsibilities to the individual states to the greatest degree possible, and to “the people.”

Australia is way ahead of the U.S. in that regard, learning from the errors of environment laws and the expansion of its government into areas of health and education. We would do well to follow its example.

[Originally published at Warning Signs]

Categories: On the Blog

Free-Market Solutions to Financial Crises

Somewhat Reasonable - May 10, 2014, 12:38 AM

The banking crisis of 2008 and its attendant deep recession have been hailed by statists the world over as the ultimate demonstration of capitalist greed and a justification for more and more regulation and government control of the economy, particularly the financial sector. Their argument boils down to an accusation that private actors in the marketplace are incapable of dealing with systemic crises and that government is the only agent that can address the market as a whole in order to combat panics and economic shocks. That argument won out in the aftermath of the recession, leading to a raft of new regulations, most notably the voluminous Dodd-Frank Act.

But this is far from the correct lesson to take from the financial crisis. Much ink already has been spilled on the real causes of the crisis: government-caused distortions of the home-lending market through Fannie Mae and Freddie Mac, and firms operating with the tacit promise of bail-outs should things go south. These government policies created the systemic problems that devastated the economy. Yet there is still a great deal to say about the actual process of dealing with crises once they have started. Most of “orthodox” economics, both of the left and the right, places central banks at the heart of dealing with crises. In fact, it was afinancial crisis in 1907 that prompted the formation of the Federal Reserve. However, the choice to promote the role of government in the economy in 1907, just as in 2008, represented a fundamental misunderstanding of the power of business leaders to resolve crises without any need of government interference.

Indeed, the crisis of 1907 serves as one of the most enduring examples of the private economy’s ability to save itself, even in the face of negative government interference. The Panic of 1907 began in October with a failed attempt to corner the market on the stock of the United Copper Company. The cornering effort had been financed by a number of banks, and when the effort failed, so did the overexposed banks. Other banks financially linked to the early failures swiftly fell, and soon there was a nationwide panic complete with bank runs and emergency closures.

At first glance, the story might sound like a cautionary tale about the excesses of unregulated capitalism, but that judgment is quashed by the incredible response that followed the initial panic.

Out of the panic, a leader emerged to take charge of the situation: John Pierpont Morgan. As the financialcrisis seemed about to sink the stock market, and the economy, Morgan began holding meetings with the presidents of the country’s majorbanks in his library. For weeks, Morgan coordinated rescue efforts on institutions deemed capable of surviving, while measures were taken to untangle and insulate those solvent firms from those that could not be saved.

The result of Morgan’s efforts was not only a calming of the market, but the very survival of the American financial system. With only very limited injections of cash from the U.S. Treasury, the whole economy was preserved by the coordinated efforts of the private sector.

Morgan nearly failed in his efforts, thanks to the untimely prying of the government at the final stage of the rescue. One of the last steps taken to shore up the financial sector relied on the saving of the Tennessee Coal, Iron and Railroad Company (TC&I), the stock of which was being used as collateral by a major brokerage house. If the broker went belly-up, the panic could begin anew. Morgan’s solution was simple: merge TC&I with his own U.S. Steel. Doing so would save TC&I, which would save the brokerage firm, which would save the economy from further panic. To those involved, it seemed like a simple, non-invasive solution.

Then the government intervened. Theodore Roosevelt’s administration was adamantly opposed to anything that smelled of monopoly, and it nearly blocked the merger under the Sherman Antitrust Act. Fortunately, the president eventually listened to his advisors and allowed the merger to go through. Had that not happened, the United States could well have been plunged into a deep depression, the financial system in ruins.

It’s true that Morgan had a personal interest in staving off a financial panic that would inevitably affect his holdings, and he would soon become a key figure in the development of the Federal Reserve, which became increasingly intrusive and unaccountable in administering its vast, government-mandated power over the economy. But if there is one lesson to take from 1907, it is that not only are governments a chief cause of financial crises, they are also a frequent impediment to their resolution.

 Perhaps we should look to the example of Morgan and his private-sector compatriots who ended the panic and concurrent recession within a year, as opposed to the meddling Federal Reserve, which helped make the Depression great and has lately presided over the worst economic “recovery” ever.

[Originally published at the American Thinker]

Categories: On the Blog

John Coleman on WLS-AM Rips Obama’s Climate Report

Somewhat Reasonable - May 09, 2014, 9:07 PM

John Coleman, the 1983 American Meteorological Society’s Broacast Meteorologist of the year, is a living legend. Recently retired from full-time forecasting on KUSI-TV in San Diego, Coleman helped set the national standard on television while at WLS-TV in Chicago. He went on to be the first weatherman on ABC’s Good Morning America, and founded The Weather Channel.

John’s was on the Bruce Wolf & Dan Proft Show this week on WLS-AM to talk about the Obama administration’s latest Climate Assessment Report. John was not impressed, and laments that science from the government has become so politicized. “The theory of man-caused global warming is dead wrong,” Coleman said.

From the lake to the boonies … enjoy the soothing baritone and insightful words of John Coleman in the player above. Stay to the end so you can hear John sing, and be sure to also follow John on Twitter and Facebook.

You can see John in person at the 9th International Conference on Climate Change in Las Vegas July 7 – 9. Join him and Heartland there to learn what’s really happening to the climate.

Categories: On the Blog

Rushing to Regulate: Democrat Rosenworcel is Right – So She Should Vote ‘No’ on Net Neutrality

Somewhat Reasonable - May 09, 2014, 11:06 AM

Federal Communications Commission (FCC) Democrat Commissioner Jessica Rosenworcel yesterday made a very good point.

Rosenworcel: Delay Vote on Net Neutrality Rules

Democratic FCC Commissioner Jessica Rosenworcel has asked FCC Chairman Tom Wheeler to delay his planned May 15 vote on a draft of new network neutrality rules by at least a month….

“His proposal has unleashed a torrent of public response. Tens of thousands of e-mails, hundreds of calls, commentary all across the Internet….

“We need to respect that input and we need time for that input.  So while I recognize the urgency to move ahead and develop rules with dispatch, I think the greater urgency comes in giving the American public opportunity to speak right now, before we head down this road.

“I believe that rushing headlong into a rulemaking next week fails to respect the public response to his proposal.

She pointed out that the seven-day quiet period before the vote begins May 8. “That means we no longer accept public comment. I think it’s a mistake to cut off public debate right now as we head into consideration of the Chairman’s proposal. So again, at a minimum, we should delay the onset of our Sunshine rules.

So as of today, the FCC stops listening to what we have to say.  And Commissioner Rosenworcel thankfully wants to continue listening.

Chairman Wheeler, sadly, remains impervious.

 An FCC source speaking on background said the vote would go on as planned….

Commissioner Rosenworcel’s impression and instincts are exactly right.  And there’s a way she can get the appropriate delay – by voting “No” next Thursday.

Her Nay – combined with the likely Nays of the two Republican Commissioners – would be a majority three and stave off Net Neutrality’s imposition.

Would that mean Net Neutrality is dead and gone?  Of course not – its proponents are relentless.

Government-Imposed Net Neutrality: Twice-Bitten, Not Shy About a Third Try

It’s Groundhog Day – Again: Government Taking Third Stab at Net Neutrality Power Grab

Why Can’t the Government Take No for an Answer?

A Key Ingredient in the Left’s Wins: Persistence

What it would mean is We the People would have more time to weigh in on this huge government infliction on 1/6th of our nation’s economy.

Which is just what Commissioner Rosenworcel rightly wants.

Her No vote would give us that.

[Originally published at RedState]

Categories: On the Blog

VIDEO: How to Modernize Communications

Somewhat Reasonable - May 09, 2014, 10:54 AM

NetCompetition, an organization dedicated to improving competitiveness in the internet market, held a panel discussion and debate on April 4th on the topic “Thinking and Starting Anew: Modernizing Communications Law for American Consumers.” Scott Cleland, Heartland Institute policy advisor and president of the Precursor consultancy firm, was the first of the five guests to speak.

Cleland explained how regulations in the communications market have become increasingly obsolete due to the speed of innovation in the sector and that, rather than replacing the old regulations, new rules are simply layered on top of the old. Cleland likened these layers of regulation to an archeological site; the historical layers become lose stability over time as external forces work their way into them.

Cleland asked the question, “How do you take something that obsolete and that dysfunctional, how do you make it work?” The answer is simple and elegant: “You want to build it on a lasting foundation of enduring American values.” These values, according to Cleland, are competition, consumer protection, universal connectivity, and public safety. By enacting a new regulatory regime that allows for innovation in the ever-shifting marketplace, the communications and technology industries will be able to grow and thrive.

Most of the panelists agreed in large part with Cleland’s analysis. Gene Kimmelman of Public Knowledge even acknowledged the detrimental effects of extremely complex regulation. Kimmelman proposed that a bipartisan be established to reach agreement on what the “shared values and principles” are.

Hal Singer of the centrist Progressive Public Policy Institute likewise condemned the current regulatory regime as one that stifles innovation and beneficial competition, saying that, “many of the existing regulations are no longer justified.”

The only panelist rigorously opposed to Cleland’s ideals was Mark Cooper, the researcher director of the Consumer Federation of America. Cooper contended that, “The layers we’ve added, that Scott laments, are, in fact, progress. And every time we’ve had a major revolution in America, in terms of the economy, we’ve expanded the rights of individuals, because that’s what progressive societies do.” Cooper also argued that the market would not solve in some areas and that some subsidies in the market would be essential.

The American Enterprise Institute’s Jeff Eisenbach struck back at this criticism, explaining that Cooper’s argument amounted to an upholding of “the tyranny of the status quo.” Eisenbach added a new layer of criticism of that status quo, arguing that a principle of zero-base regulatory budgeting would solve many of the problems of obsolete and overly-constricting regulations. In zero-base budgeting, the regulator must ask one question: “If we weren’t doing it, would we start?”

Eisenbach contends that this question could transform the regulatory regime: “If you asked that question about the vast majority of what the Federal Communications Commission does today, the answer is no.”

The arguments presented by the panelists show a growing movement in favor of cutting our and streamlining the grotesque regulatory regime of the communications sector. Communications technology is one of America’s most cutting-edge industries, but that can only remain the case if onerous regulations do not stifle it.

Categories: On the Blog

Net Neutrality III is Huge Government Heinous

Somewhat Reasonable - May 08, 2014, 11:45 AM

The Barack Obama Administration is back at it — yet another big government power grab is in the works.

The Administration’s Federal Communications Commission (FCC) again resurrecting Network Neutrality — an all-encompassing Internet usurpation twice unanimously killed by court as an illegal overreach.

Court Backs Comcast Over FCC on ‘Net Neutrality’

Verizon Wins Net Neutrality Court Ruling Against FCC

Net Neutrality III does in fact acknowledge and allow for a basic economic precept: If you use more, you pay more.

It will allow Internet Service Providers (ISPs) to charge bandwidth hogs like Netflix and Google (owner of YouTube) for the Web-exploding bandwidth they use:

Netflix and YouTube Make Up Majority of US Internet Traffic

Netflix Now The Largest Single Source of Internet Traffic In North America 

Video Viewing on Netflix Accounts for Up to 30 Percent of Online Traffic

Right now,  all of us — including non-Netflix and non-YouTube users — subsidize their massive profit-models. Changing this, of course, upsets them.

Google And Netflix Are Considering An All-Out PR Blitz Against The FCC’s Net Neutrality Plan

Shocker.

This nod to economic reality also has the left rending garments and gnashing teeth in overwrought, overdramatic fashion.

Barack Macbeth’s ‘Murder’ of Net Neutrality 

So This is How Net Neutrality Dies, Under a Democratic President

Thanks to the FCC Net Neutrality is Dead 

Stop the FCC from Breaking the Internet

If these leftist bad actors had their way, the government would mandate that gas stations charge the same price for empty Escalades and Escorts.

Their reaction is part knee-jerk response to anything less than total government command-and-control, and part political theater.  Their screeching — combined with our reasonable objection to this third attempt at massive government overreach — allows FCC Chairman Tom Wheeler to say, See — both sides are angry with me.  My proposal must be reasonable.

Hardly.  The Chairman has circulated amongst the Commission his Net Neutrality Notice of Proposed Rule Making (NPRM) – the first step in a process that so often ends in really bad policy.  I spoke — on condition of anonymity — with someone inside the Beast.  And the first draft ain’t good.

A caveat: This is a first draft.  The final version, once the four Commissioners weigh in, may end up looking dramatically different.  But this is Chairman Wheeler left to his own devices — and it ain’t good.

The proposal possesses two over-arching characteristics.

  1. A preemptive Mother-May-I approach to Internet innovation. Anytime the marketplace develops a new way of doing just about anything, the innovators must first check with the government to see if they can implement it. Fairly command-and-control, is it not?  Not exactly a great way to run a constantly evolving, endlessly-faceted World Wide Web.
  2. A nebulousness to exactly just how far the government’s regulatory reach is.  Just what nook or cranny of the private Web — if any — lies beyond the Leviathan’s tentacles?

The order’s lynchpin is how the government will now define “high speed broadband.”  It appears to mandate that everyone must be able to simultaneously download multiple movies while watching Game of Thrones and playing Call of Duty with everyone from their graduating class.

And if that ridiculously huge bandwidth demand slows you down ever so slightly — the government won’t consider it “high speed.”

See, the actual law — which the FCC is ignoring by imposing Net Neutrality — allows the government to stick it’s enormous proboscis even further into the Web if there is “market failure.”  So the government will absurdly define market “success” — and then claim it’s failing.  Then start ratcheting up the regulations.

Remember Obamacare’s nutritional information disclosure requirements? Obamacare Requires 34 Million Pizza Nutrition Signs 

The new Net Neutrality order dramatically ramps up the disclosure requirements for ISPs.  How?  In what forms and fashions?  Again, it’s nebulous — and open-ended.  If Obamacare’s multi-million menu amendments are anything like a precedent, it isn’t good.

What are the punishments for violating these absurd new regulations? Again — nebulous. And, again — the government’s omni-directional precedents aren’t good.

Uninsured Next Year? Here’s Your Obamacare Penalty 

Small Businesses Could Be Hit With Huge Obamacare Fines

Obamacare Springs New, Expensive Contractor Misclassification Penalty

Obamacare to Hit Smokers with Huge Penalties

Obamacare Work Disincentives Will Add $200 Billion To Cost 

Think this latest version of Web-sector-choking Net Neutrality won’t provide similarly costly disincentives?

That ain’t nebulous.


[Originally published at Daily Caller]

Categories: On the Blog

The Inequality Trap Distracts from the Real Issue of Freedom

Somewhat Reasonable - May 08, 2014, 11:43 AM

A new book by French economist Thomas Piketty on “Capital in the Twenty-First Century” has recently caused a major stir on the opinion pages of newspapers and magazines. Piketty has resurrected from the ash heap of history Karl Marx’s claim that capitalism inescapably leads to a worsening unequal distribution of wealth with dangerous consequences for human society.

Not that Professor Piketty is a Marxist in the traditional sense of believing that mankind follows a preordained trajectory through history that inevitably leads to a worldly utopia called socialism or communism. To the contrary, he believes that capitalism is a wondrously efficient economic system that produces more and better goods and services.

Seeing Income Inequality as a Social Danger

What morally irks him is the inequality of income and wealth that he believes the capitalist economy results in, and may continue to make worse looking to the years ahead. He admits that for a long time in the twentieth century income inequality had considerably diminished between “the rich” and the rest of society. The middle class grew with more people moving out of poverty in the Western nations of Europe and America, and middle class incomes were rising at a more rapid rate than upper income levels were increasing.

But he believes that an array of statistical data strongly suggests that this has been changing over the last several decades. Middle class incomes are now rising much more slowly relative to the increases in the higher income brackets. Piketty considers that this is likely to continue into the foreseeable future with the wealth of the society more and more concentrated in fewer and fewer hands at the expense of the middle and low income groups in society, as Karl Marx prophesied.

It is not surprising, therefore, that Piketty entitled his book, “Capital in the Twenty-First Century,” because Karl Marx’s major work in the nineteenth century was called “Capital,” and Piketty is offering what some are calling a twenty-first century update of Marx’s prediction that capitalism would make a few in society richer and richer as the many became relatively less well-off in comparison to them.

He even does part of his calculations to justify this assessment with an implicit variation on Marx’s labor theory of value. He argues that it is fairly easy to determine the productivity and therefore worth of a common laborer such as an assembly line worker or a server in a fast-food establishment. So much additional labor time and effort “in,” and so much additional valued physical production “out.”

But he argues that there is no equivalent way to objectively measure and justify the stratospheric salaries and bonuses of bankers, financiers, and giant corporate CEOs or high-placed executive managers. Since he thinks there is no way to measure the productivity of such multi-millionaire earners in society, then it must necessarily follow that their salaries and wealth positions in society can in no way be justified.

Hence, the incomes of such rich members of society should be considered as arbitrary and unjust. Since it cannot be demonstrated that they earned it through a measurable contribution to the “social” processes of production, it must be viewed as unfair that they be allowed to keep it.

Taxing Away Wealth as an End in Itself

His economic policy conclusions necessarily follow from his diagnose of the “social malady” of income inequality. Incomes above $500,000 or a million dollars should be taxed at 80 percent, and those incomes between $200,000 and $500,000 should be taxed at 50-60 percent. In addition, he proposes an annual wealth tax of 10 percent precisely to prevent any further concentrated wealth accumulations looking to the future.

Clearly, once such a tax regime was imposed not much extra revenue would come from it, because after the first few years there would no longer be many in those higher income brackets; and hardly anyone any longer would have any motive to try to reach that income level since they would know it was going to be all taxed away if they were to succeed in acquiring it.

But that’s the point for Piketty. It is not that he hopes to create a perpetual redistribution machine with million-dollar and multi-million dollar incomes earned each year just waiting to be tax away and transferred by government to presumably more deserving lower-income members of society. Its purpose, instead, is to prevent there ever being such high incomes.

For instance, if one man is making one million dollars, while 100 men are each making, say, around $40,000 a year, the millionaire’s income is 25 times that of the middle class income earners. However, if any million-dollar income earner had 80 percent of what he earned taxed away, then his after-tax income of $200,000 only would be five times as great as the middle-income earner. Hence, the income inequality gap will have been greatly reduced.

One senses that it is not so much that Piketty wishes to raise the lower and middle-income earners up (though, of course, he would like to see that happen), as he wants to pull down those whom he considers to be receiving far more than he thinks can be fairly or objectively justified.

So it is not so much to take from Peter to give to Paul, as it is to pull Peter down closer to Paul’s level, and prevent Peter from ever again rising significantly above Paul’s average economic earnings. It is a psychology of envy and an ideology of egalitarianism vengefully at work.

I would like to suggest that there are some fundamental misconceptions and errors in Thomas Peketty’s understanding of the nature and workings of a free market capitalist system.

Income Earned is Not an Arbitrary Process

To begin with, he falls into the all-to-frequent conceptual mistake of thinking that the “production” aspect of the market process is independent of or at least separable from the “distribution” of what is produced.

This is an old mistake that goes at least as far back as the mid-nineteenth century British economist, John Stuart Mill. In his 1848, “Principles of Political Economy” Mill argued that the “laws” of production are more or less as fixed and unchangeable as the laws of nature, but the “laws” of distribution were a matter of the cultural and ethical values of a society at any moment in time.

This view conceives of total output as a large quantity of “stuff” produced by “society,” which then can be ladled out of the community production pot in any manner that “society” considers “good” and “right” and “deserving” to each of the members of that society.

Now in fairness neither John Stuart Mill nor Thomas Piketty believed that the level and amount of taxes did not affect people’s willingness to work and produce all that “stuff.” But Piketty certainly believes that since the multi-million dollar incomes earned by “the few” are not in any way rationally or objectively tied to an individuals’ actual measurable productivity, a lot of it can be taxed away without any significant reduction in those people’s willingness, interest, or ability to go about their work.

Firstly, it must be remembered “society” does not produce “stuff,” and for the simple reason that there is no entity or thinking and acting “being” called “society.”

Everything that is produced is done so by individuals either working on their own or as the result of associative collaboration with others, as is more commonly the case in a complex market system of division of labor.

Entrepreneurial Profit and Executive Pay are Not Irrational

Secondly, what is produced does not just happen and the amount of production does not just, somehow, automatically grow year-after-year. A good portion of that seemingly immeasurable value that is reflected in those higher incomes that bothers Piketty so much is the result of entrepreneurial creativity, innovation and a capacity to better “read the tea-leaves” in anticipating the future directions and forms of consumer demands in the marketplace.

What to produce, how to produce, and where and when to produce are creative acts of a human mind in a world of uncertain and continuous change. As the Austrian economist, Ludwig von Mises, once expressed it, “An enterprise without entrepreneurial spirit and creativity, however, is nothing more than a pile of rubbish and old iron.”

The market gives a clear and “objective” measure of what the achievements of an entrepreneur are worth: Did he succeed in earning a profit or did he suffer a loss? In an open, free competitive market, an entrepreneur must successfully demonstrate this capability each and every day, and better than his rivals who are also attempting to gain a part of the consumer’s business. If he fails to do so the profits he may have earned yesterday become losses suffered tomorrow.

Not every high-income earner, of course, is an entrepreneur in this sense. The market also rewards people who have the skills, abilities and talents to perform various tasks that enterprises need and consumers’ desire.

Again, in an open and competitive free market, no one is paid more than some employer or consumer thinks their services are worth. The executive manager, whether in the manufacturing or financial sectors, for instance, is only offered the salary he receives because others value his services enough to outbid some competing employer also desiring to hire him for a job to be done.

The Tax Code Distorts the Workings of the Market

Piketty’s proposed penalties on earnable salaries through confiscatory or near confiscatory taxes can only be viewed as a form of a maximum wage control. That is, a ceiling above which the incentive for people to search out for higher paying salaries is greatly diminished due to the 80 percent tax rate on anything earned above the Piketty-specified income thresholds.

But how shall these individuals’ valuable and scarce human labor skills and abilities be allocated among their alternative executive employments in the business world if the government uses its tax code to distort the market price system for workers?

It will threaten, over time, to result in mismatches between where the market thinks people should most effectively be employed to help properly guide enterprise activities and the actual places where some of these rare talents actually find themselves working. As with any mismatch between supply and demand, consumer demands and production efficiencies will be less satisfactorily fulfilled.

If such a confiscatory range of tax rates were to be employed, it should not be surprising to see the people in the market attempting to get around the government’s interference with wage determination via the tax code.

In the 1960s and 1970s, Great Britain was considered to be the “sick man of Europe” due to it anemic economic performance. One of the reasons was the extremely high tax rates on corporate and executive salaries that resulted in a British “brain drain” as some of the “best and the brightest” in business moved to the United States and a few other places where the tax penalty for their success was noticeably less than in their home country.

To get around the implicit “wage ceiling” under the tax code, British companies kept or attracted the executive talent they needed through “in-kind” additions to their money salaries. Companies would provide valued executives with high-end luxury automobiles for their own use, or give them company paid-for luxury apartments in the choice areas of London, or allowed them very liberal expense accounts on which many personal purchases could be made under the rationale that they were in some loose way business-related. These companies would then write off all these expenses from their corporate income taxes.

The Price System and a Rational Use of People’s Skills

The market economy has a steering mechanism: the price system. Through prices consumers inform businesses what goods and services they might be interested in buying and the value they would place on getting them. The prices for what the economist loosely calls “the factors of production” – land, labor, capital, resources and raw materials – inform those same businesses what other enterprises think those factors of production are worth in producing and bringing to market the various alternative goods the consuming public wants.

The prices offered and paid for the ordinary assembly line workers or the senior corporate managers are not arbitrary or “irrational.” Given their respective skills, abilities and talents to perform tasks they are paid what others think they are worth to do the job. If it turns out that this was a misestimate, those salaried workers in the “higher” or “lower” position will either be let go and have their wage revised down to what is now considered to be their more reasonable worth.

Whether it be formal price and wage controls that freeze the price system into what politicians and bureaucrats think people should be paid, or a tax code that discriminates against and penalizes threshold levels of success as well as the everyday incentives and ability to work, saving and investment, such policies slowly grind the market economy down, if not to a halt, then into directions and patterns of supply and demand significantly different from what a truly free enterprise could and would generate.

In a recent interview Thomas Piketty said that he had no formula to determine how much is too much in terms of “socially necessary” inequality. But he considered a variety of multi-million and multi-billion dollar wealth positions to be socially unnecessary and harmful.

Viewing the Individual as Owned and Used by the Collective

Notice his phrasing and the mindset that it implies. The society, the community, the tribe is presumed to own the individual and collectively decide how much the material “carrot” should be in terms of unequal incomes that may be “necessary” to induce people to be productive or innovative “enough.”

Enough for what? To make the communal economic pie grow at a rate that he and others like himself may think is the “optimal” amount – not “too much” and not “too little” – from out of which the government will then decide what “share” each will receive.

On the basis of what standard all of this will be determined, he clearly admits he has no clue, other than what he may subjectively think is “right” or “just” or “fair.” Like the difficulty of offering an objective definition of pornography, he just knows it when he sees it.

We have already seen to where this leads in modern democratic society. It is a mutual plunder land of individuals and special interest groups rationalizing every tax, redistribution, and regulation to serve their own purposes at the expense of others who are forced to directly or indirectly foot the bill through the use of government power.

A Society of Peaceful Production or Political Plunder

This gets us, perhaps, to an understanding of from where unjust and unfair inequality can arise. It is precisely when the political system is used to rig the game so as to manipulate the outcomes inside and outside of the market. It suggests the basis for conceiving of a non-Marxist notion of potential group or “class” conflict in society.

A French classical liberal named Charles Dunoyer (1786-1862) explained this long ago in the early nineteenth century. In an article published in 1817, Dunoyer distinguished between two groups in society. One of them he called the “Nation of Industrious Peoples” and the other are those who wish to use government to live at the expense of peaceful and productive others.

“The Nation of Industrious Peoples” Dunoyer said, is made up of “farmers, merchants, manufacturers, and scholars, the industrious people of all classes and all nations. In the other, there are the major portions of all the old and new aristocracies of Europe, office holders and professional soldiers, the ambitious do-nothings of all ranks and all nations who demand to be enriched and advanced at the expense of those who labor.”

“The aim of the first, Dunoyer went on to say, “is to extirpate from Europe the three scourges of war, despotism and monopoly, to ensure that men of every nation may freely exercise their labors, and, finally, to establish forms of government most able to guarantee these advantages at the least cost. The unique object of the second is to exercise power, to exercise it with greatest possible safety and profit, and, thus, to maintain war, despotism and monopoly.”

In other words, Dunoyer was saying that society is composed of one set of people who diligently work and conscientiously save and who honestly produce and peacefully trade; and there is another set of people who wish to acquire and consume what others have saved and produced. The latter group acquires the wealth produced by those others through political means – taxation, regulation, and government-bestowed privileges that interfere with the competitive free market. This source of injustice, and exploitation is the same in every country.

The Plunder Land of Modern Democratic Politics

In our own times, those who want “to be enriched and advanced at the expense of those who labor” are, of course, the welfare statists, the economic interventionists, and the proponents and supporters of every other form of collectivism.

They are the crony-capitalists who use their influence with political power to obtain subsidies, regulations limiting competition, and bailouts and profit-guarantees at the expense of the taxpayers, their potential rivals who are locked out of markets, and the consumers who end up paying more and having fewer choices than if the market was free and competitive.

They are the swarm of locust-like lobbyists who lucratively exist for only one purpose: to gain for the special interest clients who handsomely pay them large portions of the wealth and income of those taxpayers and producers whose peaceful and productive efforts are the only source of the privileges and favors the plunderers wish to obtain.

They are the political class of career politicians and entrenched bureaucrats who have incomes, wealth and positions simply due to their control of the levers of government power; power they gives them control over the success or failure, the life and death of every honest, hardworking and peaceful and productive worker, businessman, and citizen, and who are squeezed to feed the financial trough at which the political plunders gorge themselves.

Regulated markets help preserve the wealth of the politically connected and hinders the opportunities of the potentially productive and innovative from rising up and out of a lower income or even poverty. Wealth and position may not be completely frozen, but it is rigidified to the extent to which it is politically secured and protected from open competition.

Welfare dependency locks people into a social status of living off what the government redistributes to them from the income and wealth of others, and makes escape from this modern-day pauperism difficult and costly. An underclass of intergenerational poverty is created, that reduces upward mobility and makes improvement difficult for those caught in this trap; at the same time it serves the interests of those in political power who justify their position and role as the needed caretakers of those whose dependency they live off.

Most of Human History Based on Plundered Inequality

For most of human history, those with certain inherited endowments and developed skills used their superior physical strength and mental agility to conquer, kill, plunder and enslave their fellow men. Whatever meager wealth may have been produced during those thousands of years ended up concentrated in the hands of the few through coercion and political cunning.

There was little justice in a world in which the “strong” stole from and controlled the “weak.” But slowly men began to revolt against this “unnatural” inequality under which what some had produced was forcefully taken by a handful of powerful others.

Against this imposed system of political and wealth inequality slowly developed a counter conception of a just and good society. Its hallmark was a call for a new vision of man and society based on the alternative notion of individual rights equally respected and enforced for all, rather than privileges and favors for some at the expense of the rest.

A New Society of Equal Individual Rights Before the Law

In a political system under which individuals had the same equal rights to their life, liberty, and honestly acquired property, each person could then rise to his own level based on his natural abilities and those he had acquired through experience and determination.

The outcomes and positions that individuals reached would inevitably and inescapably be different – unequal in terms of material and social achievement. But if some men earned and accumulated more wealth than others, its bases would be peaceful production and free trade.

In such a world of freedom and rights, some men’s skills and abilities would not have given them materially more successful positions through plunder and privilege; but instead it would be as the result of creatively producing and offering for sale in the marketplace what others may wish to buy, as the method through which each non-violently pursued his own self-interest.

If it were possible to give any reasonable meaning to the ambiguous and manipulated phrase, “social justice,” it would be:

Are the material differences among members of a society the result of the peaceful and voluntary associations and trades among individuals on an open and competitive free market? Or are the material inequalities coming from the use of political power to coercively obtain by taxation, regulation and forced redistribution what the recipients had not been able to obtain through mutually beneficial agreement with their fellow men?

The Appropriate Question is: Wealth from Production or Politics?

The only important and relevant ethical and political issue in a free society should be: How has the individual earned and accumulated his material wealth? Has he done so through peaceful production and exchange or through government-assisted plunder and privilege?

Rather than asking the source or origin of that accumulated wealth — production or plunder –the egalitarians like Thomas Piketty merely see that some have more wealth than others and condemn such an “unequal distribution,” in itself.

By doing so, they punish through government taxation and wealth confiscation the “innocent” as well as the “guilty.” This, surely, represents an especially perverse inequality of treatment among the citizenry of the country, especially since those who have obtained their ill-gotten gains through the political process usually know how to work their way through the labyrinth of the tax code and the regulatory procedures to see that they keep what they have unethically acquired.

The modern egalitarians like Thomas Piketty are locked into a pre-capitalist mindset when, indeed, accumulated wealth was most often the product of theft, murder, and deception. They, and the socialists who came before them, seemingly find it impossible to understand that classical liberalism and free market capitalism frees production and wealth from political power.

And that any income and wealth inequalities in a truly free market society are the inequalities that inescapably emerge from the natural diversities among human beings, and their different capacities in serving the ends of others in the peaceful competitive process as the means to improve their own individual circumstances.

[Originally published at epictimes.com]

Categories: On the Blog

Climate Change Is a Clear and Present Danger, Says Landmark US Report

Somewhat Reasonable - May 08, 2014, 12:20 AM

This is the title of an article in the May 5 Internet edition of The Guardian written by Suzanne Goldenberg, US environment correspondent.  The article is about the release May 6 at the White House of the National Climate Assessment Report (NCA) with a great deal of fanfare.

The article states “Climate change has moved from distant threat to present-day danger and no American will be left unscathed, according to a landmark report due to be unveiled on Tuesday.  The National Climate Assessment, a 1,300-page report compiled by 300 leading scientists and experts, is meant to be the definitive account of the effects of climate change on the US.”

The article further states “Gary Yohe, an economist at Wesleyan University and vice-chair of the NCA advisory committee, said the US report would be unequivocal that the effects of climate change were occurring in real-time and were evident in every region of the country.  ‘One major take-home message is that just about every place in the country has observed that the climate has changed,’ he told the Guardian. ‘It is here and happening, and we are not cherrypicking or fearmongering.’

The draft report notes that average temperature in the US has increased by about 1.5F (0.8C) since 1895, with more than 80% of that rise since 1980. The last decade was the hottest on record in the US.  Temperatures are projected to rise another 2F over the next few decades, the report says. In northern latitudes such as Alaska, temperatures are rising even faster.”

It takes a very astute observer to note climate change is happening in the United States the past hundred years; or for that matter over the 4 billion year existence of the planet.  The country is blessed to have such people working on the NCA.  Surely these individuals will state climate change is the normal state of affairs for the nation.

The U. S. Weather Service and National Oceanographic and Atmospheric Administration has been collecting data since the late 19th century on all types of weather events such as temperatures, rain fall, drought, snow fall, wild fires, sea level rise, tornadoes, hurricanes, etc.   The data shows little change in event occurrence over times of observation.  If anything there is less frequency of some events the past twenty years when atmospheric carbon dioxide has been at its highest rate of increase.

The graph that follows is the monthly average of all daily high and low temperatures at all NOAA U. S. Historical Climate Network stations.

 

It is hard to visualize a continuous rise in U. S. temperatures from 1895 to 2013 in this data.  The planet is in a global warming cycle called the Current Warming Period since about 1850.  So it would be expected to see some warming over this 160-year period.   This warming can’t be attributed to increases in atmospheric carbon dioxide from burning fossil fuels.  Does the NCA report the pause in global warming since 1998?

Based on The Guardian article, it appears the NCA is another report similar to the latest United Nations 5th Intergovernmental Panel on Climate Change Report forthcoming the past eight months.  To counteract omissions, half-truths, and false statements in these reports, the Non-governmental International Panel on Climate Change (NIPCC) was formed in 2003.  Since 2009, the NIPCC has released six reports that give authoritative, easily-read information about the vast amount of experimental data showing negligible influence of carbon dioxide from burning fossil fuels on climate, financial losses from mitigation, and proper role of adapting to climate change.

If the material in the NCA contains the information cited in The Guardian, my only comment is aquote from attorney Joseph Welch protesting Joseph McCarthy actions June, 9, 1954, “Have you no sense of decency?”  After Mr. Welch’s statement, Senator McCarthy’s credibility was ruined and he died a lonely man three years later.

Let us hope the NCA will show the illogic reasoning for stopping use of the nation’s abundant, economical fossil fuel resources of coal, oil, and natural gas.  The attempts so far are the reasons for the economic malaise besetting country the past 6 years.  This agony must come to a halt, and the possible illogic NCA will awaken the public about the mass of false reasoning presented the past 25 years.
Categories: On the Blog
Syndicate content