Somewhat Reasonable

Syndicate content Somewhat Reasonable | Somewhat Reasonable
The Policy and Commentary Blog of The Heartland Institute
Updated: 10 min 27 sec ago

The Government Assaults ‘Big Dogs’ – To Advantage the Biggest Dog of Them All

November 25, 2014, 9:24 AM
One of the largest myths going is that government helps the Little Guy.

On it’s face this is patently absurd.  More government – taxes and/or regulations – raises the costs of everything for everyone.  The Big Guys are far better equipped to absorb the punishment – while the Little Guys are pummeled into un-existence.

Then there’s the Crony Socialism – it’s not Crony Capitalism, because it has very little to do with capitalism.  Wherein Big Guys – who have the wherewithal – bend government policy to their will.  To their advantage – and against that of the Little Guys seeking to compete with them.  For instance:

Green Scam: 80% of Green Energy Loans Went to (President Barack) Obama Donors

Crony Socialists Looking to Ban Online Gambling Don’t Seem to Realize It’s a WORLD WIDE Web

Obama Donor’s Firm Hired to Fix Health Care Web Mess It Created

Obama Crony Wins Contract to Give Phones to Jobless

Obama’s United Auto Workers Bailout

Which brings us to the ridiculous Network Neutrality political rhetoric being extruded by the Obama Administration.

President Obama his own self recently said this:

“(N)et neutrality”…says that an entrepreneur’s fledgling company should have the same chance to succeed as established corporations….

Then there’s Tom Wheeler, the Chairman of the President’s allegedly politics-free, independent Federal Communications Commission (FCC).

FCC Chief on Net Neutrality: ‘The Big Dogs Are Going to Sue, Regardless’

First – why are these lawsuits inevitable?  Because the FCC has already twice unilaterally imposed Net Neutrality – and twice the D.C. Circuit Court has unanimously overturned the orders as outside the bounds of their authority.

Rather than complaining about additional suits to again fend off the Leviathan – perhaps the Leviathan should pull in its tentacles.  Especially when it has already had two lopped off by Courts.  As Jonah Goldberg has said: Don’t just do something – stand there.

But wait a minute – which “Big Dogs” does Wheeler mean?  The Internet Service Providers (ISPs) government intends to yet again assault.

To be sure, Verizon, Comcast, AT&T, et. al are big companies.

Verizon: ~ $207 billion.

Comcast: ~ $140 billion.

AT&T: ~ $183 billion.

But they aren’t looking for Crony Socialist favors from government – merely protection from its monumental overreaches.

Then there’s this plucky little upstart for whom the Obama Administration is fighting.

Google: ~ $370 billion.

Get that?  Google is bigger than Verizon and Comcast – combined.

Google has spent the last decade-plus shoving Net Neutrality down our throats.

Google…Support(s) Net Neutrality, Call(s) For Extension To Mobile Providers

Google has uber-generously funded pro-Net Neutrality Leftist efforts.  It twice helped President Obama get elected.  Google CEO Eric Schmidt was one of the first Obama Administration “adviser” hires.

The relationship really is that syrupy:

Obama & Google – A Love Story

So this isn’t a galloping shock:

Who Wins the Net Neutrality Debate? Google, of Course

No matter how the FCC rules next year, Google can move forward with fiber rollouts, even if they’re restricted, because it will still be earning far-healthier revenues from carrying content.

Google’s two-pronged strategy has been obvious for a long time, but lately it has looked genius given the net neutrality battle….

(I)t’s a strategy only a very large company could undertake….

Get that?  Google is more than Big Guy enough to absorb the government hit – the Little Guys looking to compete with them aren’t.

“It’s a strategy only a very large company could undertake” – using government to make the marketplace untenable for anyone but themselves.

Creating for Google a for-all-intents-and-purposes government-mandated monopoly.

The very thing the Obama Administration – with its gi-normous Internet overreach – alleges it is attempting to address/prevent.

To paraphrase George Orwell: All monopolies are equal – but some are more equal than others.

To paraphrase Franklin Delano Roosevelt: Google will be a son-of-a-bitch monopoly – but it’ll be our son-of-a-bitch monopoly.

Don’t be evil.”  Enjoy the Crony Socialism, All.

[Originally published at Human Events]


Categories: On the Blog

Gates and Pearson Partner to Reap Tens of Millions from Common Core

November 25, 2014, 1:05 AM

Follow the money. It all ends up in the hands of a very few. Pearson Foundation is getting the contracts because of its partnership with the Bill Gates Foundation. Greed, secrecy, deceptions, and lies …. and to think Democrats accuse Republicans of the very things, while Democrats are the ones using government to get richer. The deceptions run very deep. It’s time for exposure.

The saga begins on one summer day in 2008, when Gene Wilhoit, director of a national group of state school chiefs, and David Coleman (known as the architect of Common Core), knowing they needed tens of millions of dollars and a champion to overcome the politics that had thwarted previous attempts to institute national standards, approached Bill Gates at his headquarters near Seattle, to convince Gates and his wife to sign on to their idea.  Gates, upon asking if states were serious about common educational standards, was assured that they were. Gates signed on and the remarkable shift in education policy know as Common Core was born.

The Gates Foundation has spent over $170 million to manipulate the U.S. Department of Education to impose the CSSS, knowing it would realize a return on this investment as school districts and parents rush to buy the technology products they’ve been convinced are vital to improving education.  Bill Gates’ Microsoft will make a fortune form the sale of new technology products.  According to the Gates Foundation, CCSS is seen as a “step to greater excellence in education.”

On April 27, 2011 the Gates Foundation joined forces with the Pearson Foundation, a British multi-national conglomerate, representing the largest private business maneuvering for U.S. education dollars. Pearson executives saw the potential to secure lucrative contracts in testing, textbooks and software worth tens of millions of dollars.

Its partnership with the Gates Foundation was to support America’s teachers by creating a full series of digital instructional resources. Online courses in Math and Reading/English Language Arts would offer a coherent and systemic approach to teaching the new Common Core State Standard. The aim: To create an online curriculum for those standards in mathematics and English language arts that span nearly every year of a child’s pre-collegiate education. This aim has already been realized and is in practice in Common Core states.

The Pearson and Gates foundations also fund the Education Development Center (EDC) based in Waltham, Massachusetts. It is a global nonprofit organization that designs teacher evaluation policy.  Both stand to benefit from EDC recommendations. The center is involved in curriculum and materials development, research and evaluation, publication and distribution, online learning, professional development, and public policy development.

Its alignment with the Gates Foundation and Common Core, Pearson dominates the education testing and is raking in profits as school districts are pushed to replace paper textbooks with digital technology.  For example, the Los Angeles school system with 651 students, spent over $1 billion in 2013 to purchase iPads from Pearson.  Additionally, The Los Angeles school purchased Pearson’s Common Core Systems of Courses to provide all the primary instructional material for math and English/language arts for K-12, even though the material were incomplete in 2013.

Pearson’s profits will continue to increase as it has billions of dollars in long-term contracts with education department in a number of states and municipalities to introduce both testing software and the teacher training software and textbooks it claims are necessary to prepare for the tests. For example, Illinois has paid Pearson $138 million to produce standardized tests; Texas, $50 million; and New York, $32 million.

Pearson is really raking in the dough now that Pearson VUE, the assessment services wing of Pearson, has acquired examination software development company Exam Design.  CTS/McGraw-Hill is Pearson’s main competitor in the rise of standardized testing.

Corporations finding they can profit from turning students into unimaginative machines, are newly discovering they can likewise profit from standardizing teachers as well. Starting in May 2014, Pearson Education will take over teacher certification in New York State as a way of fulfilling the state’s promised “reforms” in its application for federal Race to the Top money. The evaluation system known as the Teacher Performance assessment or TPA was developed at Stanford University with support from Pearson, but it will be solely administered and prospective teachers will be entirely evaluated by Pearson and its agents.

A small cloud did fall over the Pearson Foundation (the nonprofit arm of educational publishing giant Pearson Inc) in December of 2013, when a $7.7 million fine was levied for using its charitable work to promote and develop course materials and software to benefit its corporate profit making.  After the investigation begun, Pearson Foundation sold the courses to Pearson for $15.1 million.

New York Attorney General Eric T. Schneiderman determined that the foundation had created Common Core products to generate “tens of millions of dollars” for its corporate sister. According to the settlement: “Pearson used its nonprofit foundation to develop Common Core product in order to win an endorsement from the Bill and Melinda Gates Foundation, which helped fund the creation of the Common Core standards, having announced in 2011 that it would work with the Pearson foundation to write reading and math courses aligned with the new standard.”

Since Pearson is the world largest education company and book publisher, with profits of more than $9 billion annually, the $7.7 million fine was not a hardship. Pearson, wasn’t always so big.  As a British multinational corporation Pearson was just starting out in the early 2000’s. Pearson started to grow when it embraced No Child Left Behind as its business plan and began rapidly buying up U.S. companies.

On June 10 of this year, The Bill & Melinda Gates Foundation announced its support for a two-year moratorium on tying results from assessments aligned to the Common Core State Standards to teacher evaluations or student promotions to the next grade level.

Although the Gates Foundation’s director of college-ready programs stated how Common Core was having a very positive impact on education, teachers do need more time to adjust.

The moratorium was enacted when on June 9, Diane Ravitch, research professor of education at New York University and author of “Reign of Error,” sounded the alarm over the implementation of Common Core and called for a congressional investigation, noting, “The idea that the richest man in [the U.S.] can purchase and — working closely with the U.S. Department of Education — impose new and untested academic standards on the nation’s public schools is a national scandal.”

It would be folly to suggest that either Bill Gates or Pearson, despite the temporary tactical retreat by Gates will not keep pushing for Common Core with its required educational technology. This nation spends over $500 billion annually on K-12 education.  When colleges and career-training programs are included, the education sector represents almost 9 percent of the U.S. gross domestic production.  Companies like Pearson and Microsoft stand to greatly profit as they develop and administer the tests and sell the teacher-training material.

It is not unreasonable to suspect that companies like Pearson stand to gain when tests designed to measure Common Core State Standards make most public schools look bad.  Counting on widespread failure of the Common Core State Standards, school districts and parents will be pushed to purchase even more training technology, teachers in low-ranked schools will be fired, and school will be turned over to private management.

As a text book manufacturer, Pearson Education buckled to the activists demands in Texas and replaced the scientific understanding of climate change with the politically driven claim that humans are causing climate change.    Because Texas is a large state, it does have influence on the national textbook market.

Might Common Core State Standards be the latest in the grand corporate scheme to profit from privatized public education?  In the interim, Bill Gates’ Microsoft and Pearson reap big CCSS profits.  Certainly neither teachers nor students are benefiting.

Categories: On the Blog

Heartland Daily Podcast: Sean Parnell – Obamacare After the Midterms and Gruber Comments

November 24, 2014, 12:23 PM

Research Fellow and Managing Editor of Healthcare News Sean Parnell sits down with host Donald Kendal to discuss the latest healthcare news. Parnell talks about the elections impact on Obamacare, the proposed 2017 project and the comments by Jonathan Gruber. 

With the Republicans taking control of congress, what is the likelihood of seeing a repeal of Obamacare? What is the 2017 project, and does it have a chance of replacing Obamacare? These questions and more are answered in today’s Podcast.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog

The Evolving Urban Form: Tianjin

November 23, 2014, 9:49 AM

Tianjin is located on Bohai Gulf, approximately 75 miles (120 kilometers) from Beijing. It was the imperial port of China, by virtue of that proximity. Tianjin also served as one of the most important “treaty ports” occupied and/or controlled by western nations and Japan for various years before 1950.

Tianjin is pivotally located along the East coast corridor between “Dongbei” – the northeast (the provinces of Heilongjiang, Jilin and Liaoning, which are also referred to as Manchuria) and Jinan, Nanjing, Shanghai and points south. Both the most direct expressway route (interstate standard) and high speed rail line from Shanghai to Dongbei cross through Tianjin rather than larger Beijing.

Tianjin is one of four centrally administered provincial level municipalities, along with Shanghai, Beijing, and Chongqing. While Tianjin has grown strongly in recent years, it has been one of China’s largest cities for decades. According to the United Nations, the 1950 Tianjin urban area was the second largest in China, with 2.5 million residents, trailing only Shanghai which had 4.3 million. Beijing trailed Tianjin by a third, at 1.7 million.

Population and Growth

Since 1982, the total population of Tianjin has expanded by nearly 90 percent, from 7.9 million to 14.7 million in 2013 (Exhibit 1).  Population growth has accelerated over that time. Between 2000 and 2010, the population rose 2.7 percent annually, more than double 1.2 percent rate of the 1990s. The rate of increase was even higher between 2010 and 2013, at 4.5 percent.

Between the 2000 and 2010 censuses, the inner core district (Heping qu), experienced a population loss of 12 percent. But the rest of the municipality increased, accounting for 101 percent of the growth. The balance of the core captured 18 percent of the growth, while the suburban ring attracted 27 percent. By far the greatest growth was in the outer districts, which accounted for a solid majority of the growth (Exhibit 2). This peripheral domination of growth mirrors the experience of other large Chinese cities, such as Shanghai, Beijing, and Chongqing, which have seen their core areas decline in population, with most growth occurring in the outer sectors.

A New Megacity

Tianjin is one of the world’s newest megacities (urban area over 10 million population). This has occurred because of the strong post-2010 population growth. In the next Demographia World Urban Areas (early 2015), Tianjin will have an estimated built up urban area population of 10.9 million. With an urban expanse covering 775 square miles (2,007 square kilometers), Tianjin has an urban population density of 14,100 per square mile (5,400 per square kilometer).

With the urban area expanding geographically, Tianjin fits the international trend of cities, in growing strongly, yet experiencing declining overall urban densities. Chinese urban planners have told me that it has been an intended objective of policy to reduce population densities, to give people more living space. This is despite the preachments of US and European urban planners for whom higher densities often are embraced as an “Article of Faith.”

Tianjin’s Urban Form

Despite their comparatively high density, Chinese cities are anything but compact. Most are polycentric in urban form, with central districts have widely spaced commercial buildings (the most notable exceptions may be Shanghai, Chongqing, and Dalian, but even these are somewhat polycentric). Tianjin, along with “in situ” urbanization Quanzhou, may be the least compact of the major cities.

Tianjin has a broad central business district (CBD), populated with tall, commercial buildings and residential structures (Exhibits 3 & 4). As is the case in many Asian cities (such as Bangkok,Guanzhou-Foshan, Xi’an and Beijing, the tall commercial buildings tend to be highly dispersed, rather than close together as is the custom in Canadian and American cities. In between the dispersed tall buildings are lower rise buildings, both commercial and residential.

Currently the tallest building in the CBD is the Tianjin World Financial Center (Exhibit 5), at 76 stories (1,105 feet or 337 meters). This is somewhat taller than New York’s Chrysler Building, which was the second tallest in Gotham for years. However, another taller building is near completion, the Tianjin R&F Guangdong Tower (Exhibit 6), which is well on the way to its 91 floors (1,535 feet or 468 meters). However,even this building is not as tall as three others under construction in other Tianjin centers.

A second central business district is developing in the Binhai new area, near the port and 30 miles (50 kilometers) south of the Tianjin CBD. The Rose Rock International Financial Center will reach 100 floors (1,929 feet or 538 meters). This, however, is only the second tallest under construction. The CTF Tower is also under construction and will reach 96 floors (1,740 feet or 530 meters), nearly as tall as the new World Trade Center in New York (1,776 feet or 541 meters).

Finally, the tallest building in Tianjin, Goldin Finance 117 is under construction approximately 9 miles (15 kilometers) west of the Tianjin CBD in a virtually new business center. This building will exceed the heights of all but three of the completed skyscrapers in the world (Lead Photo).

Altogether, Tianjin will soon have five buildings of more than 90 floors, a record few if any cities will soon equal.


Tianjin has more than its share of modern Chinese high rise commercial structures and residential buildings. But, perhaps to a greater extent than any other Chinese city, Tianjin exhibits the architecture of the foreign powers to a greater degree than some other treaty ports (such as Fuzhou, Dalian, and Wuhan). The city of Tianjin has meticulously preserved many of these structures, not only commercial and residential buildings, but also churches.

The Tianjin CBD has a number of low rise streets with European architecture. Some of the most impressive are across the Hai River from the Tianjin Railway Station. There is also a long pedestrian street beyond with considerable western architecture. Virtually throughout the urban core there are examples of classic western architecture, some as ornate as in central Buenos Aires(Exhibit 7).

Perhaps the most unique feature is a large area of western residences just to the south of the Tianjin CBD (Exhibits 8 & 9).

In the Beijing Orbit: An Advantage

Tianjin is clearly in the orbit of larger Beijing, which has recently announced plans for a 7th ring road and other infrastructure to tie not only the city but adjacent provincial level jurisdictions together (Tianjin and Hebei). With a strong policy interest in limiting Beijing’s population growth, and with plenty of rural land available, Tianjin could receive a substantial share of growth that otherwise would go to Beijing.

Top photo: Goldin 117 Financial Building under construction at November 6, 2014 (by author).

Wendell Cox is principal of Demographia, an international public policy and demographics firm. He is co-author of the “Demographia International Housing Affordability Survey” and author of “Demographia World Urban Areas” and “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.” He was appointed to three terms on the Los Angeles County Transportation Commission, where he served with the leading city and county leadership as the only non-elected member. He was appointed to the Amtrak Reform Council to fill the unexpired term of Governor Christine Todd Whitman and has served as a visiting professor at the Conservatoire National des Arts et Metiers, a national university in Paris.


[Originally published at New Geography

Categories: On the Blog

If We Had Some Global Warming … (Song Parody)

November 22, 2014, 8:03 PM

Via Minnesotans for Global Warming

This video below is from back in 2007 by the Freezing Emperors from Minnesotans for Global Warming — who wrote and produced the most-famous global warming parody song of all time, “Hide the Decline,” which Michael Mann tried to get taken down from YouTube with limited success.

Someone shared the song below with Heartland tonight on Twitter, and I don’t know what took them so long! It a song called “If We Had Some Global Warming,” a parody of “If I Had a Million Dollars” by Barenaked Ladies.

I think it is even colder now than in 2007 — at least judging by last winter in the Midwest, and the early arrival of another dread “polar vortex” this month. So, yeah. We could use some global warming right about now. The experts keep promising it, but when will it arrive!

Enjoy. This song it’s cheeky fun.

Oh, what the heck. Watch the great “Hide the Decline,” too. The warmists will not be mocked! … or at least that’s what they want. Happily, we still have free speech in this country, and the Internet is forever. Sorry, Mike.

Categories: On the Blog

Heartland Daily Podcast: Drew Johnson – Proposed Global Tobacco Tax

November 20, 2014, 4:41 PM

Washington Times columnist and editor Drew Johnson joins The Heartland Institute’s Budget and Tax News managing editor Jesse Hathaway to talk about the World Health Organization’s (WHO) “Article 6,” a proposed global tax aimed at making tobacco products prohibitively expensive.

Johnson, ejected from covering the public meetings for reporting on the WHO proceedings in Moscow, talks about the United Nations health organization’s misguided priorities, and the undemocratic nature of the proposed tariff rules.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog

The U.N., The Ultimate Narc

November 20, 2014, 3:57 PM

Last week, the U.N. ant-narcotics chief, Yury Fedotov, made headlines when Reuters reported he said moves by American states to end the prohibition on marijuana were illegitimate due to existing international drug conventions. He added that he may take action against these states as well.

The drug conventions mentioned by Fedotov are the 1961 convention on narcotic drugs. This 50+ year old agreement limits the production and consumption of cannabis to only medical purposes. So, the dozens of states that have passed medical marijuana laws are still in compliance. However, Colorado, Oregon, Washington, Alaska and D.C. passed laws through ballot initiatives that legalized marijuana for recreational use. These are the cases that caught the attention of Fedotov. “I don’t see how (the new laws) can be compatible with existing conventions,” said Fedotov.

But what can the U.N. do to fight this? Luckily not much. In response to question about what the U.N. could do about it, Fedotov stated he would discuss the issue in the near future in Washington.

While Fedotov and the United Nations Office on Drugs and Crime (UNODC) may have no real ability to combat these moves, the hubris alone is overwhelming. To think an international governmental organization like the U.N. can change the policies enacted by individual states in America is frightening. Marijuana prohibition should not be a responsibility of a international governing body. In fact, it should not even be a concern of the federal government.

The actions taken by these states do fly in the face of U.N. drug conventions; they are also inconsistent with federal law. Fortunately for liberty advocates, the federal government has condoned these moves in order to avoid conflict and potential political fallout. Individual states have been allowed the freedom to craft their own recreational drug policies. This, however, does not rule out a reversal on this position in the future.

Hopefully nothing will materialize from all of this. The U.N. should take a lesson from the federal government on this matter and keep its nose out of the business of these states.

Categories: On the Blog

“Where to Watch” Piracy Decrease

November 20, 2014, 3:02 PM

The Internet ecosystem just added a new tool to preserve the property of rights holders even while encouraging greater use of broadband. The Motion Picture Association has announced the launch of a new search engine called

As Variety has reported, “MPAA — upping efforts to help consumers find legal sources of content instead of pirating it — has rolled out, an advertising-free entertainment search engine designed to point people to TV shows and movies from authorized sources. includes info and links from providers including Netflix, Apple’s iTunes, and Hulu as well as smaller sites like SnagFilms and WolfeOnDemand. MPAA said it expects to expand its list of partners in the coming months.”

Great, but what does this have to do with public policy? Rather than relying on another years-long legislative battle, which may fail to reach any sort of resolution, the industry got to work creating a solution to help protect its property. This sort of industry self-help should be lauded and encouraged across the digital ecosystem.

More success will come as all parties understand that they must do their part and that an economically thriving digital ecosystem requires good faith cooperation, within the bounds of the law, with an eye towards what is best for the broader ecosystem. Less infringement combined with great legal choices available in many places for consumers is in the best interest of all.


[Originally published at Madery Bridge]

Categories: On the Blog

How Republicans Can Push Back Against Immigration Executive Orders

November 20, 2014, 2:21 PM

In a segment on a recent episode of Your World with Neil Cavuto, Heartland Institute research fellow David Applegate outlined the options Republicans can use to push back against Obama’s executive orders on immigration. Applegate says some options won’t yield much but others have the potential to produce results.

In a very revealing montage to begin the clip above, Obama is shown repeatedly saying over four years that he has no legal ability to legislate by executive order in this manner. This seems to have been forgotten in light of recent announcements by the Obama administration. Republicans, however, have a few options that may block these executive orders.

The first two options mentioned by Cavuto may not have much success. As Applegate says, taking the matter to court would likely not work. “Suing in the courts is something that the courts really do not want to handle.” Whatever case there may be would likely be ignored by the courts. Applegate says the option other, impeachment, is a legitimate constitutional option congress has. However, this is also unlikely to go anywhere. Regarding impeachment, Applegate says, “politically that would go nowhere.” There is some hope for the Republicans however.

Applegate says Mitch McConnell and John Boehner need “to realize they still have two very strong cards to play.” One is to use the power of the purse. According to Applegate, Republicans could use the threat to defunding specific government programs to help negotiate against the Obama administration. Another option would be for Republicans to compromise with the president and agree upon a more bipartisan move on illegal immigration.

It will be interesting to see what options are pursued by Republicans in the coming weeks. Stay tuned for more insight and information on this developing situation.

Categories: On the Blog

A Lot of “Folks” And “Just Some Guys”

November 20, 2014, 1:07 PM

Apart from his halting, staccato, eight-to-ten-word phrase delivery when not reading off a TelePrompTer, President Barack Obama has two noticeable and telling verbal tics. The first is “folks”; the second is “just some guy.”  The first is just an annoying and apparently insincere way of trying to show that, despite being President, he’s really, you know, just one of us.  But the second is a tell-tale sign that he’s throwing somebody under the bus.

Perhaps “folks” is the way that Harvard-educated lecturers in law at the University of Chicago are taught to talk about their fellow Americans, but I rather doubt it. Having attended an Ivy-league school myself and having studied law at the University of Chicago for three years, I’m pretty sure I never heard the word “folks” once.  Even Tennessee Ernie Ford used “people,” as do the U. S. Constitution’s opening three words, “We the People …” .

Obama uses the word “folks” whenever he wants to sound sage and, well, folksy; usually when about to make a patronizing observation about the American people that justifies, in his mind, his administration’s increasingly one-party top-down style of governing.

That’s just how white folks will do you,“ he wrote in Dreams from My Father: A Story of Race and Inheritance , referring to what he perceived as white arrogance and cruelty.  “These are folks who are strong allies and supporters of me,” he said in an October 20, 2014, interview with Al Sharpton, referring to Democratic candidates who were running away from him in the recent midterm elections.  “We need to internalize this idea of excellence,” he said on another occasion.  “Not many folks spend a lot of time trying to be excellent.”  And, in a particularly portentous and lecturing moment, “Folks haven’t been reading their Bibles.”

Most infamously, Obama awkwardly claimed during an impromptu Friday, August 1, 2014, White House news conference regarding the War on Terror that “We tortured some folks.”  That struck many “folks” as inappropriate, leading one commentator on Twitter to ask incredulously, “Wow.  How does the supposedly rhetorically great Obama use ‘torture’ and ‘folks’ in the same sentence?”

But it’s Obama’s use of “just some guy” that signifies when someone has outlived his usefulness to the president, at least for public consumption.

Obama’s political mentor in Chicago’s Hyde Park neighborhood for many years was American terrorist Bill “Guilty as hell, free as a bird” Ayers, a founder of the radical Weathermen group.  Ayers is widely suspected of having ghost-written at least large portions of Obama’s two books for him, and Obama and Ayers worked closely together on the Chicago Annenberg Challenge, a five-year failed philanthropic venture for which Ayers wrote the grants and Obama chaired the board that distributed the money.   But when Obama ran for President and Sarah Palin called him out for “palling around with terrorists,” Ayers became “just a guy who lives in my neighborhood” who hasn’t been publicly seen in the President’s company since.

Obama’s most recent use of the phrase is in reference to Jonathan Gruber, the now-infamous MIT professor who was one of Obamacare’ s architects.  While working to help get Obamacare passed, Gruber was highly regarded and highly rewarded.  The administration cited Gruber frequently in hearings and White House blogs, dedicated a webpage to his analysis, met with him repeatedly at the White House, and paid him $380,000 of taxpayer money in 2009 alone.

Now that videos have surfaced in which Gruber calls his fellow Americans not “folks” but “stupid” and brags that Obamacare was founded and sold on deliberate lies, Gruber has become just “some adviser who never worked on our staff,” which even Politifact rates as “mostly false.”

Even worse for the administration, perhaps, Gruber is also on record having said that the intent of Obamacare’s design was that, to encourage states to set up health care exchanges, if a state did not do so then its residents would not be eligible for income tax subsidies.  Now that 36 states have declined to set up exchanges and Obama has directed his IRS to provide subsidies anyway, Gruber’s comments have become just a misquoted typo taken out of context and Gruber himself, in the President’s own words, has become just another guy.

It turns out that the Obama administration may be the most transparent in history, just not in the way that it meant.  As Yogi Berra once said, you can observe a lot just by watching.

Categories: On the Blog

Top Ten Questions to Ask About Title II Utility Regulation of Internet

November 20, 2014, 10:48 AM

If Congress or the media seek incisive oversight/accountability questions to ask the FCC about the real world implications and unintended consequences of its Title II net neutrality plans, here are ten that fit the bill.

1. Authority? If the FCC truly needs more legal authority to do what it believes necessary in the 21st century, why doesn’t the FCC start the FCC modernization process and ask Congress for the legitimacy of real modern legislative authorities? Or is it the official position of the FCC that its core 1934 and 1996 statutory authorities are sufficiently timeless, modern and flexible to sustain the legitimacy of FCC regulation for the remainder of the 21st century?

2. Growth & Job Creation? While it may be good for the FCC’s own power in the short-term to impose its most antiquated authority and restrictive Title II regulations on the most modern part of the economy, how would that heavy-handed regulation be good or positive for net private investment, economic growth and job creation?

3. Zero-price? Does the FCC seek new legal theories and authority for the purposes of setting a de facto permanent zero-price for some form of downstream Internet traffic, or not?

4. Consumers? How is it neutral, equal or fair under FCC net neutrality regulations for consumers to pay for faster Internet speed tiers/lanes and their Internet usage, but it is somehow a violation of net neutrality for Silicon Valley giants to pay anything other than a price of zero for delivery of their hugely-outsized downstream Internet traffic? (And why would FCC Title II reclassification also not have the unintended consequence of triggering large new fees and taxes on unsuspecting consumers?)

5. UN-ITU? Would the FCC reclassifying Internet traffic as “telecommunications” enable the U.N.’s International Telecommunications Union the legal authority and cover to assert governance over the Internet like it has long had over international telecommunications, and International telecommunications trade settlements? (And in imposing the most restrictive American regulatory regime available to prevent potential problems, wouldn’t the FCC be leading, and giving political cover to, autocratic nations which seek to impose similar maximal regulation of their Internet for the autocratic purposes of censoring, spying on, and controlling their people?)

6. Cost-Benefit? In any potential Title II action will the FCC abide by the President’s 2011 Executive Order 13563 that requires the FCC to use “the least burdensome tools for achieving regulatory ends,” and to “adopt a regulation only upon a reasoned determination that its benefits justify its costs?”

7. Forbearance? Under a “hybrid” (Title II/Section 706) approach, how does the FCC square the circle of the FCC justifying forbearance from most all Title II regulations by showing there is enough competition to protect consumers, while simultaneously justifying reclassification of the Internet as a Title II utility because of insufficient competition to protect consumers

8. Deployment Barriers? Since Section 706 is about removing barriers to broadband deployment, how would Title II Section 214, which requires that the FCC get prior approval to upgrade any “telecommunications” facilities (a process that can routinely takes months at a minimum), not be considered to be a barrier to broadband deployment under Section 706?

9. Internet Backbone? What is different competitively in the Internet backbone market now from the last 20 years of no FCC regulation, that warrants maximal FCC regulation under Title II for the first time since the Internet was privatized in the early 1990s?

10. Supreme Court? Wouldn’t a June 2014 Supreme Court precedent (Util. Air Reg. Grp v. EPA) — that establishes that FCC rules “must be “ground[ed] … in the statute,” rather than on “reasoning divorced from the statutory text” – disallow the FCC from reclassifying services solely for the purpose of evading other statutory provisions that Congress passed to restrict FCC authority?

[First published at the Precursor blog.]

Categories: On the Blog

You’ve Been Gruber’d, Stupid!

November 20, 2014, 8:23 AM

“No.  I — I did not.  Uhhh, I just heard about this… I — I get well briefed before I come out here.  Uh, th-th-the fact that some advisor who never worked on our staff, uhh, expressed an opinion that, uhh, I completely disagree with wuh, uhh, in terms of the voters, is no reflection on the actual process that was run.” — President Obama replying to a question about Jonathan Gruber at the conclusion of the G-20 Conference in Brisbane, Australia. 

 Will the last name of the MIT professor identified as the “architect of ObamaCare” become a verb some day? Will people say “I’ve been Gruber’d? or “The government is “Grubering again”?

 After all, when he admitted that ObamaCare’s passage was achieved by deceiving the Congressional Budget Office and the entire American public, turning his name into a synonym for lying is not unthinkable. Adding insult to injury, he said the voters were “stupid.”

 How stupid was it for the Democrat-controlled Congress to pass a two-thousand page piece of legislation that none of them had read? (No Republican in Congress voted for it.) ObamaCare took over one-sixth of the U.S. economy and did something that makes me wonder why we even have a Supreme Court. It required people to buy a product whether they wanted to or not. If they didn’t, they would be subject to a penalty.

 One way of the other, the federal government was going to squeeze you. The Court did conclude early on that ObamaCare was a tax, but don’t expect the mainstream media to tell you about all the other taxes hidden within it.

 What surprises me about the Gruber revelations—available on YouTube to any journalist who wanted to investigate, but none did—is that there appears to be so little public outrage. An arrogant MIT professor who received $400,000 from the government and made millions as a consultant to the states who needed to understand ObamaCare, calls voters stupid and the initial reaction of the mainstream media was to ignore the story.

 At the heart of the Gruber affair is the fact that Obama and his administration has been lying to the voters from the moment he began to campaign for the presidency. In virtually every respect, everything he has said for public consumption has been and is a lie.

 In one scandal after another, Obama would have us believe he knew nothing about it. That is the response one might expect from a criminal rather than a President.

 One has to ask why it would be difficult to repeal in full a piece of legislation that the President said would not cause Americans to lose their healthcare insurance if they preferred their current plan, that would not cause them to lose the care of a doctor they knew and trusted, and would save them money for premiums. The initial deception was to name the bill the Affordable Care Act.

 Repeal would help ensure the solvency of Medicare and restore the private sector market for healthcare insurance.

 This is a President who was elected twice, so maybe Prof. Gruber is right when he speaks of stupid voters.  Not all, of course, but more than voted for Obama’s two opponents. As this is written over 45% of those polled these days continue to express approval for Obama’s performance in office. How stupid is that?

 What is so offensive about Gruber’s own revelations about the manner in which the bill was written and the lies that were told to get it passed is the incalculable misery it has caused millions of Americans.

 It has caused the loss of jobs. It has forced others into part-time employment. It has caused companies to reconsider expanding to grow the economy. It has driven up the cost of healthcare insurance. It has impacted local hospitals and clinics to the point where some have closed their doors. It has caused many healthcare professionals to retire or cease practicing medicine.

 I invite you to make a list of all the things you think the government should require you to purchase whether you want it or need it. Should you be required to own a bike and use it as an alternative to a car? (Yes, you must own auto insurance to defray the cost of accidents, just as you must pay a tax on gasoline to maintain our highway system.)  Should you be required to wear a certain style or item of clothing? Should you be required to get married by a certain age? Should you be required to eat certain foods and avoid others?

 A new study by the Legatum Institute in London ranked citizen’s perception of their personal freedom in a number of nations. Americans ranked way down the list at 21 out of 25, well below Canada, France, and Costa Rica to name just three. The study was based on a 2013 poll.

 What is a stake here is (1) the absolute need for a trustworthy federal government and (2) the need to repeal a piece of legislation based entirely on lies. On a larger scale, the right to make your own decisions on matters not relevant to the governance of the nation should be regarded as sacred, it’s called liberty.

 The Republican-controlled Congress and the Supreme Court are the two elements of our government that can and must provide a measure of protection against the deception that is practiced every day by President Obama and members of his administration. Let’s hope neither is “stupid” in the two years that remain.


Categories: On the Blog

Heartland Daily Podcast – David Schweikert: Secret Science Reform Act of 2014

November 19, 2014, 3:33 PM

Congressman David Schweikert, Republican representing Arizona’s 6thdistrict is the chairman of the House subcommittee on the Environment in the House Science, Space and Technology Committee. In this capacity, Representative Schweikert introduced the Secret Science Reform Act of 2014 (H.R. 4012) and with the support of Texas’s own Lamar Smith, chairman of the full committee, it was passed out of committee.

The bill requires the EPA to disclose all the science, research, models and data used to justify regulations, and the results would have to be reproducible by independent researchers. Schweikert argues research used to make rules imposed on the public, especially when it is funded directly or indirectly by taxpayers, should be transparent.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog

Heartland’s Jim Lakely Crushes ACLU in Net Neutrality Debate

November 19, 2014, 2:52 PM

The Internet isn’t broken, and doesn’t need the government to fix it. That was my overriding message in a debate on Chicago’s PBS station WTTW Tuesday night with Illinois ACLU Executive Director Colleen K. Connell.

In an excellent discussion led by “Chicago Tonight” host Phil Ponce, Connell and I talked about President Obama’s attempt to strong-arm the FCC into regulating the Internet like a public utility under “Title II” of the Telecommuncations act of 1996. That’s the only sure-fire way to ruin the vibrant digital economy and the online experience we all now take for granted on our computers and mobile devices.

If Obama was president in 1996, Netflix and Hulu as we know it today wouldn’t exist. Why? Because anyone who thought it was a good idea to stream content directly to consumers — circumvening the bundled-channels model of cable TV, and even creating original content — would have had to invest enormous human and monetary capital into convincing the FCC that it was something consumers wanted. All of that creative energy would have been wasted on goverment rent-seeking instead of creating what we have today.

Thankfully, a Republican Congress and Democratic President Bill Clinton in 1996 put then-nacent broadband under “Title I,” which did not give the FCC the power to micromanage the Internet “on our behalf.” As a result, dozens of ISPs (Comcast, AT&T, Verizon are the Big Three, but not the only ones), content providers (Google’s YouTube, Netflix, Hulu, ESPN), millions of app creators, and billions of consumers enjoy the ever-evolving wonders of the digital age. It might be the best example Adam Smith and Milton Friedman could have conceived of free-market capitalism providing the best services at the lowest prices and at the fastest possible speeds — both in terms of innovation and delivery.

Yet Connell, my debate opponent from the Illinois ACLU, was arguing that now — right now — the government must get its molasses-filled hands deep into the structures of all these wonders and start mucking around. Why? Because of potential “anti-consumer” actions that might be taken by the big players in the digital economy. It’s a hard argument to make, which is why I got the better of these exchanges.


Categories: On the Blog

Exchange on Texas textbook controversy

November 19, 2014, 12:35 PM

Recently there has been a great deal of controversy over the adoption of new social studies and history textbooks in Texas. Global warming alarmists have successfully pressured textbook publishers into removing any trace of the undeniable fact that the causes and consequences of global warming are still open for debate or that any scientists still question the theory that humans are causing catastrophic climate change. This is unfortunate because in removing all reference to skepticism concerning warming, the publishers have foregone the truth for political expediency. Should these textbooks be adopted as is, it will be a disservice to Texas students. It’s my sincere hope that the Texas’s State Board of Education, reject all the proposed books, reopen the selection process and make it clear that any textbooks to be adopted, if they address global warming (and they need not do so) must acknowledge the ongoing debate concerning the causes and consequences of climate change and air the views of both sides of the debate.

I have recently written some pieces about the Texas School book controversy in Human Events and The Dallas Morning News

A reader wrote me in response to the Dallas Morning News article. I have reprinted our exchange below (leaving out the name of the writer to protect his anonymity). I post it to answer questions others might have but haven’t asked.

First e-mail:

Mr. Burnett, I agree with you that we all should have an open mind on this subject and we should lead our students into critical thinking, (about any subject for that matter).
But you contradict that statement in your own editorial. You flatly state: “The models are wrong.” You also cherry pick and mislead by stating: “…the facts show and failed to predict the current lack of increase in Earth’s average temperature…”

Those statements do not show someone who is keeping an open mind and using critical thinking.

No model that I have seen (and I have seen probably just about every one) do not support your statements unless one cherry picks the information provided and doesn’t take a look at all the information provided.

The Koch brothers funded a study with a team of skeptical scientists. They provide all their data with total transparency. Check the results for yourself in the second link.

(Name withheld)

My response:

Mr. (Name withheld):

Thank you for your attention to my article and for your thoughtful response. However, I believe we will have to agree to disagree. Though stating “the models are wrong,” is a strong statement concerning current temperatures and temperature trends I believe it is wholly accurate see Dr. Roy Spencer’s post.

Concerning the really important matters, the projected harmful results from AGW, its certainly true that one can pick and choose models to project whatever current weather/climate trends or events happen and then claim that this hurricane or that drought, etc… is consistent with the models, but that doesn’t seem to me to be a sound scientific process. One critical feature of the scientific method is the falsification or the ability to disprove the theory. A theory that can’t be disproven is a religion. I have no problem with religion, but religion should not be confused with science. Michael Mann once told me that that for AGW to be wrong everything he knew about physics (including fundamental laws of gravity, thermodynamics and laws concerning the conservation of energy and entropy) would have to be wrong. That is shear hubris in the face of the numerous climate factors of which scientists are ignorant. I have written about the problems with selective choice of models a number of times in the past, for instance at, National review and Human Events

Nothing, I’ve seen since I wrote these pieces has led me to change my considered opinion.

There are simply too many factors that climate modelers admittedly don’t understand or know how to plug into their models, to have any confidence in models projections of future harms when they cannot even accurately model temperature trends.

If there is a model that has accurately projected the current lull in temperature, let me know which one it is, then let’s check the projections that flow from it concerning various climate phenomena and trends and see how they comport with what has been actually measured or recorded. A model that projected temperatures and most recorded phenomena and measured trends accurately, I would count as one worth serious consideration.

Concerning the Koch brothers, I have never met them and to my knowledge my work has never been funded by them. More importantly, even if I, or the scientists you discuss, were a wholly paid employees of the Koch brothers, the question would remain, is what we have written wrong. To indict or praise the work of someone because of who they associate with, who they are paid by, or because it is supported by a majority (or consensus) is a logical fallacy.

Arguments should be analyzed strictly on their own merits, regardless of whether one likes or trusts the persons or groups making them.

Just my two cents.

Take care,

H. Sterling Burnett, Ph.D.


I had thought that that would be the end of the exchange but it was not. Instead I received this reply:

So you’re not going to keep an open mind and you haven’t studied the lag affect yet? Looking at only the last 16 years of data reminds me of my liberal friends who post the Federal Budget Deficit from 2009 to present and they show how much it has come down under Obama. But of course they fail to acknowledge and show the previous years budget deficit prior to 2009 and the enourmous jump it made right after Obama took office.

(Name withheld)

To which I responded:

(Name withheld):

You misunderstand me or misstate my position. I don’t have any problem with the idea of a lag effect, just show me where the models we are supposed to trust, build it in. They treat emissions as if they worked geometrically not logarithmically. I never said that I just looked at the last 16 years of data, I just pointed out that they missed the actual measurements for the past 18 years. The models and their promoters don’t accurately portray the state of our knowledge about climate sensitivity.

To the extent that the models have accurately hind-cast, past temperatures and trends they have only done so after much tinkering and building in adjustment factors. The fact that the model temperature projections have error bars as great or greater than the actual expected temperature rise, tells me they aren’t very useful for decision making. If I hired a caterer for a party and informed him that I expected 30 people to attend, but there could be as few as zero (not counting me) or as many as 75, I wouldn’t really be providing him with much useful information for planning purposes.

I’ve got an open mind, I admit I could be wrong. And all I request to trust models projections is a model that gets temperatures and expected climatic events mostly right. You seem to think that that is too much to ask on my part. Unlike AGW believers, I take data and evidence to be paramount, not unconfirmed (perhaps unconformable) theory. In addition, I don’t require the scientific method or basic rules of physics, biology, chemistry or any other core principle or set of principles to have to be overturned to prove me wrong.

Having said this, even if I am wrong and potentially dangerous human caused warming is in the offing, I would still argue that the danger, the actual harms, resulting today and in the future, would be greater from denying widespread access to and use of fossil fuels than the harms from warming. Leaving billions in grinding poverty and want today, and leaving future generations poorer than they otherwise would be, by stopping fossil fuel use, is immoral, since it sacrifices present generations based on possible harms to future generations that will be much wealthier and able to adapt to or mitigate climate change much better due to their greater wealth. It’s the difference between the impacts of cyclones and tidal waves on Indonesia or the Indian subcontinent compared to the impacts in the U.S. or Japan. The difference between harms from widespread drought in the Southwestern U.S. compared to a similar drought in Sudan or Ethiopia.

Take care,

H. Sterling Burnett, Ph.D.

Any thoughts?

Categories: On the Blog

Thinking the Unthinkable – Part IV

November 19, 2014, 11:40 AM

I was gratified by the excellent attendance at the Free State Foundation’s program last Friday titled, “Thinking the Unthinkable: Imposing the ‘Utility Model’ on Internet Providers.” If you weren’t there, you missed what was a very important event – one that, in light of the substantive discussions that occurred, likely will play an important role going forward in the debate over the Federal Communications Commission’s consideration of the imposition of new net neutrality mandates.

With the release of President Obama’s video on Monday, what he calls “President Obama’s Plan for a Free and Open Internet,” directly urging the FCC to classify Internet service providers as common carriers – that is, to impose the Title II public utility model of regulation – the FCC’s proceeding has now become even more highly politicized than before. More and more, in the absence of any present market failure or consumer harm, the proceeding is looking like a textbook case study in administrative agency overreach. Or put more bluntly, an administrative agency power grab.

And with President Obama interjecting himself so directly into the net neutrality rulemaking, the proceeding is also providing a textbook example of the problematic nature of so-called independent agencies like the FCC, which in any event occupy an odd place in our tripartite constitutional system. After all, in the interests of accountability, the Constitution vests all powers in the legislative, executive, and judicial branches, not in agencies, commonly referred to as the “headless fourth branch” of government, that blend together quasi-executive, quasi-legislative, and quasi-judicial powers.

Witnessing what is transpiring at the FCC calls to mind for me a famous statement by Roscoe Pound, the distinguished American legal scholar and long-time Dean of the Harvard Law School, concerning the rise of administrative agencies like the FCC. In 1920, he said: “[T]he whole genius of administrative action through commissions endangers the supremacy of law. Not the least task of the common-law lawyers of the future will be to impose a legal yoke on these commissions, as Coke and his fellows did upon the organs of executive justice in Tudor and Stuart England.”

It is possible that, when all is said and done, the FCC’s actions in the net neutrality proceeding might lead to a fundamental rethinking by Congress of the proper institutional role in the Digital Age of an agency created in 1934 (or, in part, in 1927 if you wish to go back to the Federal Radio Commission). To its credit, the House Commerce Committee early this year began such a #CommActUpdate process in earnest, and when the new Congress convenes in January, the Senate should follow suit.

In the meantime, it is at least somewhat encouraging that FCC Chairman Tom Wheeler appears to recognize the need to take a pause in the net neutrality rulemaking. This would be a good idea, especially if Mr. Wheeler and his Commission colleagues use such a pause as a time for some serious rethinking concerning the proper exercise of the agency’s putative authority with respect to Internet regulation.

On the assumption that the willingness to engage in reflection and rethinking should always be in order, those at the FCC and elsewhere would do well to review the remarks delivered by Commissioners Ajit Pai and Michael O’Rielly at the Free State Foundation’s event on November 14th. Regardless of any predispositions regarding what actions – or not – you believe the FCC should take, their statements at the least warrant careful consideration.

Commissioner Pai’s remarks are here

Commissioner Michael O’Rielly’s remarks are here

And the remarks of Rep. Bob Latta, the Vice Chairman of the House Communications and Technology Committee, are important as well. They are here.

I’m going back and re-read each of these, and I hope you will read them too.

PS – The video of the opening remarks and the lively panel discussion featuring Robert Crandall, Gerald Faulhaber, Deborah Taylor Tate, and Michael Weinberg will be posted shortly.

[Originally published at The Free State Foundation]

Categories: On the Blog

And the Records Keep Falling

November 19, 2014, 11:38 AM

Weather map used with the kind permission of Weatherbell Analytics.

America is experiencing record setting cold, any where you go or live, you just can’t avoid it.

Weather Bell says it best:

The Lower-48 or CONUS spatially average temperature plummeted overnight to only 19.4°F typical of mid-winter not November 18th!   Data

An astounding 226-million Americans will experience at or below freezing temperatures (32°F) on Tuesday as well — if you venture outdoors.

More than 85% of the surface area of the Lower-48 reached or fell below freezing Tuesday morning. All 50-states saw at or below freezing temperatures on Tuesday.

Record lows from Idaho to Nebraska and Iowa south to Texas and east through the Great Lakes, the eastern 2/3 of the US will shatter decades-long and in some cases, century-long records. Temperatures east of the Rockies will be 20-40°F below climate normals.

Compared to normal, temperatures over the past several days have dropped off a cliff — to 10°C below climate normal — more anomalous than even during the polarvortex of early January.

How cold is it? reports that during the past week, 1,360 daily low maximum records were set — meaning  those ,1360 cities and towns saw their coldest daily highs ever recorded on that particular date.

In addition, snow ground cover is over 50 percent of the country, which is more than twice the coverage the U.S. usually experiences for mind-November.

CNN reports that Areas in Buffalo, New York among other cities along the Great Lakes, have experienced a years worth of snow in just three days this past week.

To repeat: Every state in the Union had a had at least one location within its borders registering temperatures below freezing, yes, including Hawaii and more than 1360 cities and towns set record low high temperatures.

I know all this record cold naturally makes alarmists think of global warming, as, in fact, it does me — though for me as I shiver in Dallas, I want to see some warming.


Categories: On the Blog

Textbooks Proposed for Texas Schools Open Can of Worms

November 19, 2014, 11:03 AM

Controversy continues over the adoption of new schoolbooks in Texas, as environmental lobbyists fight to have sound science concerning global warming removed from the curriculum. With the ability to influence millions of schoolchildren regarding climate change, environmental alarmists are trying to ensure their message is the only one heard.

Alarmists claim the science is certain: Humans are causing catastrophic climate change and governments must force people to use less energy to prevent disaster. To be clear, climate change is occurring; the climate is always changing. However, there is an ongoing, heated and widespread scientific debate over whether human activities are responsible for all, some or none of the recent climate change. In addition, there is certainly no agreement a warmer climate will result in more dangerous weather patterns or climate conditions than we already experience.

The predictions of catastrophe are based on models that ignore the facts and failed to predict the current 18-year lack of increase in Earth’s average temperature, which has happened despite rising CO2 levels. All the models have assumed and continue to assume the increase in CO2 is the culprit causing temperature increases. The models are wrong.

The textbooks in question don’t deny human-caused global warming is happening; they just accurately report scientists are still debating the question. They present the evidence and ask the students to make up their own minds.

Having an open mind is what climate alarmists really object to.

A couple of textbook publishers, including Pearson Education just last week, buckled to the activists’ demands and replaced the scientific understanding of climate with the politically driven, dogmatic claim humans are causing dangerous climate change. Reasonable people will praise McGraw-Hill for, so far, resisting the alarmists’ pressure tactics.

The Texas Board of Education is justifiably acting cautiously to ensure its textbooks rigorously present the best science available and accurately portray ongoing debates, including those over climate change. They are right to do so and should endorse only textbooks that uphold critical thinking and skepticism in the face of unsupportable claims of pending climate disaster.


[Originally published at Dallas News]

Categories: On the Blog

The GoogleNet Playbook & Zero Pricing – A Special Report

November 19, 2014, 9:41 AM

GoogleNet is Google’s vision to leverage its proliferating dominance by offering global, near-free Internet-access, mobile connectivity, and Internet-of-Things connectivity via a global, largely-wireless, Android-based, “GoogleNet,” that is subsidized by Google’s search and search advertising dominance and by “open Internet” zero pricing of downstream Internet traffic.

A near-free global GoogleNet would be much like the Google Playbook which offers Android, Maps, YouTube, and others’ content for free globally, to disrupt and commoditize competitors in order to maintain and extend its search and search advertising dominance throughout the economy.

Why the GoogleNet Playbook matters competitively is that it is Google’s new disruptive strategy to disintermediate and commoditize physical-world industries’ direct relationship with their customers (like ISPs, energy utilities, automobile manufacturers, big-box stores, banks, package delivery services, realty, and their networks, vehicles, inventory, ATMs, credit cards, appliances, devices etc., just like Google has been disintermediating and commoditizing the paid content, app and software industries’ direct relationship with their customers.

Tellingly, Google explains that Google proper is mostly about digital bits in the virtual world i.e. computer science and Internet technologies, whereas its next generation GoogleX research lab is mostly about atoms in the physical world, i.e. physical objects like driverless cars, satellites, drones, networks, devices, sensors, etc.

In a nutshell, this analysis spotlights: Google’s much-underappreciated, global-connectivity plans — GoogleNet; how GoogleNet neatly fits into the Google Playbook; and how zero-price-defined net neutrality is necessary to subsidize and accelerate Google’s grandiose ambitions to broadly extend its dominance of the virtual world into the physical world.



This analysis first applies the Google Playbook of “open-dominate-close” to GoogleNet’s global connectivity ambitions.

Next it shows how GoogleNet neatly ties together Google’s unique technology vision, company mission, “serial-moon-shot” ambitions, and its core beliefs in digital information commons cyber-ideology, and abundance economics.

Next, for the first time, it charts the much-underappreciated, exceptional comprehensiveness of GoogleNet’s progress: from the Google’s dominant Android mobile operating system, to Android devices, satellites, high-altitude balloons, drones, dark fiber, undersea cables, data center construction, server-points-of-presence, fiber broadband, wireless backhaul, WiFi mesh-networking, etc.

Then it explains the exceptional value and advantage of getting a government “net neutrality” industrial policy to ban the evolution of a two-sided free market for the large enterprise market, via permanently banning any charges for high-volume downstream Internet traffic under the guise of “no-fast-lanes” or no “paid prioritization” for the Internet.


The Google Playbook

FairSearch clearly and cohesively describes The Google Playbook, Google’s plan to build and maintain its dominance via its predatory strategy of: “open-dominate-close.”

First, under the guise of “openness,” Google offers free, or deeply cross-subsidized, products and services to induce fast mass adoption and “disrupt” existing business models. Second, Google proliferates its dominance based on promises of “openness.” Third, once dominant in the new cross-subsidized market, Google then closes its products/services and excludes competitors, so it can discriminate in favor of itself.

To put this in perspective, this analysis explains and documents how GoogleNet is Google’s strategy to eventually dominate global Internet access and connectivity for mobile and the Internet-of-Things, much like I explained and documented Google’s anti-competitive strategy to extend its dominance to YouTube in my Google-YouTube’s Internet Video Distribution Dominance analysis last year.


GoogleNet’s Technological Vision, Mission, Ambitions, Ideology and Economics

GoogleNet neatly ties together Google’s unique technology vision, company mission, “moonshot” ambitions, digital information commons cyber-ideology, and abundance economics.

Google’s Unique Technology Vision is summarized by Google Chairman Eric Schmidt in his recent book: “How Google Works.”

Page 11: “Three powerful technology trends have converged to fundamentally shift the playing field in most industries. First, the Internet has made information free, copious, and ubiquitous – practically everything is online. Second, mobile devices and networks have made global reach and continuous connectivity widely available. And third, cloud computing has put practically infinite computing power and storage and a host of sophisticated tools and applications at everyone’s disposal, on an inexpensive pay-as-you-go basis.”

Mr. Schmidt then lays out the implicit vision for GoogleNet: “Today, access to these technologies is still unavailable to much of the world’s population, but it won’t be long before that situation changes and the next five billion people come on line.”

Simply, GoogleNet is Google’s global vision of a fully-integrated network of digital information, connectivity and computing power that combined is “10x” better than the existing Internet.

Mr. Schmidt continues: “we are entering what lead Google economist Hal Varian calls a new period of ‘combinatorial innovation.’ This occurs when there is a great availability of different component parts that can be combined or recombined to recreate new inventions. … Today the components are all about information, connectivity and computing.

The genius of this insight is why Google can be more “innovative” than anyone else simply because they dominate, or will dominate, most of the necessary fundamental component parts of “combinatorial innovation” long term: information, connectivity, and computing.

Mr. Schmidt recently told the CBC:  “The concept of having every human reachable by every other human is an extraordinarily valuable thing.” He is echoing Metcalfe’s Law of network effects which posits that “the value of a telecommunications network is proportional to the square of the number of connected users of the system” — per Wikipedia.

Simply, Mr. Schmidt and Google get that its dominance in search, search advertising, Android, Maps, YouTube, and Chrome, grows with more users.

Google’s Mission & Ambitions: If one thought Google’s mission “to organize the world’s information and make it universally accessible and useful” was grandiose, they have already achieved most of it in just fifteen years, as I documented recently in my Google’s WorldWideWatch of the WorldWideWeb analysis that charted the vastness of Google’s Internet empire and data hegemony for the first time.

What is the effective “mars-shot” to scale Google’s ambitions “10x” beyond the mere “moonshot” of organizing the world’s information? When the FT recently asked Google CEO Larry Page if Google’s mission statement needed updating, he responded: “I think we do, probably. We’re still working that out.”

Just two years ago, Google CEO Larry Page lamented that “We’re still 1 percent to where we should be…what I’m trying to do is… really scale our ambition.”

Given all of Google’s GoogleNet-related acquisitions and activities that will be documented later in this analysis, it appears that Mr. Page’s mission is actually expanding to something like this: “Inter-connecting everyone, every “thing,” and the world’s information over one universally accessible and useful cloud computing GoogleNet.” Or as Google simply calls it internally: “The Google computer.”

Google’s Cyber-ideology: No one can fully understand the boundlessness of Google’s ambitions without understanding why Google CEO Larry Page promised shareholders in his 2004 Founder’s letter that: “Google is not a conventional company. We do not intend to become one.” He promised that because he knows Google is driven by a very different ideology than most of the world would recognize.

Google’s mission and ambitions are not merely technological but also very political, a natural outgrowth of its codism cyber-ideology of a digital information commons where “information wants to be free.” For those who are struggling to understand Google’s geopolitical world view see: a detailed explanation of the Codism movement of which Google increasingly is the de facto global leader, in “What Is the Code War?

“Abundance Economics” is Key to Google’s Dominance: Google is the world’s largest adherent to the theory of abundance economics, where because the marginal cost of computing, storage, and bandwidth approaches zero, whatever is on the Internet should be free or no cost to use. Abundance economics generally ignores the reality of fixed and total costs and property rights, because they don’t support their notion and aspiration that the economics of abundance have supplanted the traditional economics of scarcity.

The penultimate for abundance economics and a digital information commons would be dominating the three biggest disruptive technological trends of universal and near-free: data-accessibility, connectivity, and computing power.

Simply, Google’s CEO Larry Page singularly gets the implications of digital hyper-centralization – omni-scale wins.

Whoever gets first-mover advantage of combining data-aggregation, connectivity, and computing power wins – its winner-takes-all.

Competitors can’t compete if Google’s proliferating dominance allows it to create an unmatchable, fully integrated, super-high-cost essential facility of one global client-server network (and proverbial Tower of Babel) – that is the only network where eventually one can go for the world’s information, universal Internet access and connectivity, and the lowest-cost computing power.

Tellingly, Google, a world leader in multi-language translation services, originally named its global Google Hangout video chat and video conferencing service “Babel.”

Moreover, Mr. Page gets that the Internet’s web-server-infrastructure is a basic client-server model that ultimately will turn out to have more in common with IBM’s mainframe dominance of the 1950s – 1970s than Microsoft’s dominance of the PC client-software largely in the 1990’s. (Note: The “server” in the traditional “client-server” model has morphed over the decades into a data center of hundreds of thousands of virtualized server-blades in globally-virtualized data centers that function like one unitary server or mainframe computer did in the IBM dominant era.)

And to force Google’s ideological position that information should be free, i.e. no cost, Google has been a most hostile entity when it comes to disrespecting users’ privacy rights anddisrespecting others’ intellectual property rights.

And to commoditize cloud computing, take note that Google has precipitated a price war with Amazon’s AWS, cutting cloud prices 38% in 2014 alone, a price war that it already knows it ultimately will win.


The Evidence of the Exceptional Comprehensiveness of the GoogleNet Domination Effort:

A very big public indicator that GoogleNet is a real, urgent and major strategic priority for Google was in June when CEO Larry Page made Craig Barratt Senior Vice President for Access and Energy, on par with the SVPs for Android, Ads and YouTube per WSJ reporting.

Importantly, the dominant core or “spine,” on which GoogleNet is being built upon and around, is Google’s very-fast-growing Android mobile operating system, which already commands 85% share of global smart-phone shipments, 62% share of tablets, 93% share of mobile searches, and over one billion active users up from 538m in June 2013.

The Android mobile operating system is rapidly becoming the default operating system for much of the consumer Internet of Things marketplace because it the only one that is free and “open,” and because the smart-phone has become the default remote controller for: home networking via its Nest acquisition; for autos via its dominant Open Automotive Alliance; and for wearables among other categories of “things” in the consumer Internet of Things.

Google is clearly serious in being the first-mover to reach what Mr. Schmidt calls the Internet’s next 5 billion users coming on line from the developing world via its supply of a free mobile operating system, and its low cost Chrome-books, tablets, smart-phones, wearables and sensors.

To provide these next five billion Internet users free or near-free connectivity, Google is piloting three different technology approaches to offering a free global GoogleNet service.

Google bought Skybox Imaging for $500m and plans to spend $1-3 billion on “180 small, high capacity satellites at lower altitudes than traditional satellites” to enable two-way Internet access. Google also bought Titan Aerospace – which makes solar-powered, high-flying drones that Titan calls “atmospheric satellites” — for Internet access to remote areas. And Google CEO Larry Page shared his ambitions that Project Loon “could build a world-wide mesh of these balloons that can cover the whole planet” to provide Internet access. Any one of these very different physical technologies could work, or be meshed together depending on which ones work best in what circumstance.

Since as early as 2005, Google has been buying massive amounts of dark fiber (i.e. fiber in the ground that has not been “lit” yet with optical devices on each end). After the tech bubble, which resulted in a global overbuilding of fiber networks, the fiber market bubble burst, which made dark fiber dirt cheap when companies like WorldCom, Global Crossing and PSINet and others went bankrupt.

This August Google invested $300m in a trans-Pacific undersea cable with Chinese, Japanese and Singaporean companies. This October, Google announced it was building a new U.S.-Brazil undersea cable system with Brazil to be completed in 2016. The trans-Pacific and Brazil undersea cables are Google’s third and fourth undersea cable investments.

In the last few years Google has globalized its GoogleNet investments in its Internet infrastructure. Google led the world in data center cap-ex with about $28b from 2006-2014. It now has 1,400 global server points-of-presence in 140 or 68% of the world’s countries per USC research that mapped Google’s global serving infrastructure. Google-YouTube also reports that it has localized YouTube on servers in 61 countries in 61 languages.

Add this entire fiber infrastructure together and it suggests Google already has assembled its own de facto private Internet backbone that handles traffic that could rival the traffic routed by a Tier I backbone provider.

Google has also invested more than any entity to create the only proprietary global Internet “phone/address book” of Internet addresses, the functional, economic, and market power equivalent of the old Bell system phone book and yellow pages, but this time for the whole world and all devices with an Internet address. In 2012, Google claimed to be the world’s leading domain name service (DNS) resolver handling 70b requests daily. Google also offers a Cloud DNS service.

Google is also experimenting with various local ISP access technologies.

The best known is Google Fiber which has build-outs in Kansas City MO/KS, Provo UT, and Austin TX. Google has also targeted nine metro areas and 34 cities for more 1 Gigabit local access build-outs: Nashville TN, Phoenix AZ, Portland OR, Raleigh-Durham NC, Salt Lake City UT, San Antonio TX, San Jose CA, and Atlanta GA.

Here it is important to discuss Google’s various technological solutions and efforts to create a free large-scale, WiFi-based cloud network to disrupt and ultimately replace paid-ISP service.

It is important to note that Google’s new SVP for Access and Energy Craig Barratt is from a wifi-wireless chip background and not a traditional ISP background of any kind. It is also important to note that Google publicly reminds us “We don’t make money from peering or collocation,” because Google makes its money from advertising.

The lesser-known effort to Google Fiber was Google’s acquisition of Alpental Technologies, which is a 60MHz wireless technology that can provide wireless connections of up to a mile at potential speeds of seven Gigabits a second. The founders describe Alpental’s technology as “self-organizing, ultra-low power gigabit wireless technology” that can extend the reach of fiber to create WiFi networks.

A potential game-changer here is that the Alpental technology, leverages a new Android application, probably a peer-to-peer approach, which automatically transfers a user to its WiFi hotspots whenever they come in range, an operational attribute similar to seamless handoffs on wireless cellular networks.

Google is also working with Ruckus Wireless “trialing a new software-based wireless controller that virtualizes the management functions of the Wi-Fi network in the cloud… The end result would be a nationwide — or even global — network that any business could join and any Google customer could access,” per Gigaom.

This analysis of Google’s global GoogleNet plans would not be complete without mentioning the  potential for a Google acquisition of, or partnership of some kind, with SoftBank’s Sprint.

Google does not need to acquire a company to reap most of its integration benefits. Google Chairman Eric Schmidt uses the term: “merge without merging.” The web allows you to do that, where you can get the web systems of both organizations fairly well-integrated, and you don’t have to do it on an exclusive basis.”

Something could be afoot at Sprint with Google. To start with, Softbank and Google have long had exceptionally close leadership ties and aligned interests – documented in detail here.

It is unlikely that Softbank’s CEO Son would have been able to poach Google’s Chief Business Officer, Nikesh Arora, to be SoftBank’s Vice Chairman, and it is not likely that Google would have paid Mr. Arora his full-term bonus that was not due to him contractually upon his departure, if there was not something else going on in this close strategic relationship.

Last April, Amir Efrati of The Information reported that Google was talking to wireless providers about an MVNO wholesale relationship to provide Google with wireless services. In that context, it is noteworthy that Mr. Arora just joined Sprint’s board.  

In addition to close ties between Softbank and Google, Sprint needs rescuing or a big long-term wholesale contract, and Google could do that and put Sprint’s woefully-underutilized, and massively-WiFi-compatible spectrum holdings to work better and more fully than any other entity could.

In short, no other entity is as serious and determined as Google to create a global de facto shadow Internet of global information, connectivity, and computing — soonest.


Conclusion: GoogleNet Dominance Depends in Part on Net Neutrality Zero-Pricing

Google effectively defines net neutrality as a permanent Government-set price of zero for all downstream Internet traffic to the consumer.

Why Google has been the real power behind-the-scenes pushing for net neutrality zero-pricing is that Google dominates downstream Internet traffic to users.

Google’s cloud client-server model — of ad-serving, video streaming, software on demand, App downloads, and cloud-computing services – all involves sending vastly more downstream traffic to American and international users than those users send upstream to Google.

Consider Google-world-leading stats to grasp how much downstream Internet traffic Google alone generates, and how much users subsidize Google profits when Google does not have to pay for much of the costs of its Internet downstream traffic.

Per Deepfield research: 60% of Internet devices and users exchange traffic daily with Google’s servers; >50% of websites’ traffic involves Google analytics, hosting and ads daily; and ~25% of the Internet’s daily traffic is Google.

A billion users receive very bandwidth-intensive videos from Google-YouTube, maps from Google Maps, and content via Google’s Chrome browser. Google uniquely serves display ads to two million websites.

No other entity in the world generates this amount of downstream Internet traffic because Google alone controls five of the world’s six billion-user web platforms.

GoogleNet’s ambition to be the global multi-party-video-conferencing network via Google Hangouts, means that Internet users will help fund Google’s dominance whether or not they use Google’s services at all.

At core, zero-price-defined net neutrality provides Google a substantial anti-competitive advantage where they can shift their Internet infrastructure cost obligations to users and infrastructure providers and Google can then provide free or near free global connectivity as a way to disrupt, disintermediate, and commoditize physical-world industries’ direct relationship with their customers (like ISPs, energy utilities, automobile manufacturers, big-box stores, banks, package delivery services, realty, and their networks, vehicles, inventory, ATMs, credit cards, appliances, devices etc., just like Google has been disintermediating and commoditizing the content, app and software industries’ direct relationship with their customers via free cross-subsidized products and services.

The only way that Google’s Playbook works is if it can use “openness” as a way to offer free, or near-free, offerings that can drive rapid adoption and that keep Google’s operational costs lowest. Pushing for zero-price-defined net neutrality clearly fits this bill as it shifts most of the Internet infrastructure costs Google causes off of Google and onto consumers and potential competitors.

To sum up, a global GoogleNet that provides free, or near-free, universally accessible and useful Internet access and mobile connectivity, in order to offer more content, products and services for free that consumers currently pay for, enables Google to disrupt, disintermediate, and commoditize most all of Google’s potential competitors — before they even know what hit them.

In a word, a global GoogleNet could become the quintessential essential facility.




Googleopoly Research Series

Googleopoly I: The Google-DoubleClick Anti-competitive Case – 2007

Googleopoly II: Google’s Predatory Playbook to Thwart Competition – 2008

Googleopoly III: Dependency: Crux of the Google-Yahoo Ad Agreement Problem – 2008

Googleopoly IV: Google Extends its Search Monopoly to Monopsony over Info — 2009

Googleopoly V: Why the FTC Should Block Google-AdMob – 2009

Googleopoly VI: Seeing the Big Picture: Google’s Monopolizing Internet Media –2010

Googleopoly VII:  Monopolizing Location Services – Skyhook is Google’s Netscape –2011

Googleopoly VIII: Google’s Deceptive and Predatory Search Practices – 2011

Googleopoly IX: Google-Motorola’s Patents of Mass Destruction — 2012

Googleopoly X: Google’s Dominance is Spreading at an Accelerating Rate — 2013

Googleopoly XI: A Satire: Grading Google’s Search Antitrust Remedies in EU Test – 2013

Googleopoly XII:  Google-YouTube’s Internet Video Distribution Dominance – 2013

Googleopoly XIII: Let’s Play Pretend: a Satire of Google’s Second EU Search Remedy Proposal 2013

Googleopoly XIV: Google’s WorldWideWatch over the WorldWideWeb [9-14]


[Originally published at Precursor Blog]

Categories: On the Blog

Heartland Daily Podcast – Jessica Sena: Challenges Associated with Increasing Oil and Natural Gas Production

November 18, 2014, 8:00 PM

Hydraulic fracturing has unleashed a boom in both oil and natural gas production which has caused the United States to become the largest producers of oil and natural gas in the world. While hydraulic fracturing has been used since 1947, its wide-scale application caught many industries and policymakers by surprise. Even former Federal Reserve Chairman Alan Greenspan suggested the United States increase its imports of liquid natural gas (LNG) to keep natural gas prices low. As a result, certain sectors are experiencing some “growing pains” associated with competing with the energy sector for transportation services such as trucking and hauling freight by train.

One of the sectors that has felt some of this pain is agriculture, as trains hauling frac sand and oil to and from North Dakota have resulted in delays in the transportation of grain and other agricultural products. Research Fellow Isaac Orr and Jessica Sena from the Montana Petroleum Association discuss some of the challenges associated with the “growing pains” of increasing oil and natural gas production in the United States, and some of the potential solutions, including building more pipelines.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog