On the Blog

A European Revolution Against Google’s Virtual Colonialization?

Somewhat Reasonable - 5 hours 22 min ago

The European Parliament reportedly is scheduled to vote this week on a political non-binding resolution urging the European Commission to “enforce EU competition rules decisively” against search engines, i.e. Google.

What is going on?

In a nutshell, this vote has three big effective implications. It is a political revolt and declaration of Independence from Google’s virtual hegemony. It is a rejection of former EC Vice President Almunia’s gross mishandling of the Google competition case. And it is a vote for a European “single digital market” to promote European economic growth and job creation.

A Political Revolt & Declaration of Independence

First, this European Parliament vote is effectively a political revolt and a EuropeanDeclaration of Independence from Google’s de facto virtual colonialization of the European digital marketplace and Google’s exceptional data dominance, and hegemony, over Europeans’ personal information.

Google dominates >90% of the European search market, and apparently has abused its search dominance to proliferate Google’s dominance over personal data, digital advertising, mobile, maps and other key European digital markets.

Personal data “is the new currency of the Internet” and Google is “a business with a huge, huge, huge market share,” testified Margrethe Vestager, the new EC Vice President for Competition before the European Parliament, who has sweeping enforcement power over how Google competes, operates, and transacts in the European Union. Vice President Vestager added that “particular alertness is needed to ensure that dominant players respect the rules” in the digital sector.

The European Parliament’s vote serves as a political referendum on Google’s treatment of European citizens as virtual vassals of a virtual sovereign Google – without any real rights to know, own or control their private data that Google routinely seizes without European consumers’ meaningful consent.

Europeans appreciate that Google has repeatedly defied Europe’s data protection authorities’ enforcement that Europeans have a right to opt out of Google’s consolidation of their personal data across sixty Google services.

They also are aware of Google’s passive-aggressive opposition to the European High Court’s right to be forgotten ruling by effectively encouraging Europeans to evade the Court’s decision by going to Google.com rather than their own countries domain. And they appreciate Google’s strong opposition to storing Europeans’ private data physically in the EU under EU sovereign jurisdiction.

Rejection of VP Almunia’s Mishandling of Google Competition Case

Second, this European Parliament vote is also an opportunity for Europe’s leaders to formally and strongly repudiate the mishandling of the EC’s Google competition case over the last four years by the former EC VP for Competition Joaquin Almunia. VP Almunia repeatedly refused to prosecute Google despite his repeated public warnings that he would prosecute Google for its alleged four abuses of its dominance.

Mr. Almunia’s proposed Google settlement mimicked the U.S. FTC’s weak Google settlement, which was based on weaker U.S. antitrust law.

It also appeared to be a unilateral surrender of European sovereignty to Google’s virtual sovereignty because: first it did not conclude Google’s >90% share was in fact dominant; second it did not conclude Google did anything wrong despite its findings of four abuses of dominance; third it would impose no fine or sanction that would deter future abuses; fourth it would provide no relief for those harmed by Google’s many abuses; and finally it would have handcuffed future EC enforcement going forward by shutting down any additional Google search investigation for five years despite growing search complaints of Google abuses.

Support for a European Single Digital Market

In order to form a more perfect European Union, one of the top priorities of the new European Commission is the creation of a European “single digital market” in order to generate economic and job growth in the years ahead.

EU nations, including their citizens, businesses, and media, now understand that for all practical purposes they are virtually-dependent on Google to an exceptional extent to find and monetize information online, and for many mobile and map services.

As effective digital colonies of Google, their exceptional online dependency on Google is in stark contrast to other sovereign nations which have chosen to develop indigenous search engines like: the Czech Republic (Seznam has 45% share), South Korea (Naver has 72% share), China (Baidu has76% share), and Russia (Yandex has 62% share).

Simply, this vote for a single digital market is about Europe reclaiming sovereign control over Europe’s economy, culture, and data protection, because Google has effectively seized virtual sovereign control over much of Europe via its dominance, abuses of dominance and seizure of private data without the meaningful consent of individual Europeans.

In sum, Europe’s political leaders are wisely creating political unity and legitimacy around addressing the grand problem of reining in Google’s proliferating dominance and abuses of dominance before the European Commission seriously challenges Google with legal enforcement action in the months and years ahead.

European leaders understand this effectively is an epic battle over who will rule sovereign over Europe’s virtual world and digital marketplace, Europeans or Google?

It also is a fight that neither Europe nor Google can afford to lose.

No one should expect Google to acquiesce. It is antithetical to everything in their being.

With this upcoming vote for European political consensus around Europe’s single digital market, Europe is making it much harder for Google to continue its longstanding chauvinism and imperiousness that Google knows better than Europeans what is best for Europe and Europeans.

Categories: On the Blog

Dear Northeast, How’s That Solar Working Out For Ya?

Somewhat Reasonable - 5 hours 39 min ago
A couple of months ago, effective in November, National Grid, one of Massachusetts’ two dominant utilities, announced rate increases of a “whopping” 37 percent over last year. Other utilities in the region are expected to follow suit. It’s dramatic headlines like these that make rooftop solar sound so attractive to people wanting to save money. In fact, embedded within the online version of the Boston Globe story: “Electric rates in Mass. set to spike this winter,” is a link to another article: “How to install solar power and save.” The solar story points out: “By now everyone knows that solar power can save homeowners big money on utility bills.” It claims that solar works even in New England’s dreary winters and cites Henry K. Vandermark, founder and president of Solar Wave Energy in Cambridge, as saying: “Even snow doesn’t matter if your panels have a steep angle. It just slides right off them.”   Solar is not the panacea it is promoted to be, though it is true that—after a substantial investment, heavy government subsidies (funded by all taxpayers), and generous net-metering programs (that raise costs for non-solar customers)—solar systems can save money on the typical homeowners’ monthly bill. (An unsubsidized system averages about $24,000.)   New England has seen one big power plant close within the past year—Salem Harbor Power Station in Salem, Massachusetts went “dark” on June 1, in part due to tightening federal regulations. Another major closure will take place within weeks: Vermont Yankee nuclear plant.   A new, state-of-the-art natural gas plant on 18 acres of the 65-acre Salem site will replace the Salem Harbor plant. The remaining 47 acres will see redevelopment, including renewable energy. But, that plan has received pushback from environmental groups that want it fully replaced with renewables. The Boston Globe states: “A decade ago, replacing the aging plant with a far cleaner natural gas facility would have thrilled environmental and public health advocates.” The Conservation Law Foundation filed a lawsuit against the project’s approval, claiming the state “failed to adequately consider its own climate change law when state energy officials approved the Salem plant.” In February, the group settled the suit after it caused construction delays and reliability concerns.   Just days before the plant closed, a report from The Daily Climate addressed the controversy over usage of the Salem Harbor site: “Many activists pushed back, arguing for wind or solar generation or non-energy uses, such as a marine biotechnology research facility.” One activist group: HealthLink, “has marshaled opposition to running a gas line to the new plant” and another: Grassroots Against Another Salem Plant (GAASP), “has pledged to use peaceful civil disobedience to block construction of the gas plant.” The state of Massachusetts has offered three closed, or scheduled to be closed, coal-fueled power plant sites $6 million to pursue renewable energy projects—even though wind and solar require full back up from fossil fuel power plants so electricity is available in the frigid Northeast winters. Additionally, a new report from two Stanford Ph.Ds., who spent 4 years trying to prove renewables can, ultimately, replace fossil fuels, have had to admit defeat: “Renewable energy technologies simply won’t work; we need a fundamentally different approach.” Having lived with the 63-year old Salem Harbor plant in her back yard for 20 years, Linda Haley, doesn’t, according to WGBH News, “understand why Salem would encourage use of a non-renewable fossil-fuel resource like natural gas when alternative investments in green technology finally seem possible.” These stories reveal the snow job that has been perpetuated on the general public regarding renewable energy. They don’t understand the need for power or how it works. They seem to believe that when a rule passes a magic wand waves replacing older, but still fully functional, power plants with wind or solar—that doesn’t produce electricity 24/7/365 as do the decommissioned coal or nuclear plants and which requires far more land to produce the same amount of, albeit intermittent, electricity. An iced up wind turbine or a solar panel covered in seven feet of snow—even if some of it slides off—doesn’t generate electricity. And the cold days of a Northeast winter create one of the times when energy demand peaks. Remember last winter’s polar vortex, when freezing weather crippled the Northeast for days and put a tremendous strain on the electric supply? Congress, following the near crisis, brought in utility executives to explain the situation. Regarding the nation’s electrical output last winter, Nicholas Akins, the CEO of the biggest generator of coal-fueled electricity in the U.S., American Electric Power (AEP), told Congress: “This country did not just dodge a bullet—we dodged a cannon ball.” Similarly, Michael Kormos, Executive VP of Operations for PJM Interconnection (the largest grid operator in the U.S. overseeing 13 states), commented on operations during the polar vortex: PJM was “never—as some accounts have portrayed—700 megawatts away from rolling blackouts. … On the worst day,January 7, our next step if we had lost a very large generator would have been to implement a small voltage reduction”—industry speak for the last option before power outages. About last winter’s grid reliability, Glenn Beck claims: “I had an energy guy come to me about three weeks ago. …He said, ‘We were one power plant away from a blackout in the east all winter long… We were using so much electricity. We were at the top of the grid. There’s no more electricity. We’re at the top.’”   This winter’s extreme weather—with new records set for November power demand—has already arrived. Come January, there will be not one, but two fewer Northeast power plants since last year—not because they had to be retired, but because of EPA regulations and public sentiment. In a November 17 op-ed, former Senators Bayh (D-IN) and Judd (R-NH) said: “Vermont Yankee produced 26 percent of New England’s power during the peak of last year’s frigid weather.” The Northeast won’t have Vermont Yankee’s power this January.   Without these two vital power plants, what will the Northeast do?   For several months, since I had a chat with Weather Bell Analytics’ Joe Bastardi at theInternational Conference on Climate Change, I’ve continued to say that I fear people will have to die due to power outages that prevent them from heating their homes in the winter cold, before the public wakes up to the damage of these policies. AEP’s Atkins seems to agree. He told Columbus Business First: “Truth be known, something’s probably going to have to happen before people realize that there is an issue.”   “New England is in the midst of an energy crisis,” claims WGBH News. The report continues: “residents and businesses are facing a future that may include ‘rolling blackouts’ on days when usage is highest.”   ISO New England, the agency that oversees the power grid, warns, in the Boston Globe: “Boston and northeast Massachusetts are ‘expected to face an electricity capacity shortage’ that could lead to rolling blackouts or the use of trailer-mounted diesel generators—which emit far more pollutants than natural gas—to fill the gap.” Ray Hepper, the lawyer for ISO New England, in a court filing, wrote: “The ISO simply cannot make megawatts of generation materialize that are not on the system.” In an interview, he added: “We’re really, as a region, at the point of needing new power plants.”   As the Salem Harbor story illustrates, natural gas will likely fuel those new power plants and environmental groups are expected to challenge construction. Plus, natural gas faces cost volatility. On November 20, the Wall Street Journal (WSJ), in the wake of November cold, not experienced since the 1970s when global cooling was predicted, featured an article titled: “Chill pushes up natural-gas prices” that stated: “Natural-gas stockpiles shrank by more than expected last week reflecting surging demand.” As in the ’70s, many are now projecting, based on solar activity and other natural variables, a long global cooling trend. While the Boston Globe, in September, said: “The upcoming winter is not expected to be as cold as last season,” Bastardi told me otherwise. He said: “This winter could be as cold and nasty as last year and in a worst case go beyond that to some of the great winters of the late 1970s, lasting all the way into April. As it is, we still have a winter comparable to last year forecasted, though the position of the worst, relative to averages, may be further southeast than last year.” During a November 19 appearance with Neil Cavuto, Bastardi suggested that we may see a bit of warming after November, but will have one, or two, very cold months after that.   The WSJ quoted Brian Bradshaw, portfolio manager at BP Capital in Dallas: “‘Everyone thinks it’s not possible’ to have another winter like last year ‘But the weather does impossible things all the time.’” WSJ added: “the natural-gas market is setting up for a repeat of last winter.”   So, why, when natural gas prices sit at historic lows that experts predicted will lower electricity rates, is the Northeast facing double-digit increases? The answer: there is no magic wand. The changes have been mandated, but the replacements aren’t ready yet. Ray Gifford, former commissioner with the Colorado Public Utility Commission, told me: “I don’t see how the gas infrastructure in New England can be built fast enough to replace retiring baseload capacity.”   Within the past decade, natural gas went from supplying less than a fifth of New England’s power to one half—which could be great if New England had natural gas, but it is, as Tim Maverick, Commodities Correspondent for Wall Street Daily, says: “gas-starved.” After last winter’s freezing weather, Maverick wrote: “The Northeast was slapped in the face with the reality that there’s not sufficient pipeline infrastructure to provide it with the mega-energy pull it draws in the colder season. This is probably because not one new pipeline infrastructure has been introduced in over 40 years. Natural gas consumption in the Northeast has grown more than 20% in the last decade, and not one new pipeline has been built. Current pipelines are stuffed and can carry no more supply.”   At the Edison Electric Institute financial conference on November 11, AEP’s Atkins confirmed that the proposed timeline to cut pollution from the EPA will shutter coal plants before completion of construction of new power plants using other fuels, or the infrastructure to move the needed natural gas around.   The lack of available supply, results in higher prices. The Boston Globe explains: “gas supplies for home heating are purchased under long-term contracts arranged far in advance, so utilities have the advantage of locking in lower rates. Power plants, on the other hand, often buy shorter-term and are more exposed to price movements in the spot markets.” In the winter’s cold weather, the gas goes to people’s homes first. Different from coal, which is shipped by train, with a thirty-day supply easily held at the point of use, the switch to natural gas leaves power plants struggling to meet demand, paying higher prices.   Addressing the 2013/2014 winter, Terry Jarrett, a former public service commissioner and a nationally recognized leader in energy, utility, and regulatory issues, said: “Natural gas couldn’t shoulder that burden, due in part to a shortage of infrastructure to deliver gas where it was needed—this despite record-setting production in the Marcellus Shale and elsewhere. But more importantly, whereas coal’s sole purpose is to generate electricity, natural gas is also used for home heating. And when push comes to shove, heating gets priority over generation.”   Last winter, coal and nuclear met the demand to keep the lights on and heat homes and businesses. AEP reports that 89 percent of its coal plants, now slated for retirement, ran at capacity just to meet the peak demand.   These shortages in the Northeast occur before the implementation of Obama’s Clean Power Planthat experts believe will shut down hundreds of coal-fueled power plants nationwide by 2016. New pipelines and new plants need to be built, but “not-in-my-backyard” attitudes and environmental activists will probably further delay and prevent construction as they have done in the Northeast, which will result in higher electric bills nationwide.   “Because less-expensive coal generation is retiring and in part is being replaced by demand-response or other potential high energy cost resources, excess generation will narrow and energy prices could become more volatile due to the increasing reliance on natural gas for electricity generation,” PJM’s Kormos told Congress.   The lessons for America’s energy supply learned from the Northeast’s far-reaching experiment, that has only resulted only in price increases and potential energy shortages, are twofold. First, don’t shut down existing supply until the replacement is ready, as legal action and local attitudes can slow its development. Second, you can cover every square inch of available land with wind and solar, but when extreme weather hits, it requires a reliable energy supply, best met by coal and nuclear.   Current policy direction will have all of America, not just the Northeast, freezing in the dark. I hope it can it be turned back before it is too late.     [A version of this content was originally published at Breitbart.com]

 

Categories: On the Blog

The Government Assaults ‘Big Dogs’ – To Advantage the Biggest Dog of Them All

Somewhat Reasonable - 6 hours 40 min ago
One of the largest myths going is that government helps the Little Guy.

On it’s face this is patently absurd.  More government – taxes and/or regulations – raises the costs of everything for everyone.  The Big Guys are far better equipped to absorb the punishment – while the Little Guys are pummeled into un-existence.

Then there’s the Crony Socialism – it’s not Crony Capitalism, because it has very little to do with capitalism.  Wherein Big Guys – who have the wherewithal – bend government policy to their will.  To their advantage – and against that of the Little Guys seeking to compete with them.  For instance:

Green Scam: 80% of Green Energy Loans Went to (President Barack) Obama Donors

Crony Socialists Looking to Ban Online Gambling Don’t Seem to Realize It’s a WORLD WIDE Web

Obama Donor’s Firm Hired to Fix Health Care Web Mess It Created

Obama Crony Wins Contract to Give Phones to Jobless

Obama’s United Auto Workers Bailout

Which brings us to the ridiculous Network Neutrality political rhetoric being extruded by the Obama Administration.

President Obama his own self recently said this:

“(N)et neutrality”…says that an entrepreneur’s fledgling company should have the same chance to succeed as established corporations….

Then there’s Tom Wheeler, the Chairman of the President’s allegedly politics-free, independent Federal Communications Commission (FCC).

FCC Chief on Net Neutrality: ‘The Big Dogs Are Going to Sue, Regardless’

First – why are these lawsuits inevitable?  Because the FCC has already twice unilaterally imposed Net Neutrality – and twice the D.C. Circuit Court has unanimously overturned the orders as outside the bounds of their authority.

Rather than complaining about additional suits to again fend off the Leviathan – perhaps the Leviathan should pull in its tentacles.  Especially when it has already had two lopped off by Courts.  As Jonah Goldberg has said: Don’t just do something – stand there.

But wait a minute – which “Big Dogs” does Wheeler mean?  The Internet Service Providers (ISPs) government intends to yet again assault.

To be sure, Verizon, Comcast, AT&T, et. al are big companies.

Verizon: ~ $207 billion.

Comcast: ~ $140 billion.

AT&T: ~ $183 billion.

But they aren’t looking for Crony Socialist favors from government – merely protection from its monumental overreaches.

Then there’s this plucky little upstart for whom the Obama Administration is fighting.

Google: ~ $370 billion.

Get that?  Google is bigger than Verizon and Comcast – combined.

Google has spent the last decade-plus shoving Net Neutrality down our throats.

Google…Support(s) Net Neutrality, Call(s) For Extension To Mobile Providers

Google has uber-generously funded pro-Net Neutrality Leftist efforts.  It twice helped President Obama get elected.  Google CEO Eric Schmidt was one of the first Obama Administration “adviser” hires.

The relationship really is that syrupy:

Obama & Google – A Love Story

So this isn’t a galloping shock:

Who Wins the Net Neutrality Debate? Google, of Course

No matter how the FCC rules next year, Google can move forward with fiber rollouts, even if they’re restricted, because it will still be earning far-healthier revenues from carrying content.

Google’s two-pronged strategy has been obvious for a long time, but lately it has looked genius given the net neutrality battle….

(I)t’s a strategy only a very large company could undertake….

Get that?  Google is more than Big Guy enough to absorb the government hit – the Little Guys looking to compete with them aren’t.

“It’s a strategy only a very large company could undertake” – using government to make the marketplace untenable for anyone but themselves.

Creating for Google a for-all-intents-and-purposes government-mandated monopoly.

The very thing the Obama Administration – with its gi-normous Internet overreach – alleges it is attempting to address/prevent.

To paraphrase George Orwell: All monopolies are equal – but some are more equal than others.

To paraphrase Franklin Delano Roosevelt: Google will be a son-of-a-bitch monopoly – but it’ll be our son-of-a-bitch monopoly.

Don’t be evil.”  Enjoy the Crony Socialism, All.

[Originally published at Human Events]

 

Categories: On the Blog

Gates and Pearson Partner to Reap Tens of Millions from Common Core

Somewhat Reasonable - 14 hours 59 min ago

Follow the money. It all ends up in the hands of a very few. Pearson Foundation is getting the contracts because of its partnership with the Bill Gates Foundation. Greed, secrecy, deceptions, and lies …. and to think Democrats accuse Republicans of the very things, while Democrats are the ones using government to get richer. The deceptions run very deep. It’s time for exposure.

The saga begins on one summer day in 2008, when Gene Wilhoit, director of a national group of state school chiefs, and David Coleman (known as the architect of Common Core), knowing they needed tens of millions of dollars and a champion to overcome the politics that had thwarted previous attempts to institute national standards, approached Bill Gates at his headquarters near Seattle, to convince Gates and his wife to sign on to their idea.  Gates, upon asking if states were serious about common educational standards, was assured that they were. Gates signed on and the remarkable shift in education policy know as Common Core was born.

The Gates Foundation has spent over $170 million to manipulate the U.S. Department of Education to impose the CSSS, knowing it would realize a return on this investment as school districts and parents rush to buy the technology products they’ve been convinced are vital to improving education.  Bill Gates’ Microsoft will make a fortune form the sale of new technology products.  According to the Gates Foundation, CCSS is seen as a “step to greater excellence in education.”

On April 27, 2011 the Gates Foundation joined forces with the Pearson Foundation, a British multi-national conglomerate, representing the largest private business maneuvering for U.S. education dollars. Pearson executives saw the potential to secure lucrative contracts in testing, textbooks and software worth tens of millions of dollars.

Its partnership with the Gates Foundation was to support America’s teachers by creating a full series of digital instructional resources. Online courses in Math and Reading/English Language Arts would offer a coherent and systemic approach to teaching the new Common Core State Standard. The aim: To create an online curriculum for those standards in mathematics and English language arts that span nearly every year of a child’s pre-collegiate education. This aim has already been realized and is in practice in Common Core states.

The Pearson and Gates foundations also fund the Education Development Center (EDC) based in Waltham, Massachusetts. It is a global nonprofit organization that designs teacher evaluation policy.  Both stand to benefit from EDC recommendations. The center is involved in curriculum and materials development, research and evaluation, publication and distribution, online learning, professional development, and public policy development.

Its alignment with the Gates Foundation and Common Core, Pearson dominates the education testing and is raking in profits as school districts are pushed to replace paper textbooks with digital technology.  For example, the Los Angeles school system with 651 students, spent over $1 billion in 2013 to purchase iPads from Pearson.  Additionally, The Los Angeles school purchased Pearson’s Common Core Systems of Courses to provide all the primary instructional material for math and English/language arts for K-12, even though the material were incomplete in 2013.

Pearson’s profits will continue to increase as it has billions of dollars in long-term contracts with education department in a number of states and municipalities to introduce both testing software and the teacher training software and textbooks it claims are necessary to prepare for the tests. For example, Illinois has paid Pearson $138 million to produce standardized tests; Texas, $50 million; and New York, $32 million.

Pearson is really raking in the dough now that Pearson VUE, the assessment services wing of Pearson, has acquired examination software development company Exam Design.  CTS/McGraw-Hill is Pearson’s main competitor in the rise of standardized testing.

Corporations finding they can profit from turning students into unimaginative machines, are newly discovering they can likewise profit from standardizing teachers as well. Starting in May 2014, Pearson Education will take over teacher certification in New York State as a way of fulfilling the state’s promised “reforms” in its application for federal Race to the Top money. The evaluation system known as the Teacher Performance assessment or TPA was developed at Stanford University with support from Pearson, but it will be solely administered and prospective teachers will be entirely evaluated by Pearson and its agents.

A small cloud did fall over the Pearson Foundation (the nonprofit arm of educational publishing giant Pearson Inc) in December of 2013, when a $7.7 million fine was levied for using its charitable work to promote and develop course materials and software to benefit its corporate profit making.  After the investigation begun, Pearson Foundation sold the courses to Pearson for $15.1 million.

New York Attorney General Eric T. Schneiderman determined that the foundation had created Common Core products to generate “tens of millions of dollars” for its corporate sister. According to the settlement: “Pearson used its nonprofit foundation to develop Common Core product in order to win an endorsement from the Bill and Melinda Gates Foundation, which helped fund the creation of the Common Core standards, having announced in 2011 that it would work with the Pearson foundation to write reading and math courses aligned with the new standard.”

Since Pearson is the world largest education company and book publisher, with profits of more than $9 billion annually, the $7.7 million fine was not a hardship. Pearson, wasn’t always so big.  As a British multinational corporation Pearson was just starting out in the early 2000’s. Pearson started to grow when it embraced No Child Left Behind as its business plan and began rapidly buying up U.S. companies.

On June 10 of this year, The Bill & Melinda Gates Foundation announced its support for a two-year moratorium on tying results from assessments aligned to the Common Core State Standards to teacher evaluations or student promotions to the next grade level.

Although the Gates Foundation’s director of college-ready programs stated how Common Core was having a very positive impact on education, teachers do need more time to adjust.

The moratorium was enacted when on June 9, Diane Ravitch, research professor of education at New York University and author of “Reign of Error,” sounded the alarm over the implementation of Common Core and called for a congressional investigation, noting, “The idea that the richest man in [the U.S.] can purchase and — working closely with the U.S. Department of Education — impose new and untested academic standards on the nation’s public schools is a national scandal.”

It would be folly to suggest that either Bill Gates or Pearson, despite the temporary tactical retreat by Gates will not keep pushing for Common Core with its required educational technology. This nation spends over $500 billion annually on K-12 education.  When colleges and career-training programs are included, the education sector represents almost 9 percent of the U.S. gross domestic production.  Companies like Pearson and Microsoft stand to greatly profit as they develop and administer the tests and sell the teacher-training material.

It is not unreasonable to suspect that companies like Pearson stand to gain when tests designed to measure Common Core State Standards make most public schools look bad.  Counting on widespread failure of the Common Core State Standards, school districts and parents will be pushed to purchase even more training technology, teachers in low-ranked schools will be fired, and school will be turned over to private management.

As a text book manufacturer, Pearson Education buckled to the activists demands in Texas and replaced the scientific understanding of climate change with the politically driven claim that humans are causing climate change.    Because Texas is a large state, it does have influence on the national textbook market.

Might Common Core State Standards be the latest in the grand corporate scheme to profit from privatized public education?  In the interim, Bill Gates’ Microsoft and Pearson reap big CCSS profits.  Certainly neither teachers nor students are benefiting.

Categories: On the Blog

Heartland Daily Podcast: Sean Parnell – Obamacare After the Midterms and Gruber Comments

Somewhat Reasonable - November 24, 2014, 12:23 PM

Research Fellow and Managing Editor of Healthcare News Sean Parnell sits down with host Donald Kendal to discuss the latest healthcare news. Parnell talks about the elections impact on Obamacare, the proposed 2017 project and the comments by Jonathan Gruber. 

With the Republicans taking control of congress, what is the likelihood of seeing a repeal of Obamacare? What is the 2017 project, and does it have a chance of replacing Obamacare? These questions and more are answered in today’s Podcast.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog

If We Had Some Global Warming … (Song Parody)

Somewhat Reasonable - November 22, 2014, 8:03 PM

Via Minnesotans for Global Warming

This video below is from back in 2007 by the Freezing Emperors from Minnesotans for Global Warming — who wrote and produced the most-famous global warming parody song of all time, “Hide the Decline,” which Michael Mann tried to get taken down from YouTube with limited success.

Someone shared the song below with Heartland tonight on Twitter, and I don’t know what took them so long! It a song called “If We Had Some Global Warming,” a parody of “If I Had a Million Dollars” by Barenaked Ladies.

I think it is even colder now than in 2007 — at least judging by last winter in the Midwest, and the early arrival of another dread “polar vortex” this month. So, yeah. We could use some global warming right about now. The experts keep promising it, but when will it arrive!

Enjoy. This song it’s cheeky fun.

Oh, what the heck. Watch the great “Hide the Decline,” too. The warmists will not be mocked! … or at least that’s what they want. Happily, we still have free speech in this country, and the Internet is forever. Sorry, Mike.

Categories: On the Blog

Heartland Daily Podcast: Drew Johnson – Proposed Global Tobacco Tax

Somewhat Reasonable - November 20, 2014, 4:41 PM

Washington Times columnist and editor Drew Johnson joins The Heartland Institute’s Budget and Tax News managing editor Jesse Hathaway to talk about the World Health Organization’s (WHO) “Article 6,” a proposed global tax aimed at making tobacco products prohibitively expensive.

Johnson, ejected from covering the public meetings for reporting on the WHO proceedings in Moscow, talks about the United Nations health organization’s misguided priorities, and the undemocratic nature of the proposed tariff rules.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog

The U.N., The Ultimate Narc

Somewhat Reasonable - November 20, 2014, 3:57 PM

Last week, the U.N. ant-narcotics chief, Yury Fedotov, made headlines when Reuters reported he said moves by American states to end the prohibition on marijuana were illegitimate due to existing international drug conventions. He added that he may take action against these states as well.

The drug conventions mentioned by Fedotov are the 1961 convention on narcotic drugs. This 50+ year old agreement limits the production and consumption of cannabis to only medical purposes. So, the dozens of states that have passed medical marijuana laws are still in compliance. However, Colorado, Oregon, Washington, Alaska and D.C. passed laws through ballot initiatives that legalized marijuana for recreational use. These are the cases that caught the attention of Fedotov. “I don’t see how (the new laws) can be compatible with existing conventions,” said Fedotov.

But what can the U.N. do to fight this? Luckily not much. In response to question about what the U.N. could do about it, Fedotov stated he would discuss the issue in the near future in Washington.

While Fedotov and the United Nations Office on Drugs and Crime (UNODC) may have no real ability to combat these moves, the hubris alone is overwhelming. To think an international governmental organization like the U.N. can change the policies enacted by individual states in America is frightening. Marijuana prohibition should not be a responsibility of a international governing body. In fact, it should not even be a concern of the federal government.

The actions taken by these states do fly in the face of U.N. drug conventions; they are also inconsistent with federal law. Fortunately for liberty advocates, the federal government has condoned these moves in order to avoid conflict and potential political fallout. Individual states have been allowed the freedom to craft their own recreational drug policies. This, however, does not rule out a reversal on this position in the future.

Hopefully nothing will materialize from all of this. The U.N. should take a lesson from the federal government on this matter and keep its nose out of the business of these states.

Categories: On the Blog

“Where to Watch” Piracy Decrease

Somewhat Reasonable - November 20, 2014, 3:02 PM

The Internet ecosystem just added a new tool to preserve the property of rights holders even while encouraging greater use of broadband. The Motion Picture Association has announced the launch of a new search engine called WheretoWatch.com.

As Variety has reported, “MPAA — upping efforts to help consumers find legal sources of content instead of pirating it — has rolled out WheretoWatch.com, an advertising-free entertainment search engine designed to point people to TV shows and movies from authorized sources. WheretoWatch.com includes info and links from providers including Netflix, Apple’s iTunes, Amazon.com and Hulu as well as smaller sites like SnagFilms and WolfeOnDemand. MPAA said it expects to expand its list of partners in the coming months.”

Great, but what does this have to do with public policy? Rather than relying on another years-long legislative battle, which may fail to reach any sort of resolution, the industry got to work creating a solution to help protect its property. This sort of industry self-help should be lauded and encouraged across the digital ecosystem.

More success will come as all parties understand that they must do their part and that an economically thriving digital ecosystem requires good faith cooperation, within the bounds of the law, with an eye towards what is best for the broader ecosystem. Less infringement combined with great legal choices available in many places for consumers is in the best interest of all.

 

[Originally published at Madery Bridge]

Categories: On the Blog

How Republicans Can Push Back Against Immigration Executive Orders

Somewhat Reasonable - November 20, 2014, 2:21 PM

In a segment on a recent episode of Your World with Neil Cavuto, Heartland Institute research fellow David Applegate outlined the options Republicans can use to push back against Obama’s executive orders on immigration. Applegate says some options won’t yield much but others have the potential to produce results.

In a very revealing montage to begin the clip above, Obama is shown repeatedly saying over four years that he has no legal ability to legislate by executive order in this manner. This seems to have been forgotten in light of recent announcements by the Obama administration. Republicans, however, have a few options that may block these executive orders.

The first two options mentioned by Cavuto may not have much success. As Applegate says, taking the matter to court would likely not work. “Suing in the courts is something that the courts really do not want to handle.” Whatever case there may be would likely be ignored by the courts. Applegate says the option other, impeachment, is a legitimate constitutional option congress has. However, this is also unlikely to go anywhere. Regarding impeachment, Applegate says, “politically that would go nowhere.” There is some hope for the Republicans however.

Applegate says Mitch McConnell and John Boehner need “to realize they still have two very strong cards to play.” One is to use the power of the purse. According to Applegate, Republicans could use the threat to defunding specific government programs to help negotiate against the Obama administration. Another option would be for Republicans to compromise with the president and agree upon a more bipartisan move on illegal immigration.

It will be interesting to see what options are pursued by Republicans in the coming weeks. Stay tuned for more insight and information on this developing situation.

Categories: On the Blog

A Lot of “Folks” And “Just Some Guys”

Somewhat Reasonable - November 20, 2014, 1:07 PM

Apart from his halting, staccato, eight-to-ten-word phrase delivery when not reading off a TelePrompTer, President Barack Obama has two noticeable and telling verbal tics. The first is “folks”; the second is “just some guy.”  The first is just an annoying and apparently insincere way of trying to show that, despite being President, he’s really, you know, just one of us.  But the second is a tell-tale sign that he’s throwing somebody under the bus.

Perhaps “folks” is the way that Harvard-educated lecturers in law at the University of Chicago are taught to talk about their fellow Americans, but I rather doubt it. Having attended an Ivy-league school myself and having studied law at the University of Chicago for three years, I’m pretty sure I never heard the word “folks” once.  Even Tennessee Ernie Ford used “people,” as do the U. S. Constitution’s opening three words, “We the People …” .

Obama uses the word “folks” whenever he wants to sound sage and, well, folksy; usually when about to make a patronizing observation about the American people that justifies, in his mind, his administration’s increasingly one-party top-down style of governing.

That’s just how white folks will do you,“ he wrote in Dreams from My Father: A Story of Race and Inheritance , referring to what he perceived as white arrogance and cruelty.  “These are folks who are strong allies and supporters of me,” he said in an October 20, 2014, interview with Al Sharpton, referring to Democratic candidates who were running away from him in the recent midterm elections.  “We need to internalize this idea of excellence,” he said on another occasion.  “Not many folks spend a lot of time trying to be excellent.”  And, in a particularly portentous and lecturing moment, “Folks haven’t been reading their Bibles.”

Most infamously, Obama awkwardly claimed during an impromptu Friday, August 1, 2014, White House news conference regarding the War on Terror that “We tortured some folks.”  That struck many “folks” as inappropriate, leading one commentator on Twitter to ask incredulously, “Wow.  How does the supposedly rhetorically great Obama use ‘torture’ and ‘folks’ in the same sentence?”

But it’s Obama’s use of “just some guy” that signifies when someone has outlived his usefulness to the president, at least for public consumption.

Obama’s political mentor in Chicago’s Hyde Park neighborhood for many years was American terrorist Bill “Guilty as hell, free as a bird” Ayers, a founder of the radical Weathermen group.  Ayers is widely suspected of having ghost-written at least large portions of Obama’s two books for him, and Obama and Ayers worked closely together on the Chicago Annenberg Challenge, a five-year failed philanthropic venture for which Ayers wrote the grants and Obama chaired the board that distributed the money.   But when Obama ran for President and Sarah Palin called him out for “palling around with terrorists,” Ayers became “just a guy who lives in my neighborhood” who hasn’t been publicly seen in the President’s company since.

Obama’s most recent use of the phrase is in reference to Jonathan Gruber, the now-infamous MIT professor who was one of Obamacare’ s architects.  While working to help get Obamacare passed, Gruber was highly regarded and highly rewarded.  The administration cited Gruber frequently in hearings and White House blogs, dedicated a webpage to his analysis, met with him repeatedly at the White House, and paid him $380,000 of taxpayer money in 2009 alone.

Now that videos have surfaced in which Gruber calls his fellow Americans not “folks” but “stupid” and brags that Obamacare was founded and sold on deliberate lies, Gruber has become just “some adviser who never worked on our staff,” which even Politifact rates as “mostly false.”

Even worse for the administration, perhaps, Gruber is also on record having said that the intent of Obamacare’s design was that, to encourage states to set up health care exchanges, if a state did not do so then its residents would not be eligible for income tax subsidies.  Now that 36 states have declined to set up exchanges and Obama has directed his IRS to provide subsidies anyway, Gruber’s comments have become just a misquoted typo taken out of context and Gruber himself, in the President’s own words, has become just another guy.

It turns out that the Obama administration may be the most transparent in history, just not in the way that it meant.  As Yogi Berra once said, you can observe a lot just by watching.

Categories: On the Blog

Top Ten Questions to Ask About Title II Utility Regulation of Internet

Somewhat Reasonable - November 20, 2014, 10:48 AM

If Congress or the media seek incisive oversight/accountability questions to ask the FCC about the real world implications and unintended consequences of its Title II net neutrality plans, here are ten that fit the bill.

1. Authority? If the FCC truly needs more legal authority to do what it believes necessary in the 21st century, why doesn’t the FCC start the FCC modernization process and ask Congress for the legitimacy of real modern legislative authorities? Or is it the official position of the FCC that its core 1934 and 1996 statutory authorities are sufficiently timeless, modern and flexible to sustain the legitimacy of FCC regulation for the remainder of the 21st century?

2. Growth & Job Creation? While it may be good for the FCC’s own power in the short-term to impose its most antiquated authority and restrictive Title II regulations on the most modern part of the economy, how would that heavy-handed regulation be good or positive for net private investment, economic growth and job creation?

3. Zero-price? Does the FCC seek new legal theories and authority for the purposes of setting a de facto permanent zero-price for some form of downstream Internet traffic, or not?

4. Consumers? How is it neutral, equal or fair under FCC net neutrality regulations for consumers to pay for faster Internet speed tiers/lanes and their Internet usage, but it is somehow a violation of net neutrality for Silicon Valley giants to pay anything other than a price of zero for delivery of their hugely-outsized downstream Internet traffic? (And why would FCC Title II reclassification also not have the unintended consequence of triggering large new fees and taxes on unsuspecting consumers?)

5. UN-ITU? Would the FCC reclassifying Internet traffic as “telecommunications” enable the U.N.’s International Telecommunications Union the legal authority and cover to assert governance over the Internet like it has long had over international telecommunications, and International telecommunications trade settlements? (And in imposing the most restrictive American regulatory regime available to prevent potential problems, wouldn’t the FCC be leading, and giving political cover to, autocratic nations which seek to impose similar maximal regulation of their Internet for the autocratic purposes of censoring, spying on, and controlling their people?)

6. Cost-Benefit? In any potential Title II action will the FCC abide by the President’s 2011 Executive Order 13563 that requires the FCC to use “the least burdensome tools for achieving regulatory ends,” and to “adopt a regulation only upon a reasoned determination that its benefits justify its costs?”

7. Forbearance? Under a “hybrid” (Title II/Section 706) approach, how does the FCC square the circle of the FCC justifying forbearance from most all Title II regulations by showing there is enough competition to protect consumers, while simultaneously justifying reclassification of the Internet as a Title II utility because of insufficient competition to protect consumers

8. Deployment Barriers? Since Section 706 is about removing barriers to broadband deployment, how would Title II Section 214, which requires that the FCC get prior approval to upgrade any “telecommunications” facilities (a process that can routinely takes months at a minimum), not be considered to be a barrier to broadband deployment under Section 706?

9. Internet Backbone? What is different competitively in the Internet backbone market now from the last 20 years of no FCC regulation, that warrants maximal FCC regulation under Title II for the first time since the Internet was privatized in the early 1990s?

10. Supreme Court? Wouldn’t a June 2014 Supreme Court precedent (Util. Air Reg. Grp v. EPA) — that establishes that FCC rules “must be “ground[ed] … in the statute,” rather than on “reasoning divorced from the statutory text” – disallow the FCC from reclassifying services solely for the purpose of evading other statutory provisions that Congress passed to restrict FCC authority?

[First published at the Precursor blog.]

Categories: On the Blog

You’ve Been Gruber’d, Stupid!

Somewhat Reasonable - November 20, 2014, 8:23 AM

“No.  I — I did not.  Uhhh, I just heard about this… I — I get well briefed before I come out here.  Uh, th-th-the fact that some advisor who never worked on our staff, uhh, expressed an opinion that, uhh, I completely disagree with wuh, uhh, in terms of the voters, is no reflection on the actual process that was run.” — President Obama replying to a question about Jonathan Gruber at the conclusion of the G-20 Conference in Brisbane, Australia. 

 Will the last name of the MIT professor identified as the “architect of ObamaCare” become a verb some day? Will people say “I’ve been Gruber’d? or “The government is “Grubering again”?

 After all, when he admitted that ObamaCare’s passage was achieved by deceiving the Congressional Budget Office and the entire American public, turning his name into a synonym for lying is not unthinkable. Adding insult to injury, he said the voters were “stupid.”

 How stupid was it for the Democrat-controlled Congress to pass a two-thousand page piece of legislation that none of them had read? (No Republican in Congress voted for it.) ObamaCare took over one-sixth of the U.S. economy and did something that makes me wonder why we even have a Supreme Court. It required people to buy a product whether they wanted to or not. If they didn’t, they would be subject to a penalty.

 One way of the other, the federal government was going to squeeze you. The Court did conclude early on that ObamaCare was a tax, but don’t expect the mainstream media to tell you about all the other taxes hidden within it.

 What surprises me about the Gruber revelations—available on YouTube to any journalist who wanted to investigate, but none did—is that there appears to be so little public outrage. An arrogant MIT professor who received $400,000 from the government and made millions as a consultant to the states who needed to understand ObamaCare, calls voters stupid and the initial reaction of the mainstream media was to ignore the story.

 At the heart of the Gruber affair is the fact that Obama and his administration has been lying to the voters from the moment he began to campaign for the presidency. In virtually every respect, everything he has said for public consumption has been and is a lie.

 In one scandal after another, Obama would have us believe he knew nothing about it. That is the response one might expect from a criminal rather than a President.

 One has to ask why it would be difficult to repeal in full a piece of legislation that the President said would not cause Americans to lose their healthcare insurance if they preferred their current plan, that would not cause them to lose the care of a doctor they knew and trusted, and would save them money for premiums. The initial deception was to name the bill the Affordable Care Act.

 Repeal would help ensure the solvency of Medicare and restore the private sector market for healthcare insurance.

 This is a President who was elected twice, so maybe Prof. Gruber is right when he speaks of stupid voters.  Not all, of course, but more than voted for Obama’s two opponents. As this is written over 45% of those polled these days continue to express approval for Obama’s performance in office. How stupid is that?

 What is so offensive about Gruber’s own revelations about the manner in which the bill was written and the lies that were told to get it passed is the incalculable misery it has caused millions of Americans.

 It has caused the loss of jobs. It has forced others into part-time employment. It has caused companies to reconsider expanding to grow the economy. It has driven up the cost of healthcare insurance. It has impacted local hospitals and clinics to the point where some have closed their doors. It has caused many healthcare professionals to retire or cease practicing medicine.

 I invite you to make a list of all the things you think the government should require you to purchase whether you want it or need it. Should you be required to own a bike and use it as an alternative to a car? (Yes, you must own auto insurance to defray the cost of accidents, just as you must pay a tax on gasoline to maintain our highway system.)  Should you be required to wear a certain style or item of clothing? Should you be required to get married by a certain age? Should you be required to eat certain foods and avoid others?

 A new study by the Legatum Institute in London ranked citizen’s perception of their personal freedom in a number of nations. Americans ranked way down the list at 21 out of 25, well below Canada, France, and Costa Rica to name just three. The study was based on a 2013 poll.

 What is a stake here is (1) the absolute need for a trustworthy federal government and (2) the need to repeal a piece of legislation based entirely on lies. On a larger scale, the right to make your own decisions on matters not relevant to the governance of the nation should be regarded as sacred, it’s called liberty.

 The Republican-controlled Congress and the Supreme Court are the two elements of our government that can and must provide a measure of protection against the deception that is practiced every day by President Obama and members of his administration. Let’s hope neither is “stupid” in the two years that remain.

 

Categories: On the Blog

Heartland Daily Podcast – David Schweikert: Secret Science Reform Act of 2014

Somewhat Reasonable - November 19, 2014, 3:33 PM

Congressman David Schweikert, Republican representing Arizona’s 6thdistrict is the chairman of the House subcommittee on the Environment in the House Science, Space and Technology Committee. In this capacity, Representative Schweikert introduced the Secret Science Reform Act of 2014 (H.R. 4012) and with the support of Texas’s own Lamar Smith, chairman of the full committee, it was passed out of committee.

The bill requires the EPA to disclose all the science, research, models and data used to justify regulations, and the results would have to be reproducible by independent researchers. Schweikert argues research used to make rules imposed on the public, especially when it is funded directly or indirectly by taxpayers, should be transparent.

[Subscribe to the Heartland Daily Podcast for free at this link.]

Categories: On the Blog

Heartland’s Jim Lakely Crushes ACLU in Net Neutrality Debate

Somewhat Reasonable - November 19, 2014, 2:52 PM

The Internet isn’t broken, and doesn’t need the government to fix it. That was my overriding message in a debate on Chicago’s PBS station WTTW Tuesday night with Illinois ACLU Executive Director Colleen K. Connell.

In an excellent discussion led by “Chicago Tonight” host Phil Ponce, Connell and I talked about President Obama’s attempt to strong-arm the FCC into regulating the Internet like a public utility under “Title II” of the Telecommuncations act of 1996. That’s the only sure-fire way to ruin the vibrant digital economy and the online experience we all now take for granted on our computers and mobile devices.

If Obama was president in 1996, Netflix and Hulu as we know it today wouldn’t exist. Why? Because anyone who thought it was a good idea to stream content directly to consumers — circumvening the bundled-channels model of cable TV, and even creating original content — would have had to invest enormous human and monetary capital into convincing the FCC that it was something consumers wanted. All of that creative energy would have been wasted on goverment rent-seeking instead of creating what we have today.

Thankfully, a Republican Congress and Democratic President Bill Clinton in 1996 put then-nacent broadband under “Title I,” which did not give the FCC the power to micromanage the Internet “on our behalf.” As a result, dozens of ISPs (Comcast, AT&T, Verizon are the Big Three, but not the only ones), content providers (Google’s YouTube, Netflix, Hulu, ESPN), millions of app creators, and billions of consumers enjoy the ever-evolving wonders of the digital age. It might be the best example Adam Smith and Milton Friedman could have conceived of free-market capitalism providing the best services at the lowest prices and at the fastest possible speeds — both in terms of innovation and delivery.

Yet Connell, my debate opponent from the Illinois ACLU, was arguing that now — right now — the government must get its molasses-filled hands deep into the structures of all these wonders and start mucking around. Why? Because of potential “anti-consumer” actions that might be taken by the big players in the digital economy. It’s a hard argument to make, which is why I got the better of these exchanges.

Enjoy.

Categories: On the Blog

Exchange on Texas textbook controversy

Somewhat Reasonable - November 19, 2014, 12:35 PM

Recently there has been a great deal of controversy over the adoption of new social studies and history textbooks in Texas. Global warming alarmists have successfully pressured textbook publishers into removing any trace of the undeniable fact that the causes and consequences of global warming are still open for debate or that any scientists still question the theory that humans are causing catastrophic climate change. This is unfortunate because in removing all reference to skepticism concerning warming, the publishers have foregone the truth for political expediency. Should these textbooks be adopted as is, it will be a disservice to Texas students. It’s my sincere hope that the Texas’s State Board of Education, reject all the proposed books, reopen the selection process and make it clear that any textbooks to be adopted, if they address global warming (and they need not do so) must acknowledge the ongoing debate concerning the causes and consequences of climate change and air the views of both sides of the debate.

I have recently written some pieces about the Texas School book controversy in Human Events and The Dallas Morning News

A reader wrote me in response to the Dallas Morning News article. I have reprinted our exchange below (leaving out the name of the writer to protect his anonymity). I post it to answer questions others might have but haven’t asked.

First e-mail:

Mr. Burnett, I agree with you that we all should have an open mind on this subject and we should lead our students into critical thinking, (about any subject for that matter).
But you contradict that statement in your own editorial. You flatly state: “The models are wrong.” You also cherry pick and mislead by stating: “…the facts show and failed to predict the current lack of increase in Earth’s average temperature…”

Those statements do not show someone who is keeping an open mind and using critical thinking.

No model that I have seen (and I have seen probably just about every one) do not support your statements unless one cherry picks the information provided and doesn’t take a look at all the information provided.

The Koch brothers funded a study with a team of skeptical scientists. They provide all their data with total transparency. Check the results for yourself in the second link. http://www.examiner.com/article/koch-funded-study-finds-global-warming-is-real-and-humans-are-the-cause

http://berkeleyearth.org/

(Name withheld)

My response:

Mr. (Name withheld):

Thank you for your attention to my article and for your thoughtful response. However, I believe we will have to agree to disagree. Though stating “the models are wrong,” is a strong statement concerning current temperatures and temperature trends I believe it is wholly accurate see Dr. Roy Spencer’s post.

Concerning the really important matters, the projected harmful results from AGW, its certainly true that one can pick and choose models to project whatever current weather/climate trends or events happen and then claim that this hurricane or that drought, etc… is consistent with the models, but that doesn’t seem to me to be a sound scientific process. One critical feature of the scientific method is the falsification or the ability to disprove the theory. A theory that can’t be disproven is a religion. I have no problem with religion, but religion should not be confused with science. Michael Mann once told me that that for AGW to be wrong everything he knew about physics (including fundamental laws of gravity, thermodynamics and laws concerning the conservation of energy and entropy) would have to be wrong. That is shear hubris in the face of the numerous climate factors of which scientists are ignorant. I have written about the problems with selective choice of models a number of times in the past, for instance at, National review and Human Events

Nothing, I’ve seen since I wrote these pieces has led me to change my considered opinion.

There are simply too many factors that climate modelers admittedly don’t understand or know how to plug into their models, to have any confidence in models projections of future harms when they cannot even accurately model temperature trends.

If there is a model that has accurately projected the current lull in temperature, let me know which one it is, then let’s check the projections that flow from it concerning various climate phenomena and trends and see how they comport with what has been actually measured or recorded. A model that projected temperatures and most recorded phenomena and measured trends accurately, I would count as one worth serious consideration.

Concerning the Koch brothers, I have never met them and to my knowledge my work has never been funded by them. More importantly, even if I, or the scientists you discuss, were a wholly paid employees of the Koch brothers, the question would remain, is what we have written wrong. To indict or praise the work of someone because of who they associate with, who they are paid by, or because it is supported by a majority (or consensus) is a logical fallacy.

Arguments should be analyzed strictly on their own merits, regardless of whether one likes or trusts the persons or groups making them.

Just my two cents.

Take care,

H. Sterling Burnett, Ph.D.

 

I had thought that that would be the end of the exchange but it was not. Instead I received this reply:

So you’re not going to keep an open mind and you haven’t studied the lag affect yet? Looking at only the last 16 years of data reminds me of my liberal friends who post the Federal Budget Deficit from 2009 to present and they show how much it has come down under Obama. But of course they fail to acknowledge and show the previous years budget deficit prior to 2009 and the enourmous jump it made right after Obama took office.

(Name withheld)

To which I responded:

(Name withheld):

You misunderstand me or misstate my position. I don’t have any problem with the idea of a lag effect, just show me where the models we are supposed to trust, build it in. They treat emissions as if they worked geometrically not logarithmically. I never said that I just looked at the last 16 years of data, I just pointed out that they missed the actual measurements for the past 18 years. The models and their promoters don’t accurately portray the state of our knowledge about climate sensitivity.

To the extent that the models have accurately hind-cast, past temperatures and trends they have only done so after much tinkering and building in adjustment factors. The fact that the model temperature projections have error bars as great or greater than the actual expected temperature rise, tells me they aren’t very useful for decision making. If I hired a caterer for a party and informed him that I expected 30 people to attend, but there could be as few as zero (not counting me) or as many as 75, I wouldn’t really be providing him with much useful information for planning purposes.

I’ve got an open mind, I admit I could be wrong. And all I request to trust models projections is a model that gets temperatures and expected climatic events mostly right. You seem to think that that is too much to ask on my part. Unlike AGW believers, I take data and evidence to be paramount, not unconfirmed (perhaps unconformable) theory. In addition, I don’t require the scientific method or basic rules of physics, biology, chemistry or any other core principle or set of principles to have to be overturned to prove me wrong.

Having said this, even if I am wrong and potentially dangerous human caused warming is in the offing, I would still argue that the danger, the actual harms, resulting today and in the future, would be greater from denying widespread access to and use of fossil fuels than the harms from warming. Leaving billions in grinding poverty and want today, and leaving future generations poorer than they otherwise would be, by stopping fossil fuel use, is immoral, since it sacrifices present generations based on possible harms to future generations that will be much wealthier and able to adapt to or mitigate climate change much better due to their greater wealth. It’s the difference between the impacts of cyclones and tidal waves on Indonesia or the Indian subcontinent compared to the impacts in the U.S. or Japan. The difference between harms from widespread drought in the Southwestern U.S. compared to a similar drought in Sudan or Ethiopia.

Take care,

H. Sterling Burnett, Ph.D.

Any thoughts?

Categories: On the Blog

Thinking the Unthinkable – Part IV

Somewhat Reasonable - November 19, 2014, 11:40 AM

I was gratified by the excellent attendance at the Free State Foundation’s program last Friday titled, “Thinking the Unthinkable: Imposing the ‘Utility Model’ on Internet Providers.” If you weren’t there, you missed what was a very important event – one that, in light of the substantive discussions that occurred, likely will play an important role going forward in the debate over the Federal Communications Commission’s consideration of the imposition of new net neutrality mandates.

With the release of President Obama’s video on Monday, what he calls “President Obama’s Plan for a Free and Open Internet,” directly urging the FCC to classify Internet service providers as common carriers – that is, to impose the Title II public utility model of regulation – the FCC’s proceeding has now become even more highly politicized than before. More and more, in the absence of any present market failure or consumer harm, the proceeding is looking like a textbook case study in administrative agency overreach. Or put more bluntly, an administrative agency power grab.

And with President Obama interjecting himself so directly into the net neutrality rulemaking, the proceeding is also providing a textbook example of the problematic nature of so-called independent agencies like the FCC, which in any event occupy an odd place in our tripartite constitutional system. After all, in the interests of accountability, the Constitution vests all powers in the legislative, executive, and judicial branches, not in agencies, commonly referred to as the “headless fourth branch” of government, that blend together quasi-executive, quasi-legislative, and quasi-judicial powers.

Witnessing what is transpiring at the FCC calls to mind for me a famous statement by Roscoe Pound, the distinguished American legal scholar and long-time Dean of the Harvard Law School, concerning the rise of administrative agencies like the FCC. In 1920, he said: “[T]he whole genius of administrative action through commissions endangers the supremacy of law. Not the least task of the common-law lawyers of the future will be to impose a legal yoke on these commissions, as Coke and his fellows did upon the organs of executive justice in Tudor and Stuart England.”

It is possible that, when all is said and done, the FCC’s actions in the net neutrality proceeding might lead to a fundamental rethinking by Congress of the proper institutional role in the Digital Age of an agency created in 1934 (or, in part, in 1927 if you wish to go back to the Federal Radio Commission). To its credit, the House Commerce Committee early this year began such a #CommActUpdate process in earnest, and when the new Congress convenes in January, the Senate should follow suit.

In the meantime, it is at least somewhat encouraging that FCC Chairman Tom Wheeler appears to recognize the need to take a pause in the net neutrality rulemaking. This would be a good idea, especially if Mr. Wheeler and his Commission colleagues use such a pause as a time for some serious rethinking concerning the proper exercise of the agency’s putative authority with respect to Internet regulation.

On the assumption that the willingness to engage in reflection and rethinking should always be in order, those at the FCC and elsewhere would do well to review the remarks delivered by Commissioners Ajit Pai and Michael O’Rielly at the Free State Foundation’s event on November 14th. Regardless of any predispositions regarding what actions – or not – you believe the FCC should take, their statements at the least warrant careful consideration.

Commissioner Pai’s remarks are here

Commissioner Michael O’Rielly’s remarks are here

And the remarks of Rep. Bob Latta, the Vice Chairman of the House Communications and Technology Committee, are important as well. They are here.

I’m going back and re-read each of these, and I hope you will read them too.

PS – The video of the opening remarks and the lively panel discussion featuring Robert Crandall, Gerald Faulhaber, Deborah Taylor Tate, and Michael Weinberg will be posted shortly.

[Originally published at The Free State Foundation]

Categories: On the Blog

And the Records Keep Falling

Somewhat Reasonable - November 19, 2014, 11:38 AM

Weather map used with the kind permission of Weatherbell Analytics.

America is experiencing record setting cold, any where you go or live, you just can’t avoid it.

Weather Bell says it best:

The Lower-48 or CONUS spatially average temperature plummeted overnight to only 19.4°F typical of mid-winter not November 18th!   Data

An astounding 226-million Americans will experience at or below freezing temperatures (32°F) on Tuesday as well — if you venture outdoors.

More than 85% of the surface area of the Lower-48 reached or fell below freezing Tuesday morning. All 50-states saw at or below freezing temperatures on Tuesday.

Record lows from Idaho to Nebraska and Iowa south to Texas and east through the Great Lakes, the eastern 2/3 of the US will shatter decades-long and in some cases, century-long records. Temperatures east of the Rockies will be 20-40°F below climate normals.

Compared to normal, temperatures over the past several days have dropped off a cliff — to 10°C below climate normal — more anomalous than even during the polarvortex of early January.

How cold is it? Boston.com reports that during the past week, 1,360 daily low maximum records were set — meaning  those ,1360 cities and towns saw their coldest daily highs ever recorded on that particular date.

In addition, snow ground cover is over 50 percent of the country, which is more than twice the coverage the U.S. usually experiences for mind-November.

CNN reports that Areas in Buffalo, New York among other cities along the Great Lakes, have experienced a years worth of snow in just three days this past week.

To repeat: Every state in the Union had a had at least one location within its borders registering temperatures below freezing, yes, including Hawaii and more than 1360 cities and towns set record low high temperatures.

I know all this record cold naturally makes alarmists think of global warming, as, in fact, it does me — though for me as I shiver in Dallas, I want to see some warming.

 

Categories: On the Blog

Textbooks Proposed for Texas Schools Open Can of Worms

Somewhat Reasonable - November 19, 2014, 11:03 AM

Controversy continues over the adoption of new schoolbooks in Texas, as environmental lobbyists fight to have sound science concerning global warming removed from the curriculum. With the ability to influence millions of schoolchildren regarding climate change, environmental alarmists are trying to ensure their message is the only one heard.

Alarmists claim the science is certain: Humans are causing catastrophic climate change and governments must force people to use less energy to prevent disaster. To be clear, climate change is occurring; the climate is always changing. However, there is an ongoing, heated and widespread scientific debate over whether human activities are responsible for all, some or none of the recent climate change. In addition, there is certainly no agreement a warmer climate will result in more dangerous weather patterns or climate conditions than we already experience.

The predictions of catastrophe are based on models that ignore the facts and failed to predict the current 18-year lack of increase in Earth’s average temperature, which has happened despite rising CO2 levels. All the models have assumed and continue to assume the increase in CO2 is the culprit causing temperature increases. The models are wrong.

The textbooks in question don’t deny human-caused global warming is happening; they just accurately report scientists are still debating the question. They present the evidence and ask the students to make up their own minds.

Having an open mind is what climate alarmists really object to.

A couple of textbook publishers, including Pearson Education just last week, buckled to the activists’ demands and replaced the scientific understanding of climate with the politically driven, dogmatic claim humans are causing dangerous climate change. Reasonable people will praise McGraw-Hill for, so far, resisting the alarmists’ pressure tactics.

The Texas Board of Education is justifiably acting cautiously to ensure its textbooks rigorously present the best science available and accurately portray ongoing debates, including those over climate change. They are right to do so and should endorse only textbooks that uphold critical thinking and skepticism in the face of unsupportable claims of pending climate disaster.

 

[Originally published at Dallas News]

Categories: On the Blog

The GoogleNet Playbook & Zero Pricing – A Special Report

Somewhat Reasonable - November 19, 2014, 9:41 AM

GoogleNet is Google’s vision to leverage its proliferating dominance by offering global, near-free Internet-access, mobile connectivity, and Internet-of-Things connectivity via a global, largely-wireless, Android-based, “GoogleNet,” that is subsidized by Google’s search and search advertising dominance and by “open Internet” zero pricing of downstream Internet traffic.

A near-free global GoogleNet would be much like the Google Playbook which offers Android, Maps, YouTube, and others’ content for free globally, to disrupt and commoditize competitors in order to maintain and extend its search and search advertising dominance throughout the economy.

Why the GoogleNet Playbook matters competitively is that it is Google’s new disruptive strategy to disintermediate and commoditize physical-world industries’ direct relationship with their customers (like ISPs, energy utilities, automobile manufacturers, big-box stores, banks, package delivery services, realty, and their networks, vehicles, inventory, ATMs, credit cards, appliances, devices etc., just like Google has been disintermediating and commoditizing the paid content, app and software industries’ direct relationship with their customers.

Tellingly, Google explains that Google proper is mostly about digital bits in the virtual world i.e. computer science and Internet technologies, whereas its next generation GoogleX research lab is mostly about atoms in the physical world, i.e. physical objects like driverless cars, satellites, drones, networks, devices, sensors, etc.

In a nutshell, this analysis spotlights: Google’s much-underappreciated, global-connectivity plans — GoogleNet; how GoogleNet neatly fits into the Google Playbook; and how zero-price-defined net neutrality is necessary to subsidize and accelerate Google’s grandiose ambitions to broadly extend its dominance of the virtual world into the physical world.

 

Summary

This analysis first applies the Google Playbook of “open-dominate-close” to GoogleNet’s global connectivity ambitions.

Next it shows how GoogleNet neatly ties together Google’s unique technology vision, company mission, “serial-moon-shot” ambitions, and its core beliefs in digital information commons cyber-ideology, and abundance economics.

Next, for the first time, it charts the much-underappreciated, exceptional comprehensiveness of GoogleNet’s progress: from the Google’s dominant Android mobile operating system, to Android devices, satellites, high-altitude balloons, drones, dark fiber, undersea cables, data center construction, server-points-of-presence, fiber broadband, wireless backhaul, WiFi mesh-networking, etc.

Then it explains the exceptional value and advantage of getting a government “net neutrality” industrial policy to ban the evolution of a two-sided free market for the large enterprise market, via permanently banning any charges for high-volume downstream Internet traffic under the guise of “no-fast-lanes” or no “paid prioritization” for the Internet.

 

The Google Playbook

FairSearch clearly and cohesively describes The Google Playbook, Google’s plan to build and maintain its dominance via its predatory strategy of: “open-dominate-close.”

First, under the guise of “openness,” Google offers free, or deeply cross-subsidized, products and services to induce fast mass adoption and “disrupt” existing business models. Second, Google proliferates its dominance based on promises of “openness.” Third, once dominant in the new cross-subsidized market, Google then closes its products/services and excludes competitors, so it can discriminate in favor of itself.

To put this in perspective, this analysis explains and documents how GoogleNet is Google’s strategy to eventually dominate global Internet access and connectivity for mobile and the Internet-of-Things, much like I explained and documented Google’s anti-competitive strategy to extend its dominance to YouTube in my Google-YouTube’s Internet Video Distribution Dominance analysis last year.

 

GoogleNet’s Technological Vision, Mission, Ambitions, Ideology and Economics

GoogleNet neatly ties together Google’s unique technology vision, company mission, “moonshot” ambitions, digital information commons cyber-ideology, and abundance economics.

Google’s Unique Technology Vision is summarized by Google Chairman Eric Schmidt in his recent book: “How Google Works.”

Page 11: “Three powerful technology trends have converged to fundamentally shift the playing field in most industries. First, the Internet has made information free, copious, and ubiquitous – practically everything is online. Second, mobile devices and networks have made global reach and continuous connectivity widely available. And third, cloud computing has put practically infinite computing power and storage and a host of sophisticated tools and applications at everyone’s disposal, on an inexpensive pay-as-you-go basis.”

Mr. Schmidt then lays out the implicit vision for GoogleNet: “Today, access to these technologies is still unavailable to much of the world’s population, but it won’t be long before that situation changes and the next five billion people come on line.”

Simply, GoogleNet is Google’s global vision of a fully-integrated network of digital information, connectivity and computing power that combined is “10x” better than the existing Internet.

Mr. Schmidt continues: “we are entering what lead Google economist Hal Varian calls a new period of ‘combinatorial innovation.’ This occurs when there is a great availability of different component parts that can be combined or recombined to recreate new inventions. … Today the components are all about information, connectivity and computing.

The genius of this insight is why Google can be more “innovative” than anyone else simply because they dominate, or will dominate, most of the necessary fundamental component parts of “combinatorial innovation” long term: information, connectivity, and computing.

Mr. Schmidt recently told the CBC:  “The concept of having every human reachable by every other human is an extraordinarily valuable thing.” He is echoing Metcalfe’s Law of network effects which posits that “the value of a telecommunications network is proportional to the square of the number of connected users of the system” — per Wikipedia.

Simply, Mr. Schmidt and Google get that its dominance in search, search advertising, Android, Maps, YouTube, and Chrome, grows with more users.

Google’s Mission & Ambitions: If one thought Google’s mission “to organize the world’s information and make it universally accessible and useful” was grandiose, they have already achieved most of it in just fifteen years, as I documented recently in my Google’s WorldWideWatch of the WorldWideWeb analysis that charted the vastness of Google’s Internet empire and data hegemony for the first time.

What is the effective “mars-shot” to scale Google’s ambitions “10x” beyond the mere “moonshot” of organizing the world’s information? When the FT recently asked Google CEO Larry Page if Google’s mission statement needed updating, he responded: “I think we do, probably. We’re still working that out.”

Just two years ago, Google CEO Larry Page lamented that “We’re still 1 percent to where we should be…what I’m trying to do is… really scale our ambition.”

Given all of Google’s GoogleNet-related acquisitions and activities that will be documented later in this analysis, it appears that Mr. Page’s mission is actually expanding to something like this: “Inter-connecting everyone, every “thing,” and the world’s information over one universally accessible and useful cloud computing GoogleNet.” Or as Google simply calls it internally: “The Google computer.”

Google’s Cyber-ideology: No one can fully understand the boundlessness of Google’s ambitions without understanding why Google CEO Larry Page promised shareholders in his 2004 Founder’s letter that: “Google is not a conventional company. We do not intend to become one.” He promised that because he knows Google is driven by a very different ideology than most of the world would recognize.

Google’s mission and ambitions are not merely technological but also very political, a natural outgrowth of its codism cyber-ideology of a digital information commons where “information wants to be free.” For those who are struggling to understand Google’s geopolitical world view see: a detailed explanation of the Codism movement of which Google increasingly is the de facto global leader, in “What Is the Code War?

“Abundance Economics” is Key to Google’s Dominance: Google is the world’s largest adherent to the theory of abundance economics, where because the marginal cost of computing, storage, and bandwidth approaches zero, whatever is on the Internet should be free or no cost to use. Abundance economics generally ignores the reality of fixed and total costs and property rights, because they don’t support their notion and aspiration that the economics of abundance have supplanted the traditional economics of scarcity.

The penultimate for abundance economics and a digital information commons would be dominating the three biggest disruptive technological trends of universal and near-free: data-accessibility, connectivity, and computing power.

Simply, Google’s CEO Larry Page singularly gets the implications of digital hyper-centralization – omni-scale wins.

Whoever gets first-mover advantage of combining data-aggregation, connectivity, and computing power wins – its winner-takes-all.

Competitors can’t compete if Google’s proliferating dominance allows it to create an unmatchable, fully integrated, super-high-cost essential facility of one global client-server network (and proverbial Tower of Babel) – that is the only network where eventually one can go for the world’s information, universal Internet access and connectivity, and the lowest-cost computing power.

Tellingly, Google, a world leader in multi-language translation services, originally named its global Google Hangout video chat and video conferencing service “Babel.”

Moreover, Mr. Page gets that the Internet’s web-server-infrastructure is a basic client-server model that ultimately will turn out to have more in common with IBM’s mainframe dominance of the 1950s – 1970s than Microsoft’s dominance of the PC client-software largely in the 1990’s. (Note: The “server” in the traditional “client-server” model has morphed over the decades into a data center of hundreds of thousands of virtualized server-blades in globally-virtualized data centers that function like one unitary server or mainframe computer did in the IBM dominant era.)

And to force Google’s ideological position that information should be free, i.e. no cost, Google has been a most hostile entity when it comes to disrespecting users’ privacy rights anddisrespecting others’ intellectual property rights.

And to commoditize cloud computing, take note that Google has precipitated a price war with Amazon’s AWS, cutting cloud prices 38% in 2014 alone, a price war that it already knows it ultimately will win.

 

The Evidence of the Exceptional Comprehensiveness of the GoogleNet Domination Effort:

A very big public indicator that GoogleNet is a real, urgent and major strategic priority for Google was in June when CEO Larry Page made Craig Barratt Senior Vice President for Access and Energy, on par with the SVPs for Android, Ads and YouTube per WSJ reporting.

Importantly, the dominant core or “spine,” on which GoogleNet is being built upon and around, is Google’s very-fast-growing Android mobile operating system, which already commands 85% share of global smart-phone shipments, 62% share of tablets, 93% share of mobile searches, and over one billion active users up from 538m in June 2013.

The Android mobile operating system is rapidly becoming the default operating system for much of the consumer Internet of Things marketplace because it the only one that is free and “open,” and because the smart-phone has become the default remote controller for: home networking via its Nest acquisition; for autos via its dominant Open Automotive Alliance; and for wearables among other categories of “things” in the consumer Internet of Things.

Google is clearly serious in being the first-mover to reach what Mr. Schmidt calls the Internet’s next 5 billion users coming on line from the developing world via its supply of a free mobile operating system, and its low cost Chrome-books, tablets, smart-phones, wearables and sensors.

To provide these next five billion Internet users free or near-free connectivity, Google is piloting three different technology approaches to offering a free global GoogleNet service.

Google bought Skybox Imaging for $500m and plans to spend $1-3 billion on “180 small, high capacity satellites at lower altitudes than traditional satellites” to enable two-way Internet access. Google also bought Titan Aerospace – which makes solar-powered, high-flying drones that Titan calls “atmospheric satellites” — for Internet access to remote areas. And Google CEO Larry Page shared his ambitions that Project Loon “could build a world-wide mesh of these balloons that can cover the whole planet” to provide Internet access. Any one of these very different physical technologies could work, or be meshed together depending on which ones work best in what circumstance.

Since as early as 2005, Google has been buying massive amounts of dark fiber (i.e. fiber in the ground that has not been “lit” yet with optical devices on each end). After the tech bubble, which resulted in a global overbuilding of fiber networks, the fiber market bubble burst, which made dark fiber dirt cheap when companies like WorldCom, Global Crossing and PSINet and others went bankrupt.

This August Google invested $300m in a trans-Pacific undersea cable with Chinese, Japanese and Singaporean companies. This October, Google announced it was building a new U.S.-Brazil undersea cable system with Brazil to be completed in 2016. The trans-Pacific and Brazil undersea cables are Google’s third and fourth undersea cable investments.

In the last few years Google has globalized its GoogleNet investments in its Internet infrastructure. Google led the world in data center cap-ex with about $28b from 2006-2014. It now has 1,400 global server points-of-presence in 140 or 68% of the world’s countries per USC research that mapped Google’s global serving infrastructure. Google-YouTube also reports that it has localized YouTube on servers in 61 countries in 61 languages.

Add this entire fiber infrastructure together and it suggests Google already has assembled its own de facto private Internet backbone that handles traffic that could rival the traffic routed by a Tier I backbone provider.

Google has also invested more than any entity to create the only proprietary global Internet “phone/address book” of Internet addresses, the functional, economic, and market power equivalent of the old Bell system phone book and yellow pages, but this time for the whole world and all devices with an Internet address. In 2012, Google claimed to be the world’s leading domain name service (DNS) resolver handling 70b requests daily. Google also offers a Cloud DNS service.

Google is also experimenting with various local ISP access technologies.

The best known is Google Fiber which has build-outs in Kansas City MO/KS, Provo UT, and Austin TX. Google has also targeted nine metro areas and 34 cities for more 1 Gigabit local access build-outs: Nashville TN, Phoenix AZ, Portland OR, Raleigh-Durham NC, Salt Lake City UT, San Antonio TX, San Jose CA, and Atlanta GA.

Here it is important to discuss Google’s various technological solutions and efforts to create a free large-scale, WiFi-based cloud network to disrupt and ultimately replace paid-ISP service.

It is important to note that Google’s new SVP for Access and Energy Craig Barratt is from a wifi-wireless chip background and not a traditional ISP background of any kind. It is also important to note that Google publicly reminds us “We don’t make money from peering or collocation,” because Google makes its money from advertising.

The lesser-known effort to Google Fiber was Google’s acquisition of Alpental Technologies, which is a 60MHz wireless technology that can provide wireless connections of up to a mile at potential speeds of seven Gigabits a second. The founders describe Alpental’s technology as “self-organizing, ultra-low power gigabit wireless technology” that can extend the reach of fiber to create WiFi networks.

A potential game-changer here is that the Alpental technology, leverages a new Android application, probably a peer-to-peer approach, which automatically transfers a user to its WiFi hotspots whenever they come in range, an operational attribute similar to seamless handoffs on wireless cellular networks.

Google is also working with Ruckus Wireless “trialing a new software-based wireless controller that virtualizes the management functions of the Wi-Fi network in the cloud… The end result would be a nationwide — or even global — network that any business could join and any Google customer could access,” per Gigaom.

This analysis of Google’s global GoogleNet plans would not be complete without mentioning the  potential for a Google acquisition of, or partnership of some kind, with SoftBank’s Sprint.

Google does not need to acquire a company to reap most of its integration benefits. Google Chairman Eric Schmidt uses the term: “merge without merging.” The web allows you to do that, where you can get the web systems of both organizations fairly well-integrated, and you don’t have to do it on an exclusive basis.”

Something could be afoot at Sprint with Google. To start with, Softbank and Google have long had exceptionally close leadership ties and aligned interests – documented in detail here.

It is unlikely that Softbank’s CEO Son would have been able to poach Google’s Chief Business Officer, Nikesh Arora, to be SoftBank’s Vice Chairman, and it is not likely that Google would have paid Mr. Arora his full-term bonus that was not due to him contractually upon his departure, if there was not something else going on in this close strategic relationship.

Last April, Amir Efrati of The Information reported that Google was talking to wireless providers about an MVNO wholesale relationship to provide Google with wireless services. In that context, it is noteworthy that Mr. Arora just joined Sprint’s board.  

In addition to close ties between Softbank and Google, Sprint needs rescuing or a big long-term wholesale contract, and Google could do that and put Sprint’s woefully-underutilized, and massively-WiFi-compatible spectrum holdings to work better and more fully than any other entity could.

In short, no other entity is as serious and determined as Google to create a global de facto shadow Internet of global information, connectivity, and computing — soonest.

 

Conclusion: GoogleNet Dominance Depends in Part on Net Neutrality Zero-Pricing

Google effectively defines net neutrality as a permanent Government-set price of zero for all downstream Internet traffic to the consumer.

Why Google has been the real power behind-the-scenes pushing for net neutrality zero-pricing is that Google dominates downstream Internet traffic to users.

Google’s cloud client-server model — of ad-serving, video streaming, software on demand, App downloads, and cloud-computing services – all involves sending vastly more downstream traffic to American and international users than those users send upstream to Google.

Consider Google-world-leading stats to grasp how much downstream Internet traffic Google alone generates, and how much users subsidize Google profits when Google does not have to pay for much of the costs of its Internet downstream traffic.

Per Deepfield research: 60% of Internet devices and users exchange traffic daily with Google’s servers; >50% of websites’ traffic involves Google analytics, hosting and ads daily; and ~25% of the Internet’s daily traffic is Google.

A billion users receive very bandwidth-intensive videos from Google-YouTube, maps from Google Maps, and content via Google’s Chrome browser. Google uniquely serves display ads to two million websites.

No other entity in the world generates this amount of downstream Internet traffic because Google alone controls five of the world’s six billion-user web platforms.

GoogleNet’s ambition to be the global multi-party-video-conferencing network via Google Hangouts, means that Internet users will help fund Google’s dominance whether or not they use Google’s services at all.

At core, zero-price-defined net neutrality provides Google a substantial anti-competitive advantage where they can shift their Internet infrastructure cost obligations to users and infrastructure providers and Google can then provide free or near free global connectivity as a way to disrupt, disintermediate, and commoditize physical-world industries’ direct relationship with their customers (like ISPs, energy utilities, automobile manufacturers, big-box stores, banks, package delivery services, realty, and their networks, vehicles, inventory, ATMs, credit cards, appliances, devices etc., just like Google has been disintermediating and commoditizing the content, app and software industries’ direct relationship with their customers via free cross-subsidized products and services.

The only way that Google’s Playbook works is if it can use “openness” as a way to offer free, or near-free, offerings that can drive rapid adoption and that keep Google’s operational costs lowest. Pushing for zero-price-defined net neutrality clearly fits this bill as it shifts most of the Internet infrastructure costs Google causes off of Google and onto consumers and potential competitors.

To sum up, a global GoogleNet that provides free, or near-free, universally accessible and useful Internet access and mobile connectivity, in order to offer more content, products and services for free that consumers currently pay for, enables Google to disrupt, disintermediate, and commoditize most all of Google’s potential competitors — before they even know what hit them.

In a word, a global GoogleNet could become the quintessential essential facility.

 

 

***

Googleopoly Research Series

Googleopoly I: The Google-DoubleClick Anti-competitive Case – 2007

Googleopoly II: Google’s Predatory Playbook to Thwart Competition – 2008

Googleopoly III: Dependency: Crux of the Google-Yahoo Ad Agreement Problem – 2008

Googleopoly IV: Google Extends its Search Monopoly to Monopsony over Info — 2009

Googleopoly V: Why the FTC Should Block Google-AdMob – 2009

Googleopoly VI: Seeing the Big Picture: Google’s Monopolizing Internet Media –2010

Googleopoly VII:  Monopolizing Location Services – Skyhook is Google’s Netscape –2011

Googleopoly VIII: Google’s Deceptive and Predatory Search Practices – 2011

Googleopoly IX: Google-Motorola’s Patents of Mass Destruction — 2012

Googleopoly X: Google’s Dominance is Spreading at an Accelerating Rate — 2013

Googleopoly XI: A Satire: Grading Google’s Search Antitrust Remedies in EU Test – 2013

Googleopoly XII:  Google-YouTube’s Internet Video Distribution Dominance – 2013

Googleopoly XIII: Let’s Play Pretend: a Satire of Google’s Second EU Search Remedy Proposal 2013

Googleopoly XIV: Google’s WorldWideWatch over the WorldWideWeb [9-14]

 

[Originally published at Precursor Blog]

Categories: On the Blog
Syndicate content