Cox Cable Congestion Management Plan Challenged, Defended

Published May 1, 2009

As Cox Communications seeks to relieve intermittent congestion on its networks by according higher priority to time-sensitive traffic while allowing minor delays in other traffic, the company’s policy has drawn the ire of self-described public-interest groups pushing for government to stop it.

Traffic designated as time-sensitive includes viewing of Web pages, voice over Internet Protocol telephone calls, streaming videos, and game-playing. Traffic that Cox will consider not time-sensitive includes peer-to-peer (P2P) file exchanges. Cox will test its traffic management technology in its Kansas and Arkansas markets.

“Congestion management is certainly not a substitution for investment in our network,” said David Deliman, a Cox Communications public relations representative, in response to a written questionnaire to the company. “An Internet event that draws an enormous audience, such as the recent presidential inauguration, can drive a temporary spike in activity that could impact our customers’ service level.”

Cox Facing Lawsuits

Vuz, a company specializing in the distribution of streaming media using P2P technology, has filed a lawsuit against Cox, arguing delays in transmission of streaming media from Vuz could affect service quality for its 10 million customers.

Experts say Vuz has a case because Cox is also in the content distribution business and its actions could be viewed as discriminatory. But that might not be enough to stop the Cox policy.

“Cox has a well-established reputation for consumer-friendly business practices,” said Robert C. Atkinson, director of policy research at the Columbia Institute for Tele-Information in New York City.

Spikes Critical to Case

Cox’s case that its network management practices are reasonable would rest on the company establishing the existence of dramatic, momentary spikes in traffic and the need to manage them.

“While average utilization of networks is below full capacity, burst traffic does exceed full capacity at the core and at the edge,” said Aleksandar Kuzmanovic, who leads research on network congestion at the Northwestern Network Group at Northwestern University in Evanston, Illinois.

“As the capacity at the edge has increased and more ISPs connect with each other, the incidence of network congestion has increased with significant packet loss,” Kuzmanovic said.

Tough Road for Cox

Timothy B. Lee, of the Cato Institute in Washington, DC, thinks government has little justification for regulatory action against Cox. He predicts the trial will unfold first.

“Cox will likely fail in its bid to manage the flow of traffic on its network,” Lee said.

Lee says the unwieldy nature of the Internet itself dooms to failure even Cox’s attempts to regulate traffic—or at least they might make it cheaper for the company simply to increase its broadband capacity.

“The incremental cost of equipment to manage traffic flow will be about equal to the cost of increasing bandwidth,” Lee said. “The Internet is not a single network but a combination of several of them, and all of them will have to agree to upgrade their technology to manage the flow of traffic.”

Individuals Stymie Throttling

Unlike legacy networks, such as basic telephone service, the routers that facilitate Internet use in the home do not have enough intelligence to control the flow of traffic effectively, Lee said.

In addition, Lee said, “activist geeks” tend almost instantly to create ways to circumvent Internet service providers’ traffic management policies.

“It is possible to disguise high throughput traffic as low bandwidth traffic,” Lee said. “The only realistic possibility for congestion management is assigning bandwidth for each user.”

Market Regulation Working

Atkinson points to Comcast’s attempts to manage its network congestion—and the mostly trumped-up controversy they generated—as a reason to limit government regulation of the Internet.

“The regulatory practices of the Federal Communications Commission have evolved with increasing recourse to market-friendly common law rather than rigid rules which could stifle innovation on the Internet,” Atkinson said. “Companies are anxious to avoid crossing the line which could attract strictures from watchdogs and regulatory action.

“In the case of the Internet, the line is fuzzy,” Atkinson added. “The fact that very few cases have been brought before FCC is proof that current regulatory practices are effective.”

Kishore Jethanandani ([email protected]) writes from San Francisco.