Exaflood Threatens to Swamp the Internet

Published July 1, 2007

More than 100 million YouTube videos are being downloaded every day … and that, as they say, is not the half of it.

Video has become a standard feature on most news sites, from CNN to the news page for your local network affiliate. Even small blogs carry video.

The video explosion has touched off discussion of how the nation’s collective network infrastructure will handle the “exaflood”–the near exponential growth of Internet traffic from year to year.

We are almost there. The term exaflood derives from exabyte, which equals 1 quintillion bytes, or 1 followed by 18 zeros. As of December 2006, the Internet was handling 700 million gigabytes of traffic a month, according to the University of Minnesota’s Digital Technology Center. A gigabyte is 1 billion bytes and 700 million billion bytes equals 700 quadrillion bytes, or 0.7 exabytes.

In and of itself, the exaflood does not necessarily present a crisis. Right now the global Internet has the capacity to handle the traffic. The question, as a short video from the Fiber to the Home Council discusses, is this: When the amount of Internet data truly begins to reach the capacity of the network, as it inevitably will, how will the industry be able to respond?

One obvious answer is to build more infrastructure. Optical transmission technology continues to improve and faster processors make for faster Internet switches and routers. Carriers have been doing as much all along. However, a second, complementary solution could be applied to what engineers call the transmission layer–the internal software of the network that handles Internet data as it flows through.

While it is true that Internet transmission is all bits and bytes, intelligence in the transmission layer already can discern video from voice and text from image and prioritize them differently. Just as with physical infrastructure, scores of U.S. manufacturers are working to improve the performance of the transmission layer.

Yet proposed network neutrality legislation, if allowed to pass, stands to short-circuit these efforts. Network neutrality would prohibit carriers from enhancing the quality, reliability, or performance of Internet applications as they move through the transmission layer. The law would require carriers to treat every bit of data the same, even if the overarching applications are vastly different. Conversely, applications providers who want to create a better experience for their customers could not ask carriers to assure quality or reliability–regardless of their willingness to pay for the service.

Trouble is, the costs of the exaflood cannot be avoided. In March 2006, Henry Kafka, chief architect at BellSouth (now AT&T), told attendees at the National Fiber Optic Engineers Conference that the average residential broadband user was consuming about 2 gigabytes of data per month, which Kafka estimated costs the service provider about $1. As downloading feature films becomes more popular, users might consume an average of 9 gigabytes per month, costing carriers $4.50.

The average IPTV user, however, will likely consume about 224 gigabytes per month, Kafka said, at a monthly cost to carriers of $112. If that content were high-definition video, the average user would be consuming more than 1 terabyte per month, at a cost to carriers of $560 per month.

“Clearly that’s not what the average user is going to pay per month for their video service,” Kafka said.

Network neutrality would close off an important revenue stream for carriers–quality, reliability, and partitioning services that very large applications providers will need for their services to work properly. This will chill investment and slow deployment. The overall utility of the Internet declines as it becomes clogged. Consumers would pay the price, because the cost of managing congestion could not be transferred to the largest users of bandwidth.

Although attacked as a “toll lane” on the Web, such paid partitioning will keep the standard transmission lanes–still extremely fast–cleared for less commercial and less bandwidth-intensive applications, resulting in a better functioning Internet for all. This will do more to ensure the Internet remains equally useful for all than regulating or banning Internet quality control.


Steven Titch ([email protected]) is senior fellow for IT and telecom policy at The Heartland Institute and managing editor of IT&T News.


For more information …

Fiber to the Home Council video, http://www.fromtheheartland.org/live/audio.html

“Today’s IT News in Pictures,” IT&T News, May 2007, http://www.heartland.org/Article.cfm?artId=21058

Ed Gubbins, “OFC: BellSouth Chief Architect warns of HD VOD costs,” Telephony Online, March 7, 2006, http://telephonyonline.com/iptv/news/BellSouth_VOD_costs_030706/.