My last issue of WI#88 on bandwidth brought the following commentary from Scott Welch. Scott (scott@softarc.com) is one of the founders of SoftArc, makers of First Class email and collaboration software. Take it away, Scott.
We all are real beneficiaries of Moore's Law, whereby our computers keep getting faster and faster, with more and more memory, for about the same price.
There is one notable exception: bandwidth.
Bandwidth does NOT double every 18 months. Networks CANNOT be upgraded like memory. Wholesale changes of underlying technology CANNOT be made to networks, the way they can to individual machines.
My issue is that site designers and operators think otherwise, and build their sites as if bandwidth will continue to get better. It won't. Things will get worse. You hear the line "... and next year, when the bandwidth is doubled...." I've got some news for you: Next year is here, and bandwidth still looks pretty finite to me.
This problem is unlikely to go away anytime soon, because it's a fairly classic bit of economics: None of us are really paying for the bandwidth we are consuming now, so there is absolutely no incentive for us not to consume all we want. So we've got PointCast and RealAudio and CNN all running... that's what we pay our $19.95 a month for. Taken as an individual problem, this means that you may or may not decide that it is worthwhile listening to your $3000 computer sound like a $5.99 transistor radio.
But the bandwidth problem isn't one of adding more. It is purely human behavior. The very action of you and a million like you doing what you want will have the effect of making the entire organism of the Internet unusable by everyone.
One of the reasons that the Internet is so slow is that almost everyone assumes that the slowdown is temporary. This makes them reluctant to write smaller pages, or set bandwidth limits, or charge based on amount of the resource that is consumed. Instead, they hope that through some miracle, the finite resource will somehow be doubled, or tripled, or made infinite. I see no evidence that such a miracle is scheduled for the near future.
Indeed, the true cost of Net access will soon become apparent, as ISPs realize that they are not charities and raise their prices. At this point, people will have to decide whether the value they receive from the net is greater than the true cost of access, and many will close their accounts.
Others have proposed charging by the email or packet, but that isn't really the answer either. What is needed perhaps is to have different pricing models for differing degrees of latency. In other words, I have to pay more to get my downloads quicker.
This problem has been studied to death in other industries: think of peak load pricing for electric power grids, differential rates for phone systems, and the various classes of postal delivery. For some reason, when it comes to the Internet, we refuse to believe that its resources are finite. In the words of Pogo, "I have seen the enemy, and he is us".
In the meantime, think about this when you do your next download.
Thanks, Scott. Indeed, you are not the only one thinking about this. You can read an interview with Bernardo Huberman, a research fellow at Xerox Palo Alto Research Center, in PC Week.
But not everyone agrees. Writes John Quaterman, an Internet Old Hand:
"There is an Internet quality of service research community. Every one in it whom I have talked to thinks Huberman has got it wrong. The basic problem is exactly the same as above: Huberman assumes the Internet is a strictly limited resource. He then assumes the Garett Hardin's tragedy of the commons scenario applies to the Internet. For neither assumption does he give any proof. Here is why both assumptions are incorrect."
David Strom
david@strom.com
+1 (516) 944-3407
back issues
entire contents copyright 1997 by David Strom, Inc.
Web Informant is ® U.S. Patent and Trademark Office