Steve Joyce, a friend of mine who works for a new startup Ganymede Software that sells network test tools (sjoyce@ganymedesoftware.com), was recently moving to a new town and was out house shopping with his real estate agent. He describes discovering the perfect 3-bedroom and almost being ready to make an offer:
Steve: "My office is downtown. Can you give me some idea of how long it will take to get to and from work during rush hour?"
Agent: "Not really. The only information I have is the roads from your house to the highway are all rated at 1000 cars per hour and the stoplight at the highway has an average wait time of 1.5 minutes. "
Steve: "What? I don't care about all that. How long will it take me to get to work? And besides, who thinks that they can get 1000 cars per hour down this little street?"
Agent: "I'm not sure. It does seem like it would be a bit much for Sleepy Hollow Lane."
My agent offered her sympathies and then made her best guess at how long it probably took to get downtown. I decided to pursue this a bit further and met with an engineer at the Department of Transportation.
Steve: "Can you give me a clue as to why you provide us with information about a road in cars per hour?"
Engineer: "Sure. We needed a measure that was simple and reproducible. Ever since federal funding was handed out to the best road crews, we've made sure that our roads are all rated for high volumes."
Steve: "How can you say that a little road like Sleepy Hollow Lane can handle 1000 cars per hour? It's a tiny residential street!"
Engineer: "First, we say that it CAN handle 1000 vehicles per hour. We don't ever imply that it should or even will. Second, notice that I said vehicles per hour, not cars. We started by trying to simulate normal traffic. You know, mostly cars but some trucks, motorcycles, and even a few mopeds. Problem was, different roads in different parts of the city had different mixes of traffic. So we created a new, generalized technique. First, we measure just trucks, then cars, then motorcycles, finally the mopeds. Then we assume that everyone is travelling on one-way streets and going as fast as they possibly can, ignoring traffic rules. That's how we came up 1000 vehicles per hour for your road."
Thanks Steve. There is a lesson here for those of us that are trying to measure Web server performance, and I continue to try my best to stay away from the traffic models. Rather than develop YAWSB (web server benchmark), I’ve joined forces with Nick Lippis' Strategic Networks Consulting, Inc. to test them with as close to real world conditions as possible. This will extend some of the original work that I did for c|net at Keylabs, Inc. and examine a variety of server features, performance and usability. Lippis' group will charge each web server vendor for doing the tests and we hope to have as many of them participating as we can. We'll report on the results this fall. Contact Cathy Lerner (lerner@snci.com, 800 999 7621) for more information about these tests.
Speaking of the real world, one interesting trendlet I’ve seen has been what WebWeek calls Spamdexing, where webmasters add various keywords or duplicate phrases in the hopes of fooling the various robots and index engines so that their page will rise to the top of the list when users enter search words. Of course, the porn sites have been leading the way here.
Danny Sullivan at Maxonline deserves a Big Duck award for his nice analysis of search engines, along with tips for Webmasters on how to anticipate the best results from them. His study comes from making changes to his own web site over a period of months and observing how the search engines react. Another good study of five search engines was done by Nicholas Tomaiuolo of Central Connecticut State University.
While on the subject of web servers, I’ve purchased the content and domain name of a site called WebCompare, which Paul Hoffman has been running for several months. The site has detailed feature comparisons of both web servers and browsers, and covers products over a wide range of operating systems and abilities. Paul wanted to move on to other challenges and I agreed to maintain the site. The site does carry advertising, if you are interested let me know. Also, appreciate hearing any improvements you'd like to see or how the site could be made more useful.
And to round out the action on servers, in this week’s Infoworld I’ve written about a new product from Compact Devices called TopSpin. It is the simplest web server yet: you just plug in a CD ROM SCSI drive and network cable and off you go.
One small note about How to Do PR right: I met the folks from Compact Devices at Interop, gave them my card. A box was in my hands within a week, and the review was accepted by Infoworld shortly thereafter. So many times I see stuff at shows and there is no followup. Sigh.
This week’s BeHereNow award goes to a website that I’ve gotten some use of lately, Greet Street. For those of us that can never seem to find the right greeting card or worse, forget to send one in a timely fashion, go there now and schedule what cards will be sent to what people with all sorts of customization features. It is one of the more innovative and practical commercial web sites I’ve seen.
This essay is composed in HTML and can be read in your browser. This is not always a simple process, and I'll be happy to provide help if I can. I am working on getting the MIME stuff to work, so please be patient. If you are getting this directly from me, or if someone is forwarding it to you, and you want to change that situation, let me know. Subscriptions are always free of charge.
David Strom
+1 516 944 3407
Link to this essay on our site
Back issues of Web Informant essays