…or, the difference between 80% uptime and 99.78% uptime.
Last week, a badly worded NPD report led various web sites to say that NPD had pegged Casual Gaming as a $52 million industry (in fact, NPD was only talking about subscription revenue – relatively small potatoes within this biz).
Here’s another example, made a little worse by the fact that the bad wording is repeated by Robert Scoble, who should know better.
The Yankee Group released a report stating that Windows Server 2003 had “nearly 20% more annual uptime” than Red Had Enterprise Linux. This was then reported by Yahoo, and in turn picked up by Scoble .
Scoble, at least, should know better. Claiming that Windows Server had 20% more uptime than Linux implies the average Linux Server was down about 20% of the time, or ~70 days a year or ~5 hours a day. That’s ludicrous and anybody who knows anything about tech should know that.
The brief report synopsis on the Yankee site does not clarify things any further, but the Yahoo article contains more details, presumably from the full report. Further down in the article, it states that the various servers (Unix servers were also tested), “experienced 3 to 5 failures per server per year in 2005, generating 10 to 19.5 hours of annual downtime for each server”.
So I think it’s safe to assume that all the articles should have stated that Linux had 20% more downtime (annoying, but not alarming), rather than 20% less uptime.
i.e. With (24 * 365 =) 8760 hours in a year, the worst case is that Linux servers averaged about ((8760-19.5)/8760) = 99.78% uptime, versus approximately 99.82% uptime for Windows 2003 servers. Rather less alarming, and rather less worthy of Scoble’s trumpeting post.