Scientists say that all of the search engines -- AltaVista,
Google, Excite, etc., etc., -- have absolutely MISERABLE
performance, and they are getting WORSE. Not a single
engine has indexed more than 16% of the Web's estimated 800
Even when you combine the results of all the major search engines together,
they only cover approximately 40% of the 15 terabytes of data available on the
How are you supposed to even do a search like that?! Metacrawler and Dogpile,
etc., aren't comprehensive enough. You would have to write your own search
tools or something! And that would make you CRAZY!
When the same scientists (researchers at NEC) looked at search engines in 1997,
coverage was significantly BETTER. Last time around, some search engines
managed to cover as much as a third of the web. So basically, the web has been
growing so fast, and getting so OBESE, that the search engines have not been
able to keep up for one reason or another...
Meanwhile, HotBot, Excite, Lycos, etc., keep adding new gadgets and features,
and redesigning their interfaces and home pages.
AltaVista, for instance, recently added "paid listings" (read: bribes to skew
search results) and they redid their home page again this month. That's
another thing that the study points out -- search engines are badly BIASED in
favor of commercial stuff in the United States. They don't index stuff at
universities (.edu) or in Canadia (.ca) as much. So "paid listings" are
probably not going to help matters there.
Maybe the search engines should try concentrating on COVERAGE. Barry Rubinson,
AltaVista's VP of Engineering, says their index currently contains only 150
million pages. Now AltaVista has a a new "Search Freshness Guarantee,"
reminiscent of Budweiser's "Born On Dating," which promises that links are only
a month old at most. That's great, but how are they going to live up to their
stated goal of indexing the ENTIRE WEB?