Internet search tools fall into two camps: search engines, such as HotBot and AltaVista, and online directories, such as Yahoo and Lycos.The difference between the two is related to how they compile their site listings.Of course, there are exceptions to every rule.Some search utilities, such as Ask Jeeves, combine the search engine and directory approaches into a single package, hoping to provide users with the best of both worlds.
In directory-based search services, the Web site listings are compiled manually.For example, the everpopular Yahoo dedicates staff resources to accept site suggestions from users, review and categorize them, and add them to a specific directory on the Yahoo site.
You can usually submit your Web site simply by filling out an online form.On Yahoo, for example, you'll find submission information at www.yahoo.com/docs/info/include.html. Because human intervention is necessary to process, verify, and review submission requests, expect a delay before your site secures a spot in a directory-based search service.
On the flip side, search engines completely automate the compilation process, removing the human component entirely.
A software robot, called a spider or crawler, automatically fetches sites all over the Web, reading pages and following associated links.By design, a spider will return to a site periodically to check for new pages and changes to existing pages.
Results from spidering are recorded in the search engine's index or catalog.Given the wealth of information available on the Internet, it is not surprising that indexes grow to very large sizes.For example, the AltaVista index has recently been increased to top out at 350 million pages.This may seem like a mammoth number, but by all estimates it still represents less than 35 percent of all pages on the Web.