Web Crawlers

A web crawler is a computer program or automated script which browses the World Wide Web (WWW) in a methodical, automated or ordered manner. This crawler is also called as web spider, web robot, ants, automatic indexers or bots. 

Web Crawlers
Web Crawlers
Web crawlers are mainly used to create a copy of the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. 

More over these web crawlers are used to gather specific types of information from Web pages, such as harvesting e-mail addresses. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code.

This process is called Web crawling or spidering. Many legitimate sites, in particular search engines, use spidering as a means of providing up-to-date data to ensure fast searches.

No comments:

Post a Comment