Sunday, March 22, 2009

What is web crawler and how can indexing activities ?

Web crawler or spider is a computer application that used in World Wide Web systematic as well as robotic way. Some other best phrases are including worms, indexes, bots, ants, web robots, all style process or manner called web spidering. More worldwide sites , a individual search engine always utilize web crawler as a indicate of providing up to data, all web spiders are most of time make a copy of any web site pages and afterward dealing out by  search engine, which index the downloaded pages to offer fast searches..Most of the spider are automatic and as well as maintenance manner and it will check all html code and get links and content and then index a particular location where give out the result when any search. A more software bot make a perfect crawler also including a perfect software agents, and always they start with URL for any visit. When spider visit any URLs then find our where hyperlinks located also checked new content and then add. Which process called spider Frontier. 

No comments:

Post a Comment