Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner. Spidering is used by many search engines as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches.
Screenshots
Comment
DeepTrawl is primarily a link checker for websites with which user can easily correct/weed out bad urls in a website. Along with urls it has additional features such as spellchecker and html error checker.