From Wikipedia
A Web crawler, sometimes called a spider, is an Internet bot that systematically browses the World Wide Web, typically for the purpose of Web indexing.
A Web crawler starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the...