A spider is a program or script written to browse the World Wide Web in a systematic manner for the purpose of indexing websites. The first web robot, World Wide Web Wanderer, was created in 1993 and today a spider may also be called a web bot, web crawler, or web robot.
Spiders are often used to gather keywords from web pages that are then sorted so users can locate said pages through an Internet search engine. Other spiders and web robots can scrape certain types of information from websites on the Internet.
Bots, Crawl, Internet terms, Robot, Robots.txt, Slurp, Web page, Web scraping, Website