A search engine robot is a program that automatically fetches Web pages. Spiders are used to feed pages to search engines. It's called a spider because it crawls over the Web. Another term for these programs is webcrawler.
http://www.webopedia.com/TERM/s/spider.html