A bot, short for Robot, is just a program that runs on a web server. They can also be called spiders or crawlers. This name is used when they are accessing web sites to gather information. By accessing millions of web sites in a short time much information can be gathered.
Search engines use spiders to gather information about web pages in a process called indexing. This information is added their index that is used when people search for an item. When a person submits his web site to a search engine he is asking that his web site be indexed. -> Read more