The bots are some of our best friends and a welcome sight.
They are
search engine scripts that crawl the Web, checking lists of Web site URLs and indexing the material that is found there. When you type a search phrase into one of the search engines (Google, Bing, Yahoo, etc.), the search engine checks through all of the pages these bots have crawled, checking for sites with content that best matches your search phrase.
Like all other things, there are 'good' bots/crawlers/spiders and there are some that are not so 'good'. Most of the bots will follow instructions in a file it can often find on a Web site's server, so what the bots 'see' and how quickly they 'see' it can be controlled. I use a file that blocks a couple of the 'bad' bots and limits how quickly another bot can make requests to the server. I then tailor the areas I want the bots to ignore, so they are not wasting time crawling pages that are not helpful to our search engine indexing. I want the bots to crawl contextual pages, so I block them from crawling things like the log-in page, the lost password request form, the directory where all the member avatars are stored, etc. I also use that file to direct the bots to a compressed sitemap that is generated in the background and stored on the server, so the bots can find clear links to new content.
Once the bots are somewhat herded onto pages we want them to 'see', we then use heading tags in the coding, to give certain words more weight, or to make them more noticeable. It is kind of like the headings you find in newspaper articles, where the attention-grabbing headline is in a very large and very bold font, with sub-headings being smaller in size, but still more bold than the font used for the general content. If you were to see the way this particular page is formatted in HTML, you would see the thread subject is the blaring headline on the page, for instance.
There are two secrets to getting search engines to rank Web sites well. One is to provide regular, new and copious content for the bots to crawl and the other is to make pages as friendly for the bots as possible. So whilst you are submitting posts to provide the content, I am often in the back room, trying to make the site more robot-friendly.