Spiders.txt is a written text computer file you put on your website to tell look for robots which webpages you would like them not to check out. Spiders.txt is certainly not compulsory for google but usually google follow what they are requested not to do. It is important to explain that robots.txt is not a way from avoiding google from creeping your site
Bookmarks