If you’re searching for your website in Google and get an error saying:
“A description for this result is not available because of this site’s robots.txt“
you’ll need to make a quick change to your website to allow Google to index the website correctly.
What is robot.txt?
robot.txt is a small file which sits on your website hosting area and tells spiders what to crawl (add to it’s index) and what to ignore. It’s important because without the robot.txt file spiders don’t know what it is they should be indexing.
How to fix the problem
The simple way to fix the problem is to upload a standard robot.txt file to your website. I’ve attached a standard version to this blog.
Simply upload it to the root area of your website (i.e. where your index or home page is) and then wait for Google to re-index your website.