If you’ve suddenly received a message from Google which says ‘Googlebot for smartphones found an increase in authorisation permission errors on’ your website, it means that for some reason your website has blocked access to the Googlebot so it’s unable to crawl and index part (or all) of your site. There are a few reasons for this, so best to start with the easiest and most obvious methods first.
1. Try Renewing Your .htaccess File
Sometimes the .htaccess within your hosting file can become corrupted. In such instances, it’s easy to replace the file with a new version. This will fix the error in the majority of cases.
WordPress users can do this very easily:
- Create a new blank
.htaccessfile in your WordPress directory via FTP or SSH or by logging in to the control panel of your hosting account (rename the old version .htaccessold if you want to back it up)
- Change permissions of .htaccess to 777.
- Log-in to your dashboard. Navigate to settings, permalinks, and update permalinks (choose any option there and select Save Changes).
- When done, change the
.htaccesspermissions back to 644.
Now Fetch as Google in the Search Console and see if the problem has been solved.
2. Check Permissions for your Hosting account haven’t been changed
Occasionally, file and folder permissions are accidentally altered which can block access to certain files on a server. You should make sure that all permissions are set correctly. If you’re not sure what to do, refer to your hosting provider. As a general rule of thumb the following permissions apply:
644: Files with permissions set to 644 are readable by everyone and writeable only by the file/folder owner.
755: Files with permissions set to 755 are readable and executable by everyone, but only writeable by the file/folder owner.
777: Files with permissions set to 777 are readable, writeable, and executable by everyone. You shouldn’t use this set of permissions, for security reasons, on your web server unless absolutely necessary.
You can read detailed information on folder permissions at Understanding FTP File Permissions in Linux (inc CHMOD).
3. WordPress user? Try disabling plugins
If you’re a WordPress user, it’s possible that a rogue plugin is making Googlebot experience a problem when it’s trying to crawl your website. Try disabling all plugins and then Fetch the website. If the problem has been solved and Googlebot succeeds, there’s a problem with a plugin.
The find out which one, enable one plugin at a time and then Fetch as Google each time. Once you activate the problem plugin and try to fetch you’ll quickly realise which plugin is to blame.
4. Make sure your robots.txt file isn’t blocking access to files or folders
If it exists, a robots.txt file will tell Googlebot what it should (or should not) index within your website. Configured incorrectly, robots.txt may be the cause of your problems.
The easiest way to check if robots.txt is blocking access to Google is to rename it to robots.old or delete it completed and then attempt to Fetch your website. If the fetch is sucessful then you’ve found the cause of the problem.
Note: You don’t need a robots.txt file to get your site listed on Google, so it’s safe in most instances to delete it, unless you want to block Googlebot from indexing certain folders or files. You can learn more about robots.txt and how it can help you at How To Create And Configure Your Robots.txt File.
5. Check your host hasn’t changed anything
Occasionally, hosting companies make changed to security settings without telling their customers. This is often due to a brute force attack or the like. If you’ve tried the above suggestions and are still having a problem, contact your hosting company.
Still having problems?
If you’ve tried the suggestions above and you’re still having problems, there may be something in your code which is blocking Google access to index your website. Place your problem below (along with a link to your website) and we’ll see if we can help.