How To Add a Disclaimer to WordPress Comments

If you’re looking for information on How To Add a Disclaimer to WordPress Comments, you’re probably concerned that someone or some business may try to take comments on your website literally or even sue you.

Whilst it’s true that you have responsiblity for everything hosted on your website – including comments – a disclaimer can go some way to defend that you’re making sure people understand that the comments may no be presentative of your.

Thankfully there’s a really easy way to add a disclaimer to your comments section, and here’s how.

 

How To Add a Disclaimer to WordPress Comments

Here’s a step-by-step guide on How To Add a Disclaimer to a WordPress Blog:

  1. Log into your WordPress website
  2. In the Dashboard, find the ‘Appearance’ tab and hover over it – then select ‘Editor’
  3. On the left hand side you’ll see a number of pages – click on the ‘Comments’ link
  4. You’ll now be presented with some code. Start a new line right above the <?php text and type your disclaimer (see below)
  5. Scroll down and click the ‘Update File’ button. That’s it!

 

 

Share

Fix ‘Googlebot for Smartphones Found an Increase in Authorisation Permission errors on’

If you’ve suddenly received a message from Google which says ‘Googlebot for smartphones found an increase in authorisation permission errors on’ your website, it means that for some reason your website has blocked access to the Googlebot so it’s unable to crawl and index part (or all) of your site. There are a few reasons for this, so best to start with the easiest and most obvious methods first.

 

1. Try Renewing Your .htaccess File

Click the image above to see and example of a sucessful and unsuccessful Google Fetch and Render

Sometimes the .htaccess within your hosting file can become corrupted. In such instances, it’s easy to replace the file with a new version. This will fix the error in the majority of cases.

WordPress users can do this very easily:

  • Create a new blank .htaccess file in your WordPress directory via FTP or SSH or by logging in to the control panel of your hosting account (rename the old version .htaccessold if you want to back it up)
  • Change permissions of .htaccess to 777.
  • Log-in to your dashboard. Navigate to settings, permalinks, and update permalinks (choose any option there and select Save Changes).
  • When done, change the .htaccess permissions back to 644.

Now Fetch as Google in the Search Console and see if the problem has been solved.

 

2. Check Permissions for your Hosting account haven’t been changed

Occasionally, file and folder permissions are accidentally altered which can block access to certain files on a server. You should make sure that all permissions are set correctly. If you’re not sure what to do, refer to your hosting provider. As a general rule of thumb the following permissions apply:

  • 644: Files with permissions set to 644 are readable by everyone and writeable only by the file/folder owner.

  • 755: Files with permissions set to 755 are readable and executable by everyone, but only writeable by the file/folder owner.

  • 777: Files with permissions set to 777 are readable, writeable, and executable by everyone. You shouldn’t use this set of permissions, for security reasons, on your web server unless absolutely necessary.

You can read detailed information on folder permissions at Understanding FTP File Permissions in Linux (inc CHMOD).

 

3. WordPress user? Try disabling plugins

If you’re a WordPress user, it’s possible that a rogue plugin is making Googlebot experience a problem when it’s trying to crawl your website. Try disabling all plugins and then Fetch the website. If the problem has been solved and Googlebot succeeds, there’s a problem with a plugin.

The find out which one, enable one plugin at a time and then Fetch as Google each time. Once you activate the problem plugin and try to fetch you’ll quickly realise which plugin is to blame.

 

4. Make sure your robots.txt file isn’t blocking access to files or folders

If it exists, a robots.txt file will tell Googlebot what it should (or should not) index within your website. Configured incorrectly, robots.txt may be the cause of your problems.

The easiest way to check if robots.txt is blocking access to Google is to rename it to robots.old or delete it completed and then attempt to Fetch your website. If the fetch is sucessful then you’ve found the cause of the problem.

Note: You don’t need a robots.txt file to get your site listed on Google, so it’s safe in most instances to delete it, unless you want to block Googlebot from indexing certain folders or files. You can learn more about robots.txt and how it can help you at How To Create And Configure Your Robots.txt File.

 

5. Check your host hasn’t changed anything

Occasionally, hosting companies make changed to security settings without telling their customers. This is often due to a brute force attack or the like. If you’ve tried the above suggestions and are still having a problem, contact your hosting company.

 

Still having problems?

If you’ve tried the suggestions above and you’re still having problems, there may be something in your code which is blocking Google access to index your website. Place your problem below (along with a link to your website) and we’ll see if we can help.

Share

Fix “A description for this result is not available because of this site’s robots.txt”

If you’re searching for your website in Google and get an error saying:

A description for this result is not available because of this site’s robots.txt

you’ll need to make a quick change to your website to allow Google to index the website correctly.

A description for this result is not available because of this site's robots.txt

What is robot.txt?

robot.txt is a small file which sits on your website hosting area and tells spiders what to crawl (add to it’s index) and what to ignore. It’s important because without the robot.txt file spiders don’t know what it is they should be indexing.

 

How to fix the problem

The simple way to fix the problem is to upload a standard robot.txt file to your website. I’ve attached a standard version to this blog.

Simply upload it to the root area of your website (i.e. where your index or home page is) and then wait for Google to re-index your website.

Share

Please support this blog

Updating this blog costs me time and money for website hosting. If you’ve been helped by this blog or saved some money from my work, please consider making a donation to support it via Paypal. Even a smallest amount will help. Thank you.

Donate £1
Donate £5
Donate £10
Donate £15
Donate £20
Donate any other amount

Paypal won’t tell me your email address if you do decide to donate, so if you don’t hear back from me please accept my thanks in advance.