Robots.txt file in WordPress

Afbeelding Robots.txt file in WordPress

Most WordPress websites have a robots.txt file, but few people actually know what this file is and does. Because it is an important part of your WordPress website, today we focus on this small text file, also called Robots Exclusion Protocol.

What is the Robots.txt file?

The Robots file is a .txt file in the root directory of your website, with which you communicate the rules of engagement of your WordPress website for search engines. Search engine robots use this file to find out which parts of your website they are allowed to search. This sounds like some kind of hacking action, but these robots don’t have bad intentions. The bots search the web and help search engines index your WordPress website, among other things.

Why use Robots.txt?

If the bots, in general, are good for the internet, why should you use Robots.txt? Well, by using the Robots.txt file, you can exclude certain environments in your WordPress website from crawling, such as the admin environment or a folder with your customer data. You only send the search engine robots to the web pages that are important and unwanted pages do not end up in the search results.

Better indexing

The use of robots.txt thus saves the crawlbot a lot of time and prevents unnecessary pages from being indexed. This is essential for the position of your WordPress website in the search results.

Check your robots.txt file

Please note that there are no errors in your robots.txt file! During our SEO audits we unfortunately regularly come across incorrect texts, but these errors can lead to unconscious blocking of the crawlbots and that in turn can ensure that fewer visitors can still find your WordPress website. Check your robots.txt file currently looks like via the following link: ””. Replace ‘yourdomain name’ for the domain name of your website.

How to use Robots.txt in WordPress?

How to adapt the robots.txt to your WordPress, you know best. You have the most feeling for your website and know which pages can and cannot be crawled. Only how do you indicate it in the robots.txt file? Our WordPress SEO specialists have made a step list for this. Can’t figure it out? Please contact us.

  • The structure and standard steps for robot.txt

    The content of a robots.txt file looks different for every website, of course, but the structure is in any case as follows:

    1 | User-agent: [naam van de user-agent]
    2 | Disallow: [URL string die niet gecrawld mag worden]

    Rule 1 indicates which robot this rule concerns. With an asterisk (*) you indicate that it concerns all robots. The second line states which URLs may not be crawled by the aforementioned robots. In such a robots.txt file, you can also specify multiple URLs for a specific bot. See below some examples:

  • Block all content for robots

    1 | User-agent: *
    2 | Disallow:/

    Beware: there is a good chance that your WordPress website will disappear from Google in the short term with this notification.

  • Allow all content for all robots

    1 | User-agent: *
    2 | Disallow:

    Because there is nothing behind ‘disallow’ you give access to all robots to crawl all content on your WordPress website. Please note: duplicate content will also be included in the indexation, and Google is not in favor of this.

  • Block a specific robot from a specific part of the website

    1 | User-agent: Googlebot
    2 | Disallow: /example-subfolder/

    With this message you indicate that the Google robot is not allowed to look in the mentioned folder.

Robot.txt customize in WordPress

Do you want to modify the existing robots.txt file of your WordPress website? This is possible via FTP. Then manually add your wishes to the file. It is also possible to use the Yoast SEO plugin. Chances are that you already use it, but for other SEO activities. Go to the ‘SEO’ menu in WordPress and choose ‘extra’. Click on the link of ‘file editor’, now you can directly adjust the robots.txt best. After that, don’t forget to test your adjustments! This can be done, for example, via the ‘Google Search Console’ tool.

Customization is not a guarantee

Finally, we would like to inform you that the robots.txt file is a request to the robots, but it does not offer a guarantee. Most search engines respect the notifications in the robots.txt files, but it is not the case that you completely protect your data with this file.

Need help? Get in touch!

Do you suspect that something is not right with your robots.txt file, can you not figure it out or would you like to outsource this part? Feel free to contact us! Start a live chat or call us on 030 20 72 488. Our WordPress specialists are ready for you. Do you not want to have to worry about these technical matters at all? Then take a look at our WordPress maintenance packages so that you no longer have to worry about your WordPress website in the future.

Was this article helpful?

Yes No

We'll assist you promptly and professionally!

Check here all benefits.

Also check the article
Also check the article
© 2011 - 2024 | All rights reserved | WordPress Maintenance is part of Artitex