How to edit a Robots.txt file in WordPress

This week, we tackle yet another query that beginner WP users have – how to edit a robots.txt file in WordPress. The reason behind this is simple enough – it benefits your SEO efforts. A robots.txt file is an incredibly powerful tool in SEO because it works as a website guide for search engine crawl bots. That is why WordPress experts recognize its importance and how best to implement it into the optimization of their site. Here us our step-by-step take on the entire topic.

File edit icon
To edit a robots.txt file in WordPress, you only need a simple guide.

We begin with a more detailed definition of the robots.txt file

Search engine bots need instructions on how to crawl and index the inner pages of your site. That is where robots.txt files come into play. This file lays out the structure of your pages, making them easy to read and rate by search engine bots.

You can usually find the robots.txt file stored away in the root directory. This directory is also known to be the main folder of your entire website. Before we get into how to edit a robots.txt file in WordPress, we first need to learn how to recognize it. So, here is an example of how a typical robots.txt file looks:

1

2

3

4

5

6

7

8

User-agent: [user-agent name]

Disallow: [URL string not to be crawled]

 

User-agent: [user-agent name]

Allow: [URL string to be crawled]

 

 

Sitemap: [URL of your XML Sitemap]

The file itself offers plenty of space for customization, such as allowing/disallowing particular URLs or adding multiple sitemap files. So, if you have a URL that you do not want search engine bots to crawl and rank, you need to disallow it through robots.txt. Here is an example of allowing/disallowing search engine bots to crawl particular URLs:

1

2

3

4

5

6

User-Agent: *

Allow: /wp-content/uploads/

Disallow: /wp-content/plugins/

Disallow: /wp-admin/

 

Sitemap: https://example.com/sitemap_index.xml

Lastly, we add our sitemap to the mix, making the URL visible to search engine bots.

The importance of having a robots.txt file in WordPress

The important thing to understand is that even if you don’t have a robots.txt file, search engine bots will still crawl your pages. However, they will crawl all of them. You will be unable to order which pages they should and shouldn’t crawl.

For fresh WP websites, this might not present too big or a problem. However, for those sites with endless content, you will want to have better control over how it is crawled and indexed. A robots.txt file provides a proper way of maintaining your WordPress website and how search engine bots see it.

Why is this so important, to begin with?

Each website has a crawl quota – that is how search engine bots function. That means that bots will crawl your website one page at a time. If they fail to finish crawling all your pages during one session, they will resume crawling the next time around. And this is something that can significantly slow down the indexation of your site (indexing rate).

However, when you edit a robots.txt file in WordPress to disallow certain pages from crawling, you save your crawl quota. The most common and unnecessary pages that WP users usually disallow include WordPress admin pages, plugins, and themes. Once you eliminate these pages, you give crawlers more room to index other relevant pages on your site.

What should the perfect robots.txt file in WordPress look like?

Popular blog sites opt for a simplistic example of the robots.txt file – a form that varies depending on the needs of that site:

1

2

3

4

5

User-agent: *

Disallow:

  

Sitemap: http://www.example.com/post-sitemap.xml

Sitemap: http://www.example.com/page-sitemap.xml

This example shows how to allow bots to index all content, giving it a link to the XML sitemap of the website in the process.

However, this is the recommended form for WordPress website users how to edit a robots.txt file in WordPress:

1

2

3

4

5

6

7

8

9

User-Agent: *

Allow: /wp-content/uploads/

Disallow: /wp-content/plugins/

Disallow: /wp-admin/

Disallow: /readme.html

Disallow: /refer/

 

Sitemap: http://www.example.com/post-sitemap.xml

Sitemap: http://www.example.com/page-sitemap.xml

Here, you can clearly see that bots are instructed to index all WP images and files. However, it also disallows search engine bots from indexing the following:

  • WP plugin files
  • WP admin area
  • WP readme file
  • Affiliate links

The reasons why you should add a link to your XML sitemap is to make it easier for bots to find all the pages on your site.

How to create and edit a robots.txt file in WordPress

Like all of WordPress, there are versatile solutions for each task. So, there are in fact two ways to create a robots.txt in WordPress:

Solution #1: Using Yoast SEO to edit a robots.txt file in WordPress

If you are using the Yoast SEO plugin, you are in luck! The plugin comes with a robots.txt file generator. And you can use this generator to create and edit the robots.txt file directly from your Admin role dashboard.Robots.txt

All you have to do is go to SEO >> Tools page as an Admin. Once there, click on the File Editor link. The page it leads you to should contain the existing robots.txt file. And in case it doesn’t, Yoast SEO will generate a file for you. Some versions of Yoast SEO will generate a default file in the following form:

1

2

User-agent: *

Disallow: /

Once this happens, you should make sure to delete this text because it will otherwise block all bots from crawling your website. Once you remove the default text, make sure to input your own version of the robots.txt script. You can use the template we shared earlier.

Once you complete editing the file, you simply need to click on ‘Save robots.txt file’ and store your changes.

Solution #2: Using FTP to manually edit a robots.txt file in WordPress

This particular approach demands the use of an FTP client to edit the file. So, the first thing you want to do is use an FTP client to connect to your WordPress hosting account. Once there, you will find the robots.txt file in the root folder of your website.FTP

If you are unable to find a robots.txt file, then your WP site most likely doesn’t have one. If this is the case, you can go ahead and create one. Given the fact that the robots.txt file is a plain text file, you can simply download it and then edit it in any plain editor such as Notepad or TextEdit. Once you edit the file, you can simply upload it back to your website’s root folder.

Once you edit a robots.txt file in WordPress, make sure to test it

Testing is always the next logical step to follow editing. And this is particularly important in cases that involve robots.txt, for reasons we mentioned previously. Not to worry – there are tools that you can use to test this. The one we recommend most is Google Search Console.

Create an account if you don’t have one, and once logged in, switch to the old Google search console website. Once you have the old interface in front of you, you will need to launch the robots.txt tester. The tool is located under the ‘Crawl’ menu. This tool will automatically pull up your website’s robots.txt file and highlight any errors it finds.

Final Thoughts

In conclusion, having a robots.txt file is a very important tool in furthering the cause of your SEO strategies. Hence, learning how to create and edit a robots.txt file in WordPress is definitely a step forward. For any additional questions or tutorials when it comes to WordPress, visit our blog or contact us directly.

Let us provide you with a tailored maintenance plan for your website