Black Friday Sales Are Giving up to 50% Price Off!! on Elementor Template Pack| Discount Coupon Code: BFCM22

How to Edit & Optimize WordPress Robots.txt File for SEO

Robots.txt is a very important file for WordPress. Have you been optimized it yet?

If your answer is negative, then you are seriously ignoring a very much essential aspect of SEO. Robots.txt file plays a very important role in your WordPress website’s SEO.

You may call yourself are lucky. Because WordPress automatically generates a Robots.txt file for you. That means you have already done half of the battle. But you must make sure that Robots.txt file is perfectly optimized to get the full benefits.

Robots.txt file informs search engine bots what pages need to crawl and what pages need to be avoided. In this article, I will try to cover on how to edit and perfectly optimize Robots.txt file in WordPress.


What is Robots.txt File?

Let us know what Robots.txt file is.

According to the extension of the file, we can easily understand that Robots.txt file is a text file. It directs the search engine bots how to crawl and index a website. Whenever any search engine bots come to your site, it only reads the robots.txt file and follows the instructions given by Robots.txt. By using the Robots.txt file, you can notify bots that which part of your site need to crawl and which part need to be avoided. However, if the robots.txt file is not present search engine bots will not stop to crawl and index your site.

Editing & Understanding Robots.txt in WordPress

As I said before that every WordPress site has a default robots.txt file included in the root directory. You can easily check your robots.tx file simply by going to http://yourdomain.com/robots.txt. If you don’t have a robots.txt file in your root folder, you need to create one. It’s a very easy task to do. Just create a text file in your computer and rename it as robots.txt and upload it to your root folder. You can simply upload it via your FTP Manager or via cPanel File Manager as well.

Now let us know how to edit the robots.txt file.

You can easily edit your robots.txt file by using FTP Manager or cPanel File Manager. But there is a problem, and that is, it is very time-consuming and a little bit difficult.

The best way is using a plugin to edit the Robots.txt file. There are so many WordPress robots.txt plugins everywhere. I will recommend you to use Yoast SEO. This is the best SEO plugin I have ever seen for WordPress SEO.Yoast SEO will help you to modify the robots.txt file from your WordPress admin panel. But, if you don’t want to use the Yoast plugin, there are options. You can easily use other plugins like WP Robots Txt.

Once you have successfully installed and activated the Yoast SEO plugin, just go to WordPress Admin Panel > SEO > Tools.
Optimize WordPress Yoast SEO Dashboard Menu

Then simply click on “File editor”.
Tools of Yoast SEO

Then the task you need to do is to click on “Create robots.txt file”.

Robots.txt yoast seo

After that, you will simply get the Robots.txt file editor. You can easily configure or customize your robots.txt file from this panel.

Editing Robots.txt

Before editing the robots.txt file, you must understand the commands of the robots.txt file. There are mainly three commands available.


  • User-agent – It defines the name of the search engine bots. Such that, Googlebot or Bingbot. You can also use an asterisk (*) to refer to all search engine bots.
  • Disallow – Directs the search engine bots not to crawl and index some parts of your website.
  • Allow – Directs search engine bots to crawl and index which parts you want to index.


Look at the sample of Robots.txt file.


User-agent: *
Disallow: /wp-admin/
Allow: /


This robots.txt file gives instructions all search engine bots to crawl the website. Look at the 2nd line, it tells search engine bots not to crawl or index the /wp-admin/ part of the website. In 3rd line, it tells the search engine bots that they are allowed to crawl and index the whole website.

Configuring & Optimizing Robots.txt File for SEO

A simple misconfiguration in Robots.txt file can completely deindex your site from search engine bots. Suppose, if you use the command “Disallow: /” in Robots.txt file, your site will be non-indexed from search engines that mean you will not be stored in search engines. So you have to be very careful while configuring the robots.txt file.

Another very important thing is optimization of the Robots.txt file for SEO purpose is before going to configure or customize the Robots.txt SEO, I am going to warn you about some very bad practices that should not be done.

  • Don’t use the Robots.txt file for hiding low-quality contents. The best practice is to use no index and no follow meta tag. You can do this easily by using Yoast SEO plugin.
  • Don’t use Robots.txt file to stop search engines to index your website’s Categories, Tags, Archives, Author pages, etc. You can add no follow and no index meta tags to those pages by using Yoast SEO plugin.
  • Don’t use Robots.txt file to handle the duplicate content of your WordPress site. There are other ways.

Now let us know how you can make Robots.txt file SEO friendly.

  1. At first, you have to determine which parts of your site you want search engine bots not to crawl. I will recommend disallowing /wp-admin/, /wp-content/plugins/, /readme.html, /trackback/.
  2. Adding “Allow: /” derivatives on Robots.txt file is not that much important as we know that bots will automatically crawl your site anyway. But you can use it for the particular search-bot as well.
  3. Adding the sitemaps to Robots.txt file is also a very good practice.


Here is a simple example of an ideal Robots.txt file for WordPress.


User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://wwww.yourdomain.com/post-sitemap.xml

Sitemap: https://www.yourdomain.com/page-sitemap.xml


After updating your website’s Robots.txt file, you must test the Robots.txt file to check if any kind of content is impacted by an update.

You can simply use Google Search Console to check out if there is any “Error” or “Warning” for your Robots.txt file. Simply log in to Google Search Console and select the site you want. Then go to Crawl > robots.txt Tester and then click on the “Submit” button.

robots.txt tester

A pop-up box will be shown. Just click on the “Submit” button.
Google Search Console

Then just reload the site page and check if the file is updated. The Robots.txt file may take some time to be updated.

If the robots.txt file has not been updated yet, you can enter your Robots.txt file code into the box manually to check whether there are any errors or warnings or not. It will then show you the errors and warnings there.
Robots.txt code

If you can see any errors or warnings in the robots.txt file, you must fix the errors by editing the robots.txt file.

Final Thoughts

I hope that this article helped you to perfectly optimize your WordPress robots.txt file. If you have any further confusion or question regarding this, feel free to ask us. We will be very happy to help you.
If this Article seems helpful to you, Please share.

Last Update:

About the Author

One Comment

  • Monjurul Hasan says

    Thank you for sharing. This was really helpful for me to understand robots.txt and well-described robots.txt, I am very happy to have a good writter.

Comments are closed.