Test Your Robots.txt File

Kyle Van Deusen
June 29, 2020

Share This Article:

Share on facebook
Share on twitter
Share on email

Your robots.txt file tells web crawlers if they are allowed to crawl your website or not.

During the development of your website, you likely turned off any options for indexing your website so Google (and other search engines) wouldn’t index it before you were ready.

People will most often do this from the WordPress dashboard under ‘Settings’, ‘Reading’, “Discourage search engines from indexing this site’.

Of course, you’ll want that off when it’s time to launch your website, and testing what you robot.txt file says will tell you if you’re still blocking or not.

How can I see my robots.txt file?

The easiest way to get to your robots.txt file is to type in your root domain into the address bar, followed by ‘/robots.txt’. Here’s what a typical robots.txt file looks like for reference:

Typical robots.txt file
Typical robots.txt file

How do I edit my robots.txt file?

You have a couple of ways of editing your robots.txt if you need to.

First is with your SEO plugin.

Most SEO plugins (free and paid) allow you to edit your robots.txt in an easy way right from your WordPress dashboard. If you can’t find it in your plugin, try searching your SEO plugin’s documentation.

The other option is to go in and edit the file in your hosting file manager or via FTP.

If you are comfortable, you can jump into the files on your server and open and edit the robots.txt file by hand. The robots.txt file is located in the root directory of your domain.

What should be in my robots.txt file?

This ultimately depends on what you want to have indexed on your website and what you do not.

Here’s a brief understanding of how the robots.txt file works (using the example from above).

The ‘User-agent:’ line in the example is marked with an asterisk (*), saying that the following rules apply to any crawlers.

The ‘Disallow:’ line is telling the user agent to not visit ‘/wp-admin/’ so that it does not crawl the back-end of your website.

The ‘Allow:’ line in the example is making one exception for a specific URL after the ‘/wp-admin/’ that we just told it not to visit.

Some plugins or features of your website might add more lines of code than the one in this example, but typically you are safe to use this (which comes standard in WordPress):

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

If you want to restrict certain pages from indexing, you can list them after the ‘Disallow:’ line. However, this is most commonly done using your SEO plugin by giving a specific page a ‘no-index’ tag).

Share this Content
Share on facebook
Share on twitter
Share on email

Try Docket WP Risk-Free

If having your processes at your fingertips doesn’t boost your productivity and save you from costly mistakes, then you wont pay a dime. If it does, you’ll wonder how you ever lived without it!

import checklists