3 free Basics to include in Robots.txt and how it helps SEO
Take advantage of this little known document for some easy search engine optimization!
What is Robots.txt?
At the root of a website, there is a text file that helps search engines know which parts of your website to crawl and index. The reason a robots.txt file exists is to tell Google, Bing, and other search engines what content should not be indexed. Search engines avoid this content when crawling your website.
How to find a robots.txt file
Anytime a website gets designed, the designer must add a necessary robots.txt file. To check if the designer/developer did this action, you can type this URL in any search engine: domain.com/robots.txt. You should ideally see something that looks like this:
3 things to include in a Robots.txt file
A sitemap is essential to include in a Robots.txt file either at the beginning or end because it tells a search engine where to find your sitemap. If a sitemap has already been submitted through Google search console, then Google already knows about it. Other search engines use your sitemap as well.
As you can see in the above image, we have disallowed the WPadmin pages and its files to be accessed.
Each search engine defines itself with a different user agent. Because there are hundreds of search engines, we suggest always using the wildcard (*) to allow all search engines to access the robots.txt file.
Do you need a robots.txt file?
Everyone should have a robots.txt file. Having one allows the website owner control over where and how they want search engines to navigate their site. It is especially important to have one if you have the following:
- Duplicate pages
- Staging sites
- Submission forms
- Images and videos
Remember, you are in control over what you want search engines to crawl. Each website has a crawl budget; by disallowing and no indexing pages, you will be able to save your site’s crawl budget.
How to change a Robots.txt file
In past blogs, we have mentioned how great the plugin Rankmath is. We recommend this plugin because of all of the functions that it offers. Inside rankmath’s general settings, you will see a drop-down menu that says “Edit robots.txt” this is where you can make changes to it.
Auditing Robots.txt for errors
The easiest way to check if there are any errors to your robots.txt file is by looking at the “coverage” section in the search console report. Here you can see that we have one indexing error on our site. If you have an issue caused by robots.txt, then it will state “Blocked by robots.txt.” Some of the recommendations that Google search console recommends need to be done by a web developer that knows how to read a PHP file.
Having a clear outline of how you want search engines to crawl your website is vital for all SEO professionals. Submit a form on our contact page if you are interested in seeing if your robots.txt file is correct. We are available to answer any digital marketing questions you may have.