Blogger Custom Robots.txt File –

Having blogger custom robots.txt file in settings most important for SEO. Because it increase your rankings, help search engine bot to find perfectly. In fact, they can crawl regularly. But if your blog doesn’t have any customize or SEO friendly robots file, you can’t think rankings. Because search engine crawling bot doesn’t read your whole blog, they just look for great robots txt file with optimized blog sitemap.

Add Blogger Custom Robots.txt

There have lot’s of question from newbie bloggers that how to add custom robots.txt in blogspot? Well, there were right question and they must need absolute solution also.

But what is the actual benefits of having this important file?

– There have lot’s of benefits if you have this exclusive modified file. You can indicate robot to follow links, ignore blocked links also. You can keep nice blogger sitemap here. That help every search engine to find whole blog together.

You can give nofollow attribute to specific pages or posts. But if you doesn’t setup properly or make mistakes, it harms for blog SEO health. So, be careful and just follow my tips and tricks for better result.

What is Robots.txt?

In every blog, there have a default robots.txt file with some simple codes. That can make your blog visible and crawl-able to Google, Bing, Yandex etc.

Before starting search engine crawling your blog, they get help from your robot file. So, you must need to keep proper settings here. Also do not make any mistakes also.

If you are using blogspot free blog, then you can see following default robots file in settings:

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

Now I will discuss what they are mean!

In this robots file, you see that 5 lines with some indicates. These are the main key for custom robots. You can use these by default. But for better SEO facilities, you must need to customize here. But it’s not important.

User-agent: Mediapartners-Google

If you are using Google Adsense to monetize blog, then you need to use this code. Because it’s help Google bot to serve cache properly and show relevant ads on your blog.

Disallow: /search

When anyone search on your blog, they will visit via this search page. If you doesn’t disallow from robots, it’s send random generated links to search engine. Which makes duplicate pages.

Note that, here is Allow: / means allowing your homepage to Google, Bing for better updates.

User-agent: *

All of search engine using this (*) marks as their crawling signal. Just use this simple code and allow all search engine to crawl your website. If you not added it, just do now & allow all search engines.


This is the blogspot sitemap and allow search engines to find your content one place. Without this important part, your SEO is valueless. You must use this code perfectly. These can brings largest success for your blog.

If you using this sitemap, it can be crawl-able maximum 25 posts. But if you have more than 25 posts in your blog, you must use this sitemap: “”

How to add blogger custom robots.txt?

We discuss about the file, how to modify and how to use. Now it’s time to submit on blogger. Because, without this submission, you work is valueless. So just follow these steps add robots file in blogspot:
1. Go to your blog from dashboard
2. Now go Settings > Search preferences > Custom robots.txt > YES
3. Now paste your robots code in the box
4. Finally click on Save changes
5. You’re done!

How to check robots file:

Final Thoughts

If you want make your blog search engine visible, you must keep great robots file. Above these tips help you to put blogger custom robots.txt file for better SEO. Just follow these step by step guide and boost your SEO rankings. Let’s Cheer!


Write A Comment