How to add Robot.txt file in blogger blog to help search engine crawlers.

Robot.txt file help in seo (search engine optimization) because it help seach engine crawlers (bots) and tell them which part of the blog to indexed and which to not.It correct use can increase your ranking but its wrong use can decrease your ranking and can ignore your blog by search engines.

Read: How robot.txt help me increasing my traffic and ranking.

Photo attribution:Timothy Vollmer

Steps how to add robot.txt file to blogger blog.

1. Go to blogger.
2. Select your blog.
3. Go to setting.
4. Go to search preferences.
5. In the crawler and indexing look for custom robot.txt and enable it.
6. Then past the following code.

              User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /


To check your robot.txt file enter the following url in your browser
(replace the red part with your own domain).


The code i provided is default blogger robot.txt file and is working even when the robot.txt is disable so if want to make changes in the code first learn about coding in robot.txt file.
Use at your own risk because as already i have mentioned its wrong use can cause harm and can decrease your ranking.

You may also like:  what is linkbuiling and how to build backlinks
back up your complete blogger blog to remain safe

One Response

  1. Anonymous January 10, 2014

Leave a Reply