How to optimize Robots.txt SEO standard for Blogger

Hey, it’s been a while since I sat down today to optimize SEO for my blog. In addition to the allocation of keywords in the article, many other factors are also needed, such as robots.txt – The robots.txt file helps the search engine crawlers to know if it’s good or bad. crawls cannot be requested from any pages or files on your site. Today I will guide How to optimize the Robots.txt SEO standard for Blogger.

What is Robots.txt?

Robots.txt is a file that includes commands that navigate the crawling process of search engines. It helps search engine crawlers and indexers know whether or not they can request crawling from which pages or files on your site.

As written above, robots.txt will be very good for SEO when you prevent some unwanted links from being indexed on search engines.

Benefits of using robots.txt

Adding robots.txt is optional, but it is necessary because:

  • Block indexing of unnecessary resources (e.g. videos, PDF files, …)
  • Block unnecessary page indexing.
  • Insert Sitemap
  • Optimize crawling: Google always has a maximum crawl rate for a website in a certain period of time. Therefore, we need to index the necessary pages and remove the necessary pages to save this crawl.

Basic commands of robots.txt

Commands Function
User-agent: [Required, at least one command in each group] This is the name of the search engine crawler. For example Googlebot
Allow: The syntax allows search engine robots to crawl.
Disallow: The syntax does not allow search engine robots to crawl.
Crawl-delay: This parameter determines how long (in seconds) bots must wait before moving on to the next section (this syntax is rarely used)
Sitemap: Declare the sitemap location of the website.

Edit robots.txt for Blogger

Step 1: Go to blog management page > Settings.

How to optimize Robots.txt SEO standard for Blogger

Step 2: Scroll down and find Crawler and Indexer.


Enable custom robots.txt and click below to edit.

Configure standard robots.txt for Blogspot

Here is a standard robots.txt configuration for those of you who are using Blogspot
User-agent: *
Allow: /
User-agent: Googlebot
Allow: /
Allow: /search/label
Disallow: /search
Allow: /search(/)?$
Disallow: *archive.html$
Sitemap: https://www.bbloger.net/atom.xml?redirect=false&start-index=1&max-results=500
Note Change https://www.bbloger.net to your domain name and if your blog has more than 500 articles, change 500 to a larger number.

Explanation of this configuration

The first is User-agent: *, This syntax allows any bot to crawl like bots of Google, Bing, … and define the rules to be applied below.

Allow: /, This line means to allow indexing of all URL prefixes

I want Google not to crawl unnecessary pages that other bots can still crawl, so I write a separate command for Googlebot by adding the line User-agent: Googlebot

Allow Crawling of the label page: Allow: /search/label/.

Block crawling of search pages that may have no content ( Disallow: /search ) but still allow article pages to be crawled ( Allow: /search(/)?$ )

Disallow archive.html$ is to block the crawling of sites with the archive.html extension. I use the ($) character to match the URL at the end.

Finally, Sitemap: https… bookmarks the sitemap address of the blog.

Epilogue

So you can create and edit robots.txt files to optimize SEO for Blogger, if you have any questions, please leave a comment below the article. Hope the article will help you.

Click to rate this post!
[Total: 0 Average: 0]
51
Subscribe
Notify of
guest
0 Góp ý
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x