Website Building » BigCommerce » How do I edit robots txt in BigCommerce?

How do I edit robots txt in BigCommerce?

Last updated on September 25, 2022 @ 1:20 am

When creating or editing robots.txt files for your BigCommerce store, it is important to understand the different sections of the file and what they do.

The robots.txt file is a text file located in the root of your BigCommerce store.

It is used to control how search engines index your pages.

The file contains 4 main sections:

1. User-agent

2. Disallow

3. Allow

4. Noindex

User-agent

The first section of the robots.txt file is the user-agent section.

This section specifies the type of robot that is accessing the file. The default user-agent is Googlebot. You can change this to any other type of robot by adding a line like this:.

User-agent: Googlebot

Disallow

The next section of the robots.txt file is the Disallow section.

This section specifies which pages should not be indexed by search engines. You can add one or more lines to the Disallow section to exclude specific pages from being indexed.

PRO TIP: If you are not familiar with editing code, we recommend that you consult with a developer before editing your robots.txt file. Making an error in this file can result in your website being blocked from search engines, which can have a negative impact on your traffic and sales.

For example, if you want to exclude your homepage from being indexed, you would add the following line to the Disallow section:

Disallow: /

Allow

The next section of the robots.txt file is the Allow section.

This section specifies which pages should be indexed by search engines. You can add one or more lines to the Allow section to allow specific pages to be indexed.

For example, if you want to allow your homepage to be indexed, you would add the following line to the Allow section:

Allow: /

Noindex

The last section of the robots.txt file is the Noindex section.

This section specifies which pages should not be included in search engine results. You can add one or more lines to the Noindex section to exclude specific pages from being included in search engine results.

For example, if you want to exclude your homepage from being included in search engine results, you would add the following line to the Noindex section:

Noindex: /

Conclusion

In summary, the robots.txt file is used to control how search engines index your pages.

The Disallow section specifies which pages should not be indexed, the Allow section specifies which pages should be indexed, and the Noindex section specifies which pages should not be included in search engine results.

Kathy McFarland

Kathy McFarland

Devops woman in trade, tech explorer and problem navigator.