Recommended robots.txt for OpenCart

Some of our customers who use the OpenCart shopping cart software have reported that their site seems slow and generates very high “load”.

This happens because search engines index each page on their site hundreds of different ways, following every possible link to search, sort, and filter the results. OpenCart creates far more of these links than most other programs do. The extra requests from the search engine indexers can cause hundreds of times as many requests as humans do.

Since all the links are virtually identical, and since you only want each search engine to see one version of each page to avoid “diluting” your rankings, you should use a robots.txt file to prevent search engines from indexing the extra versions.

The most commonly recommended one on the OpenCart forums is:

User-agent: *
Disallow: /*&filter
Disallow: /*&limit
Disallow: /*&sort
Disallow: /*?route=account/
Disallow: /*?route=affiliate/
Disallow: /*?route=checkout/
Disallow: /*?route=product/search