Robots.txt Robots.txt

Optimized Robots.txt for WooCommerce

Loading the Text to Speech AudioNative Player...

During a review, it was found that a WooCommerce website is causing high CPU usage due to frequent bot access to non-cacheable pages with the parameter ?add_to_wishlist=. These pages are indexed by bots like Googlebot and Ahrefs, which slows down the website. To prevent this, it is recommended to set certain parameters in the /robots.txt file that prohibit bots from indexing these links. Regardless of whether the “add-to-cart” function is executed via JavaScript or directly in HTML, disabling the indexing of such parameters is a necessary measure.

Optimierte robots.txt

#Block WooCommerce assets
User-agent: *
Disallow: /cart/
Disallow: /warenkorb/
Disallow: /checkout/
Disallow: /kasse/
Disallow: /my-account/
Disallow: /mein-konto/
Disallow: /*?orderby=price
Disallow: /*?orderby=rating
Disallow: /*?orderby=date
Disallow: /*?orderby=price-desc
Disallow: /*?orderby=popularity
Disallow: /*?filter
Disallow: /*add-to-cart=*
Disallow: /*?add_to_wishlist=*


#Block Search assets
User-agent: *
Disallow: /search/
Disallow: *?s=*
Disallow: *?p=*
Disallow: *&p=*
Disallow: *&preview=*
Disallow: /search

By setting these parameters, crawlers will no longer index the “add-to-cart” links and other non-cacheable pages. This leads to savings in CPU, memory, and bandwidth.

Conclusion

For hosting providers that base their prices on CPU usage, this measure would significantly reduce monthly costs. Even with traditional web hosting, it results in a noticeable reduction in CPU usage. Additionally, these settings in the robots.txt file allow for the conservation of valuable crawling credits when using Ahrefs. Last but not least, the overall evaluation by Google crawlers benefits as well, since valuable crawling limits are preserved and the Google crawler can focus on essential content.

en_USEnglish