Cocolyzebot

Cocolyzebot is a web crawler (sometimes called spider or crawler). A crawler is simply a computer program that crawls websites in order to analyze and/or index them in a search engine. Cocolyzebot crawls websites to analyze their SEO and generate reports.

Why does cocolyzebot explore my site?

Cocolyzebot can crawl your site to generate SEO reports for our users. It can be used for page analysis, site analysis, backlink verification for a client's site as well as content extraction for our Writing tool. The data collected are technical data (loading speed, third party resources etc.), links and content for semantic optimization analysis.

Comment bloquer l'accès à mon site ?

The cocolyzebot crawlers are identifiable by their user-agent:
- Mozilla/5.0 (Linux ; Android 6.0.1 ; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible ; Cocolyzebot/1.0; +https://cocolyze.com/bot)
- Mozilla/5.0 (compatible; Cocolyzebot/1.0; https://cocolyze.com/bot)

Cocolyzebot respects the guidelines in the robots.txt file (more information about the robots.txt file). So every time it needs to access a site for analysis, it first checks that it is allowed to access it. If it is not authorized, it raises an error to the user and does not perform the analysis. To forbid access to a page or to your entire site, it is necessary to use the user-agent token: "cocolyzebot".

For example, to forbid access to the page "http://website.com/private", it is necessary to specify in the robots.txt file "http://website.com/private", the following directive:

User-agent: cocolyzebot
Disallow: /private