Robots.txt and SEO Footprint

Hello, i’m wondering why many folders are blocked in robots.txt and its allows just 3 of them. Why just be simply and serve a normal robots.txt file like

User-agent: *
Disallow:
Sitemap: http://www.my-domaine.com/sitemap.xml

GoogleBot and others try to crawl this blocked folders ? Any exemples ? As a SEO, it’s a big footprint for Google and mass PBN deployment.

Just wanna know if there is a good reason or if it’s just for a prevention for indexation.

If I understand your question correctly,
Yes, this is mainly in folders where a bot doesn’t need to get into and if it does, it has to be notified that it isn’t ment to be in there, so that the chance of getting indexed is smaller.

The pages on your site that aren’t in a folder where the robots.txt folder ignores stuff are mostly the only pages you want to get to viewers anyway :slight_smile:

Hope it explains what you wanted, good luck!