Hello, i’m wondering why many folders are blocked in robots.txt and its allows just 3 of them. Why just be simply and serve a normal robots.txt file like
GoogleBot and others try to crawl this blocked folders ? Any exemples ? As a SEO, it’s a big footprint for Google and mass PBN deployment.
Just wanna know if there is a good reason or if it’s just for a prevention for indexation.