Hello, i’m wondering why many folders are blocked in robots.txt and its allows just 3 of them. Why just be simply and serve a normal robots.txt file like
If I understand your question correctly,
Yes, this is mainly in folders where a bot doesn’t need to get into and if it does, it has to be notified that it isn’t ment to be in there, so that the chance of getting indexed is smaller.
The pages on your site that aren’t in a folder where the robots.txt folder ignores stuff are mostly the only pages you want to get to viewers anyway