Does any one have any advise on what directories to hide from Google and other search engine spiders?



It really depends on your site, because you may have a page called system that would be normally something you want to hide. Grav will serve a page first if it finds one that matches the requested path. This is why we have no predefined robots.txt.

One thing I have noticed is that if you disable any pages (rather than deleting them) from a skeleton by changing the folder to 01.example to something like 01.example-old it can still get picked up by the sitemap plugin and could still potentially get crawled.

I have a new site up launched last week with the default robots.txt that shipped with Grav and you can see that Google has only indexed the pages…nothing internal or other weird and wonderful files in the index.
click here for link

So unless there’s some pages you don’t want indexed I reckon you should be good with the default robots.txt

But it’s like rhukster said…it really depends on your site.