Hello
I am working on a Grav-based site that hosts a large content library thousands of pages with images, videos & downloadable resources.
While Grav handles smaller sites well, I am seeing slower load times and heavier server usage as the site scales. ![]()
I would like to explore strategies for optimizing Grav for cloud hosting environments so it can handle high traffic & large datasets efficiently.![]()
Possible approaches include using a CDN for static assets; implementing smart caching strategies / splitting content into modular sections that can be loaded on demand.
While researching cloud infrastructure options, I came across the role of a cloud architect and wondered what is cloud architect in the context of designing a CMS deployment that balances speed, cost & scalability? ![]()
This could be an interesting angle for Grav developers working on enterprise-level projects.
Has anyone here built a large-scale Grav deployment in the cloud and found effective optimization techniques?
A good starting point might be the Grav Performance & Caching Guide, but I would love to hear real-world experiences and configurations that worked well for demanding sites.
Thank you !!![]()