Hello. Wanted to ask if Grav is enterprise ready. We have wordpress based site that has a very large DB. 4GB if dumped to a sql text file and 300GB worth of images. We get around 70-80m pageviews per month on average.
Check out this issue https://github.com/getgrav/grav/issues/1099 for some pointers.
Basically with the right tweaks and caching strategy, it’s possible.
That said, I don’t know a single site that does those numbers (might be out there, but I am not aware of it), and there are no guidelines yet on how to achieve top performance with a huge amount of pages.
In general hangling high traffic is not a problem. Grav out-of-the-box can handle large amounts of traffic due to hits high throughput. However, large number of posts makes things trickier. Grav is best suited for sites of < 1000 pages, although it can handle larger if you are willing to jump through some hoops to optimize things.
Thanks for your replies. In my previous setups, I did utilize scaling varnish servers to handle the load. But since I don’t want to manage it on my own, I opted to get a managed service. The problem with that is that I don’t have a say on how it should be handled. They have their own solution, and have their own way of doing things.
Been reading the Github thread. I’m I correct to assume that Grav would only need that much memory if it’s loading thousands of pages simultaneously? If the pages are in memory, shouldn’t there be some sort of way older pages will be kicked off the memory if it has reached a certain age or if full?