Hello friends, the third night in a row I learn Grav. I imported 10,000 pages. After that, the site began to load for a very long time how to restore the download speed of the site?
Friends really need a response to this question
10k pages is possible but you need the setup to support it. I was curious and here’s how I got it functioning.
Hardware
Google Compute Engine n1-standard-2 machine type VPS: 2 vCPU & 7.5GB RAM
- vCPU = hardware hyperthread on 2.2GHz Broadwell platform
Server software
- FreeBSD 11
- NGINX 1.12.1
- PHP-FPM 7.1.10
- APCu 5.1.8
- YAML parser (unknown if this has any impact)
php.ini adjustments
-
memory_limit=1024M
<- modify existing -
apc.shm_size=512M
<- likely need to append - 512M is probably overkill
Grav
- Default Blog Site skeleton with plugins enabled
- Admin plugin installed
- Note I also used the Precache plugin for testing purposes but did not impact results and ultimately it’s not required
- 10,000 blog posts with 3 paragraphs lorem ipsum (1 visible as summary)
Results
- ~2.5s page load time (~2s TTFB)
- Using Random or SimpleSearch takes a bit longer
- ~100MB APCu usage
Admittedly this isn’t as ‘snappy’ as some other Grav sites I’ve built, but it has an acceptable level of functionality. I didn’t try increasing hardware specs to improve performance but I’m sure additional gains can be made. E.g. I noticed a significant performance increase when I upgraded from 1 vCPU to 2, as php-fpm process is the bottleneck.
Now, I’m curious what your specific use-case is and how you define ‘very long time’? Have you compared Grav’s performance with 10k pages against another flat-file CMS?
You will probably want to look at TNTSearch plugin rather than simplesearch as it uses a sqlite based fulltext search index. Much more efficient and fast for a large dataset.