Optimisation Tips for Large Site?

Hi

Anyone have any tips for large sites? 5000+ pages…

I have NGINX running along with PHP 5.6 (latest), I have also installed / enabled OpCache and APCU cache. The site runs fine with this my main issue is when I push my daily changes the site can go down for anything between 1 minute and 30 minutes… During this time I get 504 error… CPU is not maxed but memory usage is quite high… Not sure what to do… On my local machines it takes about 60-90 secs to build the pages… On the webserver I am not sure why sometimes its quite fast and other times it takes like 30 minutes to get everything back up and running.

Right now the site is still down.

I’ve done multiple nginx and php-fpm restarts and nothing…

Google PageSpeed Insights might be helpful?

What does your hosting look like? Are you using a shared instance or a dedicated server? I would suggest having a small farm setup using AWS or maybe even Heroku so that you can update one at a time, and all of them don’t go down.

Are you using FTP to manually deploy the pages? If so I would also look at having a lightweight deployment setup. This way you can zip your site up upload a single compressed file, and extract it into location.

The best way however, is to only update the files that need to be pushed, I would still look at a deployment package to do this though.

Have you considered pushing the changes incrementally, or in parallel? For example, push pages 1-100 before 101-200, etc, rather than all at once. As DarryllD says above, pushing just the changed files would be a huge benefit.