Caching many images and responsiveness

I am working on a gallery-system for a site, and am concerned about performance. Consider the following: I gather images from a multitude of posts into a gallery, and for argument’s sake this currently consists of ~300 images at about ~2.5mb each. Offsetting the page-load isn’t particularly difficult, I use lazy-loading and compressed thumbnails for that.

However, if I had done some upgrades that required clearing the cache, and someone then visited the gallery page, all ~300 images would need to be re-cached. Even with high execution times, processing and memory limits, this could easily crash both the page and the caching.

What are your thoughts on strategies for resolving this? Ajax would be one approach, but even with preloading this would be a slow process when loading items into the gallery.

Also, could someone elaborate on the Sizes with media queries part of the documentation? I followed the sample there and wrote this:

{{ image.derivatives(640, 1600, 320).sizes('640px, 960px, 1280px', '1600px').url }}

But the returned URL still refers to the original (viewed in a vewport 1030px wide).

You might want to look at the precache plugin. This basically uses an out-of-process event to cycle through all your pages and call their content()method. This has the effect of pre-caching everything.

In the latest version of the plugin there is even a CLI command that lets you kick this off from the command line (or a script) after you do your update.

The plugin solves the issue of initial caching, but is the precache-process structured for sequential or parallel loads? That is, would it still potentially fail on execution time, process limits, or memory use?

it’s sequential, but happens in the background, so it doesn’t slow your site down. And won’t cause any problems sync problems because it’s basically just calling the content() method on each page. Grav internally is seeing if that is already in cache, if it is it uses that, if not, it adds it to cache.

One thing I’ve noticed using both the “regular” caching, and through the precache-plugin, is that any “excessively-sized” image (that is, raw, for example 8134x5562 6.83MB) will ostensibly crash the caching-mechanism regardless of method.

For testing I used a set of 36 images resized to a max-width of 1280 - thus reducing file-size to 100-400 KB. The exception was an image of the dimensions given above, which Grav would not render in a simple loop running cropResize(320,160) for thumbnails.
More worryingly, when caching and thus rendering fails there is no error given: No “Crikey!, Twig done blew up”, no message in Grav log, nor Apache log. Is error-handling set up for the caching-mechanisms?

I did the test again, now with a set of 301 images, also constrained to a max-width of 1280. With meta.yaml-files this amounts to about 55 MB, and caching all of it the “regular” way will of course take a good while (~2.3 min, with increased execution-time). Using “precache” is of course more optimal, but it seemingly is not a solution because the images are not rendering through content(), but by a twig-loop in the template (based on [image gallery recipe](learn.getgrav.org/cookbook/general-recipes#creating-a-simple-g allery)).

The most ideal approach at this point seems to be either manual-resizing - like I did to constrain the max-width - to generate needed sizes (thumbnails, smalls, mediums, larges), or calling Grav’s caching-mechanism through Ajax. Doing this outside of the server would at least utilize my computer’s power, though of course be somewhat bloated in the amount of files generated locally.

I would assume that the failing caching rests within PHP and ImageMagick, or some other library utilized, but the failure should really be expressed at some point or at least logged.

Of course, it may be my server-configuration that is not properly set up to do the caching as efficiently as possible. I’m running a vanilla install of Bitnami Wamp locally for development.

Actually this is just a limitation of GD in PHP.

Any large image is first converted to a totally uncompressed bitmap. That means it will take up many magnitudes more memory than the original JPG or PNG. Not to mention taking a huge amount of time. I would recommend limiting your source images to something reasonable that would be your maximum ever size you would use on your website. I usually keep my images to less than 1600x1600px.

From a ticket on Gregwar I saw there was development towards using ImageMagick where available, as this would improve speed and resource usage. That would certainly help caching large amounts of images.

Yes Imagemagick support would be great. Unfortunately there doesn’t seem to be much progress happening on that front. I keep checking back though.