Website search without php?

Hi dear ones,

I’m looking for workable ideas on how to do a search without using PHP for a Grav website.

Is a search possible only via JavaScript - in terms of performance and feasibility?

If so, does anyone have an approach on how to proceed most sensibly? Provide data beforehand in a JSON package and then search?

Would it be better to involve a third party search?

I would appreciate a few ideas and food for thought.

Best regards,

I think main question is - where would the search happen?

If your search bar is on every page, then the contents of your whole website should also be loaded on every page to be used for JS search. JS happens on client side, so that would use user’s resources, which isn’t good.

What’s your use case? What’s wrong with server side search?

The final deliverd website is an static export (plugin blackhole) of rendered pages. There is no php on the live server. That’s a security specification that cannot be changed.

@christiana83, The fact that the resulting site is a static site makes quite a difference…

Have you tried Google? Here is an article that mentions different options, like Algolia, Lunr, …

Thank you for your answer! The article is very good!

There are some searches available, also jquery searches. Actually I want to seek out the most optimal of these solutions.
Third-party providers put you off a little … Precisely because a search function is usually not a resource-saving thing, a long service life for these search services is not a matter of course. External tools often have too much overhead for a simple search … but are not yet excluded

For a similar situation I have looked and tried several solutions and ended up with Lunr mainly because it is fast and supports multiple languages so called “stemmers”.

The site consists of about 200 pages. All content is saved in a JSON file for Lunr to index which is about 500kB. The search index file Lunr creates is around 1.6MB.

The size of the index file grows quickly and is in my opinion the largest downside. A search page needs to download that index before the JS part of Lunr in the page can perform the search.

In case you want a real life search experience have a look at
and for example try water as the search term. It’s all in Dutch though and is about monitoring mainly environmental policy by a local government (the province of Groningen) in the Netherlands.

@bleutzinn Thank you for sharing your experiences!

For using Lunr, do you have to create an extra search index JSON File to store all pages and their information that should be searched? Do you use a sheduler / cronjob to do this job in a special interval?

Correct. All content being parts of the HTML body text is exported by a PHP script to a JSON file in the format required by Lunr. The export is started using a shell (bash) script which runs at regular intervals using cron. The script also starts the indexing by calling the Lunr Node.js app.

I put together a search-engine comparison with Lunr, FlexSearch, MiniSearch, and others to determine which yields the best search results with the best performance. MiniSearch currently ranks highest, largely because of being maintained and using modern standards. For searching I’d advise separating an index from full contents. Even with full contents and 57 pages, it can come out at 187 KB unminified.

The Static Generator plugin is capable of creating both effectively for Grav, and the Scholar theme currently uses a FlexSearch-implementation. You could offset the load for the content to be searched, but it’s comparatively low in the context of a full website. Searching just the index - metadata - is acceptable in my view, and rendering is fairly easy with JS.

Performance and feasibility are as good, if not better, than more involved solutions in PHP or other services. You’re performing fewer external queries, require less processing, and can handle debouncing and throttling more effectively. If you wanted to keep it “live”, you could attach it to any Fetch-response from any API, but at that point it’d be sane to cache results and data.

I’m intrigued by the possibility of adding Algolia to my Grav websites, but the Premium listing is forever stuck at ‘Coming Soon’. I know that the @christiana83 is looking for a non-PHP solution, but FWIW, on my 1000+ page text-heavy static sites I use either Swiftype or an excellent PHP package called Live Search from CodeCanyon.

PHP I can’t use beacuse of the environment…I’m also considering to use a solution with jquery and a index json to avoid loading more and more javascript libraries…

Index all into one JSON-file and load one library; MiniSearch. It’s all you need, as well as a callback from some search-field that searches the index. No bloated jQuery, minimal added JavaScript.

The overhead is tiny compared to other solutions, and even many, large Pages are just text.

1 Like