Hello, all.
I’m building a site for a client using Grav. They’d like to see my progress as I go, so I was thinking about spinning up a dev server to post my work instead of just doing it locally.
Once it is done, however, I am wondering if there is any easy way to “package” the site and then install it in a new Grav instance on their production server?
Is there a way to do this with Grav itself? I found this plugin but I am not sure if that can install from a backup.
I hope that all makes sense. Any help would be greatly appreciated.
Update: I found the “Backup” button right there on the admin dashboard. Now I just need to figure out how to install from a backup.
@johnfrenchxyz According the docs Creating a Backup:
Backing up your project is nothing more than creating an archive of the ROOT of Grav. No Database, no complications.
You can create the backup through the admin dashboard as you mentioned or using the CLI:
cd ~/workspace/portfolio
bin/grav backup
That is great. Thanks, @anon76427325
I discovered that you can also do it from the dashboard in the admin panel, but that is awesome that you can do it from the CLI.
Now I just trying to figure out how to install that backup on a new Grav instance.
I understand how to put the backup on the server. I guess I wasn’t thinking about the backup as a complete instance of grav itself, but if it is then I guess that might answer my question. I was wondering if you were supposed to install a new instance of Grav first, and then do something to install the backup.
I’ll try it and update this thread if it is that simple.
@johnfrenchxyz The zip contains a complete backup of the root of your site. Just tried it myself and it works instantly after unzipping.
You may have to consider some configuration settings that are different from your dev server:
- Think of caching, pipelining, debugger, etc. in
user/config/system.yaml
.
- How about site settings in
user/config/site.yaml
?
- Does .htaccess need some tweaking?
- Clearing /cache and /assets.
- …
If there is already a Grav installation, I only copy required /user/… folders and files.
1 Like
That’s awesome! Thank you so much for the help!
I make a git repository of the user folder locally, and sync that with a remote private repo (Bitbucket has free ones, I think Gitlab do too).
I also track the remote repository on my server (test and production environments, but you could skip test). On the server, I simply git pull
when I am ready to update. This could be automated even more, but I haven’t bothered.
Much easier and foolproof than zip+FTP if you are comfortable setting this up. Works a treat, actually, so easy!
In all environments, I have core Grav containing the user folder of course. There’s no point version controlling this.
Brief description today but hope that helps!
@hughbris - that is awesome. I am building a custom Grav theme right now, and I have that in a git repo, which makes it super easy to push changes. But your setup is a great idea, too!
Great! I’m meaning to document my process in detail and I want to encourage others to do the same. There’s no right answer and I see a lot of questions about it.
The Git approach is really nice and works great. The only issue I have with this approach is that only 1 out of the 3 hosters I deal with supports ssh and git in combination with shared hosting.
A step-by-step-descripion of this Git approach can be found here Grav Development with GitHub - Part 2 (by Andy Miller). It includes a description for pulling the repo automatically once github get’s updated from the dev server.
1 Like
If your hoster don’t supports git you can try https://github.com/dg/ftp-deployment
It’s not perfect solution but can be usefull.