Are you hitting the save button in the lower right of the screen?
How do you maintain the backup routine of your forum?
Developing in a server without the easy-doing-way that WHM/cPanel does to configure simple tasks is a brand new thing for me... And I'm still getting used to it.
The forum will be running a Linhux/Nginx/MongoDB server and now I'm planning the backup routine.
Shall we discuss about it?
P.s: I'm not exactly asking what would be the best way to do backup routine for me. It's just a topic to share our ideas and what we're doing in order to keep the backups working well.
The only thing you really need to save is the database, everything else should be using a versioning system. I know there are scripts that back up mysql databases, and work kind of like logrotate, and can do stuff like e-mail you the dump or save it somewhere else... there might be something similar for mongo / redis, but I haven't looked. Even a simple script can be made that runs in a cronjob will probably suffice.
There, might (hopefully) are more elegant solutions to db backups though....
We dump redis every 30 minute and send it over to another server and file uploads get backed up every day.
The backups is kept for a month or so...
Like @BDHarrington7 we deploy backups automatically to a staging server where new features can be tested (with email plugin disabled!).
I posted the script I use for MongoDB a while back. I'll see if i can find it.
https://github.com/micahwedemeyer/automongobackup is the script i use.
Cron that script to backup your mongodb every day. Then you can move backups to another location.
The Ndebb docs also suggest you backup your avatars and icon files. So i would suggest doing this too.
For security and safety, I won't elaborate completely on how we back up our files (or maybe I should? )
But here's how we extract our hosted instances' MongoDB data:
$ /usr/bin/mongodump --db $DBNAME --collection objects -o - | gzip > /tmp/db.objects.bson.gz
This takes the binary representation of the MongoDB data, pipes it to gzip, and writes the compressed data to disk.
Then back up and save as necessary
Question: wouldn't the service (Nodebb) need to be stoped in order to backup a database? Honestly don't know to much about mongo / redis. I'm using redis but hearing mongo is better long run.
If not why not just include or someone right a plugin to backup the database, could be as simple as import/export or elaborate as timed backup to remote FTP... Just woundering, I myself have timed VM backups on a Xenserver
It is best practice to stop NodeBB during the backup period, as it is theoretically possible that some data could change during the backup process.
I'm not sure of the intricacies, and whether MongoDB would allow something like that to happen, so I cannot say with certainty
I wasn't sure if Mongo / radius databases worked any differently. That is what I thought, backing up without stopping the service your chancing corrupting the database. Like some making a backup every 30min stopping your service that much is kind of drastic. From a VM standpoint, I can may a system file and memory snapshot as often as I like without affecting the live service.
There are a number of strategies you could utilise to minimise the risks... e.g.
- Placing your forum in maintenance mode while doing the backup, but this is a very manual process
- If using containers (LXC or maybe docker), then freezing the container while the backup is done