I bought a second hand NUC to have a little more horsepower to run various services. I have it connected to my NAS, but almost all of the docker volumes reside on the SSD in the NUC.

It would be nice to be able to backup those volumes to my NAS in case the NUC fails. I have Debian 12 running on it.

What are my options ? Should I just backup my docker volumes or does it make more sense to backup the entire NUC ? (I’m less tech savvy then I might appear. Please be generous with your explanation, I still have a lot to learn)

1 point

Nop, docker volumes bring me anxiety since I have no clue where are the files located. I always go with directory binding, I retain full control, master of my files, captain of my soul.

permalink
report
reply
1 point

Those backing up bind mounted volumes, what user does your backup program run as? The data inside bind mounts can have very random user IDs depending on the container including files owned by root.
Does your backup have to run as root in order to capture all files and retain permissions?

permalink
report
reply
1 point

If you just need to backup Docker Volumes, I recommend Nautical.

You can use it to backup to an NFS share if you need to go between Machines.

permalink
report
reply
3 points

I dont use volumes at all and instead use bind mounts so i can just back up those folders

permalink
report
reply
1 point

Wow people are recommending a lot of things I don’t do and now I’m worried I’m doing something wrong.

I just have a folder on my Ubuntu boot drive called Docker with all of the persistent data from my containers. And I just tell Duplicati to backup that folder to BackBlaze. I don’t stop the containers to do that. Am I doing something wrong?

permalink
report
reply
1 point

Some people would say that you’re doing something wrong by using Duplicati, because they’ve had problems restoring data and it’s very slow, so if you’ve never had to restore data before you should test that to check that it works, and maybe switch to something else like Borg to be safe.

Also, backing up the folder without stopping the containers first might result in any backed up databases being corrupt, so if you’re running anything that uses databases, you should stop those containers before backing up the folder.

permalink
report
parent
reply
1 point

How would you automate the stopping of the containers?

permalink
report
parent
reply
1 point

I’m not sure, but I think you’d just need to create a script which stops them, runs the backup, and then restarts them.

permalink
report
parent
reply
1 point

IMHO not really.

There is the slight chance that DBs get inconsistent with backing up hot DB files, but in a homelab with minimal load this is usually not an issue. Same for NFS.

Just make sure you have older backups, too. Just in case the last backup was not good.

permalink
report
parent
reply

Self-Hosted Main

!main@selfhosted.forum

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

For Example

  • Service: Dropbox - Alternative: Nextcloud
  • Service: Google Reader - Alternative: Tiny Tiny RSS
  • Service: Blogger - Alternative: WordPress

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

Community stats

  • 17

    Monthly active users

  • 1.8K

    Posts

  • 11K

    Comments

Community moderators