Backups for GNU/Linux?

How does Zig Forums back up their Chinese girl cartoons? I had been using Time Machine on the Mac, which worked with no questions asked, but I'm looking for something less gay now.

What I am looking for is an automated solution which will back up my files to a local hard drive that's plugged into my computer. Also, should I back up the entire machine or just my home directory? I think the home directory should be sufficient because the system itself can be just rebuilt by installing the packages again.

I would also like to have several versions of my files, so I can restore some file I deleted last week even if I have already made a new hourly backup. And of course the backup needs to be encrypted or otherwise protected somehow. Really important things like GPG keys or passwords would need to be stored somewhere off-site in addition to the local backup.

What are my options? Is there something you guys are using? Should I roll my own thing with rsync? And just to be clear, I want the functionality, I don't care about the stupid space animations of Time Machine.

Attached: serveimage.jpeg (500x335, 107.74K)

Your home directory is probably full of dumping ground trash because users are the lowest of faggots.

I use Borg. There's an exclude file option and I use that to ignore a chunk of crap.

borg indeed

/thread

Is just a worse rsync. Everything Apple does seems to be crap, glad I don't have to deal with it all the time.
If for some reason rsync is not enough for your needs, take a look at borg.

I always just use rsync. Does borg do anything special?

snapshots, dedup and encryption.

All of these can be done with less bloat by choosing an appropriate FS on the backup machine.

lol

Are you retarded? Something like ZFS (as bloated as it is, which is a lot) is better than your Python bloatware, especially since you need a FS anyway.

rdiff-backup. All other software I tried was just too shitty, the downside is rdiff-backup takes forever for large files.

Nigga what? A filesystem will not save you from disk failure. That's why you make snapshots and store them in different disks.

ZFS can replicate the data on different disks.

...

Choosing an exotic FS is going to bite you in the ass when you go to restore backups and find out that the tools for said FS have been discontinued and no longer work.

ZFS is a meme and RAID isn't a filesystem.

How about you give your hardware to someone that is not mentally handicapped

Build a server, install FreeNAS, create a ZFS volume, set up SMB and accounts, then configure on the end devices to auto backup using the credentials for the accounts you just made. Set and forget.

Attached: freenas.jpg (900x900, 29.08K)

so like .gitignore then?

bittorrent

rsync or tar.
or dump and restore ;^)


this.

Did you read the post I replied to?
Didn't mention backup, you're supposed to ZFS or btrfs on the backup machine.

at least this guy is funny

ok

What chink shit are you buying?

Thanks, I'll give Bork a look.


I'll keep this one in the back of my mind if Borg doesn't do the trick.

Attached: 2kn8d3.jpg (600x315, 36.48K)

You don't rely on the array as backup, it's just an extra precaution
Backup your RAIDs

no shit, sherlock

If your collection of bootleg hentai is worth the investment for you, yes

I'm trying to make an rsync script that backs up a directory full of other directories with files and sub directories to multiple external drives. Currently I have something like rsync -avP --delete-before --include "E*" --include "F*" --include "G*" --include "H*" --include "I*" --include "J*" --include "k*" --exclude "*" /mnt/source/ /mnt/backup/E-K/ But the include filters apply to all files and subdir, where I just want then to apply to /mnt/source/. Can this be done?

Which brings us back to OP's question.

That's offline backup, which is a completely different question from OP's.

1) I don't want to think about your problem.
2) Use --dry-run and test a lot.
3) You might have to generate the --include option list dynamically (I doubt it, though).

I'm a normie-tier user so I just backup to a separate LUKS encrypted HDD every week or two. Not as good as a continual backup or an off-site but at least the HDD resides in a fire resistant safe. It's convenient to me and it's better than no backup.

#!/bin/bashDRYRUN=--dry-runif [ "${1}" = "--really" ]; then dialog --yesno "Run for real?" 0 0 || exit unset DRYRUNelse echo DRY RUNficat

No one has mentioned dar in this whole thread?

What this guy said. Upload it, sharing is caring.

Rsync and zfs if you have the space to waste with slightly inferior compression
tar and 7z if you want maximum compression but sacrifice stability

pipe viewer

rsync obviously