Comparison of backup tools
This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.
Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.
- Graphical Interface? Command line?
- Incremental backups?
- Automatic backups?
- Install method: In standard repositories? PPA?
software-recommendation backup
add a comment |
This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.
Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.
- Graphical Interface? Command line?
- Incremental backups?
- Automatic backups?
- Install method: In standard repositories? PPA?
software-recommendation backup
4
I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway.
– Toby
Aug 18 '10 at 21:15
It's not a features comparison, but this poll might help: webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too!
– Alin Andrei
Aug 18 '10 at 21:19
add a comment |
This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.
Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.
- Graphical Interface? Command line?
- Incremental backups?
- Automatic backups?
- Install method: In standard repositories? PPA?
software-recommendation backup
This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.
Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.
- Graphical Interface? Command line?
- Incremental backups?
- Automatic backups?
- Install method: In standard repositories? PPA?
software-recommendation backup
software-recommendation backup
edited Apr 13 '17 at 12:24
community wiki
9 revs, 5 users 50%
8128
4
I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway.
– Toby
Aug 18 '10 at 21:15
It's not a features comparison, but this poll might help: webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too!
– Alin Andrei
Aug 18 '10 at 21:19
add a comment |
4
I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway.
– Toby
Aug 18 '10 at 21:15
It's not a features comparison, but this poll might help: webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too!
– Alin Andrei
Aug 18 '10 at 21:19
4
4
I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway.
– Toby
Aug 18 '10 at 21:15
I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway.
– Toby
Aug 18 '10 at 21:15
It's not a features comparison, but this poll might help: webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too!
– Alin Andrei
Aug 18 '10 at 21:19
It's not a features comparison, but this poll might help: webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too!
– Alin Andrei
Aug 18 '10 at 21:19
add a comment |
33 Answers
33
active
oldest
votes
1 2
next
Déjà Dup
Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".
It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.
Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.
Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.
4
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
2
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
8
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
2
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
3
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
|
show 5 more comments
Back in Time
I have been using Back in Time for some time, and I'm very satisfied.
All you have to do is configure:
- Where to save snapshot
- What directories to backup
- When backup should be done (manual, every hour, every day, every week, every month)
And forget about it.
To install (working on Ubuntu 16.04 for gnome):
sudo add-apt-repository ppa:bit-team/stable
sudo apt-get update
sudo apt-get install backintime-gnome
The program GUI can be opened via ubuntu search for "backintime".
Project is active as of April 2018.
2
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
23
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
1
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
1
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
3
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
|
show 3 more comments
rsnapshot vs. rdiff-backup
I often refer to this comparison of rsnapshot and rdiff-backup:
Similarities:
- both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
- both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
- both use a simple copy of the source for the current backup
Differences in disk usage:
- rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
- rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.
Differences in speed:
- rdiff-backup is slower than rsnapshot
Differences in metadata storage:
- rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.
Differences in file transparency:
- For rsnapshot, all versions of the backup are accessible as plain files.
- For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.
Differences in backup levels made:
- rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
- rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.
Differences in support community:
- Based on the number of responses to my post on the mailing lists (rsnapshot: 6, rdiff-backup: 0), rsnapshot has a more active community.
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
2
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
add a comment |
rsync
If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync
called Grsync that makes manual backups easier.
In combination with hard links, it's possible to make backup in a way that deleted files are preserved.
See:
- http://www.sanitarium.net/golug/rsync_backups_2010
6
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
1
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..
– webwurst
Aug 23 '10 at 12:53
3
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
|
show 2 more comments
Duplicity
Duplicity is a feature-rich command line backup tool.
Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.
Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).
1
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
add a comment |
Dropbox
A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.
Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.
Note also that restoring large amount of files may be very time-consuming as Dropbox was not build as a backup tool.
35
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
7
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
3
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
add a comment |
luckyBackup
It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.
Note that this tool is no longer developed.
The all important screenshots are found here on their website with one shown below:
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
add a comment |
BackupPC
If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.
BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.
1
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
1
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
add a comment |
bup
A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."
Highlights:
It uses a rolling checksum algorithm (similar to rsync) to split large
files into chunks. The most useful result of this is you can backup huge
virtual machine (VM) disk images, databases, and XML files incrementally,
even though they're typically all in one huge file, and not use tons of
disk space for multiple versions.
Data is "automagically" shared between incremental backups without having
to know which backup is based on which other one - even if the backups
are made from two different computers that don't even know about each
other. You just tell bup to back stuff up, and it saves only the minimum
amount of data needed.
Bup can use "par2" redundancy to recover corrupted backups even if your
disk has undetected bad sectors.
You can mount your bup repository as a FUSE filesystem and access the
content that way, and even export it over Samba.
A KDE-based front-end (GUI) for bup is available, namely Kup Backup System.
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Now metadata seems to be supported, seehttps://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).
– student
Mar 20 '13 at 18:22
add a comment |
CrashPlan
CrashPlan is a company providing business backup, without plan for individual users.
Features
- 10$/month/device fee
- Triple destination data storage and protection
- Silent and continuous
- Generous retention and versioning
- Deleted file protection
I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.
CrashPlan is cross platform, but not open source.
It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
add a comment |
Bacula
I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.
One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).
When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.
Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:
- file daemon (takes care of actually collecting files and their metadata cross-platform way)
- storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
- director daemon (takes care of scheduling backups and central configuration)
There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.
bconsole binary is shipped with bacula and provides CLI interface for bacula administration.
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
2
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
add a comment |
Simple Backup
Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).
Features:
easy to set-up with already pre-defined backup strategies
external hard disk backup support
remote backup via SSH or FTP
revision history
clever auto-purging- easy sheduling
user- and/or system-level backups
As you can see the feature set is similar to the one offered by Back in time
.
Simple Backup fits well in the Gnome and Ubuntu Desktop environment.
6
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
2
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
2
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
1
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
|
show 2 more comments
Use tar.
It is a simple and robust method, but yet it's rather outdated. Today, we have better and faster backup tools which also have more useful features.
Create a full backup of your home
directory:
cd
to the directory where you want to store the backup file, and then:
tar --create --verbose --file backup.tar <path to the home directory>
For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar
:
Again, cd
to the directory where the backup file is, and then use --update
:
tar --update --verbose --file backup.tar <path to the home directory>
All files that are either new or have been modified will be saved in backup.tar
. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar
, and find the files (and versions) you want to restore.
Note: You cannot use --update
on a compressed tar file (e.g. .tar.gz
).
add a comment |
DAR
DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.
add a comment |
Spideroak
A dropbox like backup/syncing service with comparable features.
- Access all your data in one de-duplicated location
- Configurable multi-platform synchronization
- Preserve all historical versions & deleted files
- Share folders instantly in web
- ShareRooms w / RSS
- Retrieve files from any internet-connected device
- Comprehensive 'zero-knowledge' data encryption
Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope
More info at https://spideroak.com
1
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
5
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
1
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
add a comment |
Attic Backup
Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup
data. The data deduplication technique used makes Attic suitable for
daily backups since only the changes are stored.
Main Features:
- Easy to use
Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified
using HMAC-SHA256.
Off-site backups: Attic can store data on any remote host accessible over SSH
Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.
Requirements:
Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python
and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse
is required.
Note:
There is also now a fork of Attic called Borg.
add a comment |
FlyBack
Warning: Unmaintained, last update in 2010.
Similar to Back in Time
Apple's Time Machine is a great
feature in their OS, and Linux has
almost all of the required technology
already built in to recreate it. This
is a simple GUI to make it easy to
use.
1
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
add a comment |
Jungledisk
Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.
I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .
It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.
I can't recommend it enough.
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
add a comment |
Areca Backup
Warning: Unmaintained, last release in 2015.
is also a very decent GPL program to make backups easily.
Features
- Archives compression (Zip & Zip64
format) - Archives encryption (AES128 & AES256
encryption algorithms) - Storage on local hard drive, network
drive, USB key, FTP / FTPs server
(with implicit and explicit SSL /
TLS) - Source file filters (by extension,
subdirectory, regular expression,
size, date, status, with AND/OR/NOT
logical operators) - Incremental, differential and full
backup support - Support for delta backup (store only
modified parts of your files) - Archives merges : You can merge
contiguous archives into one single
archive to save storage space. - As of date recovery : Areca allows
you to recover your archives (or
single files) as of a specific date. - Transaction mechanism : All critical
processes (such as backups or merges)
are transactional. This guarantees
your backups' integrity. - Backup reports : Areca generates
backup reports that can be stored on
your disk or sent by email. - Post backup scripts : Areca can
launch shell scripts after backup. - Files permissions, symbolic links and
named pipes can be stored and
recovered. (Linux only)
add a comment |
I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:
exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest
I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.
Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:
/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
2
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
add a comment |
Dirvish
Dirvish is a nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.
Here is a nice tutorial for it: http://wiki.edseek.com/howto:dirvish
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
add a comment |
Duplicati
An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".
Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.
I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.
A fork of duplicity.
add a comment |
PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.
What I like about this program is that it copies the entire partition.
Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.
You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.
Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.
Pro's:
- Will not only backup documents, but system files as well
- It's easy to use
- It is possible to backup other (non-Linux) partitions as well
- It will compress the backup in gzip or bzip2 format, saving disk space
Cons:
- The PC will have to be restarted before being able to backup
- PING will make a backup of an entire partition, even when only few files have been modified
- You'll need an external hard drive or some free space on your PC to put your backups
An excellent Dutch manual can be found here.
add a comment |
s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.
Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.
See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.
I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.
add a comment |
TimeVault
Warning: unmaintained
TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.
Can be downloaded from Launchpad.
add a comment |
inosync
A Python script that offers a more-or-less real-time backup capability.
Mote that this software is not maintained anymore.
"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."
http://www.opcug.ca/public/Reviews/linux_part16.htm
add a comment |
Obnam
Warning: Software is no longer maintained, authors recommend not using it
'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.
Some features that may interest you:
- Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
- Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
- Encrypted backups, using GnuPG.'
An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.
add a comment |
saybackup and saypurge
There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:
This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the
end of each successful backup run, a symlink '*-current' is updated
to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.
The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:
Sayepurge parses the timestamps from the names of this set of backup
directories, computes the time deltas, and determines good deletion
candidates so that backups are spaced out over time most evenly. The
exact behavior can be tuned by specifying the number of recent files
to guard against deletion (-g), the number of historic backups to keep
around (-k) and the maximum number of deletions for any given run
(-d). In the above set of files, the two backups from 2011-07-07 are
only 6h apart, so they make good purging candidates...
add a comment |
backup2l
Warning: unmaintained, last commit on 2017-02-14
From the homepage:
backup2l is a lightweight command line tool for generating,
maintaining and restoring backups on a mountable file system (e. g.
hard disk). The main design goals are are low maintenance effort,
efficiency, transparency and robustness. In a default installation,
backups are created autonomously by a cron script.
backup2l supports hierarchical differential backups with a
user-specified number of levels and backups per level. With this
scheme, the total number of archives that have to be stored only
increases logarithmically with the number of differential backups
since the last full backup. Hence, small incremental backups can be
generated at short intervals while time- and space-consuming full
backups are only sparsely needed.
The restore function allows to easily restore the state of the file
system or arbitrary directories/files of previous points in time. The
ownership and permission attributes of files and directories are
correctly restored.
An integrated split-and-collect function allows to comfortably
transfer all or selected archives to a set of CDs or other removable
media.
All control files are stored together with the archives on the backup
device, and their contents are mostly self-explaining. Hence, in the
case of an emergency, a user does not only have to rely on the restore
functionality of backup2l, but can - if necessary - browse the files
and extract archives manually.
For deciding whether a file is new or modified, backup2l looks at its
name, modification time, size, ownership and permissions. Unlike other
backup tools, the i-node is not considered in order to avoid problems
with non-Unix file systems like FAT32.
add a comment |
boxbackup
From the homepage:
Box Backup is an open source, completely automatic, on-line backup
system. It has the following key features:
- All backed up data is stored on the server in files on a filesystem - no tape, archive or other special devices are required.
-The server is trusted only to make files available when they are required - all data is encrypted and can be decoded only by the
original client. This makes it ideal for backing up over an untrusted
network (such as the Internet), or where the server is in an
uncontrolled environment.
-A backup daemon runs on systems to be backed up, and copies encrypted data to the server when it notices changes - so backups are continuous
and up-to-date (although traditional snapshot backups are possible
too).
- Only changes within files are sent to the server, just like rsync, minimising the bandwidth used between clients and server. This makes
it particularly suitable for backing up between distant locations, or
over the Internet.
- It behaves like tape - old file versions and deleted files are available.
- Old versions of files on the server are stored as changes from the current version, minimising the storage space required on the server.
Files are the server are also compressed to minimise their size.
- Choice of backup behaviour - it can be optimised for document or server backup.
- It is designed to be easy and cheap to run a server. It has a portable implementation, and optional RAID implemented in userland for
reliability without complex server setup or expensive hardware.
http://www.boxbackup.org/
add a comment |
1 2
next
protected by Community♦ Nov 1 '16 at 9:22
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
33 Answers
33
active
oldest
votes
33 Answers
33
active
oldest
votes
active
oldest
votes
active
oldest
votes
1 2
next
Déjà Dup
Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".
It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.
Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.
Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.
4
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
2
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
8
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
2
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
3
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
|
show 5 more comments
Déjà Dup
Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".
It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.
Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.
Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.
4
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
2
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
8
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
2
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
3
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
|
show 5 more comments
Déjà Dup
Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".
It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.
Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.
Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.
Déjà Dup
Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".
It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.
Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.
Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.
edited Feb 29 '16 at 10:05
community wiki
12 revs, 8 users 63%
8128
4
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
2
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
8
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
2
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
3
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
|
show 5 more comments
4
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
2
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
8
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
2
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
3
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
4
4
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
I don't quite understand? You can't restore specific versions of individual files very easily. However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc
– 8128
Aug 30 '10 at 7:12
2
2
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid.
– 8128
Sep 8 '10 at 19:28
8
8
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version."
– andrewsomething
Oct 13 '10 at 21:44
2
2
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
is there a command line interface for Deja Dup?
– brillout
Oct 24 '11 at 20:18
3
3
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
@brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply.
– nealmcb
Jun 29 '12 at 5:46
|
show 5 more comments
Back in Time
I have been using Back in Time for some time, and I'm very satisfied.
All you have to do is configure:
- Where to save snapshot
- What directories to backup
- When backup should be done (manual, every hour, every day, every week, every month)
And forget about it.
To install (working on Ubuntu 16.04 for gnome):
sudo add-apt-repository ppa:bit-team/stable
sudo apt-get update
sudo apt-get install backintime-gnome
The program GUI can be opened via ubuntu search for "backintime".
Project is active as of April 2018.
2
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
23
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
1
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
1
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
3
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
|
show 3 more comments
Back in Time
I have been using Back in Time for some time, and I'm very satisfied.
All you have to do is configure:
- Where to save snapshot
- What directories to backup
- When backup should be done (manual, every hour, every day, every week, every month)
And forget about it.
To install (working on Ubuntu 16.04 for gnome):
sudo add-apt-repository ppa:bit-team/stable
sudo apt-get update
sudo apt-get install backintime-gnome
The program GUI can be opened via ubuntu search for "backintime".
Project is active as of April 2018.
2
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
23
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
1
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
1
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
3
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
|
show 3 more comments
Back in Time
I have been using Back in Time for some time, and I'm very satisfied.
All you have to do is configure:
- Where to save snapshot
- What directories to backup
- When backup should be done (manual, every hour, every day, every week, every month)
And forget about it.
To install (working on Ubuntu 16.04 for gnome):
sudo add-apt-repository ppa:bit-team/stable
sudo apt-get update
sudo apt-get install backintime-gnome
The program GUI can be opened via ubuntu search for "backintime".
Project is active as of April 2018.
Back in Time
I have been using Back in Time for some time, and I'm very satisfied.
All you have to do is configure:
- Where to save snapshot
- What directories to backup
- When backup should be done (manual, every hour, every day, every week, every month)
And forget about it.
To install (working on Ubuntu 16.04 for gnome):
sudo add-apt-repository ppa:bit-team/stable
sudo apt-get update
sudo apt-get install backintime-gnome
The program GUI can be opened via ubuntu search for "backintime".
Project is active as of April 2018.
edited Aug 9 '18 at 6:41
community wiki
14 revs, 9 users 44%
Decio Lira
2
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
23
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
1
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
1
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
3
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
|
show 3 more comments
2
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
23
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
1
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
1
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
3
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
2
2
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work.
– zacharyliu
Dec 5 '10 at 7:23
23
23
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .*
– user8290
Feb 16 '11 at 17:49
1
1
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually.
– desgua
Mar 27 '11 at 15:33
1
1
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time.
– Chethan S.
May 18 '11 at 12:28
3
3
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
@Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like.
– Germar
Mar 12 '16 at 0:25
|
show 3 more comments
rsnapshot vs. rdiff-backup
I often refer to this comparison of rsnapshot and rdiff-backup:
Similarities:
- both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
- both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
- both use a simple copy of the source for the current backup
Differences in disk usage:
- rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
- rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.
Differences in speed:
- rdiff-backup is slower than rsnapshot
Differences in metadata storage:
- rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.
Differences in file transparency:
- For rsnapshot, all versions of the backup are accessible as plain files.
- For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.
Differences in backup levels made:
- rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
- rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.
Differences in support community:
- Based on the number of responses to my post on the mailing lists (rsnapshot: 6, rdiff-backup: 0), rsnapshot has a more active community.
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
2
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
add a comment |
rsnapshot vs. rdiff-backup
I often refer to this comparison of rsnapshot and rdiff-backup:
Similarities:
- both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
- both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
- both use a simple copy of the source for the current backup
Differences in disk usage:
- rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
- rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.
Differences in speed:
- rdiff-backup is slower than rsnapshot
Differences in metadata storage:
- rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.
Differences in file transparency:
- For rsnapshot, all versions of the backup are accessible as plain files.
- For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.
Differences in backup levels made:
- rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
- rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.
Differences in support community:
- Based on the number of responses to my post on the mailing lists (rsnapshot: 6, rdiff-backup: 0), rsnapshot has a more active community.
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
2
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
add a comment |
rsnapshot vs. rdiff-backup
I often refer to this comparison of rsnapshot and rdiff-backup:
Similarities:
- both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
- both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
- both use a simple copy of the source for the current backup
Differences in disk usage:
- rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
- rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.
Differences in speed:
- rdiff-backup is slower than rsnapshot
Differences in metadata storage:
- rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.
Differences in file transparency:
- For rsnapshot, all versions of the backup are accessible as plain files.
- For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.
Differences in backup levels made:
- rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
- rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.
Differences in support community:
- Based on the number of responses to my post on the mailing lists (rsnapshot: 6, rdiff-backup: 0), rsnapshot has a more active community.
rsnapshot vs. rdiff-backup
I often refer to this comparison of rsnapshot and rdiff-backup:
Similarities:
- both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
- both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
- both use a simple copy of the source for the current backup
Differences in disk usage:
- rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
- rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.
Differences in speed:
- rdiff-backup is slower than rsnapshot
Differences in metadata storage:
- rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.
Differences in file transparency:
- For rsnapshot, all versions of the backup are accessible as plain files.
- For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.
Differences in backup levels made:
- rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
- rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.
Differences in support community:
- Based on the number of responses to my post on the mailing lists (rsnapshot: 6, rdiff-backup: 0), rsnapshot has a more active community.
answered Sep 7 '10 at 19:29
community wiki
ændrük
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
2
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
add a comment |
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
2
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
Do either support data deduplication?
– intuited
Feb 5 '11 at 21:48
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
So it sounds like rsnapshot is just generally better.
– mlissner
Apr 30 '11 at 6:26
2
2
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does.
– Anthon
Feb 21 '14 at 7:05
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder.
– user258532
13 hours ago
add a comment |
rsync
If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync
called Grsync that makes manual backups easier.
In combination with hard links, it's possible to make backup in a way that deleted files are preserved.
See:
- http://www.sanitarium.net/golug/rsync_backups_2010
6
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
1
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..
– webwurst
Aug 23 '10 at 12:53
3
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
|
show 2 more comments
rsync
If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync
called Grsync that makes manual backups easier.
In combination with hard links, it's possible to make backup in a way that deleted files are preserved.
See:
- http://www.sanitarium.net/golug/rsync_backups_2010
6
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
1
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..
– webwurst
Aug 23 '10 at 12:53
3
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
|
show 2 more comments
rsync
If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync
called Grsync that makes manual backups easier.
In combination with hard links, it's possible to make backup in a way that deleted files are preserved.
See:
- http://www.sanitarium.net/golug/rsync_backups_2010
rsync
If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync
called Grsync that makes manual backups easier.
In combination with hard links, it's possible to make backup in a way that deleted files are preserved.
See:
- http://www.sanitarium.net/golug/rsync_backups_2010
edited Feb 15 '14 at 21:06
community wiki
8 revs, 5 users 37%
fluteflute
6
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
1
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..
– webwurst
Aug 23 '10 at 12:53
3
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
|
show 2 more comments
6
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
1
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..
– webwurst
Aug 23 '10 at 12:53
3
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
6
6
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions.
– Erigami
Aug 19 '10 at 18:32
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
I've changed this to talk about rsnapshot, which is what I think the author was referring to.
– 8128
Aug 19 '10 at 18:53
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
@fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup.
– Roalt
Aug 23 '10 at 11:00
1
1
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_
date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..– webwurst
Aug 23 '10 at 12:53
Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_
date '+%Y-%m-%d'
}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference..– webwurst
Aug 23 '10 at 12:53
3
3
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
There is GUI frontend for rsync called Grsync (opbyte.it/grsync) that makes manual backups easier. I use it for making backups to my portable hard drive.
– Dmitry
Jun 11 '11 at 17:58
|
show 2 more comments
Duplicity
Duplicity is a feature-rich command line backup tool.
Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.
Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).
1
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
add a comment |
Duplicity
Duplicity is a feature-rich command line backup tool.
Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.
Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).
1
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
add a comment |
Duplicity
Duplicity is a feature-rich command line backup tool.
Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.
Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).
Duplicity
Duplicity is a feature-rich command line backup tool.
Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.
Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).
edited Mar 7 '16 at 23:33
community wiki
5 revs, 5 users 38%
vh1
1
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
add a comment |
1
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
1
1
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
There are also a number of GUI frontends to duplicity, such as Time Drive
– Ryan Thompson
Aug 25 '10 at 23:10
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus).
– Chris Good
Mar 29 '13 at 6:07
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos.
– bloudraak
May 28 '13 at 4:05
add a comment |
Dropbox
A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.
Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.
Note also that restoring large amount of files may be very time-consuming as Dropbox was not build as a backup tool.
35
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
7
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
3
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
add a comment |
Dropbox
A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.
Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.
Note also that restoring large amount of files may be very time-consuming as Dropbox was not build as a backup tool.
35
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
7
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
3
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
add a comment |
Dropbox
A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.
Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.
Note also that restoring large amount of files may be very time-consuming as Dropbox was not build as a backup tool.
Dropbox
A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.
Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.
Note also that restoring large amount of files may be very time-consuming as Dropbox was not build as a backup tool.
edited Aug 7 '18 at 5:44
community wiki
5 revs, 4 users 35%
Derek
35
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
7
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
3
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
add a comment |
35
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
7
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
3
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
35
35
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
Synchronisation tools should not be confused with backup tools. A synchronisation tool can help make a backup more efficient like rsync can spare bandwidth for exemple. But it is not a solution for backup unless it has strong revision history. Why? Imagine you get a virus which infects your file and modify them. The modified will get sync, and you will lose them. Dropbox has some kind of revision history. So it could serve as an ersatz for backup. But keep in mind that it is not guaranteed that you can restore your files when need arise!
– Huygens
Oct 13 '10 at 20:14
7
7
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
Spideroak provides unlimited revision history with free accounts.
– intuited
Jan 9 '11 at 5:09
3
3
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note that Dropbox fails badly if you need to restore a large number of files, as Dropbox will only let you restore one at a time, at the cost of several page loads each.
– Scott Severance
May 29 '12 at 10:21
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
Note Dropbox dropped the support for encrypted Linux filesystems although exist this alternatives, basically LUKS and full disk encryption, maybe Cryptomator or CryFS or better move to a Dropbox alternative.
– Pablo Bianchi
Oct 15 '18 at 22:34
add a comment |
luckyBackup
It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.
Note that this tool is no longer developed.
The all important screenshots are found here on their website with one shown below:
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
add a comment |
luckyBackup
It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.
Note that this tool is no longer developed.
The all important screenshots are found here on their website with one shown below:
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
add a comment |
luckyBackup
It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.
Note that this tool is no longer developed.
The all important screenshots are found here on their website with one shown below:
luckyBackup
It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.
Note that this tool is no longer developed.
The all important screenshots are found here on their website with one shown below:
edited Feb 24 '16 at 13:12
community wiki
8 revs, 6 users 50%
Scaine
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
add a comment |
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful!
– desgua
Jun 23 '11 at 16:15
add a comment |
BackupPC
If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.
BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.
1
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
1
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
add a comment |
BackupPC
If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.
BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.
1
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
1
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
add a comment |
BackupPC
If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.
BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.
BackupPC
If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.
BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.
edited Mar 11 '17 at 18:56
community wiki
6 revs, 4 users 39%
8128
1
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
1
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
add a comment |
1
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
1
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
1
1
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
BackupPC is a very nice solution for home / home office / small business. Works great for servers too and mixed Windows / Linux environment.
– Amala
Apr 21 '11 at 23:16
1
1
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
I'm surprised at how many issues I've run into with backuppc in Precise 12.04. The documentation is geared towards doing config by hand, not via the pretty web interface. It is confusing to configure. They have no convenient upstream bug tracker, just a mailing list, but I've run across many unresolved bugs, including those mentioned at issues with BackupPC on Ubuntu 12.04 | tolaris.com and at bugs.launchpad.net/ubuntu/+source/backuppc/+bug/497732/comments/…
– nealmcb
Jun 29 '12 at 1:42
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
Note also that it installs apache to run the web site, opening port 80 for outside access. Worse, it requires a password to do web config, but sends the password over the network in the clear by default. See other security issues at SourceForge.net: Configuring BackupPC for secure backups and access controls - backuppc
– nealmcb
Jun 29 '12 at 1:57
add a comment |
bup
A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."
Highlights:
It uses a rolling checksum algorithm (similar to rsync) to split large
files into chunks. The most useful result of this is you can backup huge
virtual machine (VM) disk images, databases, and XML files incrementally,
even though they're typically all in one huge file, and not use tons of
disk space for multiple versions.
Data is "automagically" shared between incremental backups without having
to know which backup is based on which other one - even if the backups
are made from two different computers that don't even know about each
other. You just tell bup to back stuff up, and it saves only the minimum
amount of data needed.
Bup can use "par2" redundancy to recover corrupted backups even if your
disk has undetected bad sectors.
You can mount your bup repository as a FUSE filesystem and access the
content that way, and even export it over Samba.
A KDE-based front-end (GUI) for bup is available, namely Kup Backup System.
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Now metadata seems to be supported, seehttps://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).
– student
Mar 20 '13 at 18:22
add a comment |
bup
A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."
Highlights:
It uses a rolling checksum algorithm (similar to rsync) to split large
files into chunks. The most useful result of this is you can backup huge
virtual machine (VM) disk images, databases, and XML files incrementally,
even though they're typically all in one huge file, and not use tons of
disk space for multiple versions.
Data is "automagically" shared between incremental backups without having
to know which backup is based on which other one - even if the backups
are made from two different computers that don't even know about each
other. You just tell bup to back stuff up, and it saves only the minimum
amount of data needed.
Bup can use "par2" redundancy to recover corrupted backups even if your
disk has undetected bad sectors.
You can mount your bup repository as a FUSE filesystem and access the
content that way, and even export it over Samba.
A KDE-based front-end (GUI) for bup is available, namely Kup Backup System.
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Now metadata seems to be supported, seehttps://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).
– student
Mar 20 '13 at 18:22
add a comment |
bup
A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."
Highlights:
It uses a rolling checksum algorithm (similar to rsync) to split large
files into chunks. The most useful result of this is you can backup huge
virtual machine (VM) disk images, databases, and XML files incrementally,
even though they're typically all in one huge file, and not use tons of
disk space for multiple versions.
Data is "automagically" shared between incremental backups without having
to know which backup is based on which other one - even if the backups
are made from two different computers that don't even know about each
other. You just tell bup to back stuff up, and it saves only the minimum
amount of data needed.
Bup can use "par2" redundancy to recover corrupted backups even if your
disk has undetected bad sectors.
You can mount your bup repository as a FUSE filesystem and access the
content that way, and even export it over Samba.
A KDE-based front-end (GUI) for bup is available, namely Kup Backup System.
bup
A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."
Highlights:
It uses a rolling checksum algorithm (similar to rsync) to split large
files into chunks. The most useful result of this is you can backup huge
virtual machine (VM) disk images, databases, and XML files incrementally,
even though they're typically all in one huge file, and not use tons of
disk space for multiple versions.
Data is "automagically" shared between incremental backups without having
to know which backup is based on which other one - even if the backups
are made from two different computers that don't even know about each
other. You just tell bup to back stuff up, and it saves only the minimum
amount of data needed.
Bup can use "par2" redundancy to recover corrupted backups even if your
disk has undetected bad sectors.
You can mount your bup repository as a FUSE filesystem and access the
content that way, and even export it over Samba.
A KDE-based front-end (GUI) for bup is available, namely Kup Backup System.
edited Aug 1 '12 at 16:03
community wiki
2 revs, 2 users 94%
ændrük
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Now metadata seems to be supported, seehttps://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).
– student
Mar 20 '13 at 18:22
add a comment |
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Now metadata seems to be supported, seehttps://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).
– student
Mar 20 '13 at 18:22
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub
– nealmcb
Jul 1 '11 at 20:28
Now metadata seems to be supported, see
https://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).– student
Mar 20 '13 at 18:22
Now metadata seems to be supported, see
https://github.com/apenwarr/bup
: 'bup save' and 'bup restore' have immature metadata support. On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).– student
Mar 20 '13 at 18:22
add a comment |
CrashPlan
CrashPlan is a company providing business backup, without plan for individual users.
Features
- 10$/month/device fee
- Triple destination data storage and protection
- Silent and continuous
- Generous retention and versioning
- Deleted file protection
I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.
CrashPlan is cross platform, but not open source.
It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
add a comment |
CrashPlan
CrashPlan is a company providing business backup, without plan for individual users.
Features
- 10$/month/device fee
- Triple destination data storage and protection
- Silent and continuous
- Generous retention and versioning
- Deleted file protection
I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.
CrashPlan is cross platform, but not open source.
It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
add a comment |
CrashPlan
CrashPlan is a company providing business backup, without plan for individual users.
Features
- 10$/month/device fee
- Triple destination data storage and protection
- Silent and continuous
- Generous retention and versioning
- Deleted file protection
I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.
CrashPlan is cross platform, but not open source.
It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.
CrashPlan
CrashPlan is a company providing business backup, without plan for individual users.
Features
- 10$/month/device fee
- Triple destination data storage and protection
- Silent and continuous
- Generous retention and versioning
- Deleted file protection
I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.
CrashPlan is cross platform, but not open source.
It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.
edited Aug 7 '18 at 5:50
community wiki
9 revs, 5 users 39%
Diogo Gomes
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
add a comment |
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
CrashPlan could be good, but is insanely slow to backup.
– Goddard
Oct 21 '16 at 21:20
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
Do note that Crashplan is stopping their service to non-enterprise customers: crashplan.com/en-us/consumer/nextsteps
– Ours
Aug 28 '17 at 17:23
add a comment |
Bacula
I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.
One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).
When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.
Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:
- file daemon (takes care of actually collecting files and their metadata cross-platform way)
- storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
- director daemon (takes care of scheduling backups and central configuration)
There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.
bconsole binary is shipped with bacula and provides CLI interface for bacula administration.
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
2
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
add a comment |
Bacula
I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.
One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).
When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.
Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:
- file daemon (takes care of actually collecting files and their metadata cross-platform way)
- storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
- director daemon (takes care of scheduling backups and central configuration)
There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.
bconsole binary is shipped with bacula and provides CLI interface for bacula administration.
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
2
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
add a comment |
Bacula
I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.
One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).
When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.
Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:
- file daemon (takes care of actually collecting files and their metadata cross-platform way)
- storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
- director daemon (takes care of scheduling backups and central configuration)
There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.
bconsole binary is shipped with bacula and provides CLI interface for bacula administration.
Bacula
I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.
One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).
When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.
Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:
- file daemon (takes care of actually collecting files and their metadata cross-platform way)
- storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
- director daemon (takes care of scheduling backups and central configuration)
There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.
bconsole binary is shipped with bacula and provides CLI interface for bacula administration.
edited Apr 25 '12 at 16:07
community wiki
4 revs, 4 users 65%
Chuck
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
2
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
add a comment |
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
2
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
pls explain 2nd paragraph: "This makes it independent..."
– Tshepang
Jan 11 '11 at 23:31
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
There is a web interface written in python: readthedocs.org/docs/almir/en/latest
– iElectric
Apr 25 '12 at 16:00
2
2
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
@Tshepang meaning it doesn't rely on tools installed on operating system itself.
– iElectric
Jul 8 '12 at 20:09
add a comment |
Simple Backup
Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).
Features:
easy to set-up with already pre-defined backup strategies
external hard disk backup support
remote backup via SSH or FTP
revision history
clever auto-purging- easy sheduling
user- and/or system-level backups
As you can see the feature set is similar to the one offered by Back in time
.
Simple Backup fits well in the Gnome and Ubuntu Desktop environment.
6
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
2
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
2
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
1
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
|
show 2 more comments
Simple Backup
Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).
Features:
easy to set-up with already pre-defined backup strategies
external hard disk backup support
remote backup via SSH or FTP
revision history
clever auto-purging- easy sheduling
user- and/or system-level backups
As you can see the feature set is similar to the one offered by Back in time
.
Simple Backup fits well in the Gnome and Ubuntu Desktop environment.
6
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
2
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
2
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
1
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
|
show 2 more comments
Simple Backup
Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).
Features:
easy to set-up with already pre-defined backup strategies
external hard disk backup support
remote backup via SSH or FTP
revision history
clever auto-purging- easy sheduling
user- and/or system-level backups
As you can see the feature set is similar to the one offered by Back in time
.
Simple Backup fits well in the Gnome and Ubuntu Desktop environment.
Simple Backup
Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).
Features:
easy to set-up with already pre-defined backup strategies
external hard disk backup support
remote backup via SSH or FTP
revision history
clever auto-purging- easy sheduling
user- and/or system-level backups
As you can see the feature set is similar to the one offered by Back in time
.
Simple Backup fits well in the Gnome and Ubuntu Desktop environment.
edited Nov 10 '11 at 9:06
community wiki
5 revs, 4 users 85%
Huygens
6
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
2
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
2
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
1
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
|
show 2 more comments
6
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
2
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
2
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
1
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
6
6
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it.
– Alex Launi
Nov 1 '10 at 3:16
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
@Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity.
– Huygens
Nov 1 '10 at 21:57
2
2
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once.
– Alex Launi
Nov 2 '10 at 15:17
2
2
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time.
– user2413
Nov 8 '10 at 13:00
1
1
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
@Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities.
– user2413
Nov 9 '10 at 16:53
|
show 2 more comments
Use tar.
It is a simple and robust method, but yet it's rather outdated. Today, we have better and faster backup tools which also have more useful features.
Create a full backup of your home
directory:
cd
to the directory where you want to store the backup file, and then:
tar --create --verbose --file backup.tar <path to the home directory>
For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar
:
Again, cd
to the directory where the backup file is, and then use --update
:
tar --update --verbose --file backup.tar <path to the home directory>
All files that are either new or have been modified will be saved in backup.tar
. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar
, and find the files (and versions) you want to restore.
Note: You cannot use --update
on a compressed tar file (e.g. .tar.gz
).
add a comment |
Use tar.
It is a simple and robust method, but yet it's rather outdated. Today, we have better and faster backup tools which also have more useful features.
Create a full backup of your home
directory:
cd
to the directory where you want to store the backup file, and then:
tar --create --verbose --file backup.tar <path to the home directory>
For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar
:
Again, cd
to the directory where the backup file is, and then use --update
:
tar --update --verbose --file backup.tar <path to the home directory>
All files that are either new or have been modified will be saved in backup.tar
. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar
, and find the files (and versions) you want to restore.
Note: You cannot use --update
on a compressed tar file (e.g. .tar.gz
).
add a comment |
Use tar.
It is a simple and robust method, but yet it's rather outdated. Today, we have better and faster backup tools which also have more useful features.
Create a full backup of your home
directory:
cd
to the directory where you want to store the backup file, and then:
tar --create --verbose --file backup.tar <path to the home directory>
For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar
:
Again, cd
to the directory where the backup file is, and then use --update
:
tar --update --verbose --file backup.tar <path to the home directory>
All files that are either new or have been modified will be saved in backup.tar
. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar
, and find the files (and versions) you want to restore.
Note: You cannot use --update
on a compressed tar file (e.g. .tar.gz
).
Use tar.
It is a simple and robust method, but yet it's rather outdated. Today, we have better and faster backup tools which also have more useful features.
Create a full backup of your home
directory:
cd
to the directory where you want to store the backup file, and then:
tar --create --verbose --file backup.tar <path to the home directory>
For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar
:
Again, cd
to the directory where the backup file is, and then use --update
:
tar --update --verbose --file backup.tar <path to the home directory>
All files that are either new or have been modified will be saved in backup.tar
. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar
, and find the files (and versions) you want to restore.
Note: You cannot use --update
on a compressed tar file (e.g. .tar.gz
).
edited 4 mins ago
community wiki
4 revs, 2 users 73%
stevehendo34
add a comment |
add a comment |
DAR
DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.
add a comment |
DAR
DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.
add a comment |
DAR
DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.
DAR
DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.
edited Mar 11 '17 at 18:56
community wiki
6 revs, 3 users 58%
maxschlepzig
add a comment |
add a comment |
Spideroak
A dropbox like backup/syncing service with comparable features.
- Access all your data in one de-duplicated location
- Configurable multi-platform synchronization
- Preserve all historical versions & deleted files
- Share folders instantly in web
- ShareRooms w / RSS
- Retrieve files from any internet-connected device
- Comprehensive 'zero-knowledge' data encryption
Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope
More info at https://spideroak.com
1
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
5
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
1
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
add a comment |
Spideroak
A dropbox like backup/syncing service with comparable features.
- Access all your data in one de-duplicated location
- Configurable multi-platform synchronization
- Preserve all historical versions & deleted files
- Share folders instantly in web
- ShareRooms w / RSS
- Retrieve files from any internet-connected device
- Comprehensive 'zero-knowledge' data encryption
Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope
More info at https://spideroak.com
1
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
5
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
1
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
add a comment |
Spideroak
A dropbox like backup/syncing service with comparable features.
- Access all your data in one de-duplicated location
- Configurable multi-platform synchronization
- Preserve all historical versions & deleted files
- Share folders instantly in web
- ShareRooms w / RSS
- Retrieve files from any internet-connected device
- Comprehensive 'zero-knowledge' data encryption
Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope
More info at https://spideroak.com
Spideroak
A dropbox like backup/syncing service with comparable features.
- Access all your data in one de-duplicated location
- Configurable multi-platform synchronization
- Preserve all historical versions & deleted files
- Share folders instantly in web
- ShareRooms w / RSS
- Retrieve files from any internet-connected device
- Comprehensive 'zero-knowledge' data encryption
Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope
More info at https://spideroak.com
edited Aug 9 '18 at 6:52
community wiki
2 revs, 2 users 94%
Derek
1
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
5
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
1
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
add a comment |
1
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
5
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
1
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
1
1
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops.
– Scott Severance
May 29 '12 at 10:33
5
5
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup.
– RichVel
Nov 1 '12 at 12:19
1
1
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
as mentioned for dropbox: backup and syncing are two different tasks!
– DJCrashdummy
Jun 18 '17 at 19:39
add a comment |
Attic Backup
Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup
data. The data deduplication technique used makes Attic suitable for
daily backups since only the changes are stored.
Main Features:
- Easy to use
Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified
using HMAC-SHA256.
Off-site backups: Attic can store data on any remote host accessible over SSH
Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.
Requirements:
Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python
and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse
is required.
Note:
There is also now a fork of Attic called Borg.
add a comment |
Attic Backup
Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup
data. The data deduplication technique used makes Attic suitable for
daily backups since only the changes are stored.
Main Features:
- Easy to use
Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified
using HMAC-SHA256.
Off-site backups: Attic can store data on any remote host accessible over SSH
Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.
Requirements:
Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python
and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse
is required.
Note:
There is also now a fork of Attic called Borg.
add a comment |
Attic Backup
Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup
data. The data deduplication technique used makes Attic suitable for
daily backups since only the changes are stored.
Main Features:
- Easy to use
Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified
using HMAC-SHA256.
Off-site backups: Attic can store data on any remote host accessible over SSH
Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.
Requirements:
Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python
and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse
is required.
Note:
There is also now a fork of Attic called Borg.
Attic Backup
Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup
data. The data deduplication technique used makes Attic suitable for
daily backups since only the changes are stored.
Main Features:
- Easy to use
Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified
using HMAC-SHA256.
Off-site backups: Attic can store data on any remote host accessible over SSH
Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.
Requirements:
Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python
and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse
is required.
Note:
There is also now a fork of Attic called Borg.
edited Nov 28 '15 at 17:52
community wiki
2 revs, 2 users 89%
rcs
add a comment |
add a comment |
FlyBack
Warning: Unmaintained, last update in 2010.
Similar to Back in Time
Apple's Time Machine is a great
feature in their OS, and Linux has
almost all of the required technology
already built in to recreate it. This
is a simple GUI to make it easy to
use.
1
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
add a comment |
FlyBack
Warning: Unmaintained, last update in 2010.
Similar to Back in Time
Apple's Time Machine is a great
feature in their OS, and Linux has
almost all of the required technology
already built in to recreate it. This
is a simple GUI to make it easy to
use.
1
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
add a comment |
FlyBack
Warning: Unmaintained, last update in 2010.
Similar to Back in Time
Apple's Time Machine is a great
feature in their OS, and Linux has
almost all of the required technology
already built in to recreate it. This
is a simple GUI to make it easy to
use.
FlyBack
Warning: Unmaintained, last update in 2010.
Similar to Back in Time
Apple's Time Machine is a great
feature in their OS, and Linux has
almost all of the required technology
already built in to recreate it. This
is a simple GUI to make it easy to
use.
edited Aug 9 '18 at 6:57
community wiki
4 revs, 4 users 61%
Derek
1
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
add a comment |
1
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
1
1
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time).
– Jealie
Jul 21 '15 at 17:23
add a comment |
Jungledisk
Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.
I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .
It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.
I can't recommend it enough.
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
add a comment |
Jungledisk
Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.
I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .
It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.
I can't recommend it enough.
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
add a comment |
Jungledisk
Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.
I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .
It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.
I can't recommend it enough.
Jungledisk
Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.
I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .
It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.
I can't recommend it enough.
edited Aug 7 '18 at 5:56
community wiki
3 revs, 3 users 81%
nwahmaet
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
add a comment |
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows.
– sbrattla
Oct 4 '15 at 19:19
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start.
– Mateusz Konieczny
Aug 7 '18 at 5:58
add a comment |
Areca Backup
Warning: Unmaintained, last release in 2015.
is also a very decent GPL program to make backups easily.
Features
- Archives compression (Zip & Zip64
format) - Archives encryption (AES128 & AES256
encryption algorithms) - Storage on local hard drive, network
drive, USB key, FTP / FTPs server
(with implicit and explicit SSL /
TLS) - Source file filters (by extension,
subdirectory, regular expression,
size, date, status, with AND/OR/NOT
logical operators) - Incremental, differential and full
backup support - Support for delta backup (store only
modified parts of your files) - Archives merges : You can merge
contiguous archives into one single
archive to save storage space. - As of date recovery : Areca allows
you to recover your archives (or
single files) as of a specific date. - Transaction mechanism : All critical
processes (such as backups or merges)
are transactional. This guarantees
your backups' integrity. - Backup reports : Areca generates
backup reports that can be stored on
your disk or sent by email. - Post backup scripts : Areca can
launch shell scripts after backup. - Files permissions, symbolic links and
named pipes can be stored and
recovered. (Linux only)
add a comment |
Areca Backup
Warning: Unmaintained, last release in 2015.
is also a very decent GPL program to make backups easily.
Features
- Archives compression (Zip & Zip64
format) - Archives encryption (AES128 & AES256
encryption algorithms) - Storage on local hard drive, network
drive, USB key, FTP / FTPs server
(with implicit and explicit SSL /
TLS) - Source file filters (by extension,
subdirectory, regular expression,
size, date, status, with AND/OR/NOT
logical operators) - Incremental, differential and full
backup support - Support for delta backup (store only
modified parts of your files) - Archives merges : You can merge
contiguous archives into one single
archive to save storage space. - As of date recovery : Areca allows
you to recover your archives (or
single files) as of a specific date. - Transaction mechanism : All critical
processes (such as backups or merges)
are transactional. This guarantees
your backups' integrity. - Backup reports : Areca generates
backup reports that can be stored on
your disk or sent by email. - Post backup scripts : Areca can
launch shell scripts after backup. - Files permissions, symbolic links and
named pipes can be stored and
recovered. (Linux only)
add a comment |
Areca Backup
Warning: Unmaintained, last release in 2015.
is also a very decent GPL program to make backups easily.
Features
- Archives compression (Zip & Zip64
format) - Archives encryption (AES128 & AES256
encryption algorithms) - Storage on local hard drive, network
drive, USB key, FTP / FTPs server
(with implicit and explicit SSL /
TLS) - Source file filters (by extension,
subdirectory, regular expression,
size, date, status, with AND/OR/NOT
logical operators) - Incremental, differential and full
backup support - Support for delta backup (store only
modified parts of your files) - Archives merges : You can merge
contiguous archives into one single
archive to save storage space. - As of date recovery : Areca allows
you to recover your archives (or
single files) as of a specific date. - Transaction mechanism : All critical
processes (such as backups or merges)
are transactional. This guarantees
your backups' integrity. - Backup reports : Areca generates
backup reports that can be stored on
your disk or sent by email. - Post backup scripts : Areca can
launch shell scripts after backup. - Files permissions, symbolic links and
named pipes can be stored and
recovered. (Linux only)
Areca Backup
Warning: Unmaintained, last release in 2015.
is also a very decent GPL program to make backups easily.
Features
- Archives compression (Zip & Zip64
format) - Archives encryption (AES128 & AES256
encryption algorithms) - Storage on local hard drive, network
drive, USB key, FTP / FTPs server
(with implicit and explicit SSL /
TLS) - Source file filters (by extension,
subdirectory, regular expression,
size, date, status, with AND/OR/NOT
logical operators) - Incremental, differential and full
backup support - Support for delta backup (store only
modified parts of your files) - Archives merges : You can merge
contiguous archives into one single
archive to save storage space. - As of date recovery : Areca allows
you to recover your archives (or
single files) as of a specific date. - Transaction mechanism : All critical
processes (such as backups or merges)
are transactional. This guarantees
your backups' integrity. - Backup reports : Areca generates
backup reports that can be stored on
your disk or sent by email. - Post backup scripts : Areca can
launch shell scripts after backup. - Files permissions, symbolic links and
named pipes can be stored and
recovered. (Linux only)
edited Aug 9 '18 at 6:58
community wiki
2 revs, 2 users 98%
AndyB
add a comment |
add a comment |
I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:
exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest
I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.
Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:
/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
2
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
add a comment |
I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:
exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest
I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.
Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:
/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
2
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
add a comment |
I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:
exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest
I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.
Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:
/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:
exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest
I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.
Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:
/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
answered Aug 30 '10 at 8:30
community wiki
user185
2
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
add a comment |
2
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
2
2
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try
– Marcel Stimberg
Sep 2 '10 at 9:08
add a comment |
Dirvish
Dirvish is a nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.
Here is a nice tutorial for it: http://wiki.edseek.com/howto:dirvish
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
add a comment |
Dirvish
Dirvish is a nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.
Here is a nice tutorial for it: http://wiki.edseek.com/howto:dirvish
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
add a comment |
Dirvish
Dirvish is a nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.
Here is a nice tutorial for it: http://wiki.edseek.com/howto:dirvish
Dirvish
Dirvish is a nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.
Here is a nice tutorial for it: http://wiki.edseek.com/howto:dirvish
answered Mar 28 '13 at 10:06
community wiki
student
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
add a comment |
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
This is a real good way to get rsync incremental backups to work!
– Nanne
May 20 '13 at 8:26
add a comment |
Duplicati
An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".
Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.
I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.
A fork of duplicity.
add a comment |
Duplicati
An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".
Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.
I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.
A fork of duplicity.
add a comment |
Duplicati
An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".
Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.
I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.
A fork of duplicity.
Duplicati
An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".
Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.
I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.
A fork of duplicity.
edited Apr 13 '17 at 12:23
community wiki
3 revs, 2 users 87%
Breezer
add a comment |
add a comment |
PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.
What I like about this program is that it copies the entire partition.
Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.
You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.
Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.
Pro's:
- Will not only backup documents, but system files as well
- It's easy to use
- It is possible to backup other (non-Linux) partitions as well
- It will compress the backup in gzip or bzip2 format, saving disk space
Cons:
- The PC will have to be restarted before being able to backup
- PING will make a backup of an entire partition, even when only few files have been modified
- You'll need an external hard drive or some free space on your PC to put your backups
An excellent Dutch manual can be found here.
add a comment |
PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.
What I like about this program is that it copies the entire partition.
Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.
You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.
Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.
Pro's:
- Will not only backup documents, but system files as well
- It's easy to use
- It is possible to backup other (non-Linux) partitions as well
- It will compress the backup in gzip or bzip2 format, saving disk space
Cons:
- The PC will have to be restarted before being able to backup
- PING will make a backup of an entire partition, even when only few files have been modified
- You'll need an external hard drive or some free space on your PC to put your backups
An excellent Dutch manual can be found here.
add a comment |
PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.
What I like about this program is that it copies the entire partition.
Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.
You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.
Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.
Pro's:
- Will not only backup documents, but system files as well
- It's easy to use
- It is possible to backup other (non-Linux) partitions as well
- It will compress the backup in gzip or bzip2 format, saving disk space
Cons:
- The PC will have to be restarted before being able to backup
- PING will make a backup of an entire partition, even when only few files have been modified
- You'll need an external hard drive or some free space on your PC to put your backups
An excellent Dutch manual can be found here.
PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.
What I like about this program is that it copies the entire partition.
Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.
You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.
Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.
Pro's:
- Will not only backup documents, but system files as well
- It's easy to use
- It is possible to backup other (non-Linux) partitions as well
- It will compress the backup in gzip or bzip2 format, saving disk space
Cons:
- The PC will have to be restarted before being able to backup
- PING will make a backup of an entire partition, even when only few files have been modified
- You'll need an external hard drive or some free space on your PC to put your backups
An excellent Dutch manual can be found here.
edited Jan 3 '12 at 21:23
community wiki
2 revs
Exeleration-G
add a comment |
add a comment |
s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.
Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.
See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.
I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.
add a comment |
s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.
Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.
See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.
I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.
add a comment |
s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.
Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.
See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.
I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.
s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.
Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.
See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.
I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.
answered Jul 10 '12 at 6:23
community wiki
nealmcb
add a comment |
add a comment |
TimeVault
Warning: unmaintained
TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.
Can be downloaded from Launchpad.
add a comment |
TimeVault
Warning: unmaintained
TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.
Can be downloaded from Launchpad.
add a comment |
TimeVault
Warning: unmaintained
TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.
Can be downloaded from Launchpad.
TimeVault
Warning: unmaintained
TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.
Can be downloaded from Launchpad.
edited Aug 9 '18 at 6:59
community wiki
2 revs, 2 users 84%
papukaija
add a comment |
add a comment |
inosync
A Python script that offers a more-or-less real-time backup capability.
Mote that this software is not maintained anymore.
"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."
http://www.opcug.ca/public/Reviews/linux_part16.htm
add a comment |
inosync
A Python script that offers a more-or-less real-time backup capability.
Mote that this software is not maintained anymore.
"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."
http://www.opcug.ca/public/Reviews/linux_part16.htm
add a comment |
inosync
A Python script that offers a more-or-less real-time backup capability.
Mote that this software is not maintained anymore.
"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."
http://www.opcug.ca/public/Reviews/linux_part16.htm
inosync
A Python script that offers a more-or-less real-time backup capability.
Mote that this software is not maintained anymore.
"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."
http://www.opcug.ca/public/Reviews/linux_part16.htm
edited Aug 7 '18 at 5:59
community wiki
2 revs, 2 users 87%
CentaurusA
add a comment |
add a comment |
Obnam
Warning: Software is no longer maintained, authors recommend not using it
'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.
Some features that may interest you:
- Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
- Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
- Encrypted backups, using GnuPG.'
An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.
add a comment |
Obnam
Warning: Software is no longer maintained, authors recommend not using it
'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.
Some features that may interest you:
- Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
- Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
- Encrypted backups, using GnuPG.'
An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.
add a comment |
Obnam
Warning: Software is no longer maintained, authors recommend not using it
'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.
Some features that may interest you:
- Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
- Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
- Encrypted backups, using GnuPG.'
An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.
Obnam
Warning: Software is no longer maintained, authors recommend not using it
'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.
Some features that may interest you:
- Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
- Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
- Encrypted backups, using GnuPG.'
An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.
edited Aug 9 '18 at 7:01
community wiki
2 revs, 2 users 94%
shaddow
add a comment |
add a comment |
saybackup and saypurge
There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:
This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the
end of each successful backup run, a symlink '*-current' is updated
to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.
The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:
Sayepurge parses the timestamps from the names of this set of backup
directories, computes the time deltas, and determines good deletion
candidates so that backups are spaced out over time most evenly. The
exact behavior can be tuned by specifying the number of recent files
to guard against deletion (-g), the number of historic backups to keep
around (-k) and the maximum number of deletions for any given run
(-d). In the above set of files, the two backups from 2011-07-07 are
only 6h apart, so they make good purging candidates...
add a comment |
saybackup and saypurge
There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:
This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the
end of each successful backup run, a symlink '*-current' is updated
to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.
The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:
Sayepurge parses the timestamps from the names of this set of backup
directories, computes the time deltas, and determines good deletion
candidates so that backups are spaced out over time most evenly. The
exact behavior can be tuned by specifying the number of recent files
to guard against deletion (-g), the number of historic backups to keep
around (-k) and the maximum number of deletions for any given run
(-d). In the above set of files, the two backups from 2011-07-07 are
only 6h apart, so they make good purging candidates...
add a comment |
saybackup and saypurge
There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:
This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the
end of each successful backup run, a symlink '*-current' is updated
to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.
The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:
Sayepurge parses the timestamps from the names of this set of backup
directories, computes the time deltas, and determines good deletion
candidates so that backups are spaced out over time most evenly. The
exact behavior can be tuned by specifying the number of recent files
to guard against deletion (-g), the number of historic backups to keep
around (-k) and the maximum number of deletions for any given run
(-d). In the above set of files, the two backups from 2011-07-07 are
only 6h apart, so they make good purging candidates...
saybackup and saypurge
There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:
This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the
end of each successful backup run, a symlink '*-current' is updated
to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.
The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:
Sayepurge parses the timestamps from the names of this set of backup
directories, computes the time deltas, and determines good deletion
candidates so that backups are spaced out over time most evenly. The
exact behavior can be tuned by specifying the number of recent files
to guard against deletion (-g), the number of historic backups to keep
around (-k) and the maximum number of deletions for any given run
(-d). In the above set of files, the two backups from 2011-07-07 are
only 6h apart, so they make good purging candidates...
answered Mar 28 '13 at 9:50
community wiki
student
add a comment |
add a comment |
backup2l
Warning: unmaintained, last commit on 2017-02-14
From the homepage:
backup2l is a lightweight command line tool for generating,
maintaining and restoring backups on a mountable file system (e. g.
hard disk). The main design goals are are low maintenance effort,
efficiency, transparency and robustness. In a default installation,
backups are created autonomously by a cron script.
backup2l supports hierarchical differential backups with a
user-specified number of levels and backups per level. With this
scheme, the total number of archives that have to be stored only
increases logarithmically with the number of differential backups
since the last full backup. Hence, small incremental backups can be
generated at short intervals while time- and space-consuming full
backups are only sparsely needed.
The restore function allows to easily restore the state of the file
system or arbitrary directories/files of previous points in time. The
ownership and permission attributes of files and directories are
correctly restored.
An integrated split-and-collect function allows to comfortably
transfer all or selected archives to a set of CDs or other removable
media.
All control files are stored together with the archives on the backup
device, and their contents are mostly self-explaining. Hence, in the
case of an emergency, a user does not only have to rely on the restore
functionality of backup2l, but can - if necessary - browse the files
and extract archives manually.
For deciding whether a file is new or modified, backup2l looks at its
name, modification time, size, ownership and permissions. Unlike other
backup tools, the i-node is not considered in order to avoid problems
with non-Unix file systems like FAT32.
add a comment |
backup2l
Warning: unmaintained, last commit on 2017-02-14
From the homepage:
backup2l is a lightweight command line tool for generating,
maintaining and restoring backups on a mountable file system (e. g.
hard disk). The main design goals are are low maintenance effort,
efficiency, transparency and robustness. In a default installation,
backups are created autonomously by a cron script.
backup2l supports hierarchical differential backups with a
user-specified number of levels and backups per level. With this
scheme, the total number of archives that have to be stored only
increases logarithmically with the number of differential backups
since the last full backup. Hence, small incremental backups can be
generated at short intervals while time- and space-consuming full
backups are only sparsely needed.
The restore function allows to easily restore the state of the file
system or arbitrary directories/files of previous points in time. The
ownership and permission attributes of files and directories are
correctly restored.
An integrated split-and-collect function allows to comfortably
transfer all or selected archives to a set of CDs or other removable
media.
All control files are stored together with the archives on the backup
device, and their contents are mostly self-explaining. Hence, in the
case of an emergency, a user does not only have to rely on the restore
functionality of backup2l, but can - if necessary - browse the files
and extract archives manually.
For deciding whether a file is new or modified, backup2l looks at its
name, modification time, size, ownership and permissions. Unlike other
backup tools, the i-node is not considered in order to avoid problems
with non-Unix file systems like FAT32.
add a comment |
backup2l
Warning: unmaintained, last commit on 2017-02-14
From the homepage:
backup2l is a lightweight command line tool for generating,
maintaining and restoring backups on a mountable file system (e. g.
hard disk). The main design goals are are low maintenance effort,
efficiency, transparency and robustness. In a default installation,
backups are created autonomously by a cron script.
backup2l supports hierarchical differential backups with a
user-specified number of levels and backups per level. With this
scheme, the total number of archives that have to be stored only
increases logarithmically with the number of differential backups
since the last full backup. Hence, small incremental backups can be
generated at short intervals while time- and space-consuming full
backups are only sparsely needed.
The restore function allows to easily restore the state of the file
system or arbitrary directories/files of previous points in time. The
ownership and permission attributes of files and directories are
correctly restored.
An integrated split-and-collect function allows to comfortably
transfer all or selected archives to a set of CDs or other removable
media.
All control files are stored together with the archives on the backup
device, and their contents are mostly self-explaining. Hence, in the
case of an emergency, a user does not only have to rely on the restore
functionality of backup2l, but can - if necessary - browse the files
and extract archives manually.
For deciding whether a file is new or modified, backup2l looks at its
name, modification time, size, ownership and permissions. Unlike other
backup tools, the i-node is not considered in order to avoid problems
with non-Unix file systems like FAT32.
backup2l
Warning: unmaintained, last commit on 2017-02-14
From the homepage:
backup2l is a lightweight command line tool for generating,
maintaining and restoring backups on a mountable file system (e. g.
hard disk). The main design goals are are low maintenance effort,
efficiency, transparency and robustness. In a default installation,
backups are created autonomously by a cron script.
backup2l supports hierarchical differential backups with a
user-specified number of levels and backups per level. With this
scheme, the total number of archives that have to be stored only
increases logarithmically with the number of differential backups
since the last full backup. Hence, small incremental backups can be
generated at short intervals while time- and space-consuming full
backups are only sparsely needed.
The restore function allows to easily restore the state of the file
system or arbitrary directories/files of previous points in time. The
ownership and permission attributes of files and directories are
correctly restored.
An integrated split-and-collect function allows to comfortably
transfer all or selected archives to a set of CDs or other removable
media.
All control files are stored together with the archives on the backup
device, and their contents are mostly self-explaining. Hence, in the
case of an emergency, a user does not only have to rely on the restore
functionality of backup2l, but can - if necessary - browse the files
and extract archives manually.
For deciding whether a file is new or modified, backup2l looks at its
name, modification time, size, ownership and permissions. Unlike other
backup tools, the i-node is not considered in order to avoid problems
with non-Unix file systems like FAT32.
edited Aug 9 '18 at 7:02
community wiki
3 revs, 2 users 96%
student
add a comment |
add a comment |
boxbackup
From the homepage:
Box Backup is an open source, completely automatic, on-line backup
system. It has the following key features:
- All backed up data is stored on the server in files on a filesystem - no tape, archive or other special devices are required.
-The server is trusted only to make files available when they are required - all data is encrypted and can be decoded only by the
original client. This makes it ideal for backing up over an untrusted
network (such as the Internet), or where the server is in an
uncontrolled environment.
-A backup daemon runs on systems to be backed up, and copies encrypted data to the server when it notices changes - so backups are continuous
and up-to-date (although traditional snapshot backups are possible
too).
- Only changes within files are sent to the server, just like rsync, minimising the bandwidth used between clients and server. This makes
it particularly suitable for backing up between distant locations, or
over the Internet.
- It behaves like tape - old file versions and deleted files are available.
- Old versions of files on the server are stored as changes from the current version, minimising the storage space required on the server.
Files are the server are also compressed to minimise their size.
- Choice of backup behaviour - it can be optimised for document or server backup.
- It is designed to be easy and cheap to run a server. It has a portable implementation, and optional RAID implemented in userland for
reliability without complex server setup or expensive hardware.
http://www.boxbackup.org/
add a comment |
boxbackup
From the homepage:
Box Backup is an open source, completely automatic, on-line backup
system. It has the following key features:
- All backed up data is stored on the server in files on a filesystem - no tape, archive or other special devices are required.
-The server is trusted only to make files available when they are required - all data is encrypted and can be decoded only by the
original client. This makes it ideal for backing up over an untrusted
network (such as the Internet), or where the server is in an
uncontrolled environment.
-A backup daemon runs on systems to be backed up, and copies encrypted data to the server when it notices changes - so backups are continuous
and up-to-date (although traditional snapshot backups are possible
too).
- Only changes within files are sent to the server, just like rsync, minimising the bandwidth used between clients and server. This makes
it particularly suitable for backing up between distant locations, or
over the Internet.
- It behaves like tape - old file versions and deleted files are available.
- Old versions of files on the server are stored as changes from the current version, minimising the storage space required on the server.
Files are the server are also compressed to minimise their size.
- Choice of backup behaviour - it can be optimised for document or server backup.
- It is designed to be easy and cheap to run a server. It has a portable implementation, and optional RAID implemented in userland for
reliability without complex server setup or expensive hardware.
http://www.boxbackup.org/
add a comment |
boxbackup
From the homepage:
Box Backup is an open source, completely automatic, on-line backup
system. It has the following key features:
- All backed up data is stored on the server in files on a filesystem - no tape, archive or other special devices are required.
-The server is trusted only to make files available when they are required - all data is encrypted and can be decoded only by the
original client. This makes it ideal for backing up over an untrusted
network (such as the Internet), or where the server is in an
uncontrolled environment.
-A backup daemon runs on systems to be backed up, and copies encrypted data to the server when it notices changes - so backups are continuous
and up-to-date (although traditional snapshot backups are possible
too).
- Only changes within files are sent to the server, just like rsync, minimising the bandwidth used between clients and server. This makes
it particularly suitable for backing up between distant locations, or
over the Internet.
- It behaves like tape - old file versions and deleted files are available.
- Old versions of files on the server are stored as changes from the current version, minimising the storage space required on the server.
Files are the server are also compressed to minimise their size.
- Choice of backup behaviour - it can be optimised for document or server backup.
- It is designed to be easy and cheap to run a server. It has a portable implementation, and optional RAID implemented in userland for
reliability without complex server setup or expensive hardware.
http://www.boxbackup.org/
boxbackup
From the homepage:
Box Backup is an open source, completely automatic, on-line backup
system. It has the following key features:
- All backed up data is stored on the server in files on a filesystem - no tape, archive or other special devices are required.
-The server is trusted only to make files available when they are required - all data is encrypted and can be decoded only by the
original client. This makes it ideal for backing up over an untrusted
network (such as the Internet), or where the server is in an
uncontrolled environment.
-A backup daemon runs on systems to be backed up, and copies encrypted data to the server when it notices changes - so backups are continuous
and up-to-date (although traditional snapshot backups are possible
too).
- Only changes within files are sent to the server, just like rsync, minimising the bandwidth used between clients and server. This makes
it particularly suitable for backing up between distant locations, or
over the Internet.
- It behaves like tape - old file versions and deleted files are available.
- Old versions of files on the server are stored as changes from the current version, minimising the storage space required on the server.
Files are the server are also compressed to minimise their size.
- Choice of backup behaviour - it can be optimised for document or server backup.
- It is designed to be easy and cheap to run a server. It has a portable implementation, and optional RAID implemented in userland for
reliability without complex server setup or expensive hardware.
http://www.boxbackup.org/
edited Jul 4 '16 at 23:34
community wiki
2 revs, 2 users 98%
student
add a comment |
add a comment |
1 2
next
protected by Community♦ Nov 1 '16 at 9:22
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
4
I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway.
– Toby
Aug 18 '10 at 21:15
It's not a features comparison, but this poll might help: webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too!
– Alin Andrei
Aug 18 '10 at 21:19