Do you (sorry, but just checking) repeatedly test backups? Eg pull monthly and bit verify that they're correct? Are you aware of anyone testing in this way?
I can only compare it from a user experience point of view. I tried duplicati for my windows laptop and was never quite happy. Kopia just worked from day one. The front-end still has a few bugs here and there particularly if you on windows electron eating sockets, WebDAV mounts not always working), however the backend seems very reliable (only did one full restore, but I also did not note any reports).
It still has a lot of potential, IMHO. You e.g. find some hints how to use it with AWS storage tiering in the docs.
I haven't looked at duplicati in a while and, it has evolved. While Duplicati's feature set looks similar now, I would need to benchmark it both for efficiency and final backup sizes.
And, while not directly, I know a number of companies, including mine, do test restores all the time.
Not the previous poster, and I don't use Kopia, but after reading Kopia features and docs, they seem to be in par. I use Duplicati quite extensively for personal backups, and didn't really have any issues.
Duplicati has a web interface, so with a proper authentication in place, you can use it to remotely monitor and manage backups.
Duplicati doesn't keep a local cache. It uses SQLite files for file meta data, but not for the content themselves.
I like Duplicati's snapshotting mechanism. You can specify how long or how many snapshots to keep, and my anecdotal evidence is that it's archival storage-friendly. I imagine S3 and it's lifetime management rules can bring a decent and cost effective backup solution.
I'm using Google Drive 2TB plan, and I didn't see Kopia supporting Google drive out of the box.
Hmm. I'm asking because I've had some trouble with Duplicati. I use it on a laptop, and it does not like being interrupted during a backup. It also doesn't fail the backup, which would be fine; instead, it gets jammed on files, particularly the large (multigb sqlite) file it generates to track state. It remains jammed, even once network is restored, and measures upload speeds at single-digit bytes per second. I end up having to force kill it after multiple abort requests fail to stop the jammed backup, and there's multiple warnings that this can corrupt data / you shouldn't kill the process...
So anyway, I'm looking for alternatives.
Duplicati also, somewhat annoyingly, nails 100% cpu for a while during backup which spins up the fans and gets my laptop very hot. I've been meaning to see if there's a simple way to modify the code to prevent this, but I'm very unfamiliar with C#.
Do you (sorry, but just checking) repeatedly test backups? Eg pull monthly and bit verify that they're correct? Are you aware of anyone testing in this way?
Thanks so much!