Cloud backups, maybe. Data that's only in the cloud (e.g. on GitHub) is not more reliable than data that is in the cloud and backed-up by you to somewhere else (to a different cloud, if you want).
> If the repo goes down it's already going to be a lot harder to use, but it seems easier to fork a repo.
Which means you have to maintain an up-to-date fork, you loose the issues etc from the original repo, and if a repo is DMCAd your GitHub-fork is gone as well.
> When have you not been able to work on a local repo locally?
If you wanted to read the issues, or PRs by other people that you didn't manually copy.
I really like the project, it should make it easier to make sure I always have a copy of important stuff available. Just need to figure out what's the best way of telling it which repos I care enough about (I star to much stuff).
EDIT: take a look at the issues though, it has some annoying limitations
> I disagree, cloud backups are more reliable.
Cloud backups, maybe. Data that's only in the cloud (e.g. on GitHub) is not more reliable than data that is in the cloud and backed-up by you to somewhere else (to a different cloud, if you want).
> If the repo goes down it's already going to be a lot harder to use, but it seems easier to fork a repo.
Which means you have to maintain an up-to-date fork, you loose the issues etc from the original repo, and if a repo is DMCAd your GitHub-fork is gone as well.
> When have you not been able to work on a local repo locally?
If you wanted to read the issues, or PRs by other people that you didn't manually copy.
I really like the project, it should make it easier to make sure I always have a copy of important stuff available. Just need to figure out what's the best way of telling it which repos I care enough about (I star to much stuff).
EDIT: take a look at the issues though, it has some annoying limitations