Skip to content(if available)orjump to list(if available)

Git-Annex

Git-Annex

39 comments

·August 25, 2025

nolist_policy

I use git-annex to manage all my data on all my drives. It automatically keeps track of which files are on which drives, it ensures that there are enough copies and it checksums everything. It works perfectly with offline drives.

git-annex can be a bit hard to grasp, so I suggest to create a throw-away repository, following the walkthrough[1] and try things out. See also workflows[2].

[1] https://git-annex.branchable.com/walkthrough/

[2] https://git-annex.branchable.com/workflow/

albertzeyer

How much data do you have? I'm using git-annex on my photos, and that are around 100k-1M files, several TB of data, on a ZFS. In the beginning, everything was fine, but it starts to become increasingly slow, such that every operation takes several minutes (5-30 mins or so).

I wonder a bit whether that is ZFS, or git-annex, or maybe my disk, or sth else.

Borg3

Why? WHY?! Why the heck are you using (D)VFS on your immutable data? What is the reasoning? That stuff is immutable and usually incremental.. Just throw proper syncing algoritm on it and sync w/ backups.. thats all. I wonder aby logic behind this...

Docs and other files you often change is completly different story. This is where DVFS shines. I wrote my own very simple DVFS exacly for that case. You just create directory, init repo manager.. and vioala.. Disk wide VFS is kinda useless as most of your data there just sits..

warp

My experience is the same, git-annex just doesn't work well with lots of small files. With annexes on slow USB disks, connected to a Raspberry Pi 3 or 4, I'm already annoyed when working with my largest annex (in file count) of 25000 files.

However, I mostly use annex as a way to archive stuff and make sure I have enough copies in distinct physical locations. So for photos I now just tar them up with one .tar file per family member per year. This works fine for for me for any data I want to keep safe but don't need to access directly very often.

egwor

One thing to check is whether any security/monitoring software might be causing issues. Since there are so many files in git repos, it can put a lot of load on that type of software.

matrss

I had tested a git-annex repository with about 1.5M files and it got pretty slow as well. The plain git repo size grew to multiple GiB and plain git operations were super slow, so I think this was mostly a git limitation. DataLad's approach of nested subdatasets (in practice git submodules where each submodule is a git-annex repository) can help, if it fits the data and workflows.

riedel

It would be great to have comprehensive benchmarks for git lsf, git annex, dvc and alike. I am also always getting annoyed with one or the other , e.g. due to the hashing overhead, etc. However, in many cases the annoyances come with bad filesystem integration on Windows in my case.

rurban

My guess is the windows virus scaner

pinoy420

[dead]

_Algernon_

I have thought about doing this in the past but ran into issues (one of them being the friction in permanently deleting files once added). I'd be curious how you use it if you have time to share.

internet_points

The page doesn't say it, but git-annex was created by https://www.patreon.com/joeyh who also made the wonderful https://joeyh.name/code/moreutils/ and https://etckeeper.branchable.com/

BrandiATMuhkuh

Does this also work if I have data on SharePoint, DropBox, etc. and want to pull them (sync with local machine)?

My use case is mostly ETL related, where I want to pull all customers data (enterprise customer) so I can process them. But also keep the data updated, hence pull?

ttiurani

Relevant discussion 9 days ago about the new native git large object promisers in "The future of large files in Git is Git":

https://news.ycombinator.com/item?id=44916783

avar

Thanks, also not-so-relevant, for the reasons I noted in a comment in that thread: https://news.ycombinator.com/item?id=44922405

I.e. annex is really in a different problem space than "big files in git", despite the obvious overlap.

A good way to think about it is that git-annex is sort of a git-native and distributed solution to the storage problem at the "other side" ("server side") of something like LFS, and to reason about it from there.

andunie

I've used this for years, but to me the big selling point was integration with cloud storage providers as a means of backup. That, however, was always flaky and dependent on unmaintained third-party plugins. I think there was also a bug at some point that caused some data inconsistencies, so eventually I stopped.

Does anyone know if the situation has improved on that front in the past 5 years?

matrss

Depends on the cloud storage provider, I think. The best chances are with those that support the more standard protocols like S3, webdav, sftp, etc.. A relatively new development is the special remote built into rclone, which should be better maintained than some other third-party special remotes and provides access to all rclone-supported remotes.

goku12

My only problem with git-annex is Haskell. I don't hate the language itself, but the sheer number of dependencies it has to install is staggering. Many of those dependencies are not used by anything else, or may be incompatible versions when more than one application uses it. The pain is when you install them using the system package manager. Just two Haskell applications - annex and pandoc - are enough to fill your daily updates with may be a dozen little Haskell packages. God forbid you're on a distro that installs from source!

It's quite safe to just statically link most, if not all of them directly into the application, even when some of them are shared by other applications. I have seen this complaint repeated a few times. The reply from the Haskelliers seem to be that this is for the fine grained modularity of the library ecosystem. But why do they treat it like everything starts and ends with Haskell? Sometimes, there are other priorities like system administration. None of the other compiled languages have this problem - Rust, Go, Zig, ... Even plain old C and C++ aren't this frustrating with dependencies.

I need to clarify that I'm not hostile towards the Haskell language, its ecosystem and its users. It's something I plan to learn myself. But why does this problem exist? And is there a solution?

IsTom

> It's quite safe to just statically link most, if not all of them directly into the application

If you're talking about distro's repos, isn't this a matter of distro and package manager policy?

aragilar

Which package manager are you using? I've not seen any issues with apt-based systems with Haskell?

goku12

I used to have issues on Arch/pacman. Now on ebuilds/Gentoo.

kajika91

I'm using my self-hosted forgejo. I don't see any benefit of git-annex over LFS so far, I'm not even sure I could setup annex as easily.

Digging a little bit I found that git-annex is coded in haskell (not a fan) and seems to be 50% slower (expected from haskell but also only 1 source so far so not really reliable).

I don't see appeal of the complexity of the commands, they probably serve a purpose. Once you opened a .gitattributes from git-LFS you pretty much know all you need and you barely need any commands anymore.

Also I like how setting up a .gitattribute makes everything transparent the same way .gitignore works. I don't see any equivalent with git-annex.

Lastly any "tutorial" or guide about git-annex that won't show me an equivalent of 'git lfs ls-files' will definitely not appeal to me. I'm a big user of 'git status' and 'git lfs ls-files' to check/re-check everything.

avar

Annex isn't slow because it's written in Haskell, it tends to be slow because of I/O and paranoia that's warranted as the default behavior in a distributed backup tool.

E.g. if you drop something it'll by default check the remotes it has access to for that content in real time, it can be many orders of magnitude faster to use --fast etc., to (somewhat unsafely) skip all that and trust whatever metadata you have a local copy of.

seanparsons

LFS and git-annex have subtly different use cases in my experience. LFS is for users developing something with git that has large files in the repo like the classic game development example. git-annex is something you'd use to keep some important stuff backed up which happens to involve large files, like a home folder with music or whatever in it. In my case I do the latter.

aragilar

What it works really well at is storing research data. LFS can't upload to arbitrary webdav/S3/sharepoint/other random cloud service.

stv0g

There is a soft-fork of Forgejo which adds support for git-annex:

https://codeberg.org/forgejo-aneksajo/forgejo-aneksajo

aragilar

How big are the repos you have? The largest git-annex repo I have is multiple TB (spread across multiple systems), with some files 10s of GB.

I'm not sure what you are doing, but from looking at the git-lfs-ls-files manpage `git annex list --in here` is likely what you want?

EmilStenstrom

Happy to see use cases front and center in command line documentation. They seem to always start with ”obscure command flag that you’ll probably never use”.

Munksgaard

Git-Annex is a cool piece of technology, but my impression is that it works best for single-user repositories. So for instance, as @nolist_policy described in a sibling comment, managing all your personal files, documents, music, etc. across many different devices.

I tried using it for syncing large files in a collaborative repository, and the use of "magic" branches didn't seem to scale well.

null

[deleted]

ygritte

Could this be abused to simulate something like SVN externals? I always found git submodules to be a very bad replacement for that.

fragmede

GitHub really embraced the Microsoft-esque NIH with LFS, instead of adopting git-annex.

mathstuf

While I also find git-annex more elegant, its cross-platform story is weaker. Note that LFS was originally a collaboration between GitHub and Bitbucket (maybe? Some forge vendor I think). One had the implementation and the other had the name. They met at a Git conference and we have what we have today. My main gripes these days are the woefully inadequate limits GitHub has in place for larger projects. Coupled with the "must have all objects locally to satisfy an arbitrary push", any decently sized developer community will blow the limit fairly quickly.

FD: I have contributed to git-lfs.

keepamovin

To its absolute detriment

Here is a talk by a person who adores it: Yann Büchau: Staying in Control of your Scientific Data with Git Annex https://www.youtube.com/watch?v=IdRUsn-zB2s

codemac

While Yann has built many things with git-annex, we should be clear that the creator of git-annex is relatively singular, Joey Hess.

keepamovin

Here is a comment about Joey: https://news.ycombinator.com/item?id=14908529

And an interview When power is low, I often hack in the evenings by lantern light. https://usesthis.com/interviews/joey.hess/