Skip to content(if available)orjump to list(if available)

Git without a forge

Git without a forge

88 comments

·March 5, 2025

wahern

> People can ‘git clone’ my code, and there’s a web-based browsing interface (the basic gitweb) for looking around without having to clone it at all.

I host my own public Git repositories, but statically--read-only, no HTML views. I don't want to run any specialized server-side code, whether dynamic or preprocessed, as that's a security vector and system administration maintenance headache I don't want to deal with. You can host a Git repository as a set of static files using any web server, without any special configuration. Just clone a bare repo into an existing visible directory. `git update-server-info` generates the necessary index files for `git clone https://...` to work transparently. I add a post-receive hook to my read-write Git repositories that does `cd /path/to/mirror.git && git fetch && git --bare update-server-info` to keep the public repo updated.

In theory something like gitweb could be implemented purely client-side, using JavaScript or WASM to fetch the Git indices and packs on-demand and generate the HTML views. Some day I'd like to give that a try if someone doesn't beat me to it. You could even serve it as index.html from the Git repository directory, so the browser app and the Git clone URL are identical.

johannes1234321

I haven't looked into mich details of gibt's packing, but for a big repo you probably don't want a client to download it all nor do a huge amount of http range requests for being able to show the summary page. You think that would work well or would this need at least some caching layer?

dgl

Getting the tree (at the root) is only a few requests (get the refs or HEAD, get the relevant commit, get the tree ID from that). If pack files aren't involved then that's literally 3 requests. Pack files make this more complex, but careful caching and range requests could reduce that to as few as around 5-8 extra requests (doing binary searching via HTTP range requests). I don't think even libgit2 has APIs for range requests on pack files though, so this would need a special library (or patching libgit2...).

There's also various optimizations on the git side like bitmaps[1] and commit-graphs[2]. If this a bare repo on the server side it shouldn't be a problem to make sure it is in a particular format with receive hooks.

That's just displaying a file listing though. Displaying what GitHub displays with the last change of each file is more complex, maybe the commit graph could be used so the client wouldn't have to fetch everything itself.

  [1]: https://git-scm.com/docs/bitmap-format
  [2]: https://git-scm.com/docs/commit-graph

sneak

You could write a static site generator to generate a website from the repo, so you’re still just serving static files.

jarofgreen

https://git.trevorbentley.com/itsy-gitsy/ (I don't use so not sure how good it is, just noting)

nhanb

I once explored doing just that: https://github.com/nhanb/boast

However I never got around to finishing it, mainly because I couldn't decide on where to stop: should I also generate commits from all non-master branches etc.

I flirted with the idea of a browser-side repo viewer too, but re-implementing git packfile parsing in js didn't seem like something I'd want to spend my time on, so I moved on. Glad to see others pondering the same thing though.

JamesLeonis

I like this idea quite a bit. It's lightweight and read-only, which makes it far easier to host. I'm in the process of publishing some of my own repositories so I'm going to give this a try.

ndegruchy

This is what I love about Fossil[1]. You get all of those extra tools (wiki, chat, forums, bug tracker) in the server that is also in the binary that you use to manage the repo.

So, if you want to serve it, just `fossil server file.fossil` or serve a whole directory of them. Or, if you want, you can just `fossil ui` and muck around yourself, locally. The server supports SSH and HTTPS interactions for cloning and pushing.

All by the same people who make SQLite, Pikchr, and more.

[1]: https://fossil-scm.org

AceJohnny2

(tongue-in-cheek:) This is what I hate about Fossil, you get all this extra cruft (wiki, chat, forums, bug tracker) that's a worse version than dedicated software for each.

It really depends on what you're optimizing for.

yellowapple

Thankfully, there's not much forcing you to use much of it; the "template repo" I use outright disables forum/wiki/chat/bugtracker access to non-admin users (i.e. anyone who hasn't cracked my password), since most of the time when I throw code out into the world I don't really care all that much about supporting it - and in the cases where I do care, I can always re-enable exactly the things I want.

ndegruchy

Fair enough. I usually deny permissions to the features I don't care about if I'm pushing it up on the web. It's nice to have them there if I need them, though.

system33-

Yup. +1 for fossil. I wanted an issue tracker that wasn’t text files in the repo. Lots of git-based things that were heavier (gitea and friends) or hackier than I wanted. Decided to finally try out fossil and I think it’s really really neat.

Lyngbakr

In terms of everyday workflow, does Fossil differ radically from git? If so, what's the learning curve like?

yellowapple

The biggest workflow differences I've noticed:

- The repo normally lives outside of the worktree, so remembering to 'fossil new foo.fossil && mkdir foo && cd foo && fossil open ../foo.fossil' took some getting used to. Easy enough to throw into some 'fossil-bootstrap' script in my ~/.local/bin to never have to remember again.

- For published repos, I've gotten in the habit of creating them directly on my webserver and then pulling 'em down with 'fossil clone https://${FOSSIL_USER}@fsl.yellowapple.us/foo'

- The "Fossil way" is to automatically push and pull ("auto-sync") whenever you commit. It feels scary coming from Git, but now that I'm used to it I find it nice that I don't have to remember to separately push things; I just 'fossil ci -m "some message"' and it's automatically pushed. I don't even need to explicitly stage modified files (only newly-created ones), because...

- Fossil automatically stages changed files for the next commit - which is a nice time-saver in 99% of cases where I do want to commit all of my changes, but is a slight inconvenience for the 1% of cases where I want to split the changes into separate commits. Easy enough to do, though, via e.g. 'fossil ci -m "first change" foo.txt bar.txt && fossil ci -m "everything else"'.

- 'fossil status' doesn't default to showing untracked files like 'git status' does; 'fossil status --differ' is a closer equivalent.

ndegruchy

Not tremendously. You still commit, you still push and pull. There is a history and ignore options.

There is a guide written for Git users:

https://fossil-scm.org/home/doc/trunk/www/gitusers.md

yellowapple

That's exactly why I've been using Fossil exclusively for new projects, and have been (very slowly) migrating existing ones over as well. It ain't like my FOSS projects get a whole lot of outside contributions anyway (though I've been tinkering a bit with bidirectional Git syncs so that people can submit GitHub PRs / GitLab MRs / generic Git bundles / etc. in those rare cases).

ackyshake

> A particular thing I don’t like about git forge websites is the way they make you create an account.

Exactly. I used to have a GitHub account but as soon as it got bought out by Microsoft, I was gone.

I still refuse to create an account, even though there have been bugs I wanted to report or patches I wanted to contribute. Maybe some maintainers still have email addresses on their profile, many don't. Even if they do, I just don't get the motivation to email them.

People like to complain about email a lot, but I enjoy different mailing lists for open source software. You could have discussions with other users of that software or keep track of development by following the "-devel" list. All you needed is something you already had—email. Sadly, they're becoming less popular. Even python moved to discourse which—dun dun dun—requires an account. grumble grumble

I like SourceHut for many reasons—it's the fastest forge I've used, it's FOSS, doesn't try to copy the GitHub UI like every other Git forge these days. But by far the reason I love it is _because_ it doesn't require me creating an account to contribute. I think of it as gitweb, but nicer.

rlpb

> Multiple files to herd. When I get an email with five patch attachments, I have five files to copy around instead of one, with long awkward names.

That’s not correct. You can write the email to an mbox file (your MUA lets you do that, right?) and then use `git am` to pull it all into git.

> Why I don’t like it: because the patch series is split into multiple emails, they arrive in my inbox in a random order, and then I have to save them one by one to files, and manually sort those files back into the right order by their subject lines.

The patch series arrives threaded, so your MUA should be able to display them in order. Export them to an mbox file, and then use `git am` again.

There might be ways that someone can send you email in this way and for the patches to be broken such that `git am` won’t work, of course. I take no issue with that part of the argument.

LegionMammal978

Not everyone has a fancy client-side MUA that gives them trivial access to mbox files. E.g., a typical webmail service will make exporting mboxes into a whole process at best. (And on the sending side, have fun with the lack of git send-email integration. I've spent far more time than I'd like manually patching up References and In-Reply-To headers.)

Of course, the classic response is "get a better MUA you luser", but that just adds even more steps for people who use webmail services for 99.9% of their email needs.

rlpb

People can use webmail for regular email, but then connect a “better” MUA for patch handling. I get that this would be more steps, but for those who don’t want to do this, they probably just use GitHub PRs, and that’s fine, they can carry on doing that :-)

I’m just completing the picture by pointing out that for those who choose to use emails to jockey patches around by mutual agreement, including patches in emails really shouldn’t be a problem.

djha-skin

For those that don't have an MUA, I have made git-receive-mail[1]. It really is very doable these days to do the email workflow, on both ends.

1: https://github.com/djha-skin/git-receive-mail

layer8

Yes, this all stands and falls with using a competent email client. There are some hints regarding email clients here, though focused on sending patches: https://github.com/torvalds/linux/blob/master/Documentation/...

MaxBarraclough

This reminds me of Drew DeVault's advocacy for the traditional email-driven git workflow. [0][1] (Drew is the creator of SourceHut, an email-oriented git forge.)

I think his (Simon's) objection to git send-email emails could be addressed with better tooling, or better use of them. It's 'just' a matter of exporting the emails into a single mailbox file, right? (I'm not experienced with git's email-driven features.)

It seems to work smoothly for the Linux kernel folks; clearly it doesn't have to be clunky.

> because git format-patch doesn’t mention what commit the patches do apply against, I’m more likely to encounter a conflict in the first place when trying to apply them.

This is what the --base flag is for. [2]

[0] https://git-send-email.io/

[1] https://git-am.io/

[2] https://git-scm.com/docs/git-format-patch#Documentation/git-...

bjackman

It doesn't work smoothly for Linux kernel folks. It's a huge pain in the arse to review code. Some subprojects have CI but mostly it's too much work to set it up. You never know if the code that gets merged is truly what got reviewed. Half the time if you wanna test out someone's patches you have to spend 20 minutes trying to figure out what base commit they even apply to. Old fashioned mail clients are huge pain to deal with (and mail servers? Fuck). Raw text with no concept of "review thread resolved" wastes loads of review energy manually tracking what feedback has or hasn't been addressed. Comparing old and new versions of a patchset (equivalent of a pull request) requires manually searching your mailbox for the different versions, manually applying them to your local repo (hopefully you can find the base commit the old version applies to, which by now might not even exist any more if it's an unstable maintainer branch) and then manually running git-range-diff.

As someone who works with an git-send-email workflow every day, I can tell you, it sucks. Email is not a good substrate for development.

If I were Linus I would be pestering the Linux Foundation to set up a bunch of managed Gerrit instances or something.

imiric

Don't they have tooling to help with these issues? Surely a project the size of Linux, for which Git was originally created, would arrive at a workflow that is optimal for most contributors. This won't align with everyone's preferences, of course, but I can't imagine someone like Linus not being happy with it.

JoshTriplett

> Surely a project the size of Linux, for which Git was originally created, would arrive at a workflow that is optimal for most contributors.

No, it really, really wouldn't. The Linux workflow is optimized for the preferences of (a subset of) maintainers, notably Linus; it is not optimized for contributors.

cortesoft

I think it is funny to think of the email-driven git workflow as 'traditional' when it had such a short period of time as the standard way of doing things with git.

Git was created in 2005, and Github was created in 2008. So we had 3 years of email-driven git and 17 years of Github style development.

PhilipRoman

Patches were being mailed long before that, since they work perfectly fine without Git.

sneak

Email, like nntp and gopher, lost to the web.

People won’t use it if it ain’t on the web.

If you force them to use it anyway, very few people will use it.

DdV’s advocacy stems from the fact that he is a lone-wolf dev, building tooling for other lone-wolf devs. The social and collaborative features sucking is a feature, not a bug.

It falls flat on its face for larger projects and communities.

bigstrat2003

> If you force them to use it anyway, very few people will use it.

And that's perfectly fine. Not all things are for all people, and nor should they even try to be.

IshKebab

> Sometimes people just can’t work out how to send me patches at all.

Yeah indeed. I have written but not submitted patches to a project (OpenSBI) because it made the submission process super complicated, requiring signing up to a mailing list, learning how to set up git send-email.

I don't see how he can think creating a GitHub account (which almost everyone already has) is a big barrier when he freely admits his process is incomprehensible.

I don't buy the GitHub locks you in either. It's pretty easy to copy issues and releases elsewhere if it really comes to it. Or use Gitlab or Codeberg if you must.

https://docs.codeberg.org/advanced/migrating-repos/

Any of those are far better than random mailing lists, bugzilla, and emailed patch files.

Putty is great but please don't listen to this.

q0uaur

Back when i checked out sr.ht, i really liked the idea of git-send-email precisely because it doesn't require making an account for everything. there's a nice tutorial for how to set it up: https://git-send-email.io/#step-1

worked easily enough for me, i could see myself using it for small patches here and there.

I did end up installing forgejo in my homelab after all, but back then it sounded like federation was much closer than it actually was. i did kind of expect that though, federation gets pretty complex quick.

every time i log into forgejo, i do see that juicy "proceed with OpenID" button though, and i've looked into running my own openid provider a few times - sadly not seeing anything that would work for me yet. honestly i can't believe we went from "facebook sign in" to "google sign in" and are now going to "github sign in" without a single open standard that's gotten some adoption.

kazinator

> requiring signing up to a mailing list

A traditionally configured mailing list allows posts from non-subscribers.

All the mailing lists I operate are like this.

If you have good anti-spam-fu, you can get away with it. Plus, it's possible to have posts from non-members be held for moderation, and you can whitelist non-members who write legitimate posts.

Projects which require people to sign up to their mailing lists to participate are erecting a barrier that costs them users; it's a stupid thing to do, and unnecessary.

Whenever I have to interact with some mailing list, I begin by just sending my query to the list address. If it bounces due to nonmembership, I usually move on, unless it's some important matter.

By the way, some modern lists allow posts from non-members but then rewrite the headers in such a way that the nonmember does not receive replies! This happens in more than one way, I think. One of them is Reply-To Munging: the list robot sets the Reply-To: header redirecting replies to be directed to the list address. The Reply-To throws away the content of the original To and Cc fields.

When this happens to me, I usually refrain from further interaction with the list. I check for replies in their archive. If I'm satisfied with what they said, that's the end of it.

devnullbrain

>when he freely admits his process is incomprehensible.

Which one of the 4 preferred processes, not including the maligned git send-email, and infinite other accepted processes?

Brian_K_White

I submitted at least one patch to putty some time ago and it was such a nothingburger I don't even remember what it took. Somehow it neither baffled nor infuriated nor even annoyed me a little.

All these complaints and critiques sound like so much baby crying over nothing to me.

agf

Gotta plug the Portable Puzzle Collection, by the same author as this post: https://www.chiark.greenend.org.uk/~sgtatham/puzzles/

neilv

I can't tell you how many waiting rooms, grocery checkout lines, and delayed public transit that his puzzle collection gotten me through. On every handheld and laptop starting with the Symbian-based Nokia E61.

aard

I very much support this sentiment! If we want a decentralized internet, we need to stop relying on large companies to manage everything for us. Git was designed to be a p2p system, but we very quickly centralized it with forges like Github. It is very discouraging. Most of the internet is like this now--managed by a handful of very powerful organizations. There is no end to the problems this will cause.

sneak

Gitea is a foss clone of GitHub that is implementing federation features. It’s absolutely excellent.

duskwuff

One thing which is particularly excellent about Gitea (and its fork, forgejo) is that it's quite lightweight for what it does. It's a single-process Go application with low memory requirements, and it can use a SQLite database (so you don't need a separate database server).

etaweb

> In particular, your project automatically gets a bug tracker – and you don’t get a choice about what bug tracker to use, or what it looks like. If you use Gitlab, you’re using the Gitlab bug tracker. The same goes for the pull request / merge request system.

With Forgejo (Codeberg) you can toggle features such as pull requests, issues, etc.

You can also configure any external issue tracker or wiki apparently, though I've never tried it, because those included with the forge are good enough for me.

rglullis

Forgejo (once federation is implemented) also fixes the main issue of forcing people to have an account at every forge.

kelnos

It does solve that problem, but then creates a huge spam problem. Over at gitlab.xfce.org we are constantly fighting new spam accounts that get created daily. We've done all the recommended/paranoid things, like not allowing new accounts to fork or create repos until we manually give them permission (among other things), and we have a script that runs hourly to shut down suspicious accounts. But it's still nuts.

If anyone with an account on any other gitlab instance could automatically do things on our gitlab instance, it would be a nightmare. We'd probably disable federation if gitlab offered it.

pabs3

The only way to do it is disable registration and have people contact you to allow registration.

aitchnyu

Umm, why does a spammer bother with getting a Gitlab account in the first place?

ambigious7777

I do hope that Forgejo Federation is implemented, but it seems to me progress has stalled on it?

snthpy

There are radicle.xyz and tangled.sh .

AndrewDucker

For some context, Simon is the maintainer of PuTTY, as well as various other cool bits of software.

spudlyo

He mentions in TFA that he had a bug tracker for PuTTY before git forges were a thing, one might have gotten this from context.

mellosouls

The author is obviously entitled to whatever workflow he chooses to use, and he isn't proselytizing here, but by choosing to sidestep industry standards he is ultimately just putting obstacles in front of other coders who might like to contribute.

Unless you're really into git, it's just tiresome and a time sink having to work all this out.

Fwiw I've used putty in the past and appreciate his efforts, if not his obscurantist tendencies.