Llamafile Returns
18 comments
·October 29, 2025thangalin
chrismorgan
> # Avoid issues when wine is installed.
> sudo su -c 'echo 0 > /proc/sys/fs/binfmt_misc/status'
Please don’t recommend this. If binfmt_misc is enabled, it’s probably for a reason, and disabling it will break things. I have a .NET/Mono app installed that it would break, for example—it’s definitely not just Wine.
If binfmt_misc is causing problems, the proper solution is to register the executable type. https://github.com/mozilla-ai/llamafile#linux describes steps.
I made myself a package containing /usr/bin/ape and the following /usr/lib/binfmt.d/ape.conf:
:APE:M::MZqFpD::/usr/bin/ape:
:APE-jart:M::jartsr::/usr/bin/ape:michaelgiba
I’m glad to see llamafile being resurrected. A few things I hope for:
1. Curate a continuously extended inventory of prebuilt llamafiles for models as they are released 2. Create both flexible builds (with dynamic backend loading for cpu and cuda) and slim minimalist builds 3. Upstreaming as much as they can into llama.cpp and partner with the project
michaelgiba
Crazier ideas would be: - extend the concept to also have some sort of “agent mode” where the llamafiles can launch with their own minimal file system or isolated context - detailed profiling of main supported models to ensure deterministic outputs
njbrake
Love the idea!
swyx
justine tunney gave a great intro to Llamafile at AIE last year if it helps anyone: https://www.youtube.com/watch?v=-mRi-B3t6fA
jart
Really exciting to see Mozilla AI starting up and I can't wait to see where the next generation takes the project!
FragenAntworten
The Discord link is broken, in that it links to the server directly rather than to an invitation to join the server, which prevents new members from joining.
synergy20
how is this different from ollama? for me the more/open the merrier.
ricardobeat
Ollama is a model manager and pretty interface for llama.cpp, llamafile is a cross-platform packaging tool to distribute and run individual models also based on llama.cpp
benatkin
> As the local and open LLM ecosystem has evolved over the years, time has come for llamafile to evolve too. It needs refactoring and upgrades to incorporate newer features available in llama.cpp and develop a refined understanding of the most valuable features for its users.
It seems people have moved on from Llamafile. I doubt Mozilla AI is going to bring it back.
This announcement didn't even come with a new code commit, just a wish. https://github.com/mozilla-ai/llamafile/commits/main/
apitman
This is great news. Given the proliferation of solid local models, it would be cool if llamafile had a way to build your own custom versions with the model of your choice.
behindsight
great stuff, working on something around agentic tooling and hope to collab with Mozilla AI as well in the future as they share the same values I have
throawayonthe
go get that investor money i guess?
Tips:
And: