Skip to content(if available)orjump to list(if available)

Show HN: Gerbil – an open source desktop app for running LLMs locally

Show HN: Gerbil – an open source desktop app for running LLMs locally

2 comments

·November 11, 2025

Gerbil is an open source app that I've been working on for the last couple of months. The development now is largely done and I'm unlikely to add anymore major features. Instead I'm focusing on any bug fixes, small QoL features and dependency upgrades.

Under the hood it runs llama.cpp (via koboldcpp) backends and allows easy integration with the popular modern frontends like Open WebUI, SillyTavern, ComfyUI, StableUI (built-in) and KoboldAI Lite (built-in).

Why did I create this? I wanted an all-in-one solution for simple text and image-gen local LLMs. I got fed up with needing to manage multiple tools for the various LLM backends and frontends. In addition, as a Linux Wayland user I needed something that would work and look great on my system.

throwaway81998

Serious question, not a "what's the point of this" shitpost... My experience with local LLMs is limited.

Just installed LM Studio on a new machine today (2025 Asus ROG Flow Z13, 96GB VRAM, running Linux). Haven't had the time to test it out yet.

Is there a reason for me to choose Gerbil instead? Or something else entirely?

A4ET8a8uTh0_v2

Not OP, but I am running ollama as a testing ground for various projects ( separately from gpt sub ).

<< Is there a reason for me to choose Gerbil instead? Or something else entirely?

My initial reaction is positive, because it seems to integrate everything without sacrificing being able to customize it further if need be. That said, did not test it yet, but now I will.