Skip to content(if available)orjump to list(if available)

Jan – Ollama alternative with local UI

apitman

I really like Jan, especially the organization's principles: https://jan.ai/

Main deal breaker for me when I tried it was I couldn't talk to multiple models at once, even if they were remote models on OpenRouter. If I ask a question in one chat, then switch to another chat and ask a question, it will block until the first one is done.

Also Tauri apps feel pretty clunky on Linux for me.

_the_inflator

Yep. I really see them as an architecture blueprint with a reference implementation and not so much as a one size fits all app.

I stumbled upon Jan.ai a couple of months ago when I was considering a similar app approach. I was curious because Jan.ai went way beyond what I considered to be limitations.

I haven’t tried Jan.ai yet, I see it as an implementation not a solution.

diggan

> Also Tauri apps feel pretty clunky on Linux for me.

All of them, or this one specifically? I've developed a bunch of tiny apps for my own usage (on Linux) with Tauri (maybe largest is just 5-6K LoC) and always felt snappy to me, mostly doing all the data processing with Rust then the UI part with ClojureScript+Reagent.

c-hendricks

Yeah, webkit2gtk is a bit of a drag

klausa

So this is how women names Siri felt in 2011.

lagniappe

Hello Jan ;)

biinjo

Im confused. Isn’t the whole premise of Ollama that its locallt ran? What’s the difference or USP when comparing the two.

hoppp

I think its an alternative because ollama has no UI and its hard to use for non-developers who will never touch the CLI

simonw

Ollama added a chat UI to their desktop apps a week ago: https://ollama.com/blog/new-app

apitman

Their new app is closed source right?

moron4hire

That's not the actual tagline being used in the repo. The repo calls itself an alternative to ChatGPT. Whoever submitted the link changed it.

mathfailure

Is this an alternative to OpenWebUI?

apitman

Not exactly. OWUI is a server with a web app frontend. Jan is a desktop app you install. But it does have the ability to run a server for other apps like OWUI to talk to.

ekianjo

Openweb-ui does not include a server.

apitman

I was referring to Jan.

roscas

Tried to run Jan but it does not start llama server. It also tries to allocate 30gb that is the size of the model but my vram is only 10gb and machine is 32gb, so it does not make sense. Ollama works perfect with 30b models. Another thing that is not good is that it make constant connections to github and other sites.

hoppp

It probably loads the entire model into ram at once while ollama solves this and does not, it has a better loading strategy

SilverRubicon

Did you see the feature list? It does not deny that makes connections to other sites.

- Cloud Integration: Connect to OpenAI, Anthropic, Mistral, Groq, and others

- Privacy First: Everything runs locally when you want it to

trilogic

[flagged]

woadwarrior01

> If you looking for privacy there is only 1 app in the whole wide internet right now, HugstonOne

That's a tall claim.

I've been selling a macOS and iOS private LLM app on the App Store for over two years now, that is:

a) is fully native (not electron.js) b) not a llama.cpp / MLX wrapper c) fully sandboxed (none of Jan, Ollama, LM Studio are)

I will not promote. Quite shameless of you to shill your electron.js based llama.cpp wrapper here.

trilogic

Yes it is a bold claim that I defend well though. I have been looking for smth full privacy lik HugstonOne but I couldn't. Like it or not I am proud of it. You still couldn´t mention one good GUI and call me shameless. I don´t really care if is fully native or not as long as it works for my research. I use it mostly for personal medical/health enhancement and my data/history and pics stay mine, there is nothing wrong in that. I accept every challenge to prove that HugstonOne is worth the claim.

imiric

You mean this[1]?

It's not open source, has no license, runs on Windows only, and requires an activation code to use.

Also, the privacy policy on their website is missing[2].

Anyone remotely concerned about privacy wouldn't come near this thing.

Ah, you're the author, no wonder you're shilling for it.

[1]: https://github.com/Mainframework/HugstonOne

[2]: https://hugston.com/privacy

trilogic

The app has a licence well visible when you install the app. The rest is written in the website and in github. Then about: requires an activation code to use, ofc it is made for ethical research purposes, so yes I am distributing it responsibly. And you can see the videos in the youtube channel for how it works. But the most important point is that you can try it easily with a firewall to see that it do not leak bytes like all the rest there. That´s what i call privacy, It has a button that cut all connection. You can say what you want but that´s it that´s all.

kgeist

>I challenge everyone to find another local GUI with that privacy

Llama.cpp's built-in web UI.

trilogic

This is from webui website docs: Once saved, Open WebUI will begin using your local Llama.cpp server as a backend! So you see Llama server not CLI. That´s a big flag there. I repeat no app in the whole world takes seriously privacy like HugstonOne. This is not advertisement, I am just making a point.

reader9274

Tried to run the gpt-oss:20b in ollama (runs perfectly) and tried to connect ollama to jan but it didn't work.

thehamkercat

Exactly: https://github.com/menloresearch/jan/issues/5474

Can't make it work with ollama endpoint

this seems to be the problem but they're not focusing on it: https://github.com/menloresearch/jan/issues/5474#issuecommen...

bogdart

I tried Jan last year, but the UI was quite buggy. But maybe they fixed it.

diggan

Please do try it out again, if things used to be broken but they no longer are, it's a good signal that they're gaining stability :) And if it's still broken, even better signal that they're not addressing bugs which would be worse.

esafak

So you're saying bugs are good?!

diggan

No, but maybe that their shared opinion will be a lot more insightful if they provide a comparison between then and now, instead of leaving it at "it was like that before, now I don't know".

semessier

still looking for vLLM to support Mac ARM Metal GPUs

baggiponte

Yeah. The docs tell you that you should build it yourself, but…

venkyvb

How does this compare to LM studio ?

rmonvfer

I use both and Jan is basically the OSS version of LM Studio with some added features (e.g, you can use remote providers)

I first used Jan some time ago and didn’t really like it but it has improved a lot so I encourage everyone to try it, it’s a great project

angelmm

For me, the main difference is that LM Studio main app is not OSS. But they are similar in terms of features, although I didn't use LM Studio that much.

azyc

[dead]