Show HN: I built a full mulimodal LLM by merging multiple models into one
8 comments
·February 2, 2025kouteiheika
willwade
You know - the word "multimodal" i think is being used badly here. Its Multi-Model - not Multimodal - which certainly suggests a completeley different thing
yoeven
It's a framework that uses the best part of each LLM, e.g. multimodal support from gemini with tool calling from gpt-4o and reasoning from o3-mini by chaining them dynamically. From a user perspective, there is no model selection or routing, just write the prompt or upload a file and it works so it feels like you're working with a single LLM but under the hood it does all this work to get you the best output :) Sorry if you felt it's misleading but I hope you give it a shot!
danielbln
The problem with that phrasing is that there is actual model merging, where you merge the weights. So people reading the title might (and apparently do) expect that, less so an LLM router.
madduci
And a similar product already exists, Langdock
upghost
I'll jump in before the haterade engine wakes up -- great bit of engineering work here! I can't imagine a better level of abstracting away the unnecessary stuff while still retaining that level of manual control.
The only thing I don't see is setup for local/in-house LLMs, but it's easy enough to spoof OpenAI calls if necessary.
yoeven
Thank you!! Local model is something I'm looking into with gguf files & llama.cpp, still pretty experimental, you can check out the branch here: https://github.com/JigsawStack/omiai/tree/feat/local-models
upghost
Nice! To be clear for my usecase I didn't mean calling the local LLM directly, rather simply being able to point to an OpenAI compatible API would be fine. It seems like wrangling a native model would be a lot of extra complexity, but you seem to have a very good abstraction story here. Actually I probably will take a peek at the code to see how you are doing these abstractions layers, because the user facing API is certainly very clean...!
I clicked expecting a single full multimodal LLM made by merging multiple existing models into one like the title suggests (which sounds very interesting), and I found... a library which is an LLM router/calls a bunch of LLM web APIs and exposes that under a unified/easy to use interface?
With all due respect, sorry, but this title is very misleading. I'd expect "build an LLM" to mean, well, actually building an LLM, and while it's a very nice library it's definitely not what the title suggests.