Launch HN: Hyprnote (YC S25) – An open-source AI meeting notetaker
97 comments
·July 29, 2025crashabr
Do you intend to reach feature parity with something like MacWhisper? I'd love to switch to something open source, but automated meeting detection, push to transcribe (with custom rewrite actions) are two features I've learned to love, beside basic transcript. I also enjoy the automatic transcription from an audio, video or a even a YouTube link.
But because MacWhisper does not store transcripts or do much with them (other than giving you export options), there are some missed opportunities: I'd love to be able to add project tags to transcripts, so that any new transcript is summarized with the context of all previous transcript summaries that share the same tag. Thinking about it maybe I should build a Logseq extension to do that myself as I store all my meeting summaries there anyway.
Speaker detection is not great in MacWhisper (at least in my context where I work mostly with non native English speakers), so that would be a good differentiation too.
itsalotoffun
I'm always amazed at these relatively tiny projects that "launch" with a "customers" list that reads like they've spent 10 years doing hard outbound enterprise sales: Google, Intel, Apple, Amazon, Deloitte, IBM, Ford, Meta, Uber, Tencent, etc.
johntopia
have to admit that we did some logo plays. but our users are really all over the place and just wanted to show it off! i am not sure how it looked but that's why we didn't use terms like "teams" or "customers" to be honest while showing some validation.
Lionga
"Logo play" is such a YCombinator word for Lie.
anyg
This is perfect timing! I just cancelled my fireflies.ai subscription yesterday because it just felt unnecessary. I prefer using less platforms and more tools, especially those that can work under the surface.
yonl
Congrats on the launch. I never understood why an AI meeting notetaker needed sota LLMs and subscriptions (talking about literally all the other notetakers) - thanks for making it local first. I use a locally patched up whisperx + qwen3:1.7 + nomic embed (ofcourse with a swift script that picks up the audio buffer from microphone) and it works just fine. Rarely i create next steps / sop from the transcript - i use gemini 2.5 and export it as pdf. I’ll give Hyprnote a try soon.
I hope, since it’s opensource, you are thinking about exposing api / hooks for downstream tasks.
yujonglee
What kind of API/Hooks you expect us to expose? We are down to do that.
sjayasinghe
The ability to receive live transcripts from a webhook, including speaker diarization metadata would be super useful.
satvikpendem
Can you share the Swift script? I was thinking of doing something similar but was banging my head against the audio side of macOS.
Zardoz84
I would like to try this on Linux
btown
How are you balancing accuracy vs. time-to-word-on-live-transcript? Is this something you're actively balancing, or can allow an end user to tune?
I find myself often using otter.ai - because while it's inferior to Whisper in many ways, and anything but on-device, it's able to show words on the live transcript with minimal delay, rather than waiting for a moment of silence or for a multi-second buffer to fill. That's vital if I'm using my live transcription both to drive async summarization/notes and for my operational use in the same call, to let me speed-read to catch up to a question that was just posed to me while I was multitasking (or doing research for a prior question!)
It sometimes boggles me that we consider the latency of keypress-to-character-on-screen to be sacrosanct, but are fine with waiting for a phrase or paragraph or even an entire conversation to be complete before visualizing its transcription. Being able to control this would be incredible.
yujonglee
It is more like ai model problem(then app logic. doing it more frequently will require more computation. Things like speculative decoding can help though).
Doing it locally is hard, but we expect to ship it very soon. Please join our Discord(https://hyprnote.com/discord) if you are interested to hear from us.
teiferer
Nice!
Would be great if you could include in your launch message how you plan to monetize this. Everybody likes open source software and local-first is excellent too, but if you mention YC too then everybody also knows that there is no free lunch, so what's coming down the line would be good to know before deciding whether to give it a shot or just move on.
yujonglee
For individuals:
We have a Pro license implemented in our app. Some non-essential features like custom templates or multi-turn chat are gated behind a paid license. (A custom STT model will also be included soon.) There's still no sign-up required. We use keygen.sh to generate offline-verifiable license keys. Currently, it's priced at $179/year.
For business:
If they want to self-host some kind of admin server with integrations, access control, and SSO, we plan to sell a business license.
teiferer
Does that mean the admin server is not open source?
thedevilslawyer
Another sso.tax candidate.
Let's actively not support software that chooses anti-security.
johntopia
totally fair concern. we’re actually on the same side when it comes to promoting good security practices like SSO.
the reason we’re gating the admin server under a business license is less about profiting off sso and more about drawing a line between individual and organizational use. it includes a bunch of enterprise-specific features (sso, access control, integrations, ...) that typically require more support and maintenance.
that said, the core app is fully open-source and always will be - so individuals and teams who don’t need the admin layer can still use it freely and privately, without compromising security.
we’ll keep listening and evolving the model - after all, we're still very early and flexible. appreciate the pushback.
(edit: added some more words to reinforce our flexibility)
p2hari
I just downloaded on mac M4 pro mini. I installed the apple silicon version and try to launch it and it fails. No error message or anything. Just the icon keep bouncing on the dock. I assumed it needs some privacy and screen recording and audio permissions and explicitly gave them, however still just jumps on the dock and the app does not open. (OS, mac sequoia 15.5)
yujonglee
Seems like
https://github.com/fastrepl/hyprnote/blob/d0cb0122556da5f517...
this is invalid on Mac mini. Should be fixed today.
johntopia
working on trying to identify the problem! could you come over to our discord where we could better support you? https://hyprnote.com/discord
yujonglee
That is very strange. Can you launch it from the command line and share what you got?
/Applications/Hyprnote.app/Contents/MacOS/Hyprnote
theodorewiles
Looks really cool - I noticed Enterprise has smart consent management?
The thing I think some enterprise customers are worried about in this space is that in many jurisdictions you legally need to disclose recording - having a bot join the call can do that disclosure - but users hate the bot and it takes up too much visibility on many of these calls.
Would love to learn more about your approach there
johntopia
yes, we’re rolling out flexible consent options based on legal needs - like chat messages, silent bots, blinking backgrounds, or consent links before/during meetings. but still figuring out if there's a more elegant way to do this. would love to hear your take as well.
theodorewiles
Please shoot me a note - I'm trying to figure this out for my enterprise now, would love to figure out a way to get you in / trial it out.
johntopia
can i send you a follow-up to the email that's on your profile?
rushingcreek
Congrats on the launch! I'm very bullish on how powerful <10B-param models are becoming, so the on-device angle is cool (and great for your bottom line too, as it's cheaper for you to run).
Something that I think is interesting about AI note taking products is focus. How does it choose what's important vs what isn't? The better it is at distinguishing the signal from the noise, the more powerful it is. I wonder if there is an in-context learning angle here where you can update the model weights (either directly or via LoRA) as you get to know the user better. And, of course, everything stays private and on-device.
yujonglee
> How does it choose what's important vs what isn't?
The idea of Hyprnote is that you write chicken-scratch-raw note during the meeting(what you think is important), and AI enhance based on it.
On-device learning is interesting too. For example, Gboard: https://arxiv.org/abs/2305.18465
And yes - we are open to this too
manveerc
Congratulations! Is there a mobile version as well, especially for Android?
nashashmi
I was talking about this a week ago. One person wanted to make a pdf tutorial of how to use a software. I asked him to record himself in teams and share his screen and have AI take notes. It will create a fabulous summary with snapshots of everything he is going over.
Hi HN! We're Yujong, John, Duck, and Sung from Hyprnote (https://hyprnote.com). We're building an open-source, privacy-first AI note-taking app that runs fully on-device. Think of it as an open-source Granola. No Zoom bots, no cloud APIs, no data ever leaves your machine.
Source code: https://github.com/fastrepl/hyprnote Demo video: https://hyprnote.com/demo
We built Hyprnote because some of our friends told us that their companies banned certain meeting notetakers due to data concerns, or they simply felt uncomfortable sending data to unknown servers. So they went back to manual note-taking - losing focus during meetings and wasting time afterward.
We asked: could we build something just as useful, but completely local?
Hyprnote is a desktop app that transcribes and summarizes meetings on-device. It captures both your mic input and system audio, so you don't need to invite bots. It generates a summary based on the notes you take. Everything runs on local AI models by default, using Whisper and HyprLLM. HyprLLM is our proof-of-concept model fine-tuned from Qwen3 1.7B. We learned that summarizing meetings is a very nuanced task and that a model's raw intelligence (or weight) doesn't matter THAT much. We'll release more details on evaluation and training once we finish the 2nd iteration of the model (still not that good we can make it a lot better).
Whisper inference: https://github.com/fastrepl/hyprnote/blob/main/crates/whispe...
AEC inference: https://github.com/fastrepl/hyprnote/blob/main/crates/aec/sr...
LLM inference: https://github.com/fastrepl/hyprnote/blob/main/crates/llama/...
We also learned that for some folks, having full data controllability was as important as privacy. So we support custom endpoints, allowing users to bring in their company's internal LLM. For teams that need integrations, collaboration, or admin controls, we're working on an optional server component that can be self-hosted. Lastly, we're exploring ways to make Hyprnote work like VSCode, so you can install extensions and build your own workflows around your meetings.
We believe privacy-first tools, powered by local models, are going to unlock the next wave of real-world AI apps.
We're here and looking forward to your comments!