Skip to content(if available)orjump to list(if available)

Show HN: Metorial (YC F25) – Vercel for MCP

Show HN: Metorial (YC F25) – Vercel for MCP

13 comments

·October 14, 2025

Hey HN! We're Wen and Tobias, and we're building Metorial (https://metorial.com), an integration platform that connects AI agents to external tools and data using MCP.

The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.

Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.

For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.

What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).

Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (https://metorial.com/api) to connect to our platform.

You can self-host (https://github.com/metorial/metorial) or use the managed version at https://metorial.com.

So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.

Demo video: https://www.youtube.com/watch?v=07StSRNmJZ8

Our Repos: Metorial: https://github.com/metorial/metorial, MCP Containers: https://github.com/metorial/mcp-containers

SDKs: Node/TypeScript: https://github.com/metorial/metorial-node, Python: https://github.com/metorial/metorial-python

We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!

rancar2

I like the license (FSL) chosen for the project, but it may need some explaining for others. Can you comment on decision for selecting the Functional Source License (Version 1.1, ALv2 Future License), and the intent from the Metorial team with it including any restrictions on potential commercial use of the platform (i.e. free-to-paid without notice)?

For those who aren't aware of what FSL (https://fsl.software/) is: "The Functional Source License (FSL) is a Fair Source license that converts to Apache 2.0 or MIT after two years. It is designed for SaaS companies that value both user freedom and developer sustainability. FSL provides everything a developer needs to use and learn from your software without harmful free-riding."

tobihrbr

Thanks for pointing that out. Ultimately, we wanted to strike a balance between being fair and open to the community, welcoming contributions, and ensuring that people can self-host without having to worry about licensing issues, while also ensuring that Metorial, as a company, can exist and work on OSS sustainably. This isn't easy and I don't think there's a right answer. To us FSL strikes a pretty good balance. Allowing the community to use and participate while ensuring that Metorial makes sense as a business as well.

solumos

The distinction between "Vercel for MCP [integrations]" and "Vercel for MCP [servers]" is meaningful — maybe "Zapier for MCP" is a more appropriate "X for Y"?.

Congrats on the launch!

tobihrbr

That's a really interesting point. We've actually been discussing this quite a bit. We felt like putting an emphasize on the "dev tool" aspect (like Vercel) makes more sense, but the way you put it we might want to reconsider that. Thank for your interest!

cgijoe

Oh my lord, your timing is perfect. I need this so badly right now. Congrats on the launch, and wow, thank you for making your MCP containers available separately!

tobihrbr

Haha, good thing we launched today. Thank you so much for the encouraging words!

fsto

We’ve just begun implementing Composio. Would love to reconsider if you help clarifying the main differences. From my perspective it looks like you have more robustness features to me as a developer and you’re fully open source (not just the client) whereas Composio has more integrations. But would love your input to clarify. Congrats on the launch!

ushakov

congrats on the launch!

why do I need a specialized platform to deploy MCP instead of just hosting on existing PaaS (Vercel, Railway, Render)?

also if you're not using VMs, how do you isolate per-user servers?

tobihrbr

Great questions!

If you want to run your own remote servers (for your product/company) Railway or Render work great (Vercel is a bit more difficult since Lambdas are very expensive if you run them over long periods of time). Metorial targets developers who build their own AI agents and want to connect them to integrations. Plainly, we do a lot more then running MCP servers; we give you monitoring, observability, handle consumer-facing OAuth, and give you super nice SDKs to integrate MCP servers with your agent.

Regarding the second question, Metorial has three execution modes depending on what the server supports: 1) Docker - this is the most basic one which any MCP server should support. We did some heavy optimizations to get those to start as fast as possible and our hibernation system supports stopping and resuming them while restoring the state. 2) Remote MCP - we connect to remote MCP servers for you, while still giving you the same features and ease-of-integration you get with any Metorial server (I could go more into detail on how our remote servers are better than standard ones). 3) Servers on our own lambda-based runtime. While not every MCP server supports this execution mode, it's what really sets us apart. The Lambdas only run for short intervals, while the connection is managed by our gateway. We already have about 100 lambda-based servers and working on getting more on to that execution model.

There's a lot about our platform that I haven't included in this. Like our stateful MCP proxy, our security model, our scalable SOA, and how we transform OAuth into a single REST API calls for our users.

Let me know if you have any additional questions, always happy to talk about MCP and software architecture.

langitbiru

I wrote a book about MCP: https://leanpub.com/practical-mcp

I'm considering adding more chapters to the book: security, easy deployment, etc. So, I may look into your solution. I believe there are other players also, like Klavis AI, FastMCP and some MCP startups that I cannot remember.

Congratz!

tobihrbr

Thanks so much! I'll definitely check out your book. Always happy to talk MCP :)

null

[deleted]

samgutentag

mitochondria is the powerhouse of the cell