MCP: An (Accidentally) Universal Plugin System
114 comments
·June 28, 2025phh
visarga
The main benefit is not that it made interoperability fashionable, or that it make things easy to interconnect. It is the LLM itself, if it knows how to wield tools. It's like you build a backend and the front-end is not your job anymore, AI does it.
In my experience Claude and Gemini can take over tool use and all we need to do is tell them the goal. This is huge, we always had to specify the steps to achieve anything on a computer before. Writing a fixed program to deal with dynamic process is hard, while a LLM can adapt on the fly.
sshine
Hype, certainly.
But the way I see it, AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user?
Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because that’s practically free labor, so does everyone want everything connected.
For those of us who rode the API wave before integrating became hand-wavey, it sure feels like the world caught up.
I hope it will last, but I don’t know either.
mh-
Unfortunately, I think we're equally likely to see shortsighted lock-in attempts like this [0] one from Slack.
I tried to find a rebuttal to this article from Slack, but couldn't. I'm on a flight with slow wifi though. If someone from Slack wants to chime in that'd be swell, too.
I've made the argument to CFOs multiple times over the years why we should continue to pay for Slack instead of just using Teams, but y'all are really making that harder and harder.
[0]: https://www.reuters.com/business/salesforce-blocks-ai-rivals...
ebiester
It's going to take more people willing to move away from slack for those purposes.
As it is, I'm going to propose that we move more key conversations outside of slack so that we can take advantage of feeding it into ai. It's a small jump from that to looking for alternatives.
dgacmu
I'm happier we went with Zulip each day.
adregan
How ironic given the amount of APIs that were locking down access in response to AI training!
Though the general API lockdown was started long before that, and like you, I’m skeptical that this new wave of open access will last if the promise doesn’t live up to the hype.
TimTheTinker
MCP is supposed to grant "agency" (whatever that means), not merely expose curated data and functionality.
In practice, the distinction is little more than the difference between different HTTP verbs, but I think there is a real difference in what people are intending to enable when creating an MCP server vs. standard APIs.
adregan
Might be another reflection of McLuhan‘s “the medium is the message” in that APIs are built with the intended interface in mind.
To this point, GUIs; going forward, AI agents. While the intention rhymes, the meaning of these systems diverge.
mellosouls
the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned
Perhaps but we see current hypes like Cursor only using MCP one way; you can feed into Cursor (eg. browser tools), but not out (eg. conversation history, context etc).
I love Cursor but this "not giving back" mentality originally reflected in it's closed source forking of VS Code leaves an unpleasant taste in the mouth and I believe will ultimately see it lose developer credibility.
Lock-in still seems to be locked in.
bitwize
Remember Web 2.0? Remember the semantic web? Remember folksonomies? Mash-ups? The end of information silos? The democratizing power of HTTP APIs?Anyone? Anyone?
apgwoz
I think we found a new backronym for MCP: Mashup Context Protocol.
(The mashup hype was incredible, btw. Some of the most ridiculous web contraptions ever.)
kasey_junk
Yes. Pieces of all of those things surround us now. And where we are wrt locking and interop is far beyond where we were when each of those fads happened.
Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t. The benefits are too high for companies to allow vendors to lock the data away from them.
karaterobot
I don't understand your point. Some of those things were buzzwords, some were impossible dreams, some changed the way the web works completely. Are you just saying that the future is unknown?
klabb3
No. What they are saying is best said with a quote from Battlestar Galactica:
> All of this has happened before, and all of this will happen again.
”It” here being the boom and inevitable bust of interop and open API access between products, vendors and so on. As a millenial, my flame of hope was lit during the API explosion of Web 2.0. If you’re older, your dreams were probably crushed already by something earlier. If you’re younger, and you’re genuinely excited about MCP for the potential explosion in interop, hit me up for a bulk discount on napkins.
null
iLoveOncall
> I don't know how long it'll last
I'm just baffled no software vendor has already come up with a subscription to access the API via MCP.
I mean obviously paid API access is nothing new, but "paid MCP access for our entreprise users" is surely on the pipeline everywhere, after which the openness will die down.
pininja
Mapbox is just a small step away from that with their MCP server wrapping their pay-by-use API. I wouldn’t be surprised to see a subscription offering with usage limits if that somehow appealed to them. MapTiler already offers their service as a subscription so they’re even closer if they hosted a server like this on their own.
Bjartr
And I expect there'll eventually be a way for an AI to pay for an MCP use microtransaction style.
Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done! Ads (and scams) for AIs to fall for!
adamesque
I think for enterprise it’s going to become part of the subscription you’re already paying for, not a new line item. And then prices will simply rise.
Optionality will kill adoption, and these things are absolutely things you HAVE to be able to play with to discover the value (because it’s a new and very weird kind of tool that doesn’t work like existing tools)
jadar
I don’t want to undermine the author’s enthusiasm for the universality of the MCP. But part of me can’t help wondering: isn’t this the idea of APIs in general? Replace MCP with REST and does that really change anything in the article? Or even an Operating System API? POSIX, anyone? Programs? Unix pipes? Yes, MCP is far simpler/universal than any of those things ended up being — but maybe the solution is to build simpler software on good fundamental abstractions rather than rebuilding the abstractions every time we want to do something new.
Jonovono
MCP is not REST. In your comparison, its more that MCP is a protocol for discovering REST endpoints at runtime and letting users configure what REST endpoints should be used at runtime.
Say i'm building a app and I want my users to be able to play spotify songs. Yea, i'll hit the spotify api. But now, say i've launched my app, and I want my users to be able to play a song from sonofm when they hit play. Alright, now I gotta open up the code and do some if statements hard code the sonofm api and ship a new version, show some update messages.
MCP is literally just a way to make this extensible so instead of hardcoding this in, it can be configured at runtime
layer8
HATEOAS was supposed to be that.
Jonovono
heh, there was a good convo about HATEOAS and MCP on HN awhile back:
jaredsohn
Feels like segment.com but for calling APIs rather than adding libraries to the frontend.
null
smohare
[dead]
kvdveer
The main difference between MCP and Rest is that MCP is self described from the very start. REST may have OpenAPI, but it is a later addon, and we haven't quite standardised on using it. The first step of exposing an MCP is describing it, for Rest is is an optional step that's often omitted.
xg15
Is it "self-described" in the sense I can get a list of endpoints or methods, with a human- (or LLM-) readable description for each - or does it supply actual schemata that I could also use with non-AI clients?
(Even if only the former, it would of course be a huge step forward, as I could have the LLM generate schemata. Also, at least, everyone is standardizing on a base protocol now, and a way to pass command names, arguments, results, etc. That's already a huge step forward in contrast to arbitrary Rest+JSON or even HTTP APIs)
Spivak
For each tool you get the human description as well as a JSON schema for the parameters needed to call the function.
Szpadel
isn't also SOAP self described?
souldeux
And gRPC with reflection, yeah?
gaunds
खस्नादिर्बीद Skdbiebdjv
light_hue_1
But you're describing it in a way that is useless to anything but an LLM. It would have been much better if the description language had been more formalized.
Majromax
> It would have been much better if the description language had been more formalized.
To speculate about this, perhaps the informality is the point. A full formal specification of something is somewhere between daunting and Sisyphean, and we're more likely to see supposedly formal documentation that nonetheless is incomplete or contains gaps to be filled with background knowledge or common sense.
A mandatory but informal specification in plain language might be just the trick, particularly since vibe-APIing encourages rapid iteration and experimentation.
0x696C6961
The description includes an input and output json schema.
caust1c
In my mind the only thing novel about MCP is requiring the schema is provided as part of the protocol. Like, sure it's convenient that the shape of the requests/response wrappers are all the same, that certainly helps with management using libraries that can wrap dynamic types in static types, but everyone was already doing that with APIs already we just didn't agree on what that envelope's shape should be. BUT, with the requirement that schema be provided with the protocol, and the carrot of AI models seamlessly consuming it, that was enough of an impetus.
marcosdumay
> the only thing novel about MCP is requiring the schema is provided as part of the protocol
You mean, like OpenAPI, gRPC, SOAP, and CORBA?
sneak
You can’t connect to a gRPC endpoint and ask to download the client protobuf, but yes.
null
spenczar5
honestly, yes - but MCP includes a really simple 'reflection' endpoint to list the capabilities of an API, with human readable docs on methods and types. That is something that gRPC and OpenAPI and friends have supported as an optional extension for ages, but it has largely been a toy. MCP makes it central and maybe that makes all the difference.
spudlyo
At a previous job most of our services supported gRPC reflection, and exploring and tinkering with these APIs using the grpc_cli tool was some of the most fun I had while working there. Building and using gRPC services in golang left a strong positive impression on me.
lobsterthief
I had the same experience working with GQL :)
TZubiri
Damn, I just read this and it's comforting to see how similar it is to my own response.
To elaborate on this, I don't know much about MCP, but usually when people speak about it is in a buzzword-seeking kind of way, and the people that are interested in it make these kinds of conceptual snafus.
Second, and this applies not just to MCP, but even things like JSON, Rust, MongoDB. There's this phenomenon where people learn the complex stuff before learning the basics. It's not the first time I've cited this video on Homer studying marketing where he reads the books out of order https://www.youtube.com/watch?v=2BT7_owW2sU . It makes sense that this mistake is so common, the amount of literature and resources is like an inverted pyramid, there's so little classical foundations and A LOT of new stuff, most of which will not stand the test of time. Typically you have universities to lead the way and establish a classical corpus and path, but being such a young discipline, 70 years in and we are still not finding much stability, Universities have gone from teaching C, to teaching Java, to teaching Python (at least in intro to CS), maybe they will teach Rust next, but this buzzwording seems more in line with trying to predict the future, and there will be way more losers than winners in that realm. And the winners will have learned the classicals in addition to the new technology, learning the new stuff without the classics is a recipe for disaster.
bayesianbot
My first thought as well. But maybe at least people wanting to plug their apps to their AI forces developers to actually implement the interface, unlike APIs that are mostly unheard of in general population and thus not offered?
jampa
I don't want to sound like a skeptic, but I see way more people talking about how awesome MCP is rather than people building cool things with it. Reminds me of blockchain hype.
MCP seems like a more "in-between" step until the AI models get better. I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer.
qsort
Regardless of how good a model gets, it can't do much if it doesn't have access to deterministic tools and information about the state of the world. And that's before you take into account security: you can't have a model running arbitrary requests against production, that's psychotic.
I don't have a high opinion of MCP and the hype it's generating is ridicolous, but the problem it supposedly solves is real. If it can work as an excuse to have providers expose an API for their functionality like the article hopes, that's exciting for developers.
mtkd
It's very different to blockchain hype
I had similar skepticism initially, but I would recommend you dip toe in water on it before making judgement
The conversational/voice AI tech now dropping + the current LLMs + MCP/tools/functions to mix in vendor APIs and private data/services etc. really feels like a new frontier
It's not 100% but it's close enough for a lot of usecases now and going to change a lot of ways we build apps going forward
moooo99
Probably my judgement is a bit fogged. But if I get asked about building AI into our apps just one more time I am absolutely going to drop my job and switch careers
mtkd
That's likely because OG devs have been seeing the hallucination stuff, unpredicability etc. and questioning how that fits with their carefully curated perfect system
What blocked me initially was watching NDA'd demos a year or two back from a couple of big software vendors on how Agents were going to transform enterprise ... what they were showing was a complete non-starter to anyone who had worked in a corporate because of security, compliance, HR, silos etc. so I dismissed it
This MCP stuff solves that, it gives you (the enterprise) control in your own walled garden, whilst getting the gains from LLMs, voice etc. ... the sum of the parts is massive
It more likely wraps existing apps than integrates directly with them, the legacy systems becoming data or function providers (I know you've heard that before ... but so far this feels different when you work with it)
TZubiri
I wasn't able to find a good source on it, but I read a couple of times that Anthropic (builders of MCP) do astroturfing/shilling/growth hacking/SEO/organic advertisement. Everything I've read so far with MCP and Claude and the hype I see on social media is consistent with that, hype and no value.
bryancoxwell
But this whole post is about using MCP sans AI
iLoveOncall
MCP without AI is just APIs.
MCP is already a useless layer between AIs and APIs, using it when you don't even have GenAI is simply idiotic.
The only redeeming quality of MCP is actually that it has pushed software vendors to expose APIs to users, but just use those directly...
ricardobeat
And that’s the whole point - it’s APIs we did not have. Now app developers are encouraged to have a public, user friendly, fully functional API made for individual use, instead of locking them behind enterprise contracts and crippling usage limits.
arbuge
I could see that happening... perhaps instead of plugging in the URL of the MCP server you'd like to use, you'd just put in the URL of their online documentation and trust your AI assistant of choice to go through all of it.
dghlsakjg
> we will point to the tool's documentation or OpenAPI
You can already do this as long as your client has access to a HTTP MCP.
You can give the current generation of models an openAPI spec and it will know exactly what to do with it.
nikolayasdf123
you don't even need MCP for that. just access to hosted swagger file.
caust1c
It's incredible for investigating audit logs. Our customers use it daily.
https://blog.runreveal.com/introducing-runreveal-remote-mcp-...
MontagFTB
Bret Victor had an old video where he talked about a world in which computers very organically figured out how to interoperate. MCP feels like the first realization of that idea.
neoden
So much scepticism in the comments. I spent last week implementing an MCP server and I must say that "well-designed" is probably an overstatement. One of the principles behind MCP is that "an MCP server should be very easy to implement". I don't know, maybe it's a skill issue but it's not that easy at all. But what is important imo, is that so many eyes are looking in one direction right now. That means, it has good chances to have all the problems to be solved very quickly. And second, often it's so hard to gather a critical mass of attention around something to create an ecosystem but this is happening right now. I wish all the participants patience and luck)
newtwilly
It's pretty easy if you just use the MCP Python library. You just put an annotation on a function and there's your tool. I was able to do it and it works great without me knowing anything about MCP. Maybe it's a different story if you actually need to know the protocol and implement more for yourself
null
klabb3
> One of the principles behind MCP is that "an MCP server should be very easy to implement".
I’m not familiar with the details but I would imagine that it’s more like:
”An MCP server which re-exposes an existing public/semi-public API should be easy to implement, with as few changes as possible to the original endpoint”
At least that’s the only way I can imagine getting traction.
mattmanser
We've done it before, it hasn't worked before and it's only a matter of years if not months before apps starting locking down the endpoints so ONLY chatgpt/claude/etc. servers can use them.
Interoperability means user portability. And no tech bro firm wants user portability, they want lock in and monopoly.
sureglymop
I've thought of this as well but in reality, aren't MCP servers mostly just clients for pre existing APIs?
For example, the Kagi MCP server interacts with the Kagi API. Wouldn't you have a better experience just using that API directly then?
On another note, as the number of python interpreters running on your system increases with the number of MCP servers, does anyone think there will be "hosted" offerings that just provide a sort of "bridge" running all your MCP servers?
null
vinkelhake
While reading this, the old ARexx (Amiga Rexx) popped into my head. It was a scripting language that in itself wasn't very noteworthy. However, it also made it easy for applications to expose functionality through an ARexx port. And again, offering up an API itself isn't noteworthy either. But it shipped by default in the system and if an application wanted to open itself up for scripting, ARexx was the natural choice. As a result, a ton of applications did have ARexx ports and there was a level of universality that was way ahead of its time.
Come to think of it - I don't know what the modern equivalent would be. AppleScript?
layer8
PowerShell with COM interfaces.
bigmattystyles
I thought MCPs just ‘figured out’ using docs how to call a program’s API. Won’t it matter that many APIs just suck?
mudkipdev
Anyone else feel like this article was written with ChatGPT
neuronic
Not in this particular case. At this point I am starting to wonder if the
> Anyone else feel like this article was written with ChatGPT
comments are actually written by ChatGPT.
Workaccount2
I know this is nit-picky and not really relevant to the actual meat of the story, but a toaster (outside of a gag gift or gimmick) cannot run on USB-C since your typical toaster draws ~1kW and USB-C power spec tops out at 240W.
hnlmorg
A car lighter also cannot run a pizza oven for the same reason.
But you’re right, it does kind of miss the point.
I agree with the article, and I love how the author is (mis-)using MCP. I just want to rephrase what the accident actually is.
The accident isn't that somehow we got a protocol to do things we couldn't do before. As other comments point out MCP (the specificaiton), isn't anything new or interesting.
No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned.
I don't know how long it'll last, but I sure appreciate it.