Dear Lewis, my CEO wants AI to do it all. How do I argue for humans?
95 comments
·April 18, 2025tummler
oceanplexian
Actually it does have something at stake: its singular goal of minimizing the loss function on its training data. AI is therefore designed to convince you it's right, which is a different optimization than actually being right. For example, I code a lot with agentic code editors, you’ll quickly learn they love to modify broken tests to superficially pass, rather than fixing the underlying failure. All the folks on the Vibe Coding hype train don't have enough experience to spot this and therefore think the AI is a lot smarter than it actually is.
This has scary implications if extrapolated out to extremes, because a sufficiently advanced AI would do exactly what you’re describing, manipulate power dynamics to get to a superficial outcome.
derefr
> AI is therefore designed to convince you it's right, which is a different optimization than actually being right.
I get the impression, when talking to conversational AIs, that they're more tuned to convince you that you're right — sycophantry likely minimizes how often people press the RLHF thumbs-down button, and thereby appears more-often-than-warranted in the RLHF fine-tune dataset.
danjl
This article says a lot more about sales and sales people than it does about AI
henryfjordan
It's more that AI isn't able to be manipulated in the same way a boss can manipulate a human. It doesn't care about the threat of homelessness.
swiftcoder
> AI has nothing personal at stake... It doesn't have a mortgage payment riding on that commission check.
Glossing over the whole AI thing... Maybe we shouldn't be structuring our systems so that the humans are one bad quarter away from financial ruin either
bee_rider
Most critiques of automation are actually critiques of capitalism or other heartless parts of our society. But, the devil you know, and all that.
trod1234
Most critiques of automation are critiques of the economic system, which most certainly isn't capitalism today but every economic system that forces their workers to work to receive food and shelter at the barrel of an existential gun.
It doesn't meet the objective definition of capitalism, and its quite trite blaming failures along propaganda lines.
Capitalism can't technically exist without a stable store of value, you have to be able to make a profit in purchasing power and that simply isn't possible under certain systems.
Free-markets also can't exist under money-printing for the exact same reason. There are parasites that drive everyone else out of business and then collapse during the final stage of ponzi. Sieving all assets into few hands that can then be seized by government is communism.
What you pretend to be capitalism is in fact socialism, and like most socialist issues these systems create the problems, claim its something else, and then put forth solutions that are not solutions but enable greater control towards totalism. In other words, shock doctrine.
That path leads to extinction while trying to drag everyone else along for the ride.
Its quite evil, and its inevitable when you have those elements present and you can see that if you think along rational principles based in external measure/reality. Most civilizations never make it beyond a certain point.
You don't stay alive by ignoring reality.
codr7
So, has anyone considering turning the tables and replacing the CEO with AI?
Seems like a more reasonable path to me; more logic and less bullshit at the core, keep human creativity.
The director from the Travelers series, basically.
Just consider the potential savings...
loloquwowndueo
Right on. I’d take an AI CEO trained by reading “the mythical man month”, “Peopleware”, “out of the crisis”, “drive” over some of the real CEOs I’ve worked under, any day of the week and Sundays too.
codr7
Too much power in one individual, they start believing they know everything and are never wrong.
But the role of steering a company is needed, that's why I think it's perfect for AI. Developers and VCs write the instructions together and AI runs the company.
owebmaster
What does "run the company" mean? LLMs can generate code, generating code directly replaces developers work. But how would it run a company? I think LLMs can help anyone be a good enough CEO but still need to be a real person. Now the question is: is a developer a better CEO than a CEO can code using LLM? I think we are going to see many 1-person companies going forward.
apercu
Depends on the CEO. Let’s take a mythical example of a person who is the CEO of 5 companies, who posts on social media all day every day and ruins value of the companies they “run” and panders to angry little men with daddy issues just like themselves? That mythical CEO is probably easily replaced by just about anyone, including AI.
__MatrixMan__
Is that substantively different than quitting and starting a much smaller company that competes with your previous one? Seems easier to just let the CEO and shareholders go down with the ship and move all the talent to the new one.
You can just run the codebase through an LLM to cleanse it of any IP entanglements. Open source it under a pseudonym if you're worried about retribution. Whatever parts of the business that doesn't cover... well those are the people you need to hire from the old one.
> First AI came for the artists and I said nothing because I was not an artist...
-- VC's and CEO's while their ship sinks
palmotea
> So, has anyone considering turning the tables and replacing the CEO with AI?
Better yet: consider replacing the shareholders with AI.
But no, no one considers that, because those are the people who have the power. And "replacing with AI" is all about power.
dyauspitr
That’s just going to enable absolute idiots with enough money to “hire” an AI CEO. The rich will get richer faster.
codr7
It was always a pyramid game, because when one person has everything it stops making sense. At least this way, we get good software out of it. Replacing developers has to be the worst idea ever.
bbqfog
This is why every VC loves vibe coding. Now let's talk about replacing capital with AI!
codr7
Yeah, but that's still missing the multiplied creativity we get from working in teams. Besides, we all know it's not going to work very well long term.
Drop the CEO and keep the developers instead!
lofaszvanitt
Now watch that never happen and be amazed :D.
AIPedant
I think a very direct answer to this is pointing to the hot water Cursor found itself in after an AI customer support agent made stuff up.
- Do you want an LLM salesbot to close a deal your company isn't actually able to fulfill?
- Do you think your company will use AI more intelligently and reliably than the people who made a popular LLM coding system?
soulofmischief
I want my AI support rep to have access to data and documents that it can vector/text search and forward to the user.
I want my AI support rep to create tasks to engage my team, with all the relevant data linked to it. It should be able to automatically schedule things and ask a human for confirmation.
It should be able to elevate communication to a human in the loop, using whatever mediums of communication makes sense given staff availability and workload.
At no point is it allowed to answer any questions unless the answers are constrained and probably directly quoted from a cited and linked section in our documentation.
In general, it should never confirm or deny things, it should never try to close things or acknowledge the content of any of the user's communication other than to call tools and surface public information which might be relevant to their request.
Most of this is a software architecture problem. The LLM is just there to provide an intuitive and extremely powerful natural language interface for search and tool calling. A little bit of glue between different systems, both internal and external.
semi-extrinsic
> In general, it should never confirm or deny things, it should never try to close things or acknowledge the content of any of the user's communication other than to call tools and surface public information which might be relevant to their request.
If you were an end user of such a system, would you be happy?
soulofmischief
As long as it addressed my needs by either pointing me to the correct documentation, or elevating me to a human, then yes, of course I'd be happy with that.
It's an incremental stop along the way to truly reliable agentic systems which we can trust with important things.
ghaff
A human sales rep would never confuse selling with installing :-)
heelix
What is the difference between software and car sales? The dealership rep knows when they are lying.
ghaff
Im not saying that good sales reps actively lie but, with software especially, there are often features in the pipeline or that only somewhat work to a degree the sales rep may not even be aware of.
bastardoperator
If the CEO already thinks AI can do everything, I think their answer to that is yes and yes.
AIPedant
You would think that Cursor's leadership would be aware of other cases where LLM customer support went awry - e.g. that Canadian airline whose chatbot promised a bereavement discount, ending with a judge ordering them to honor the chatbot's BS.
I suspect Cursor told themselves that they are super-smart AI experts who would never make an amateur mistake like the airline, they will use prompt engineering + RAG. With this, it will be unpossible that the LLM could make a mistake.
andrewmutz
I don't think AI salespeople can replace human salespeople, but are deals really being closed while taking people for motorcycle rides? Who goes on motorcycle rides with vendors?
swiftcoder
My office used to be across the street from a strip club, and I've watched several senior executives stumble out of there at 11am, accompanied by a vendor sales team...
More often that one would like, enterprise SaaS sales isn't about having the best product - it's about convincing the CTO he's going to feel like a king whenever your sales reps are in town
alabastervlog
> More often that one would like, enterprise SaaS sales isn't about having the best product - it's about convincing the CTO he's going to feel like a king whenever your sales reps are in town
As far as I can tell, selling B2B security products is mostly about making C-suiters feel like they're in a political thriller. They even build (and, at least when needed for these purposes, staff!) fake and useless "war room" sets to walk the guys through, inefficient dashboards that nobody doing the actual work uses because they suck but they look cool, stuff like that.
mistrial9
Gold Club in San Francisco has a closed VIP room in the basement.. at least one troublesome associate has been taken down there to generate some compromising photos.. true story
rurp
The notion that the private sector largely runs on merit really falls apart the more one learns about how high level decision making is done.
sumtechguy
> but are deals really being closed while taking people for motorcycle rides?
Oh I see you have not hung out with 'the sales guys'. That they did it on a motorcycle ride does not surprise me. There are some seriously shady dudes out there that will do anything to close the deal, "always be closing". If closing out a million dollar contract means the CEO's daughter wants to goto a water park? Magically there are the tickets aplenty. Bars, race tracks, sports games, horseback riding, Caribbean cruises, on and on.
Is it ethical? Not really. In fact many places have training specifically on vendor relations. How much, how little, etc etc. In a small startup environment? There are going to be basically no guidelines from the business to do anything either way. Larger companies tend to have guardrails. But many of the sales guys know how to work that system.
In technical roles we usually do not see this mess. Because we have a sense of follow the rules and logic. The sales guys are 'always closing no matter what it takes'.
ativzzz
Why not? have your AI salesperson interact directly with my AI purchaser and human just signs off on budget
bryanrasmussen
Harley Davidson?
on edit: Or Hell's Angels...
jmclnx
I would agree with ein0p, but maybe you can delay it by suggesting he watch DODGE and its system replacement for the US Social Security Admin Systems, and maybe the IRS Systems too.
It is lead by Musk and I am sure he will use AI for that. Present it as "If Musk Fails, we will fail".
enahs-sf
Leadership has been pushing AI in my company for a while. Just had a massive outage because someone asked copilot how to deal with some data and it broke prod for 2 hours. Was actively asking copilot for help during the remediation. The lost revenue was probably equivalent to 20 engineers full-time. Explain to me how AI saves more money than it costs.
wcfrobert
> "The CEO was wavering until Tom found out they both owned the same obscure Italian motorcycle. Tom took him for a ride along the coast. Contract was signed the next day."
As a junior, I often wonder how many deals are signed in exclusive country clubs, on golf courses, and at the dining table with endlessly flowing Maotai.
For a successful career, is it better for one to prioritize network over skills? It seems to me that the latter can be commoditized by AI, while the former can not. Rather than learning Lisp, maybe it's time to pick up golf. I'm only half joking.
hodgesrm
> As a junior, I often wonder how many deals are signed in exclusive country clubs, on golf courses, and at the dining table with endlessly flowing Maotai.
Virtually none in our business. (Databases.) What does get deals is listening carefully to what customers actually want and putting together offerings that get it to them at a reasonable price. Incidentally, good sales people are vastly better at this than devs. There are a number of skills that go into it but being a good listener is the most important.
lantry
prioritize your soul
ageitgey
As a co-founder at a company, I get more outbound sales spam than you can imagine. I just checked my spam folder and I get at least 30-50 "personal" sales outreaches a day.
So many of them are obvious AI to anyone who has used LLMs. The emails are always like "Hi, I really like how <general fact about company mission> and how you used to <old job on LinkedIn>. We can 10x your business..."
Another fun one is they say they care so much about our business that they recorded a personalized video of them exploring our website. But the video is a person gesturing at their computer screen that only shows a Cloudflare bot blocking page because their AI video generator got blocked by our site as a bot.
It's so lame. It feels incredibly off-putting and dishonest that I am having my time wasted by a machine pretending to be a real person spending their time on me.
The problem is that this automation leads to the death of the entire sales channel. If 99% of "personal" emails I get are computer generated and the volume of emails keeps increasing because it's now so easy to send them, I'm going to stop reading any emails. I feel burned.
This is the problem with AI sales. It can automate the current average sales process. This in turn makes the average sales process really easy, so it gets saturated by everyone and then it no longer works for anyone.
If anything, you should do the opposite of whatever the AI sales people are currently doing. That's the way to make a mark.
hodgesrm
There is nothing new under the sun. There was a famous New Yorker cartoon from 1993 that laid out the issue. It would have been a good image for this substack post, though I'm not sure the author would have completely appreciated the irony. https://condenaststore.com/featured/trust-me-mort-no-electro...
gwbas1c
As I read the article, I actually wasn't convinced that people were needed over AI in this case.
Why?
When some people hire, they have their subordinates sit in meetings all day, doing occasional tasks, and merely feeding their enlarged egos. If all you want are subordinates to feed your ego, AI is exceptionally good at that. Plenty of people love talking to chatbots.
The problem is the author never really explained what the roles were. Were they customer facing sales calls? Did the CEO really believe that customers will be happy to talk to a sales robot?
Thus, because I believe these roles aren't customer facing, I suspect that these roles are either feeding someones' ego by sitting in meetings all day, or otherwise non-customer-facing roles that handle aggregating information. This makes me wonder if a smaller group of people, who know how to use AI well, will outperform a larger group of people without AI.
bob1029
> An executive has 48 hours to convince his CEO why AI can't replace human talent
> The "Replace or Justify" Ultimatum
It is hard to take this kind of stuff seriously. Actual businesses that produce value do not operate in this way.
I feel like one of the more important lessons I picked up along my journey is that ultimatums are a really bad idea. Instead of creating dialogue and exploring the entire gradient of in-between goldilocks solutions, you've narrowed an ~infinite spectra into 2 discrete, highly adversarial/tribal bins. This is not a good premise for a conversation around AI and how it applies to business. I don't know of a single business venture that couldn't extract some value from AI. Perverting this notion into an all or nothing narrative is so ridiculous to me.
Nearly all of the arguments here are easy to dismantle but I don't feel like arguing with a half-baked Substack post, so I'm not going to.
Just want to highlight one element in particular that jumped out at me:
"AI has nothing personal at stake. It doesn't feel the pressure of missing quota or the exhilaration of exceeding it. It doesn't have a mortgage payment riding on that commission check."
So the AI doesn't manipulate power dynamics to control its employees... and that's a bad thing? Okay. (It isn't true anyway; AI can easily do that.)