Skip to content(if available)orjump to list(if available)

Hypercapitalism and the AI talent wars

bix6

“The AI capital influx means that mega-projects no longer seem outlandishly expensive. This is good for the world!”

Is it? This whole piece just reads of mega funds and giga corps throwing ridiculous cash for pay to win. Nothing new there.

We can’t train more people? I didn’t know Universities were suddenly producing waaaay less talent and that intelligence fell off a cliff.

Things have gone parabolic! It’s giga mega VC time!! Adios early stage, we’re doing $200M Series Seed pre revenue! Mission aligned! Giga power law!

This is just M2 expansion and wealth concentration. Plus a total disregard for 99% of employees. The 1000000x engineer can just do everyone else’s job and the gigachad VCs will back them from seed to exit (who even exits anymore, just hyper scale your way to Google a la Windsurf)!

pj_mukh

I think the author should've clarified that this is purely a conversation about the platform plays. There will be 100's of thousands of companies on the application layer, and mini-platform plays that will have your run-of-the-mill new grad or 1x engineer.

Everything else is just executives doing a bit of dance for their investors ala "We won't need employees anymore!"

anovikov

Ironically, there's no M2 expansion going on since Covid days and M2 to GDP is back to what it was pre-Covid and overall didn't even increase much at all even since GFC. It's only 1.5x of what it was at the bottom in 1997 when cost of capital was much higher than today. I think this concern is misplaced.

walterbell

https://medium.com/@villispeaks/the-blitzhire-acquisition-e3...

> Blitzhires are another form of an acquisition.. not everybody may be thrilled of the outcome.. employees left behind may feel betrayed and unappreciated.. investors may feel founders may have broken a social contract. But, for a Blitzhire to work, usually everybody needs to work together and align. The driver behind these deals is speed. Maybe concerns over regulatory scrutiny are part of it, but more importantly speed. Not going through the [Hart-Scott-Rodino Antitrust Act] HSR process at all may be worth the enormous complexity and inefficiency of foregoing a traditional acquisition path.

From comment on OP:

> In 2023–2024, our industry witnessed massive waves of layoffs, often justified as “It’s just business, nothing personal.” These layoffs were carried out by the same companies now aggressively competing for AI talent. I would argue that the transactional nature of employer-employee relationships wasn’t primarily driven by a talent shortage or human greed. Rather, those factors only reinforced the damage caused by the companies’ own culture-destroying actions a few years earlier.

2014, https://arstechnica.com/tech-policy/2014/06/should-tech-work...

> A group of big tech companies, including Apple, Google, Adobe, and Intel, recently settled a lawsuit over their "no poach" agreement for $324 million. The CEOs of those companies had agreed not to do "cold call" recruiting of each others' engineers until they were busted by the Department of Justice, which saw the deal as an antitrust violation. The government action was followed up by a class-action lawsuit from the affected workers, who claimed the deal suppressed their wages.

alganet

> If the top 1% of companies drive the majority of VC returns, why shouldn’t the same apply to talent? Our natural egalitarian bias makes this unpalatable to accept, but the 10x engineer meme doesn’t go far enough – there are clearly people that are 1,000x the baseline impact.

https://www.youtube.com/watch?v=0obMRztklqU

ngruhn

> AI catch-up investment has gone parabolic, initially towards GPUs and mega training runs. As some labs learned that GPUs alone don't guarantee good models, the capital cannon is shifting towards talent.

So, no more bitter lesson?

Xcelerate

I find the current VC/billionaire strategy a bit odd and suboptimal. If we consider the current search for AGI as something like a multi-armed bandit seeking to identify “valuable researchers”, the industry is way over-indexing on the exploitation side of the exploitation/exploration trade-off.

If I had billions to throw around, instead of siphoning large amounts of it to a relatively small number of people, I would instead attempt to incubate new ideas across a very large base of generally smart people across interdisciplinary backgrounds. Give anyone who shows genuine interest some amount of compute resources to test their ideas in exchange for X% of the payoff should their approach lead to some step function improvement in capability. The current “AI talent war” is very different than sports, because unlike a star tennis player, it’s not clear at all whose novel approach to machine learning is ultimately going to pay off the most.

entropi

Agreed, and I suspect the explanation is that these plays are done not to search for a true AGI, but to drive up hype (and 'the line').

mlsu

The full bodied palate of this AI market mirrors the sharp nose of 2023 AI doomerism.

The argument goes: if AI is going to destroy humanity, even if that is a 0.001% chance, we should all totally re-wire society to prevent that from happening, because the _potential_ risk is so huge.

Same goes with these AI companies. What they are shooting for, is to replace white collar workers completely. Every single white collar employee, with their expensive MacBooks, great healthcare and PTO, and lax 9-5 schedule, is to be eliminated completely. IF this is to happen, even if it's a 0.001% chance, we should totally re-wire capital markets, because the _potential reward_ is so huge.

And indeed, this idea is so strongly held (especially in silicon valley) that we see these insanely frothy valuations and billion dollar deals in what should be a down market (tremendous macro uncertainty, high interest rates, etc).

AI doomerism seemed to lack finish, though. Anyone remember Eliezer Yudkowsky? Haven't heard from him in a while.

danieltanfh95

These "talent wars" are overblown and a result of money having nowhere else to go. People are banking on AI and robotics for human progress to take off and that's just a result of all other ventures fizzling out with this left for capital to migrate to.

If you talked to any of these folks worth billions they arent particularly smart, their ideas not really interesting. it took us a few years to go from gpt-3 to deepseek v3 and then another few years to go from sonnet 4 to kimi k2, both being open source models on way lower funding. This hints at a deeper problem than what "hypercapitalism" suggests. In fact, it suggests that capital distribution as of its current state is highly inefficient and we are simply funding the wrong people.

Smart AI talent aren't going to out there constantly trying to get funding or the best deals. They would want to work. Capital is getting too used to not doing the ground work to seek them out. Capital needs to be more tech savvy.

VCs and corporate development teams don't actually understand the technology deeply enough to identify who's doing the important work.

the_precipitate

I think one of the main issue is that the 10x or 100x talents in AI have not yet really show their value yet. None of these AI companies are making any money, and they are valued over highly successful and profitable companies out there because of their "potentials". ARR is nothing if you sell goods valued at 1 dollar for 90 cents.

zer00eyz

> If the top 1% of companies drive the majority of VC returns

The fact that the author brings this up and fails to realize that the behavior of current staff shows we have hit or have passed peak AI.

Moores Law is dead and it isn't going to come through and make AI any more affordable. Look at the latest GPU's: IPC is flat. And no one is charging enough to pay for running (bandwidth, power) of the computer that is being used, never mind turning NVIDA into a 4 trillion dollar company.

> Meta’s multi-hundred million dollar comp offers and Google’s multi-billion dollar Character AI and Windsurf deals signal that we are in a crazy AI talent bubble.

All this signals is that those in the know have chosen to take their payday. They don't see themselves building another google scale product, they dont see themselves delivering on samas vision. They KNOW that they are never going to be the 1% company, the unicorn. It's a stark admission that there is NO break out.

The math isnt there in the products we are building today: to borrow a Bay Area quote there is no there there. And you can't spend your way to market capture / a moat, like every VC gold rush of the past.

Do I think AI/ML is dead. NO, but I dont think that innovation is going to come out of the big players, or the dominant markets. Its going to take a bust, cheap and accessable compute (fire sale on used processing) and a new generation of kids to come in hungry and willing to throw away a few years on a big idea. Then you might see interesting tools and scaling down (to run localy).

The first team to deliver a model that can run on a GPU alongside a game, so that there is never an "I took an arrow to the knee" meme again is going to make a LOT of money.

ythiscoyness

> the 10x engineer meme doesn’t go far enough – there are clearly people that are 1,000x the baseline impact.

Plenty out there who want authors like this believing it enough to write it

harimau777

Obviously the specifics are going to depend on exactly how a team pegs story points, but if an average engineer delivers 10 story points during a two week sprint, then that would mean that a 1000x engineer would deliver 10000 story points, correct? I don't see how someone can actually believe that.

woah

These companies spend hundreds of millions of dollars to train these models and (hope to) make billions from them. The researchers are the people who know how to do it. These aren't guys cranking out React buttons.

seanp2k2

Ladies and gentlemen, the problem with The Valley in 2025.

asdf6969

1000x revenue not 1000x developer productivity is possible sometimes. There are lots of jobs where developers also decide on the roadmap and requirements along with the execution instead of just being ticket monkey and a good idea executed well could easily be worth 1000x changing button colours and adding pagination to an API

nopinsight

impact != story points

Kapura

> The first team to deliver a model that can run on a GPU alongside a game, so that there is never an "I took an arrow to the knee" meme again is going to make a LOT of money.

this feels like a fundamental misunderstanding of how video game dialogue writing works. it's actually an important that a player understand when the mission-critical dialogue is complete. While the specifics of a line becoming a meme may seem undesirable, it's far better that a player hears a line they know means "i have nothing to say" 100 times than generating ai slop every time the player passes a guard.

zer00eyz

> this feels like a fundamental misunderstanding of how video game dialogue writing works.

Factorio, Dwarf Fortress, Minecraft.

There are plenty of games where the whole story is driven by cut scenes.

There are plenty of games that shove your quests into their journal/pip boy to let you know how to drive game play.

Dont get me wrong, I loved Zork back in the day (and still do) but we have evolved past that and the tools to move us further could be there.

Kapura

I am not sure what point you're trying to make here; none of the games you mentioned contain the famous "arrow to the knee" line.

Dwarf Fortress, in fact, shows just how much is possible by committing to deep systemic synthesis. Without souped-up chatbots Dwarf Fortress creates emergent stories about cats who tread in beer and cause the downfall of a fortress, or allow players to define their own objectives and solutions, like flooding a valley full of murderous elephants with lava.

My original point is that papering over important affordances with AI slop may actually work against the goals of the game. If they were good and fun, there is no reason a company like Microsoft couldn't have added the technology to Starfield.

goopypoop

procedurally generated content is the most onanistic form of art

Avicebron

> The first team to deliver a model that can run on a GPU alongside a game, so that there is never an "I took an arrow to the knee" meme again is going to make a LOT of money.

"Local Model Augmentation", a sort of standardized local MCP that serves as a layer between a model and a traditional app like a game client. Neat :3