Skip to content(if available)orjump to list(if available)

Builder.ai Collapses: $1.5B 'AI' Startup Exposed as 'Indians'?

dang

Two claims are being made here, one boring and one lurid.

The boring claim is that the company inflated its sales through a round-tripping scheme: https://www.bloomberg.com/news/articles/2025-05-30/builder-a... (https://archive.ph/1oyOw). That's consistent with other recent reporting (e.g. https://news.ycombinator.com/item?id=44080640)

The lurid claim is that the company's AI product was actually "Indians pretending to be bots". From skimming the OP and https://timesofindia.indiatimes.com/technology/tech-news/how..., the only citation seems to be this self-promotional LinkedIn post: https://www.linkedin.com/feed/update/urn:li:activity:7334521... (https://web.archive.org/web/20250602211336/https://www.linke...).

Does anybody know of other evidence? If not, then it looks bogus, a case of "il faudrait l'inventer" which got traction by piggybacking on an old-fashioned fraud story.

To sum up: the substantiated claim is boring and the lurid claim is unsubstantiated. When have we ever seen that before? And why did I waste half an hour on this?

(Thanks to rafram and sva_ for the links in https://news.ycombinator.com/item?id=44172409 and https://news.ycombinator.com/item?id=44175373.)

kamikazechaser

There are personal testimonials in the indiandevelopers subreddit from quite a while ago, if those are to be believed.

pyman

The news about BuilderAI using 700 devs instead of AI is false. Here's why.

I've seen a lot of posts coming out of India claiming "we were the AI". So I looked into it to see if Builder AI was lying, or if this was just a case of unpaid developers from India spreading rumours after the company went bust.

Here's what some of the devs are saying:

> "We were the AI. They hired 700 of us to build the apps"

Sounds shocking, but it doesn't hold up.

The problem is, BuilderAI never said development was done using AI. Quite the opposite. Their own website explains that a virtual assistant called "Natasha" assigns a human developer to your project. That developer then customises the code. They even use facial recognition to verify it's the same person doing the work.

> "Natasha recommends the best suited developer for your app project, who then customises your code on our virtual desktop. We also use facial recognition to check that the developer working on your code is the same one Natasha picked."

Source: https://www.builder.ai/how-it-works

I also checked the Wayback Machine. No changes were made to that site after the scandal. Which means: yes, those 700 developers were probably building apps, but no, they weren't "the AI". Because the company never claimed the apps were built by AI to begin with.

Verdict: FAKE NEWS

pyman

I couldn't find any reference on the BuilderAI website claiming they use GenAI to build software. So the second claim lacks evidence.

Update: They mention AI to assemble features, not to generate code. So it's impossible to know whether they were actually using ML (traditional AI) to resolve dependencies and pull packages from a repo.

ivape

Speculating, don’t they offer dev services that’s supposed to be done by AI? If the dev services were offered by devs, then that would be the scam. Now that I’ve said the second part, it does seem lurid because who the hell is paying for AI first code deliverables.

—-

Message to HN:

Instead of founding yet another startup, please build the next Tech Vice News and fucking goto the far corners of the tech world like Shane Smith did with North Korea with a camera. I promise to be a founding subscriber at whatever price you got.

Things you’ll need:

1) Credentialed Ivy League grad. Make sure they are sporadic like that WeWork asshole.

2) Ex VC who exudes wealth with every footstep he/she takes

3) The camera

4) And as HBO Silicon Valley suggests, the exact same combination of white guy, Indian guy, Chinese guy to flesh out the rest of the team.

See, I need to know what’s it like working for a scrum master in Tencent for example during crunch time. Also, whatever the fuck goes on inside a DeFi company in executive meetings. And of course, find the next Builder.ai, or at least the Microsoft funding round discussions. We’ve yet to even get a camera inside those Arab money meetings where Sam Altman begs for a trillion dollars. We shouldn’t live without such journalism.

pyman

The short answer is no, their website doesn't claim that development is done using AI.

My gut feeling is that a lot of people, including developers, are posting hate messages and spreading fake news because of their fear of AI, which they see as a threat to their jobs.

If you look at their website, builder.ai, they tell customers that their virtual assistant, "Natasha", assigns a developer (I assume from India):

> Natasha recommends the best suited developer for your app project, who then customises your code on our virtual desktop. We also use facial recognition to check that the developer working on your code is the same one Natasha picked.

Source: https://www.builder.ai/how-it-works

They also have another page explaining how they use deep learning and transformers for speech-to-text processing. They list a bunch of libraries like MetaPath2Vec, Node2Vec, GraphSage, and Flair:

Source: https://www.builder.ai/under-the-hood

It sounds impressive, but listing libraries doesn't prove they built an actual LLM.

So, the questions that remain unanswered are:

1. Did Craig Saunders, the Head of AI at Builder.ai (and ex-Director of AI at Amazon), ever show investors or clients a working demo of Natasha, or a product roadmap? How do we know Natasha was actually an LLM and not just someone sitting in a call centre in India?

2. Was there a technical team behind Saunders capable of building such a model?

3. Was the goal really to build a domain-specific foundation model, or was that just a narrative to attract investment?

Having said that, the company went into insolvency because the CEO and CFO were misleading investors by significantly inflating sales figures through questionable financial practices. According to the Financial Times, BuilderAI reportedly engaged in "round-tripping" with VerSe Innovation. This raised red flags for investors, regulators and prosecutors, and led to bankruptcy proceedings

paxys

> Less than two months ago, Builder.ai admitted to revising down core sales numbers and engaging auditors to inspect its financials for the past two years. This came amidst concerns from former employees who suggested sales performance had been inflated during prior investor briefings.

I was hoping for something interesting, but it is just plain old fashioned accounting fraud.

null

[deleted]

rafram

pyman

This is fake news. Builder.ai, like any other dev shop, had clients and was building apps using developers in India, pretty much like Infosys or any other Indian dev shop. Nothing wrong with that.

From what I read online, the real issue was "Natasha", their virtual assistant powered by a dedicated foundation model. They ran out of money before it got anywhere.

profsummergig

This is so obviously fake news that it's a good litmus test of the people who are boosting it.

There's no way that a team of programmers can ever produce code quickly enough to mimic anything close to the response time of a coding LLM.

threeseed

But it’s not just about coding quickly but also correctly.

Coding LLMs do not solve the problem of it hallucinating, using antiquated libraries and technologies and screwing up large code bases because of the limited context size.

Given a well architected component library and set of modules I would bet that on average I could build a correct website faster.

bartread

> This is fake news. Builder.ai, like any other dev shop, had clients and was building apps using developers in India, pretty much like Infosys or any other Indian dev shop. Nothing wrong with that.

Yeeeah... that's a fairly disingenuous take.

The difference between every other offshore dev shop backed by developers in India and Builder.ai is that - and I say this as someone who thinks Infosys is a shit company - Infosys and all those other dev shops are at least up front about how their business works and where and who will be building your app. Whereas Builder.ai spent quite a long time pretending like they had AI doing the work when actually it was a lot of devs in India.

That is deliberately misleading and it is not OK. It's fraudulent. It's literally what Theranos did with their Edison machines that never worked so whereas they claimed they had this wondrous new blood testing technology they were actually running tests with Siemens machines, diluting blood samples, etc. The consequences of Theranos's actions were much more serious (misdiagnoses and, indeed missed diagnoses of thousands of patients), rather than just apps built by humans rather than AI, but lying and fraud is lying and fraud.

pyman

I don't agree. Even Infosys markets AI as part of their offering, just look at their "AI for Infrastructure" pitch:

https://www.infosys.com/services/cloud-cobalt/offerings/ai-i...

Every big dev shop does this. Overselling tech happens all the time in this space. The line between marketing and misleading isn't always so clear. The difference is Builder.ai pushed the AI angle harder, but that doesn't make it Theranos-level fraud.

osigurdson

It doesn't matter for customers but investors would be interested if AI is being used or a bunch of devs due to the scaling potential differences.

dang

Thanks! It took an annoying amount of time to try to sort this out, but I made a consolidated reply here: https://news.ycombinator.com/item?id=44176241.

fazkan

This is so weird, its not that hard to actually build an app builder. There are multiple open-source repos (bolt etc), they could have just paid their "AI engineer" to actually build an AI engineer.

Shameless plug, but we built (https://v1.slashml.com), in roughly 2 weeks. Granted its not as mature, but we don't have billions :)

driverdan

> its not that hard to actually build an app builder

Besides simple one page self-contained apps, yes, it's quite hard. So hard that it's still an unsolved problem.

fazkan

not really, lovable, v0, and bolt all are mutlipage. They connect to supabase for db and auth. Replit can spinup custom dbs on-demand, and have a full-fledged IDE.

I did my research before jumping into this space :)

aitchnyu

Which ones prevent anybody with a browser accessing other user's data? I have been discussing vibe coding and Supabase's Postgres row level security misconfiguration.

glutamate

They launched in 2016

throwaway314155

Should have pivoted faster.

xkcd-sucks

It's plausible they started with a typical software consultancy and its crappy in house app builder scripts, and reformed it as an AI thing in order to inflate its value?

downrightmike

That'd be shameful, and a complete disgrace, it'd be like adding "bitcoin" to your company name or 10k fillings a few years ago to boost your stock

mikestew

In case anyone thinks parent is speaking hypothetically:

Insider trading charges filed over Long Island Iced Tea’s blockchain ‘pivot’ https://www.cnn.com/2021/07/10/investing/blockchain-long-isl...

fazkan

I mean zapier is also calling their workflows, "agents", I remember someone ranting about it on twitter.

hnuser123456

Nice, I'll try this out tonight.

fazkan

thanks, do ping me if you run into any issues faizank@slashml.com

bartread

This is not news, or at least not fresh news. The FT reported the collapse ~9 days ago and it was discussed here: https://news.ycombinator.com/item?id=44080640

apsurd

news to me buddy. this is perhaps a useless comment but then i think, articles resurface every now and again and it's intentional and welcome for those that missed. and this isn't exactly that of course, rather makes me think it's worth a comment: news is relative. discussion ensues, it's all good

macintux

Except that it's contrary to the site FAQ.

> If a story has not had significant attention in the last year or so, a small number of reposts is ok. Otherwise we bury reposts as duplicates.

ricardobeat

So.. where did the $450M go? A team of 700 developers in India built over eight years would have cost a fraction of that.

rokob

Why do you think it would be to pay for actual costs? The whole point of running a scam is to spend the money.

antithesizer

I really wish I'd read this before starting my career as a scammer ten years ago.

CSMastermind

How do you figure? $450M / 8 years / 700 developers = $80k / year per developer.

cubano

Typically, scams like this are very top-heavy with the vast majority of the pilfered cash going to a few well-placed "bros" at the top of the company pyramid.

My guess? Most of the cash is socked away in BTC or some such wealth sink just waiting for the individuals to clear their bothersome legal issues.

owebmaster

> My guess? Most of the cash is socked away in BTC

Had they done this years ago they would be so rich it would be worthy keep builder.ai going just to avoid legal problems.

casion

Average salary for a developer in India is about 1/10th of that.

spamizbad

That hasn't been the case in like 20 years. Engineering salaries are around 40K USD, although they can even stretch into the six figures for major companies with deep pockets wanting to attract elite talent. The band is pretty wide and is largely based on whether you work in a body shop consultancy (low end) or a major tech company like Google (high end).

And, like many things in this world, you'll find you'll pay for what you get.

darth_avocado

Median salary of a reasonable developer is about 1/2th of that and if you are talking about Microsoft, Uber, Google etc., then that’s the salary of a senior dev.

https://www.levels.fyi/t/software-engineer/locations/greater...

But more importantly, we’re all pretending, the only cost of building anything is salaries. A company that size could blow a million dollars a month just on AWS, and the AI stuff is waaaay more expensive.

aprilthird2021

No, it's not

bigfatkitten

Only if they’re all ex FAANG staff/principal.

paxys

They have been operating since 2016. Companies can and have burned through $450M in funding a hell of a lot faster than that.

OpenAI is on track to spend $14 billion this year.

null

[deleted]

monksy

The Chai budget is completely justifiable expense. (Probably more so than the difference being run away with)

null

[deleted]

TrackerFF

These kind of scumbags pocket 90% of the cash.

Wouldn't surprise me if the developers were hired from sweatshop staffing agencies, or just working directly for minimum wage - if that even.

more_corn

[flagged]

dang

Please don't do this here.

pryelluw

$400M!

I get $100M. Maybe even $200M.

But $400M?

Unforgivable.

nadermx

You figure 700 employees. 400m. Avg cost per hooker can't be more than a few hundred.

So by this math each employee got 1,900ish hookers. Since i figure male hookers for the female employees where cheaper well round up to 2,000.

That is in fact unforgivable. 1,000 would of been acceptable. 2,000... just excess

pyman

Elon Musk spent $6 billion training his model. Sam Altman spent $40 billion. Where did Builder AI's $500 million go? Probably into building a foundation model, not even a full LLM.

1oooqooq

shhhh. we don't talk about the ongoing scams. those you keep hyping and try to sell your SaaS around it.

dang

Recent and related:

Microsoft-backed UK tech unicorn Builder.ai collapses into insolvency - https://news.ycombinator.com/item?id=44080640 - May 2025 (136 comments)

Ancalagon

Really weird considering how much AI is actually available now

wongarsu

If you have an idea for a cool AI startup it's faster to build your first prototype without the actual AI, just faking that part. But if your Actual Indians had 95% accuracy and you can't get an AI to do more than 85% then you are kind of stuck if you raised money and got customers pretending that your Actual Indians are Artificial Intelligence.

TYPE_FASTER

This is the way. Funny how AI could also stand for Actual Intelligence. Or, Artisanal Intelligence? "Now 100% organic handcrafted thoughts, unique for your business problem."

cubano

[flagged]

more_corn

Not true it’s super easy to fine tune and deploy one of the open models. I should teach a course.

msgodel

The technical aspects of training and tuning are trivial. GP is pointing out that you might not be able to get the model to succeed at the task as often for any number of reasons that you won't know before you actually train one.

Although I guess your point is that it's also cheap to train them, probably cheaper than doing this. But startups are started by social people, not technical people. Stuff like this will always be expensive for social people since they have to pay one of us to do it. YC interviews their CEOs from time to time, it's really clear that's how that works.

mrweasel

Also it can't have been fast. Didn't customers and investors feel that it was weird that CoPilot spits out code as fast as you can type, but Builder.ai needs days or weeks to generate your app. Or where these Indian developers just really really fast?

givemeethekeys

Maybe they use GPT :)

helloplanets

There's this one super secret agentic framework that beats all the benchmarks...

hyperadvanced

Available sure, but cost effective? My guess is that they tried a lot of things to get ChatGPT to work and burnt out of money before it got cheap enough to fit with a reliable business model. Early but not wrong, I guess.

immibis

Almost like it doesn't work as well as they market it as working?

klipt

Not all companies are equal.

At the same time that Tesla was making actual electric cars, Nikola was rolling fake "electric trucks" downhill.

Grifters exist, but not everyone is a grifter.

mjmsmith

Not sure Tesla is the poster child for non-grifters in the context of AI.

jampekka

Coast-to-coast self-driving Teslas were promised by 2017. And have been promised next year almost every year since.

Tesla can make electric vehicles but the company valuagion is based on grift.

Supermancho

Capitalism rewards dishonesty. Every company is a grifter to some degree. This is more widespread in technical service companies.

a_void_sky

Nobody has mentioned that they were reselling the AWS credits they had. We had them as our billing partner with very good discounts. The day it happened, AWS sent us a mail to remove them as our billing partner.

moonikakiss

I did due-diligence on Builder.AI for a venture firm I was interning at (circa 2019). It was extremely apparent (Glassdoor, talking to any employee) it was complete BS.

When I say apparent, it took less than 15 minutes and a couple of google searches to get a sniff of it.

Somehow, you can still raise $500MM ++.

I think about that a lot

aprilthird2021

You have to elaborate! What were the signs? When you did due diligence what were you told about the company? Was the marketing or premise itself fishy or you only realized it was fraudulent after starting the due diligence?

null

[deleted]