Skip to content(if available)orjump to list(if available)

Google AI Ultra

Google AI Ultra

209 comments

·May 20, 2025

charles_f

This is the kind of pricing that I expect most AI companies are gonna try to push for, and it might get even more expensive with time. When you see the delta between what's currently being burnt by OpenAI and what they bring home, the sweet point is going to be hard to find.

Whether you find that you get $250 worth out of that subscription is going to be the big question

Ancapistani

I agree, and the problem is that "value" != "utilization".

It costs the provider the same whether the user is asking for advice on changing a recipe or building a comprehensive project plan for a major software product - but the latter provides much more value than the former.

How can you extract an optimal price from the high-value use cases without making it prohibitively expensive for the low-value ones?

Worse, the "low-value" use cases likely influence public perception a great deal. If you drive the general public off your platform in an attempt to extract value from the professionals, your platform may never grow to the point that the professionals hear about it in the first place.

garrickvanburen

this is the problem Google search originally had.

They successfully solved it with an advertising....and they also had the ability to cache results.

jsheard

I wonder who will be the first to bite the bullet and try charging different rates for LLM inference depending on whether it's for commercial purposes. Enforcement would be a nightmare but they'd probably try to throw AI at that as well, successfully or not.

chis

I think there are always creative ways to differentiate the two tiers for those who care.

“Free tier users relinquish all rights to their (anonymized) queries, which may be used for training purposes. Enterprise tier, for $200/mo, guarantees queries can only be seen by the user”

EasyMark

I feel prices will come down a lot for "viable" AI, not everyone needs the latest and greatest at rock-bottom prices. Assuming AGI is just a pipe-dream with LLMs as I suspect.

Wowfunhappy

> When you see the delta between what's currently being burnt by OpenAI and what they bring home, the sweet point is going to be hard to find.

Moore's law should help as well, shouldn't it? GPUs will keep getting cheaper.

Unless the models also get more GPU hungry, but 2025-level performance, at least, shouldn't get more expensive.

godelski

Not necessarily. The prevailing paradigm is that performance scales with size (of data and compute power).

Of course, this is observably false as we have a long list of smaller models that require fewer resources to train and/or deploy with equal or better performance than larger ones. That's without using distillation, reduced precision/quantization, pruning, or similar techniques[0].

The real thing we need is more investment into reducing computational resources to train and deploy models and to do model optimization (best example being Llama CPP). I can tell you from personal experience that there is much lower interest in this type of research and I've seen plenty of works rejected because "why train a small model when you can just tune a large one?" or "does this scale?"[1] I'd also argue that this is important because there's not infinite data nor compute.

[0] https://arxiv.org/abs/2407.05694

[1] Those works will out perform the larger models. The question is good, but this creates a barrier to funding. Costs a lot to test at scale, you can't get funding if you don't have good evidence, and it often won't be considered evidence if it isn't published. There's always more questions, every work is limited, but smaller compute works have higher bars than big compute works.

jorvi

Small models will get really hot once they start hitting good accuracy & speed on 16GB phones and laptops.

dvt

> Moore's law should help as well, shouldn't it? GPUs will keep getting cheaper.

Maybe I'm misremembering, but I thought Moore's law doesn't apply to GPUs?

morkalork

Costs more than seats for Office 365, Salesforce and many productivity tools. I don't see management gleefully running to give access to whole departments. But then again, if you could drop headcount by just 1 on a team by giving it to the rest, you probably come out ahead.

Papazsazsa

I just tried it.

8 out of 10 attempts failed to produce audio, and of those only 1 didn't suck.

I suppose that's normal(?) but I won't be paying this much monthly if the results aren't better, or at least I'd expect some sort of refund mechanism.

CSMastermind

I pay for OpenAI Pro but this is a clear no for me. I just don't get enough value out of Gemini to justify a bump from $20 / month to $250.

If they really want to win they should undercut OpenAI and convince people to switch. For $100 / month I'd downgrade my OpenAI Pro subscription and switch to Gemini Ultra.

radicality

It does look like it comes with a few other perks that would normally cost a bunch too, specifically, 30TB of Google drive storage

J_Shelby_J

If they really want to win, they should make a competitor for O1-pro. It’s worth $200 to reduce LLM babysitting needs by %10.

mvdtnz

Perhaps they're not interested in beating OpenAI in the business of selling $1 for $0.50.

OtherShrezzing

The global average salary is somewhere in the region of $1500.

There’s lots of people and companies out there with $250 to spend on these subscriptions per seat, but on a global scale (where Google operates), these are pretty niche markets being targeted. That doesn’t align well with the multiple trillions of dollars in increased market cap we’ve seen over the last few years at Google, Nvda, MS etc.

paxys

New technology always starts off available to the elite and then slowly makes its way down to everyone. AI is no different.

dimitrios1

This is one of those assumed truisms that turns out to be false upon close scrutiny, and there's a bit of survivorship bias in the sense that we tend to look at the technologies that had mass appeal and market forces to make them cheaper and available to all. But theres tons of new tech thats effectively unobtainable to the vast majority of populations, heck even nation states. With the current prohibitive costs (in terms of processing power, energy costs, data center costs) to train these next generation models, and the walled gardens that have been erected, there's no reason to believe the good stuff is going to get cheaper anytime soon, in my opinion.

paxys

> turns out to be false upon close scrutiny

Care to share that scrutiny?

Computers, internet, cell phones, smartphones, cameras, long distance communication, GPS, televisions, radios, refrigerators, cars, air travel, light bulbs, guns, books. Go back as far as you want and this still holds true. You think the the majority of the planet could afford any of these on day 1?

sxg

I disagree. There are massive fixed costs to developing LLMs that are best amortized over a massive number of users. So there's an incentive to make the cost as cheap as possible and LLMs more accessible to recoup those fixed costs.

Yes, there are also high variable costs involved, so there’s also a floor to how cheap they can get today. However, hardware will continue to get cheaper and more powerful while users can still massively benefit from the current generation of LLMs. So it is possible for these products to become overall cheaper and more accessible using low-end future hardware with current generation LLMs. I think Llama 4 running on a future RTX 7060 in 2029 could be served at a pretty low cost while still providing a ton of value for most users.

TulliusCicero

Yeah, GP is overextending by saying it's always true.

The more basic assertion would be: something being expensive doesn't mean it can't be cheap later, as many popular and affordable consumer products today started out very expensive.

pier25

Do you have a source for the $1500 number? Seems pretty high.

ComplexSystems

The problem with all of these is that SOTA models keep changing. I thought about getting OpenAI's Pro subscription, and then Gemini flew ahead and was free. If I get this then sooner or later OpenAI or Anthropic will be back on top.

Ancapistani

I wonder if there's an opportunity here to abstract away these subscription costs and offer a consistent interface and experience?

For example - what if someone were to start a company around a fork of LiteLLM? https://litellm.ai/

LiteLLM, out of the box, lets you create a number of virtual API keys. Each key can be assigned to a user or a team, and can be granted access to one or more models (and their associated keys). Models are configured globally, but can have an arbitrary number of "real" and "virtual" keys.

Then you could sell access to a host of primary providers - OpenAI, Google, Anthropic, Groq, Grok, etc. - through a single API endpoint and key. Users could switch between them by changing a line in a config file or choosing a model from a dropdown, depending on their interface.

Assuming you're able to build a reasonable userbase, presumably you could then contract directly with providers for wholesale API usage. Pricing would be tricky, as part of your value prop would be abstracting away marginal costs, but I strongly suspect that very few people are actually consuming the full API quotas on these $200+ plans. Those that are are likely to be working directly with the providers to reduce both cost and latency, too.

The other value you could offer is consistency. Your engineering team's core mission would be providing a consistent wrapper for all of these models - translating between OpenAI-compatible, Llama-style, and Claude-style APIs on the fly.

Is there already a company doing this? If not, do you think this is a good or bad idea?

wild_egg

Isn't that https://openrouter.ai? Or do you have something different in mind?

Ancapistani

I haven't seen this, but it looks like it solves at least half of what I was thinking.

I'll investigate. Thanks!

planetpluta

I think the biggest hurdle would be complying with the TOS. Imagine that OpenAI etc would not be a fan of sharing quotas across individuals in this way

Ancapistani

How does it differ from pretty much every SaaS app that's using OpenAI today?

mrnzc

I think what Langdock (YC-backed, https://www.langdock.com) offers might be matching to your proposal?!

Ancapistani

Looks like this is at least the unified provider. I'll dig in - thanks :)

SirensOfTitan

This is even the case with Gemini:

The Gemini 2.5 Pro 05/06 release by Google’s own reported benchmarks was worse in 10/12 cases than the 3/25 version. Google re routed all traffic for the 3/25 checkpoint to the 05/06 version in the API.

I’m also unsure who needs all of these expanded quotas because the old Gemini subscription had higher quotas than I could ever anticipate using.

magicalist

> I’m also unsure who needs all of these expanded quotas because the old Gemini subscription had higher quotas than I could ever anticipate using.

"Google AI Ultra" is a consumer offering though, there's no API to have quotas for?

UncleOxidant

You can just surf between Gemini, DeepSeek, Qwen, etc. using them for free. I can't see paying for any AI subscription at this point as the free models out there are quite good and are updated every few months (at least).

pc86

I am willing to pay for up to 2 models at a time but I am constantly swapping subscriptions around. I think I'd started and cancelled GPT and Claude subscriptions at least 3-4 times each.

xnx

> If I get this then sooner or later OpenAI or Anthropic will be back on top.

The Gemini subscription is monthly, so not too much lock-in if you want to change later.

airstrike

This 100%. Unless you are building a product around the latest models and absolutely must squeeze the latest available oomph, it's more advantageous to just wait a little bit.

julianpye

Why do people keep on saying that corporations will pay these price-tags? Most corporations really keep a very tight lid on their software license costs. A $250 license will be only provided for individuals with very high justification barriers and the resulting envy effects will be a horror for HR. I think it will be rather individuals who will be paying out of their pocket and boosting their internal results. And outside of those areas in California where apples cost $5 in the supermarket I don't see many individuals capable of paying these rates.

verdverm

We just signed up to spend $60+/month for every dev to have access to Copilot because the ROI is there. If $250/month save several hours per month for a person, it makes financial sense

tacker2000

How are you measuring this? How do you know it is paying off?

afroboy

And why AI hype train didn't work on gaming industry? why it didn't save hundreds of hours from game devs times to get latest GTA anytime sooner?

I'm not sure it's correct that we need to measure the benefits of AI depending on the lines of codes that we wrote but on how much we ship more quality features faster.

julianpye

Okay, but you're in a S/W team in a corp, where everyone's main task is to code. A coding agent has clear benefits here.

This is not the usecase of AI Ultra.

delusional

We signed up for that too. 2 quaters later the will to pay is significantly lower.

troupo

Corps will likely negotiate bulk pricing and discounts, with extra layers of guarantees like "don't use and share our data" on top

bryanlarsen

"AI will make us X% more productive. 100%-X% of you are fired, the rest get a $250/month license".

kulahan

I don’t see any benefit to removing humans in order to achieve the exact same level of efficiency… wouldn’t that just straight-up guarantee a worse product unless your employees were absolutely all horrendous to begin with?

johnisgood

They are running out of ideas for names. What next, Google AI Ultra Max Pro?

Keyframe

Hmm, interesting. There's basically no information what makes Ultra worth that much money in concrete terms except "more quota". One interesting tidbid I've noticed is that it seems Google One (or what is it called now) also carries sub for youtube. So far, I'm still on "old" Google One for my family and myself storage and have a separate youtube subscription for the same. I still haven't seen a clear upgrade path, or even a discount based on how much I have left from the old subscription, if I ever choose to do so (why?).

edit: also google AI Ultra links leads to AI Pro and there's no Ultra to choose from. GG Google, as always with their "launches".

flakiness

I believe Imagen 4 and Veo 3 (the newest image/video models) and the "deep think" variant are for Ultra only. (Is it worth it? It's a different question.)

skybrian

I just tried it and Whisk seems to be using Imagen 4 and and Veo 2 when used without a subscription.

adverbly

Price point here is a bit too high... They have bundled so many things together into this that the sticker shock on the price is too much.

I get what they're trying to do but if they were serious about this they would include some other small subscriptions as well... I should get some number of free movies on YouTube per month, I should be able to cancel a bunch of my other subscriptions... I should get free data with this or a free phone or something... I could see some value if I could actually just have one subscription but I'm not going to spend $250 a month on just another subscription to add to the pile...

ehsankia

They put anything that makes sense. I don't know if including random movies makes sense.

They got Youtube Premium which is like 15$. 30TB of storage, a bit excessive and no equivalent but 20TB is around 100$ a month.

highwaylights

I’m not seeing the relevance of YouTube and the One services to this at all.

I get that Big Tech loves to try to pull you into their orbit whenever you use one of their services, but this risks alienating customers who won’t use those unrelated services and may begrudge Google making them pay for them.

j_maffe

Idk if anyone will see these offerings more than just an added bonus, especially when you compare to OAI which asks for more for only the AI models.

bezier-curve

For $250/mo I would hope it includes API access to Gemini 2.5 pro, but it's nice to want things.

pc86

As a consumer it seems to me the low hanging fruit for these super-premium offerings is some substantial amount of API credits included every month. Back when API credits were a relatively new thing I used LLMs infrequently enough I just paid $5-10/mo for API credits and used a desktop UI to talk to ChatGPT.

Now they want $200, $250/mo which is borderline offensive, and you have to pay for any API use on top of that?

highwaylights

I can’t see a way that anyone would be able to give uncapped access to these models for a fixed price (unless you mean it’s scoped to your own use and not for company use? Even then, that’s still a risk to the provider.)

bezier-curve

I use Msty a lot for personal use. I like its ability to fork conversations. Seems like a simple feature but even ChatGPT's UI, which everyone has tried to copy, is fairly limited by comparison.

Aurornis

Putting API use into the monthly plans doesn't make a lot of business sense. The only people who would sign up specifically to use API requests on a monthly plan would be looking to have a lower overall bill, then they'd pay-per-request after that. It would be a net loss.

ir77

people here keep saying that this is targeted at big companies/corporations. the big company that i work for explicitly block uploads of data to these services and we're forbidden to put anything company related in there for many reasons, even if you use your own account, we don't have 'company accounts'.

so no, i can't see companies getting all excited about buying $250mo/user licenses for their employees for google or chatgpt to suck in their proprietary data.

verdverm

These subscriptions explicitly do not suck in your proprietary data, it's all laid out in their ToS.

quantumHazer

Yeah, and who will make them accountable? How can you verify that they’re noteworthy stealing your data anyways? This companies don’t give a shit about copyright or privacy.

spaceman_2020

Honestly, at this point, nation states will have to figure out an AI strategy. Poorer countries where the locals can't afford cutting edge AI tools will find themselves outproduced by wealthier workers.