Skip to content(if available)orjump to list(if available)

How AGI became the most consequential conspiracy theory of our time

7734128

Who tolerates a website which immediately pushes three overlapping pop-ups in free user's face?

Why would anyone subject themselves to so much hatred? Have some standards.

ceejayoz

Who raw-dogs the internet without an adblocker? Have some standards.

mort96

I use uBlock Origin (the full fat version in Firefox, not the lite version). It doesn't help, because the pop-ups aren't ads. There's one asking me if I wanna be spied on, one asking me to subscribe or sign in, and one huge one telling me that there's currently a discount on subscriptions.

ceejayoz

I've got uBlock Origin on Firefox desktop too, and none of those show. Turn on more of the filter lists in the settings - especially the stuff in the "Cookie notices" and "Annoyances" sections.

erikerikson

Rumor has it that some people saw the "ads pay for it all" business model and accepted the deal because they wanted the Internet to be sustainable.

ceejayoz

I mean, that's a two-sided deal. "You watch ads, you read content". But that deal has been more and more broken by the ad networks and websites; a lot of sites are unnavigable without an adblocker now.

The days of plain text Google AdWords are long, long gone.

Workaccount2

The people making it still worthwhile to post content online.

krupan

You should probably ask an AI to read it and summarize it for you.

hagbard_c

Only those who made the mistake of not using s6 content filter like uBlock Origin or something equally effective. I just visited the site and got neither pop-ups nor ads.

TheAceOfHearts

> At the core of most definitions you’ll find the idea of a machine that can match humans on a wide range of cognitive tasks.

I expect this definition will be proven incorrect eventually. This definition would best be described as a "human level AGI", rather than AGI. AGI is a system that matches a core set of properties, but it's not necessarily tied to capabilities. Theoretically one could create a very small resource-limited AGI. The amount of computational resources available to the AGI will probably be one of the factors what determines whether it's e.g. cat level vs human level.

null

[deleted]

everdrive

People are interested in consciousness much the same way that we see faces in the clouds. We just think we're going to find it everywhere: weather patterns, mountains, computers, robots, in outer space, etc.

If we were dogs, we'd invent a basic computer and start writing scifi films about whether the computers could secretly smell things. We'd ask "what does the sun smell like?"

Terr_

Very little I disagree with there, so just nibbling at the edges.

> a scheme that’s flexible enough to sustain belief even when things don’t work out as planned; the promise of a better future that can be realized only if believers uncover hidden truths; and a hope for salvation from the horrors of this world.

Sometimes 90% of the "hidden truths" are things already "known" by the believers, an elite knowledge that sets them apart from the sheeple. The remaining 10% is acquiring some McGuffin that finally proves they were Right-All-Along so that they can take a victory lap.

> Superintelligence is the hot new flavor—AGI but better!—introduced as talk of AGI becomes commonplace.

In turn, AGI was the hot new flavor—AI but better!—introduced as consumers started getting disappointed/jaded with the limits of what was available.

> When those people are not shilling for utopia, they’re saving us from hell.

Yeah, much like how hatred is not really the opposite of love, the "AI doom" folks are really just a side-sect of the "AI awesome" folks.

> But what if there are, in fact, shadowy puppet masters here—and they’re the very people who have pushed the AGI conspiracy hardest all along? The kings of Silicon Valley are throwing everything they can get at building AGI for profit. The myth of AGI serves their interests more than anybody else’s.

Yes, the economic engine behind all this, the potential to make money, is what really supercharges everything and lifts it out of niche communities.

foxfired

I remember when ChatGPT 3.5 was going to be AGI, then 4, then 4o, etc. It's kinda like the dooms day predictions, even if they fail it's ok. Because the next one though, oh that's the real doomsday. I, for one, am waiting for a true AI Scotsman [0].

[0]: https://idiallo.com/byte-size/ai-scotsman

retube

I had to click 5 (yes 5, I counted) pop up overlays to get to the article (including 2 cookie ones, cos I guess the usual one is not infuriating enough)

hgomersall

I just clicked on the article mode on Firefox. Worked perfectly.

Krasnol

> Ilya Sutskever, cofounder and former chief scientist at OpenAI, is said to have led chants of “Feel the AGI!” at team meetings.

There is...chanting in team meetings in the US?

Has this been going for long now or is this some new trend picked up in Asia or something like that?

teeray

[delayed]

fabian2k

I don't think that is new. Back when Walmart tried to expand to Germany it was reported that they had employees do some Walmart chant. As you can guess this didn't go over well with German employees.

aomix

I was coming in with the Walmart example too. The onboarding meeting told us he overheard it at a Korean manufacturer and liked it.

Krasnol

Yeah I heard that too but I assumed this is just a thing in this sector not something actually high paid employees have to participate in.

uvaursi

FEEL THE AGI.

This is a meme that will keep on giving.

Copenjin

Like conspiracies, it works only on the most fragile and on people that already have an adjacent set of beliefs. AGI/ASI is all bullshit narratives but yeah, we have useful artifacts that will get better even if they never become A*I.

cocomutator

today is the day i stopped reading opinion pieces from the technologyreview. not for presenting an opinion i don’t agree with, but for mistaking word soup for an argument.

boothby

There are two competing figures in motion: human intelligence and computer intelligence. Either can win by sufficient reduction of the other.

jahewson

I very much dislike the way this article blurs religious and doomsday thinking with conspiracy theory thinking. There’s nobody conspiring on the other side of AGI. Other than that it makes many good observations.

FridayoLeary

> And there it is: You can’t prove it’s not true. “The idea that AGI is coming and that it’s right around the corner and that it’s inevitable has licensed a great many departures from reality,” says the University of Edinburgh’s Vallor. “But we really don’t have any evidence for it.”

That's the most important paragraph in the article. All of the self serving excessive exaggerations of Sam Altman and his ilk, predicting things and throwing out figures they cannot possibly know. "ai will cure cancer, and demetia! And reverse global warming! Just give more money to my company which is a non profit and is working for the good of humanity. What is that? Do you mean to say you don't care about the good of humanity?" What is the word for such behaviour? It's not hubris, it's a combination of wild prophecy and severe main character syndrome.

I heard once, though i have no idea if it's true that he claims he carries a remote control around with him to nuke his data centres if they ever start trying to kill everyone. Which is obviously nonsense but is exactly the kind of thing he might say.

In the meantime they're making loads of money by claiming expertise in a field which doesn't even exist and in my opinion never will, and that's the main thing i suppose.

Krasnol

> I heard once, though i have no idea if it's true that he claims he carries a remote control around with him to nuke his data centres if they ever start trying to kill everyone.

That would be quite useless even if it exists since now that you said it, the AGISGIAIsomething will surely know about it and take appropriate measures!

FridayoLeary

Oh no! Someone better phone up Sam Altman and warn him of my terrible blunder. I would hate to be the one responsible for the destruction of the entire universe.