Skip to content(if available)orjump to list(if available)

Worlds first petahertz transistor at ambient conditions

kragen

https://www.nature.com/articles/s41467-025-59675-5 is the paper, claiming "~1.6 petahertz speed." That would be 190-nanometer wavelength, which is into the so-called "far ultraviolet" band of germicidal UVC, 6.6eV photon energy, on the edge of vacuum ultraviolet. And they're switching it with light. So, I wonder how long these devices will last if you keep using them?

They say the light pulse is 6.5fs FWHM, so they weren't able to switch it on and off 1.6 quadrillion times per second; it's just that the transition from on (29nA) to off (<1nA) was only 630 attoseconds long, which is what they're describing as "petahertz". But "petahertz" implies a whole cycle time under a femtosecond, a cycle which would involve two transitions, which would presumably be 1.26 femtoseconds at this speed. (If they measured the speed of the off-to-on transition, I missed it skimming the paper.) And the actual light they're making the 6.5fs pulse out of is a "supercontinuum laser beam that spans over 400–1000 nm". That's still blue enough to raise some concerns about device longevity (though maybe graphene will prove tougher than certain other semiconductors which shall not be named here), but not to the same degree as if they were using actual petahertz light.

I think the 2× exaggeration is sort of forgivable, and nothing else seems to be intentionally misleading, but it would still be easy to draw incorrect conclusions from the headline.

Aurornis

I’m also confused about this. In EE it’s normal to use rise times to calculate bandwidth, but unless I’m missing something they didn’t do that correctly either.

It would be such a strange mistake to occur on a paper about a topic of this caliber that I feel like I must be missing something.

kragen

I suspect that maybe the rise time was much slower than the fall time, so it was the fall time they were excited about. But yeah, I'd think a 630-attosecond fall time represents 500–800 terahertz of bandwidth, not a petahertz or more.

kerblang

For those playing subsecond bingo at home, wikipedia reference table

https://en.wikipedia.org/wiki/Orders_of_magnitude_(time)#Les...

gsf_emergency

My read on the social context: they want to make a VUV/EUV solid state laser (for you-know-what). But they can't make any publishable headway in that direction, so they hype their results with the impressive sounding "petahertz" (terahertz desktop lasers at ambient conditions are mostly already in the bag)

So, ask a trusted peer whether this is a something-burger or an attempted academic pivot. Since life is too short to spend on science you can't call your own: look at invites from ASML/TSMC/SMIC. That's what happened with Sn vapor lasers-- famously fickle things. The company that first got it to work properly (Cymer) isn't even on the relevant wikipedia page? And on the ASML they are associated with DUV? Really pushing the good faith here I think. I first paid attention thanks to interest from ASML.

(Disclaimer: don't know any experimentalists working in the exact domain)

E: sn vapor (laser-pumped) light source

kragen

Hmm, why do you think that's what they were trying to do?

ChuckMcM

This is some great research, the paper is here: https://www.nature.com/articles/s41467-025-59675-5.pdf and there are two things that stand out in it, the first is that they used a "commercial graphene transistor" and the second is that their apparatus didn't need to be super-cooled or under tens of atmospheres of pressure or in vacuum etc. For me, that means that the risks of bringing this into an actual thing are much less than they have been for other technologies (like Josephson-Junctions).

It's also kind of funny that you could mine the shit out of Bitcoin with something like this, which would either pay for itself or crash Bitcoin, hard to predict.

dgfl

This is an optical transistor, meaning that a current is controlled with an optical pulse. That means that you can't pipe these things into each other, unless you can build an equivalently fast and efficient light->charge transducer (i.e. a photodetector). Moreover, this physically can't be scaled below approximately the wavelength of the laser (meaning at least 10x larger than CMOS transistors).

It might turn out to be great for the applications that they point out in the paper itself, not so much for logic. I would say bitcoin mining discussions are a bit premature, and potentially not relevant.

programjames

> That means that you can't pipe these things into each other, unless you can build an equivalently fast and efficient light->charge transducer (i.e. a photodetector).

These exist:

https://ultrafast.mit.edu/

knome

>(meaning at least 10x larger than CMOS transistors)

at petahertz (10^15) speeds, you could sacrifice a lot of space for larger components, and still come out on top vs gigahertz speeds (10^9) by doing more work but a hell of a lot faster, no?

if you can build a chip that's a million times faster, you can sacrifice 3/4 of that speed to doing more work with fewer components and still be 250,000x faster.

formerly_proven

No, because propagation delay is the same.

hulitu

> This is an optical transistor, meaning that a current is controlled with an optical pulse.

So more like an optical triode (the transistor apnplifies).

thephyber

Regarding BTC, just because a single transistor could run faster doesn’t mean it’s necessarily a good candidate for BTC mining (eg. Most mass miners use ASICs, not the fastest CPUs because they care about the fastest way to check LOTS of candidate hashes in parallel, not a single one at an extremely high rate). A light based transistor would either require all of the rest of the system run on light or it would have to be mated to silicon/electrical hardware which would slow down the clock rate of the system.

BTC Mining adjusts based on difficulty. IIRC (I haven’t looked into the specifics in a few years) the protocol will adjust so a block should be mined every 10 mins or so. If there is a massive leap in hardware capabilities, the protocol automatically throws A LOT more difficulty at the mining problem.

One way to crash BTC using a massive hardware advance: dominate the mining for ~ 2 weeks, long enough until until the mining difficulty is adjusted upwards and stop mining. Much higher difficulty + only older hardware doing the work = very slowly processed blocks and a growing backlog of transactions. Eventually the protocol would reduce the difficulty to match higher block transaction rates, but the attack could be maintained by thrashing the miner on/off just after the difficulty is adjusted upwards/down.

null

[deleted]

HPsquared

~95% of bitcoins have already been mined, market cap is $2T, so you'd expect another $100B to go (assuming no price change).

wslh

Mined bitcoins are distinct from the mining process itself: even after all bitcoins have been mined, miners will continue to operate and earn transaction fees for validating and securing the network.

bee_rider

I guess we’ll see… I mean, bitcoin transactions will have to be valuable enough that we’re willing to pay extra (vs credit or debit card transactions) to maintain that network, right?

virgilp

[flagged]

null

[deleted]

null

[deleted]

Someone

> It's also kind of funny that you could mine the shit out of Bitcoin with something like this, which would either pay for itself or crash Bitcoin, hard to predict.

Bitcoin has a built-in mechanism to counteract improvements in hashing speed (either because of hardware getting faster, algorithmic improvements, or more hardware getting devoted to hashing).

See https://en.wikipedia.org/wiki/Bitcoin#Mining: “The difficulty of generating a block is deterministically adjusted based on the mining power on the network by changing the difficulty target, which is recalibrated every 2,016 blocks (approximately two weeks) to maintain an average time of ten minutes between new blocks“

I think there’s more than enough range available here to handle a million-fold increase in hashing power.

inhumantsar

2016 blocks is a lot though. that's nearly $700 million in mining fees.

if someone had a monopoly on chips like these, they could dominate the network and freeze out other miners. which would likely tank the network and make those BTC worthless

Someone

That is a risk but not a new risk. It existed with the transition to ASIC miners, too, and didn’t happen.

I guess that didn’t happen because making ASICs is relatively easy, but even if it isn’t, what would be in it for a potential monopolist to tank the value of bitcoin in that way? They better take a large but not overly large part of the market and keep mining money for a long time, only speeding up when a competitor steps in.

Devasta

So if you had a machine that could solve blocks at a tremendous pace, could you rapidly go through 2016 blocks with minimal transactions processed, then just switch your machine off?

The difficulty will have gone through the roof and transaction processing time by the rest of the network will then slow to a crawl.

TZubiri

Even with difficulty adjustments you could mine a shit ton, you would basically be mining all of the blocks every 10 minutes for yourself. And if the advantage is significant you could do a 51% attack.

stretchwithme

Or maybe just take ownership of Bitcoin nobody can access, potentially much more profitable.

Just don't sell it all at once.

gosub100

Isn't that assuming you could pack enough of these PHz transistors to make an asic capable of calculating SHA-256? That's quite an endeavor if they have just created one.

Has anyone even made a flip-flop or latch with any optical transistor yet?

jlokier

Since you asked, yes, optical flip-flops have been around for decades.

That said, you don't need flip-flops or latches to calculate SHA-256 for mining Bitcoins. You only need them at the edges of the circuit, to use the results. But you can do that with electronics at the edge, if you want to avoid stateful logic in the all-optical part.

ziofill

In 630as light travels half a micron. If that's the clock cycle, a chip would need to do some amazing coordination for bits to reach gates at the same time, and there would be many many cycles before a signal reaches the other side of the chip. Bonkers.

mjrpes

> A study published in Nature Communications highlights how the technique could lead to processing speeds in the petahertz range – over 1,000 times faster than modern computer chips.

A 1 petahertz chip would be 200,000 faster than a 5 gigahertz chip. You've skipped past the terahertz range.

booli

This seems very huge, or am I missing something fundamental that's not included in the paper?

misja111

Yes, the fact that contrary to what the title claims, at this point there is no transistor working at petaherz frequency at all. All there is, is a promising new technology.

amelius

Maybe they should have mentioned that interconnect on a chip cannot handle these speeds.

dgfl

This is a laser controlled device. Even the terminology of "interconnect" is not really applicable. Your best hope is an optical waveguide coupled to the device, definitely not a metal line. It's not even a transistor in the traditional sense really.

parsimo2010

This has limited applications. It doesn't have a viable path to being used in a CPU or GPU. So we're not going to see a zillion-fold increase in compute speeds from this. Maybe some physicists find it useful for an experiment, but the average joe won't notice anything different about the world.

manmal

I thought that the speed of light limits the max possible frequency to the sub THz range, at current chip sizes.

bjourne

So rather than electrons flowing through regular transistors you would have photons flowing through phototransistors? Wouldn't one problem be casting light rays that with widths in the nano or picometer range?

IAmBroom

It's not clear at all what path the photons are taking. I read it at first as them travelling as a standing wave, blocking the electrons until the transistor "flips".

If the path of the photons is indeed transverse to the flow of charge, millions of transistors could share a single wavefront.

AlienRobot

I used to think electrons flow through circuits, then I learned they actually move extremely slowly, so when you flip a switch there is no way an electron made it all the way to the lamp and back. So now I assume the energy of an electron is transferred to the rest through the electromagnetic field in the circuit or something like that? Honestly, I don't have the slightest idea how any of this works and what is true and what is false anymore. It's not really relevant to computers and yet it's all very fascinating.

GuB-42

Petahertz?

It makes me raise so many questions. 1 PHz corresponds to a wavelength of 300nm, UV light. How does it make sense? It can't be the transistors we are used to, that's all quantum weirdness at this point. How do you even use them? Things like copper wires feel meaningless at these scales.

null

[deleted]

amelius

They're aiming a bit high. I'm ok with a terahertz CPU for the coming years.

null

[deleted]

AnimalMuppet

At that clock rate, propagation delays are going to be a severe issue.

null

[deleted]

manmal

The speed of light is a hard limit, the only way to make use of this switching speed is to make the chip infinitesimally small.

actinium226

I can't wait to watch cat videos at petahertz speed.

mkoryak

Just don't watch too many or you will experience the catahurtz speed.

stretchwithme

Moore's Law ain't over til it's over.