Skip to content(if available)orjump to list(if available)

AV1 – Now Powering 30% of Netflix Streaming

crazygringo

Wow. To me, the big news here is that ~30% of devices now support AV1 hardware decoding. The article lists a bunch of examples of devices that have gained it in the past few years. I had no idea it was getting that popular -- fantastic news!

So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

JoshTriplett

> So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

Hopefully AV2.

jsheard

H266/VVC has a five year head-start over AV2, so probably that first unless hardware vendors decide to skip it entirely. The final AV2 spec is due this year, so any day now, but it'll take a while to make it's way into hardware.

adgjlsfhk1

H266 is getting fully skipped (except possibly by Apple). The licensing is even worse than H265, the gains are smaller, and Google+Netflix have basically guaranteed that they won't use it (in favor of AV1 and AV2 when ready).

adzm

VVC is pretty much a dead end at this point. Hardly anyone is using it; it's benefits over AV1 are extremely minimal and no one wants the royalty headache. Basically everyone learned their lesson with HEVC.

kevincox

If it has a five year start and we've seen almost zero hardware shipping that is a pretty bad sign.

IIRC AV1 decoding hardware started shipping within a year of the bitstream being finalized. (Encoding took quite a bit longer but that is pretty reasonable)

dehrmann

Not trolling, but I'd bet something that's augmented with generative AI. Not to the level of describing scenes with words, but context-aware interpolation.

km3r

https://blogs.nvidia.com/blog/rtx-video-super-resolution/

We already have some of the stepping stones for this. But honestly much better for upscaling poor quality streams vs just gives things a weird feeling when it is a better quality stream.

randall

for sure. macroblock hinting seems like a good place for research.

dylan604

how does that mean "~30% of devices now support AV1 hardware encoding"? I'm guessing you meant hardware decoding???

crazygringo

Whoops, thanks. Fixed.

snvzz

>So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support

That'd be h264 (associated patents expired in most of the world), vp9 and av1.

h265 aka HEVC is less common due to dodgy, abusive licensing. Some vendors even disable it with drivers despite hardware support because it is nothing but legal trouble.

IgorPartola

Amazing. Proprietary video codecs need to not be the default and this is huge validation for AV1 as a production-ready codec.

bofaGuy

Netflix has been the worst performing and lowest quality video stream of any of the streaming services. Fuzzy video, lots of visual noise and artifacts. Just plan bad and this is on the 4k plan on 1GB fiber on a 4k Apple TV. I can literally tell when someone is watching Netflix without knowing because it looks like shit.

Eduard

I'm surprised AV1 usage is only at 30%. Is AV1 so demanding that Netflix clients without AV1 hardware acceleration capabilities would be overwhelmed by it?

FrostKiwi

Thanks to libdav1d's [1] lovingly hand crafted SIMD ASM instructions it's actually possible to reasonably playback AV1 without hardware acceleration, but basically yes: From Snapdragon 8 onwards, Google Tensor G3 onwards, NVIDIA RTX 3000 series onwards. All relatively new .

[1] https://code.videolan.org/videolan/dav1d

adgjlsfhk1

There are a lot of 10 year old TVs/fire sticks still in use that have a CPU that maxes out running the UI and rely exclusively on hardware decoding for all codecs (e.g. they couldn't hardware decode h264 either). Image a super budget phone from ~2012 and you'll have some idea the hardware capability we're dealing with.

eru

If you are on a mobile device, decoding without hardware assistance might not overwhelm the processors directly, but it might drain your battery unnecessarily fast?

boterock

tv manufacturers don't want high end chips for their tv sets... hardware decoding is just a way to make cheaper chips for tvs.

resolutefunctor

This is really cool. Props to the team that created AV1. Very impressive

tr45872267

>AV1 sessions use one-third less bandwidth than both AVC and HEVC

Sounds like they set HEVC to higher quality then? Otherwise how could it be the same as AVC?

pornel

There are other possible explanations, e.g. AVC and HEVC are set to the same bitrate, so AVC streams lose quality, while AV1 targets HEVC's quality. Or they compare AV1 traffic to the sum of all mixed H.26x traffic. Or the rates vary in more complex ways and that's an (over)simplified summary for the purpose of the post.

Netflix developed VMAF, so they're definitely aware of the complexity of matching quality across codecs and bitrates.

tr45872267

I have no doubt they know what they are doing. But it's srange metric no matter how you slice it. Why compare AV1's bandwith to the average of h.264 and h.265, and without any more details about resolution or compression ratio? Reading between the lines, it sounds like they use AV1 for low bandwidth and h.265 for high bandwidth and h.264 as a fallback. If that is the case, why bring up this strange average bandwidth comparison?

dylan604

definitely reads like "you're holding it wrong" to me as well

pbw

There's an HDR war brewing on TikTok and other social apps. A fraction of posts that use HDR are just massively brighter than the rest; the whole video shines like a flashlight. The apps are eventually going to have to detect HDR abuse.

recursive

My phone has this cool feature where it doesn't support HDR.

munificent

Just what we need, a new loudness war, but for our eyeballs.

https://en.wikipedia.org/wiki/Loudness_war

eru

Interestingly, the loudness war was essentially fixed by the streaming services. They were in a similar situation as Tik Tok is now.

crazygringo

This is one of the reasons I don't like HDR support "by default".

HDR is meant to be so much more intense, it should really be limited to things like immersive full-screen long-form-ish content. It's for movies, TV shows, etc.

It's not what I want for non-immersive videos you scroll through, ads, etc. I'd be happy if it were disabled by the OS whenever not in full screen mode. Unless you're building a video editor or something.

JoshTriplett

Or a photo viewer, which isn't necessarily running in fullscreen.

jsheard

Sounds like they need something akin to audio volume normalization but for video. You can go bright, but only in moderation, otherwise your whole video gets dimmed down until the average is reasonable.

JoshTriplett

That's true on the web, as well; HDR images on web pages have this problem.

It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR". But you could probably catch the most egregious cases, like "every single pixel in the video has brightness above 80%".

eru

> It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR".

That sounds like a job our new AI overlords could probably handle. (But that might be overkill.)

ElasticBottle

Can someone explain what the war is about?

Like HDR abuse makes it sound bad, because the video is bright? Wouldn't that just hurt the person posting it since I'd skip over a bright video?

Sorry if I'm phrasing this all wrong, don't really use TikTok

JoshTriplett

> Wouldn't that just hurt the person posting it since I'd skip over a bright video?

Sure, in the same way that advertising should never work since people would just skip over a banner ad. In an ideal world, everyone would uniformly go "nope"; in our world, it's very much analogous to the https://en.wikipedia.org/wiki/Loudness_war .

dylan604

sounds like every fad that came before it where it was over used by all of the people copying with no understanding of what it is or why. remember all of the HDR still images that pushed everything to look post-apocalyptic? remember all of the people pushing washed out videos because they didn't know how to grade the images recorded in log and it became a "thing"?

eventually, it'll wear itself out just like every other over use of the new

hbn

HDR videos on social media look terrible because the UI isn’t in HDR while the video isn’t. So you have this insanely bright video that more or less ignores your brightness settings, and then dim icons on top of it that almost look incomplete or fuzzy cause of their surroundings. It looks bizarre and terrible.

NathanielK

It's good if you have black text on white background, since your app can have good contrast without searing your eyes. People started switching to dark themes to avoid having their eyeballs seared monitors with the brightness high.

For things filmed with HDR in mind it's a benefit. Bummer things always get taken to the extreme.

nine_k

But isn't it the point? Try looking at a light bulb; everything around it is so much less bright.

OTOH pointing a flaslight at your face is at least impolite. I would put a dark filter on top of HDR vdeos until a video is clicked for watching.

ls612

On a related note, why are release groups not putting out AV1 WEB-DLs? Most 4K stuff is h265 now but if AV1 is supplied without re-encoding surely that would be better?

avidiax

I looked into this before, and the short answer is that release groups would be allowed to release in AV1, but the market seems to prefer H264 and H265 because of compatibility and release speed. Encoding AV1 to an archival quality takes too long, reduces playback compatibility, and doesn't save that much space.

There also are no scene rules for AV1, only for H265 [1]

[1] https://scenerules.org/html/2020_X265.html

ls612

Yeah I’m talking about web-dl though not a rip so there is no encoding necessary.

Dwedit

Because pirates are unaffected by the patent situation with H.265.

ls612

But isn’t AV1 just better than h.265 now regardless of the patents? The only downside is limited compatibility.

kvirani

Top post without a single comment and only 29 points. Clearly my mental model of how posts bubble to the top is broken.

yjftsjthsd-h

IIRC, there's a time/recency factor. If we assume that most people don't browse /newest (without commenting on should, I suspect this is true), then that seems like a reasonable way to help surface things; enough upvotes to indicate interest means a story gets a chance at the front page.