Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Of course I remember HDR10+, along with HDR10 and HLG. All of which are quite common and broadly used.

Hollywood movies primarily standardized on Dolby Vision, but the entire HDR ecosystem very much did not. Sony cameras for example primarily only shoot in HLG, even for their cinema cameras.

Similarly games regularly opt for HDR10/HDR10+ for their HDR output instead of Dolby Vision. Why? Because it's cheaper, and dynamic metadata is largely pointless in an environment where the lighting content of the next dozen frames aren't known



> Hollywood movies primarily standardized on Dolby Vision

No, pretty much the entire video/streaming industry did. Apple, Netflix, Disney, HBO, etc, either stream in DV or HDR10 (non plus).

Physical Bluray is slowly dying (I own a bunch of those) so streaming is really where most of the HDR video content lives.

> Similarly games regularly opt for HDR10/HDR10+ for their HDR output instead of Dolby Vision

Fair point, but consumers keep complaining the PS5 doesn't have DV which is an indicator of what people want. DV is actually a big selling point for the Xbox Series X.

On PC, I don't know. I've been playing HDR in consoles for years but support on Windows has been pretty bad until recently. My impression is HDR is so much more popular on consoles vs PC. Same with Atmos and surround.


That's funny, since PS5 doesn't support Atmos at all and on Windows you need to buy a paid plugin to make it work for anything that's not a Home Theatre system.

(And even if you have a home theatre system, Windows games will still prefer outputting 5.1 / 7.1 PCM and mixing 3D effects by themselves).

I'd also be interested to hear where those Dolby Vision complaints for PS5 are coming from, I haven't heard anyone really say that despite HDR being debated quite a lot :)


> you need to buy a paid plugin to make it work for anything that's not a Home Theatre system.

You need to buy a license if you want things Atmos-ified (so HRTF) for your stereo headphones. It's basically worthless.

You don't need to buy anything if your media player can decode and downmix Atmos to surround (like Windows Movies & TV).

Home Theatre systems just get Atmos passed through to them if compatible, so they can then decode and downmix the positional audio according to your configuration.

> Windows games will still prefer outputting 5.1 / 7.1 PCM and mixing 3D effects by themselves).

I wish. If games have surround at all, it's usually only analog 5.1/7.1. You need Dolby Digital Live for a digital surround output in most cases (and that can be a PITA to arrange, e.g. patched Realtek drivers). DDL basically provides a fake analog output for software and then sends a compressed digital signal to your decoder.


HDR on PCs have more important issues that lack of Dolby Vision. Well, I guess you can say Dolby Vision indirectly would benefit because it requires minimum 1000 nits brightness for certified displays, while VESA would gladly certify 400 nits peak brightness as HDR capable display.

> Fair point, but consumers keep complaining the PS5 doesn't have DV which is an indicator of what people want. DV is actually a big selling point for the Xbox Series X.

They want it because there are realistically two options right now: HDR10 (not HDR10+) and Dolby Vision. DV being superior in every aspect from viewer perspective. I didn't even know about HDR10+ until today. In other words, what people actually want is HDR dynamic metadata because it looks a lot better than static metadata.

Since, like you said, every streaming service is either DV or static HDR10, it means people say that they want DV.


> Well, I guess you can say Dolby Vision indirectly would benefit because it requires minimum 1000 nits brightness for certified displays

It unfortunately does not. That's what a Dolby Vision Mastering display requires, but to get the DV logo on your display all you really have to do is pay Dolby money and use their tonemapper. Unlike Vesa they don't actually have a display certification system at all.


Oh, I misunderstood the requirements then. I thought it's required for certification as well.

Well, then PC HDR is doomed.

There is also issue with HRD calibration on PC for some reason. I have no issues on console connected to TV, but the same game on the same TV running on PC would get all weird looking.


For games another reason they don’t need dynamic metadata is they produce their content on the fly and they’re doing tone mapping themselves already and can tailor it to the display characteristics.


HDR (HDR10) is the high dynamic range data. HDR10+ and DolbyVision are additional metadata on top of that.


Not quite. HDR10 is static metadata on top of the BT2020 PQ data, with PQ being what gives it high dynamic range. HLG being another way to encode high dynamic range data. The Dolby Vision content captured by an iPhone is actually HLG, for example, not PQ. Other Dolby Vision profiles, like profile 5, is similarly not BT2020 PQ but instead IPTPQc2.

So HDR10+ is dynamic metadata on top of HDR10's static metadata on top of BT2020 PQ which is what makes it "HDR" in the first place. That's easy.

Dolby Vision is then a profile clusterfuck. Sometimes it's just dynamic metadata on top of HDR10. Sometimes it's dynamic metadata on top of HLG. Sometimes it is its own thing entirely.


If I remember correctly, broadcasting is on non-Dolby standards as well. UK uses HLG right?


Broadcast TV is HLG because it's backwards compatible with non-HDR TVs. And yes used by UK (BBC is the one that came up with HLG even)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: