Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, again, I'm talking about PCs here, where it's usually implemented in shaders.


No it isn't. "NVDEC" is an actual ASIC block in the GPU silicon. It's not "shaders". Same with AMD's VCN. And Intel's QuickSync.

If it was just shaders then there'd be basically no concerns with driver quality or hardware support, just like there aren't with CPU decoders.


To be exact, it depends on the generation of hardware. At leat for Intel and AMD, the first version tend to have more shaders, then they switch to ASICs. Intel actually open sourced the shaders that they use.


So was I? Which phone can even achieve a 20W power draw...

The only hybrid VP9 decoders were AMD's that only supported Windows, which they stopped shipping years ago (any current/Linux AMD drivers that support VP9 decoding only do so via an ASIC), and Intel's that was only supported on 3 generations of GPUs (Gen7.5, Gen8, and Gen9) and is obsoleted with an ASIC in Gen9.5.


But the email is about ARM SoCs with dedicated VPU IP blocks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: