SGX was used by video DRM on intel platforms. As SGX no longer exists in modern intel processors, its not really doable anymore. netflix drm and the like are probably done on gpu, not on cpu (but I could be wrong)
It's actually impossible to have a "legal"/commercial 4k bluray setup today on modern PCs/CPUs, as they will only license it to players that can use SGX and as noted SGX no longer exists. (of course this doesn't prevent one from using vlc / libaacs and the like).
Any idea why Xeons would still contain this feature? Is it for backwards compatibility for their corporate customers or are there reasons that someone would still use it in modern applications in 2024?
my guess is that its not used by the ARC dGPUs which have their own equivalent for it? But I guess it makes sense to use it for iGPUs.
With that said, seems sketchy to send untrusted data to the ME which is essentially an independent computer, running an independent OS with the ability to have persistent state. Seems like a security failure waiting to happen.
It's actually impossible to have a "legal"/commercial 4k bluray setup today on modern PCs/CPUs, as they will only license it to players that can use SGX and as noted SGX no longer exists. (of course this doesn't prevent one from using vlc / libaacs and the like).