Instead of being a "guide" as it says in the title, this is merely comparing AV1 to other codecs, and ends by promoting their own thing. I don't think AV1 needs any "convincing", it's theoretically just better in every aspect. It's the tooling and hardware that needs work.
One of my take-aways after going through it is the cross-over point between h.264 and AV1: it depends on the numbers of expected view. AV1 is computationally more intensive and there's a dollar number assigned to that.
Servers don't get hardware accelerators. I am forced to run everything with llvmpipe and then encode it with vp8 and essentially run at 80% to 90% CPU utilization 24/7.
Yes the 1080p@30fps stream takes 2mbit/s in the worst case, no I don't give a damn about making it smaller. That is literally pointless. Yes in theory I could add eight times the CPU power to make it stream 4k@60fps and still end up with a lower bit rate but I don't care, because the CPU is the bottleneck and that many cores is incredibly expensive.