It's crazy to me we exist in a world where we can render hyper realistic 3d scenes and play games that once felt like they would be impossible to render on consumer grade hardware AND in the world where we are still trying to perfect putting text on a screen for a terminal LOL :)
Isn't some of it that we're optimizing more for graphics and there are tradeoffs between the two (so getting better at graphics tends to make the text worse)? Partially offset by terminals using GPU acceleration, but you're still paying for that pipeline.
How much of this includes that fact that a) it didn't matter too much previously, I.e. it works and b) until recently there's been a lot of network latency to solve for in a lot of terminal use cases.