Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In some ways though, the increase in visual fidelity has been _marginally_ improved on a per-year basis since the PS4/Xbone era. My GPUs have had much, much longer useful lives than the 90s/early-2000s.




AMD just tried to get away with stopping support for cards that were still being sold new in stores. Nvidia cards are just getting worse and more expensive over time (https://www.xda-developers.com/shrinkflation-is-making-nvidi...).

Part of what made PC gaming in the late 90s/early 2000s so exciting was that the improvements were real and substantial instead of today where we're stuck with minor improvements including bullshit like inserting fake frames generated by AI, and the cards back then were usually pretty easy to get your hands on at a normal price. You might have had to occasionally beat your neighbors to a best buy, but you didn't have to compete with armies of bot scalpers.


Exactly plus upscalers are pretty amazing. Upscaling from 1080p to 4k is 80-100% of the quality of native rendering at a far lower cost.

Now if only major studios would budget for optimizations..




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: