My experience with HDR has been pretty abysmal on a $500 4K TV. Badly tuned HDR is way worse than no HDR at all.
I have 20/20 vision, and I really can't tell the difference between 1080p and 4K for video games and movies. I will never do below 4k again on a desktop, but 1080p is more than fine for a TV. Higher framerate makes a far bigger difference than higher resolution for video games too.
You can see the difference in 4K when the bitrate is there, but most streaming platforms compress their videos too much for it to be worth it. It's definitely not the jump that 720p to 1080p is, though. I agree with everything else you said.
Whether 4k is worth it depends a lot on the size of the TV vs how far away you sit. For a 65" TV, I don't see much difference between 4k/1080p above ~8ft away.
HDR is indeed effectively a marketing gimmick on many cheap TVs. They are getting better though
For gaming, the internal render resolution of a gaming console might not go over 1080p anyway. If it's a Switch 1 it might even be only 720p. So if that is the main usage, you are right, a 4k screen is a waste of money.
I have a 1080p TV and it's fine for watching TV and movies but i am going to upgrade to a 4K display because i can definitely tell a difference in games, and i do most all my gaming on my TV.
Yes, but I think you are missing out on the part where it is close to free. I have a nice monitor for photos and other crap, but most of the shit I do is text. I do not need 4k.
If I (like my wife) was going to use a TV as a monitor at her desk, I would definitely want a 4k monitor. Up close, that is a video wall, with no need of window scaling.
Such as it is, I use 3x 1080p displays. It's fine for me, and approximates a larger curved super-wide display (while also being cheap). She does just fine with 1080p resolution however - rarely has more than 2-3 windows on screen at a time.
When those are needed, digital signage displays are the answer. They're more costly unfortunately but can be bought used and are guaranteed to work 12-18-24/24 in much less friendly environments than a living room. They're increasingly making use of Android or WebOS, unfortunately, but being aimed at the professional world they lack all those annoyances and the general crap the industry crams into home TVs.
You're missing out on resolution (4K) and picture quality (HDR, contrast ratios, color gamut) improvements by doing this.
Not everyone suffers from FOMO.
I've only seen one movie that was worth the bother and expense of seeing it in 4K (Rear Window).
The rest of the things you mention are mostly for a very small slice of theoretical people with perfect vision in perfectly lit rooms at the perfect height and viewing angle.
Beyond icons on a sticker checklist, they mean nothing to the 99% of people who just want to watch sportsball or eat popcorn while watching Disney films with their kids.
You can put lipstick on a pig, but most people are still watching pigs.
The OP is not asking for a TV to watch TV on, he's asking for a TV to use as a second monitor for his laptop. When it comes to computer interfaces, the difference between 4K and HD is enormous. Especially for text.
The footage is analog (on film). It was shot with 0 pixels, so 4k pixels on an edge doesn't matter. Side note, footage itself is a term derived from film (how many feet of film).
You can scan film into whatever digital resolution you want. You could do an 8k scan if you felt like it. You might run into issues where the resolving power of the film is less than the scan, but 4k is not an unreasonable resolution to pull out of well lit studio shot movie stock.
Plus it’s a black & white movie, and b&w film has a higher “resolution” than color too right? Because you’re dealing with silver particles instead of physically larger color grains.
Or something like that. Someone more in the know please check my math.