I think some boxes advertise that, but how much does it help? On an analog TV you could be switching channels at a rate of say 3-4 per second while still registering the channel logos and stopping on the right one (by going one back immediately). One receiver in either direction isn't going to keep the charade up. Some cleverness, like decoding adjacent channels, but if the user switches a channel down, have the "up" decoder jump to the 2nd channel down in anticipation of the user going another channel down etc., might help, but still. You can't feasibly emulate the channel agility (across ~800 channels on satellite) of an analog RF receiver in a digital system with a modulation that has a self-synchronisation interval of up to a few seconds.
> you could send keyframes more often, or you could even use a totally different compression scheme that didn't use keyframes.
GOP length / I-frame interval directly relates to bitrate. Longer GOPs generally result in a lower bitrate at similar quality; in DVDs or Blu-rays I believe the GOPs can be quite long (10+ seconds) to achieve optimum quality for the given storage space.
Non-predictive video codecs are usually pretty poor quality for a reasonable bitrate (like a bunch of weird 90s / early 00s Internet video codecs), or near-losless quality but poor compression (because they're meant as an intermediate codec).
There are technical reasons for this I guess (as you pointed them out). But what gp is talking about is really a ubiquitous phenomenon of not trying to do anything with it.
Users want to change channels quickly? Okay, let's decompose a problem into what they really want to do, then make N "preview of i-j" channels in 720p with channel pictures in e.g. 8x5 grid and show numbers so that a user could just dial them on a remote. After finding an interesting preview, it's not an issue to push some buttons and wait a couple of seconds to get 2160p or whatever quality there is. Didn't like it? "99<n>", "ok", repeat. This solution would be a jquery of tv world, dumb, straightforward, non-automated, but it would be a thousand (fourty to be honest) times better than nothing.
When I had an hd tv set, I just checked 3-5 channel numbers that I remembered and turned tv off if there was nothing of interest, because you could easily spend half an hour by just peeking them one by one.
Right but your assumption is that bandwidth efficiency is the only important thing. Why don't they use 10 second intervals if they're more efficient?
My point is that there are many ways you could apply engineering, money, bandwidth, etc to reduce or eliminate the problem but they don't because they see it as good enough to not lose customers.
Honestly it seems pretty obvious to me that the I-frame intervals that are typically used are the outcome of a balance between usability (having to wait until a clear picture can be viewed) and bandwidth as well as processing requirements.
Bandwidth matters because spectrum is a finite commons. Digital TV required giving up bands that were in use for other things (e.g. wireless microphone systems and some ham bands) as it is, using even more bandwidth would have required even more spectrum. Other people have stakes in the spectrum for very good reasons and often much more important reasons than "I wanna watch TV". Even for satellite TV, which uses frequencies high enough that there is lots of bandwidth is limited, because using a lot of bandwidth would require new LNBs and multiswitches for all customers.
> My point is that there are many ways you could apply engineering, money, bandwidth, etc to reduce or eliminate the problem but they don't because they see it as good enough to not lose customers.
Can you name one for each category you bring up? Especially the "engineering" one would be interesting. Obviously dedicating an even huger chunk of spectrum to "I wanna watch TV" would make things easier, so that's not really interesting. And "let's just give each customer a TVoIP stream that can start immediately" is also a pretty obvious "money is no concern" (and also "we don't care about customer adoption costs") option.
I think we're really saying the same thing but from a slightly different perspective.
We agree that there a whole series of parameters that can be traded off against each other:
1. Bandwidth required
2. Number of channels
3. CODEC complexity/engineering effort/cost for encoders and decoders
4. Iframe interval
5. Number of tuners/decoders in receiving equipment
6. Resolution
7. Frame rate
8. Video quality
So the standardization people decide how to set those, and basically channel switch time gets set to the maximum value that doesn't cause people to cancel their service.
If you want to see something pretty astonishing that can happen if you set the tradeoffs differently, check this out: https://puffer.stanford.edu/player/
> you could send keyframes more often, or you could even use a totally different compression scheme that didn't use keyframes.
GOP length / I-frame interval directly relates to bitrate. Longer GOPs generally result in a lower bitrate at similar quality; in DVDs or Blu-rays I believe the GOPs can be quite long (10+ seconds) to achieve optimum quality for the given storage space.
Non-predictive video codecs are usually pretty poor quality for a reasonable bitrate (like a bunch of weird 90s / early 00s Internet video codecs), or near-losless quality but poor compression (because they're meant as an intermediate codec).