"A terrestrial (over-the-air) transmission carries 19.39 megabits of data per second (a fluctuating bandwidth of about 18.3 Mbit/s left after overhead such as error correction, program guide, closed captioning, etc.),"
Circa 2009 when analog TV was first shut-off in the US, each DTV station usually only had one channel, or perhaps a second basic one like a static weather radar on-screen. Some did have 3 or 4 sub-channels early on, but it was uncommon.
Circa 2019, after the FCC "repack" / "incentive auction" (to free-up TV channels for cellular LTE use) it became very common for each RF channel to carry 4+ channels. But to be fair, many broadcasters did purchase new, improved MPEG-2 encoders at that time, which do perform better with a lower bit-rate, so quality didn't degrade by a lot.
You could technically fit 10 SD channels together in the US, except I recall that the government ("FCC") specifically REQUIRED (at least during initial rollout) at least one to be in highdef, unless you applied for a special waiver.
These days the FCC is unlikely to care, but I suspect viewership would suffer if any of the major broadcast networks downgraded their "main channel" signal to SD.
This explains the typical arrangement, as well as listing some of the extreme cases:
That I find super hard to believe!