Most people vastly overstate the effect that CRT displays had on the appearance of legacy software.
Yes, very early on, when people used TVs or cheap composite monitors as the display devices for their computers, there were blurry pixel edges, bloom effects, dot crawl, color artifacting, and all the rest.
But by the '90s, we had high-quality monitors designed for high-resolution graphics with fast refresh rates, with crisp pixel boundaries and minimal artifacting. CRT filters overcompensate for this a lot, and end up making SVGA-era graphics anachronistically look like they're being displayed on composite monitors.
CRT monitors did not have "crisp pixel boundaries". A CRT pixel is a Gaussian-blurred dot, not a "crisp" square as it is on modern displays. What "high-quality" CRT monitors did have was higher resolutions, even as high as 1600x1200, where individual pixels are basically not distinguishable.
By the early '90s, high-quality CRT displays had low dot pitches or very precise aperture grilles in addition to supporting a wider range of refresh rates, and better clarity of display was a major selling point.
People were typically using 640x480 or 800x600 in GUI enviroments, and most DOS games were at 320x200. 1600x1200 was incredibly uncommon, even where the video hardware and monitors supported it -- people were usually using 14" or 15" 4:3 displays, and that resolution was way too high to be usable on displays that size, and the necessarily lower refresh rates made flicker unbearable at higher resolutions.
At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.
> At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.
Being able to clearly resolve individual pixels (which I agree was a thing at resolutions like 640x480 or 800x600. 1024x768 is pushing it already though) is not the same as seeing "crisp" boundaries between them. The latter is what I was objecting to. 320x200 (sometimes also 320x240 or the like) is a special case since it was pixel-doubled on more modern VGA/SVGA display hardware, so that's the one case where a single pixel was genuinely seen as a small square with rather crisp boundaries, as opposed to a blurry dot.
Yes, very early on, when people used TVs or cheap composite monitors as the display devices for their computers, there were blurry pixel edges, bloom effects, dot crawl, color artifacting, and all the rest.
But by the '90s, we had high-quality monitors designed for high-resolution graphics with fast refresh rates, with crisp pixel boundaries and minimal artifacting. CRT filters overcompensate for this a lot, and end up making SVGA-era graphics anachronistically look like they're being displayed on composite monitors.