It's always occurred to me that you'd use evolving version data from an aggregator like shodan to build a picture of how up-to-date people keep their software, that way when a new vulnerability hits you have a prioritised list of IPs that haven't updated in a timely manner in the past, rather than wasting cycles trying to exploit auto-updating hosts
The cost of any additional untargeted attack attempt is essentially zero in most cases. It doesn't matter whether you are trying your exploit on 100 hosts or 1 million. An attacker willing to spray exploits across the internet has basically zero incentive to only use those exploits on hosts they know to be running a specific version, and every incentive to just try it out on all hosts running the software that they can possibly identify.
I suppose that's true. It's hard to think in terms of an attacker essentially having unlimited resources, but of course all the resources they're using are already hacked/stolen.
We don't have to consider anything near unlimited resources here - you can do a masscan of the internet on commodity hardware in an hour, or you have a shodan sub (they've sold lifetime basic subscriptions before for $5). Actually doing the exploitation on every target again probably takes under an hour with a couple cheap droplets. The only thing that actually requires any effort is setting up a reliable C&C infra.