I would argue that most ML processing by most "big data" companies also has marginal negative output for each additional GPU. Certainly anything to do with social media or adtech and a lot of analytics in general is a net negative for the world, while consuming electricity and processing hardware.
Right, the ML processing is competitive in nature, just like the crypto mining is. Much of it also rests on illegal activity that just hasn't been discovered yet, whether overtly illegal (like botnets sloshing around ad money under fraudulent pretenses) or just mild to moderately illegal, like prima facie unauthorized database usage or use of scraped data. There are also uses of data that are illegal if it comes from US citizens and ?? if they are non-US citizens. Just because the government is collectively a fat slug when it comes to law enforcement in these areas doesn't mean the laws aren't on the books.
You can't seriously be conflating all the money laundering and ransomware attacks enabled by crypto with a tiny bit of crime which happens at the margins of ML (like it might in any industry) can you?
It structurally pays less per GPU as you add more GPUs, because the security budget is a zero-sum budget. (though, that's only if you ignore extrinsic effects, and some people argue that more security = higher coin price = higher security budget).
Even if the security budget is zero-sum, the amount of actual security added to the network with each additional GPU is constant.
It seems like your argument is that cryptocurrency needs to use over half of the computing power on earth at any given time to consider it secure. That's a very bad standard. Visa and MasterCard don't need this much computing power for security.
Security is an easy argument to reach for when you want to justify something stupid.
Even those things you have decided provide marginal value employ people, and you never know where innovation will come from. Crypto is comparable to a ponzi scheme and is an asset with almost no real value, trading only on the greater fool theory.
They are not being reductionist. Each additional hashing unit cancels out at the next difficulty adjustment, having a net effect of zero on the overall working of the network.
Yes, the hash rate of the network increases, thus the difficulty for an attacker to perform a double spend. But, arguably, that's an already insane level of resources, many orders of magnitude higher than what a bitcoin-like payment system would reasonably require.
So even if you consider the network socially useful, pouring thousands of times more hashing power that the network actually requires to perform its task securely is still a waste of resources.
I would argue that most ML processing by most "big data" companies also has marginal negative output for each additional GPU. Certainly anything to do with social media or adtech and a lot of analytics in general is a net negative for the world, while consuming electricity and processing hardware.