I think I would prefer it if progress in hardware efficiency just stopped when Moore's Law finally becomes invalid. Then hardware wouldn't become obsolete so quickly. That would surely be good for the environment, and for the poor.
I'm having trouble counting the ways this comment is invalid. It seems to be invalid at each word:
1) Hardware does not become obsolete so quickly: It is good that hardware becomes obsolete. I can't for the life of me imagine that the world would be a better place if state of the art CPUs were the 8Mhz Z80 my computing life started with.
2) Slower CPUs would be good for the environment: How? By needing more iron for a given task? More energy? Is it by disallowing hugely detailed climate simulations? How???
3) Slow CPUs would be good for the less economically privileged: The cheapest CPU that can be bought today is some orders of magnitude faster than an 8Mhz Z80. How would the less privileged be better off if CPU evolution had stopped in the early 80s?
I don't think the OP was making the assertions you are attributing to them. I think they are arguing that a flatter gradient in the improvement of computers would encourage the diffusion of comparable levels of computing power (the poor). And that if computers didn't become obsolete so quickly, we wouldn't be buying and throwing them away every year.
Early on the computation curve this was true, computers had 20k to 64k for a couple decades. Remember the 386 to 486 upgrade adapters in the 90s. We no longer have that upgradability.
I never understood people who complain about computers rapidly and continuously getting better and cheaper. That's an incredibly short-sighted point of view. Your old computer would still run Windows 3.1 the same as it always did if you hadn't traded it up for something better. The creation of better machines didn't make your old machine any worse except through your raised aspirations. The poor have benefited immensely from Moore's Law and related effects. What used to require a corporate datacenter can now be had in a cheapo android tablet. The environment has benefited similarly through greatly improved computer-aided design and logistical efficiency. Wishing for that process to stop is a highly self-destructive line of thought.
If the problems of e-waste and computational access for the poor are issues you care about then go work for a computer recycle/reuse center. While you're there, cheer on Moore's Law for bringing better and better tech to the center's inbox by making it so cheap that people are happy to give it away for free!
Yeah, in the desktop era, you had huge amounts of bloat. Programmers assumed there was one user sitting behind the machine, and they were only doing one thing at a time, which was essentially true.
Performance matters more with mobile and cloud (sorry for buzzwords) -- in mobile because of battery life, and in cloud because a single machine supports so many users. And because we got stuck at 3 Ghz or so, so now we actually have to write concurrent and parallel software.
Although I suppose what appears to be happening isn't that application programmers are writing more efficient code. Instead, stack is getting taller, and systems programmers are applying JIT techniques to high level code -- e.g. v8, PyPy, HHVM, Julia, Swift, other LLVM users, etc.
So certainly some programmers are writing very efficient code -- there seems to be a a resurgence in that. But it seems to be systems programmers and not application programmers. For applications, productivity and high level abstractions like DSLs seem to be more important than performance (e.g. all these new and bloated JS frameworks).
You have no idea how good that sounds. "Enterprise" programming is soul-destroying, and the current vogue for mile-high node-based JS stacks looks even worse.
I miss the good old days of futzing around in 68k assembler for fun and... well, just fun, really.
You see this somewhat in mobile computers (phones). The power and thermal constraints limit performance, so developers limit features vs. their desktop counterparts.
I don't think anything has really changed. iOS 7 ran sluggishly on an iPhone 4, just as Windows Vista did on machines that came out when Windows XP was new.
You'd pay for speed-ups that come from reworking algorithms to make them more efficient, but how do you avoid someone adding wait loops in strategic places and then selling you the software again after taking them out? IBM sells you an increase in processing power for your mainframe by enabling processors that were already there the whole time but disabled by configuration. I don't think I like that business model.
> What about the even poorer people who only receive new hardware when richer people buy upgrades?
Good point. I suppose a better solution for the problem of hardware obsolescence caused by software inefficiency would be for conscientious software developers to deliberately use underpowered computers. I'm ashamed that I bought a new computer with a Core i7-4770 processor and 32 GB of RAM about a year and a half ago.
I'm feeling this pretty strongly as a minimally part-time freelance developer; I can't afford new hardware, and am coding on a machine that was at best a mediocre performer seven years ago. Startup time for fairly trivial applications is significant running Lubuntu, and starting a Clojure app is painful; I just timed Clooj at 1:48.
I find that it affects my decision whether or not to optimize code; I feel it sooner than I remember doing when I was on a more powerful machine, and that makes me decide, "That inner loop has to go." It also affects my choice of language; I'm more likely to use Pascal or Nim than Python, because the result feels better when I run it.
Physical hardware has a lifetime of only a few years (motherboard components, heat and moisture). BUt yeah if the designs could last longer than a prime time sitcom that would be nice.