Imagine one day we get a computer able to perform operations unimaginably faster than what we have now. It would process Snapchat's dog filters in femtoseconds, firing up and tearing down millions of kubernetes clusters every frame (because it would be easier to write it that way). Would is still make sense to try and optimize software? Wouldn't it be more constructive to solve real-world tasks instead?
Hardware is fast and cheap, and it's getting even faster and cheaper. It's perfectly fine to utilize this power, if it makes developing products faster, easier or cheaper.
Now, there are still cases when you need to send a machine to roam the mountains of another planet. This may justify doing some assembly.
> It would process Snapchat's dog filters in femtoseconds, firing up and tearing down millions of kubernetes clusters every frame (because it would be easier to write it that way).
No, it wouldn't. Those tasks would just become less efficient with time as developers stopped caring to optimize them, as has happened with the overwhelming majority of consumer software for the past several decades.
They would work well enough on the computers of that generation and painfully slow on today's supercomputers. Yes, just like the software we have today.
I was trying to express that Electron (and the like) is not an inherently bad thing. It allows to trade hardware capacity for easier development experience. Those developers who use it create useful software that works. And software that works in a given environment is exactly the point of the industry, is it not?
Hardware is fast and cheap, and it's getting even faster and cheaper. It's perfectly fine to utilize this power, if it makes developing products faster, easier or cheaper.
Now, there are still cases when you need to send a machine to roam the mountains of another planet. This may justify doing some assembly.