Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Unless AI starts being 1 000 000x energy efficient, my money is on a).

The amount of energy required for AI to be dangerous to its creators is so vast that I can't see how it can realistically happen.



That depends on how its used. See the terminator movies. One false positive is enough to end the world with even current AI tech if its merely mated to a nuclear arsenal (even a small one might see a global escalation). There have been false positives before, and the only reason why they didn't end in nuclear Armageddon was because the actual operators hesitated and defied standard protocol, which probably would have lead to the end the world as we know it.


We know that we can run human level intelligence with relative efficiency.

Without discussing timelines, it seems obvious that human energy usage should be an upper bound on the best possible energy efficiency of intelligence.


If we manage to harness the ego energy transpiring from some people working on "AI" we should be halfway there!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: