Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It comes from the fact that a logically thinking device given an arbitrary goal will probably not develop "Thou shalt not kill"-type moralisms in the pursuit of that goal.

If you make an AGI to run a paperclip factory, eventually you'll come back to find most of the carbon and iron in your galaxy refactored into paperclips. The extinction of any local flora or fauna there is just a byproduct of that drive. See [2] for a brief rundown.

[2]: https://selfawaresystems.com/wp-content/uploads/2008/01/ai_d...



> If you make an AGI to run a paperclip factory, eventually you'll come back to find most of the carbon and iron in your galaxy refactored into paperclips.

Given the costs to run such an AGI, it seems unlikely one would be left unattended for long.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: