Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think this is the bigger risk, since we can figure out that we've done this, and stop, ideally in a way that's good for all of the sentient beings involved.

But it's definitely a possible outcome of creating AGI, and it's one of the reasons I think AGI should absolutely not be pursued.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: