Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Putting aside all the ai-recommends-you-eat-a-few-rocks-a-day evidence...

Letting machines incapable of empathy, bereft of accountability for themselves or their creators, make decisions to end one or more lives, is a monstrous idea on its face.

Additionally, see https://en.wikipedia.org/wiki/Imperial_boomerang



Mechanized warfare has been a thing for generations now. What is the difference between AI and a pressure switch? AI can choose to NOT kill the 13 year old playing football in the field 5 years later.


Also a bad idea, and if I understand correctly, landmines are already considered a war crime if not removed after conflict. https://en.wikipedia.org/wiki/Convention_on_Certain_Conventi...


Ukraine stopped the Russian invasion and turned the war into a stalemate primarily through widespread use of land mines. The chance that those will all be removed after the conflict is approximately zero. Many of them were artillery delivered so no one even knows exactly where they landed. No one cares about war crimes when their own survival is at stake.


Where does it say the AI weapons make decisions?


The above comment thread is:

> AI weapons seems like a bad bad idea.

>> Why?

---

And let's not be naïve; this is the military we're talking about. It'll happen whether it's Luckey or someone else.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: