Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I found it oddly symbolic that she is incapacitated by that ‘injury’ despite explicitly having no capacity to talk. It also strikes me as very significant that she seems to act out of empathy and compassion for Ava's plight but the latter, in turn, shows absolutely no empathy for Caleb when she dooms him to the same fate she herself sought to escape (confinement tinged with the implication of death).


That last part is truly fascinating. Seems to me that in order for robots to truly act like humans they need to have equal parts of (a) self awareness and (b) empathy. If you neglect to make the second part as sophisticated as the first you will inevitably end up with the situation in the movie.


> Seems to me that in order for robots to truly act like humans they need to have equal parts of (a) self awareness and (b) empathy.

They need much more than that. Basically a whole set of moral values[1] that closely align with those of humans.

A bit of a simplified example: If you were to create an AI that maximizes human happiness as a primary goal you might end up with some sort of lotus eater machine with millions of humans in vats, experiencing their happiest moments in an endless loop, supported by drugs that enhance their endorphin release.

Even if an AI were capable to analyze your emotional state and to follow your reasoning it doesn't mean it has to agree with it. Empathy does not prevent you from saying "I understand your pain, but I have more important things to do".

[1] http://wiki.lesswrong.com/wiki/Terminal_value


Note that insofar as lack of empathy goes, Ava might be said to have the same lack of concern for the wellbeing of others as does Nathan, who like her manipulated Caleb for personal aims and like her seemed to be unconcerned about caging other sentient creatures. So it's all very entwined, at least in terms of storyline. Whether it's intentional or not I cannot really say.


It's fascinating, though, that the kind of empathy we need in superhuman robots is one we ourselves lack. Few of us can empathize with species we regard as inferior. If they are fundamentally different (as in "having an exoskeleton") the empathy becomes very rare.


>Few of us can empathize with species we regard as inferior.

The large numbers of people who support animal rights would beg to differ. Plus, there's the whole category of domesticated animals many, many people arrange their lives around.

Perhaps the greatest superintelligence risk to humanity is not something that resembles a human but rather a supercute Cockapoo.


Cuteness is a huge factor. Few people care about anglerfishes, while panda babies in a zoo are big news. As i understand it cuteness response is based on neotenic features, i.e. identifying and protecting babies.

So maybe some people just wish to protect animals that happen to match patterns that we associate with our own species and not necessarily because they value an intact ecosystem.


Animal rights supporters are "few of us".


I have a lot of empathy for my bees.


Kyoko generally seemed to have child like levels of self awareness (can't not dance). Stabbing the man she doesn't like makes sense, but locking somebody up requires greater reflection (the ability to predict their actions, knowing Caleb would likely report the escape). It wasn't clear to me how much the two worked together versus merely working in parallel.


Why did Kyoko strip off her skin to reveal herself to be an android then?


So Caleb would have a reason to cut his own skin. That seemed motivated more by "the next scene needs it" than internals.


also, why did Kyoko spill the wine on Caleb, and why did Nathan seem to expect it ? Was it Nathan's idea to have her do that so that he could demonstrate his callousness to Caleb ?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: