Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I forget where I heard it, but one of the issues brought up was that EA never really formalized the 'value' of a life relative to the 'value' of other stuff.

Like, there exists some value to the entirety of the Amazon that is higher than that of a human life. Otherwise the terrible logic says that you should devastate the Amazon to just build slum housing and take away birth controls. I'm not arguing for any of this, just stating the premises.

I think we can all agree that the 'value' of the whole Amazon isn't worth bulldozing for slum housing.

So the problem is where you put these fuzzy lines. You've got some extremes that 99.9% of people agree on, where is the middle? Where do you put down a line?

From the VERY little I've read of the EA debates, there seems to be no real work on this? If someone else could synthesize this as a reply, I'd be quite grateful.



At some point you've just got to decide for yourself. A lot of focus on human lives and especially qualys because they can be thought of as interchangeable but that hasn't stopped some EAs from focusing on animal welfare and other non humanist causes. There's no objective way to value rainforest or animal suffering in terms of qualys, all you can do is thought experiments and read studies on effectiveness of charities focusing on each so that you can decide which is more effective given your ethical framework.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: