Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because we do that with mosquito nets.

EA seems like a way to achieve nothing while looking like you're doing everything. No one expects you to fly to Mars tomorrow. And that's true every single day. It's true today. It'll be true tomorrow. It was true yesterday. It was true 10 years ago. It will be true 10 years from now.

So if no one really expects you to fully achieve your goal, all you have to do is kinda look like you're trying and that will be good enough for most people.

EA takes a good, hard look at all these good intentions and says, "Fuck, this would make a baller ass road".

However, if we solve malaria. That's another thing not killing us. Another problem checked off. Like polio. Or smallpox. Colonize Mars? Fucking how? We can't even get the environment on Earth under control. How the living fuck are we going to create an environment on another fucking planet. Much less even get there.

So how about we figure out a way to get the garbage out the ocean. On how to scrub the air of CO2. How to manufacture and produce without so many polluting side effects. We keep doing all these smaller things. Put in the work, and one day, we will save all those trillions of potential lives. But it requires putting in the work.

Edit: Not saying you believe it. But presenting the counter-argument to EA.



Where does this view of EA come from? Hatchet jobs written by TIMES and its ilk? Twitter personalities like Elon Musk who have so much social gravity that people perceive them as the spokesmen of anything they mention that you hadn't heard of before?

Google "effective altruism" and the first two results are EA/Giving What We Can and GiveWell. Both of these organizations are meta-charities that help forward money or encourage the forwarding of money to other charities, but most of all... Mosquito nets! The first charitable fund mentioned by EA/Giving What We Can is GiveWell's, and the top recipient of that fund is the Malaria Consortium.

I heartily encourage you to read about GiveWell. It's still the heart of EA from the perspective of the less-vocal majority of self-described EAs.


I think “where does this view come from?!” outrage comes off as disingenuous. I think we both know that over the past couple of years the most prominent public “face” of the EA community has been William MacAskill, who went on a major donor-funded press tour to promote his ideas on longtermism through his book “What We Owe the Future.” For most of the general public, this was probably their first encounter with the entire concept of EA.

It is perfectly fine if you don’t support MacAskill’s vision for EA’s future. I would love to hear a critique of this schism from someone within the EA community! But when you imply that critics are getting their (accurate) impression of EA from “newspaper hatchet jobs”, it feels like you’re either unaware of the way some prominent EAs are presenting the movement, or else you’re not arguing in good faith.


So you feel as though William MacAskill has been the "public face" of EA for the past couple years. That's possible, though it would make a little more sense if you had said one quarter of that time period, since his book was released in August.

I'd normally not want to get into personal accusations, but since you've already started your reply with one sentence ending in "disingenuous" and the next starting with "I think we both know" (which is infuriating), and to round it all out ended your comment with "...or else you're not arguing in good faith", I'll say it: I think you're projecting your personal Internet experience on others, and I think your personal Internet experience does not reflect that of the median person. MacAskill is not the face of EA. I think if you look at search data, you'll find that Peter Singer's popularity merely went from being ~100x MacAskill's to more like 10x during the book tour.

EA predates the notions in What We Owe the Future by many years. Present-focused charities like GiveWell were perhaps overshadowed in popularity by that book for a news cycle or two in late 2022. It happens. But the notion that that book or its author have been in any way the "most prominent" aspect of EA for the last couple of years is completely false. It's projection. In your mind, that's all EA is lately so it must be all it has ever been (hence the exaggerated timeline) and everybody else is just like you.


Here is what you said to the other poster: “Where does this view of EA come from? Hatchet jobs written by TIMES and its ilk? Twitter personalities like Elon Musk who have so much social gravity that people perceive them as the spokesmen of anything they mention that you hadn't heard of before?”

Having re-read this, it just strikes me as extremely disingenuous and uncharitable (not to mention aggressive) particularly since you seem to know that there has been a huge amount of press around EA recently due to the MacAskill longtermism book, not to mention all the press around SBF and his longtermist fund.


No! Gah! You're doing it again! I'm not being aggressive, I'm being harassed by somebody who keeps telling me what I "know"! Why do you insist on talking to me like this instead of just taking me at my word?

> ...you seem to know that there has been a huge amount of press around EA recently due to the MacAskill longtermism book...

I do not know this! I think I saw one article about it posted here on HN. I also might have read a post on somebody's substack about it a few months ago. I am not aware of any "huge amount of press", certainly not "recently". I looked into the search stats on it because of your comment. I didn't even remember the name MacAskill or much about his book before you brought it up. EA to me is still basically just malaria nets and other present-focused causes, and I can only assume that's what it is to most of the many, many people who have read Singer and not MacAskill and who donate to GiveWell year after year.


> Where does this view of EA come from?

I''ll answer. I think the view comes from a different, but related group known as "The rationalists".

"The rationalists", or less wrongers, fit the bill to a Tee, of all these common criticisms of EA that people are bringing up.

And, the reason why this criticism may be misattribute to EA specifically, is because there is a large overlap between the rationalists, and EA.

The Rationalists are the ones talking about AI existential risk, and colonizing mars, and all that nonsense.


Although the thought leaders are focused on long termism most EA money still flows through givewell to mostly global health initiatives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: