M

MikeJ

53 karmaJoined

Comments
4

Yes, i’m always unsure of what “bad faith” really means. I often see it cited as a main reason to engage or not engage with an argument. But I don’t know why it should matter to me what a writer or journalist intends deep down. I would hope that “good faith” doesn’t just mean aligned on overall goals already.

To be more specific, i keep seeing reference hidden context behind Phil Torres’s pieces. To someone who doesn’t have the time to read through many cryptic old threads, it just makes me skeptical that the bad faith criticism is useful in discounting or not discounting an argument.

Maintaining that healthy level of debate, disagreement, and skepticism is critical, but harder to do when an idea becomes more popular. I believe most of the early "converts" to AI Safety have carefully weighed the arguments and made a decision based on analysis of the evidence. But as AI Safety becomes a larger portion of EA, the idea will begin to spread for other, more "religious" reasons (e.g., social conformity, $'s, institutionalized recruiting/evangelization, leadership authority). 

As an example, I'd put the belief in prediction markets as an EA idea that tends towards the religious. Prediction markets may well be a beneficial innovation, but I personally don't think we have good evidence one way or the other yet. But due to the idea's connection to rationality and EA community leaders, it has gained many adherents who probably haven't closely evaluated the supporting data. Again, maybe the idea is correct and this is a good thing. But I think it is better if EA had fewer of these canonized, insider signals, because it makes reevaluation of the ideas difficult.

Are there any amateur EA historians who can help explain how longtermism grew in importance/status? I’d say 80k for instance is much more likely now to encourage folks to start a longtermist org than a global health org. There is lots of funding still moving towards the traditional neartermist causes like malaria and deworming, but not too much funding encouraging people to innovate there (or start another AMF).

Ultimately, I’m curious which person or orgs got convinced about longtermism first! It feels much more driven by top-down propagation than a natural evolution of an EA idea.

This post wanted data, and I’m looking forward to that … but here is another anecdotal perspective. 

I was introduced to EA several years ago via a Life You Can Save. I learned a lot about effective, evidence-based giving, and “GiveWell approved” global health orgs. I felt that EA had shared the same values as the traditional “do good” community, just even more obsessed with evidence-based, rigorous measurement. I changed my donation strategy accordingly and didn’t pay much more attention to EA community for a few years.

But in 2020, I checked back in to EA and attended an online conference. I was honestly quite surprised that very little of the conversation was about how to measurably help the world’s poor. Everyone I talked to was now focusing on things like AI Safety and Wild Animal Welfare. Even folks that I met for 1:1s, whose bio included global health work, often told me that they were “now switching to AI, since that is the way to have real impact.” Even more surprising was the fact that the most popular arguments weren’t based on measurable evidence, like GiveWell, but based on philosophical arguments and thought experiments. The “weirdness” of the philosophical arguments was a way to signal EA-ness; lack of empirical grounding wasn’t a dealbreaker anymore.

Ultimately I misjudged what the “core” principles of EA were. Rationalism and logic were a bigger deal than empiricism. But in my opinion, the old EA was defined by citing mainstream RCT studies to defend an intervention that was currently saving X lives. The current EA is defined by citing esoteric debates between Paul Christiano and Eliezer, which themselves cite EA-produced papers… all to decide which AI Safety org is actually going to save the world in 15 years. I’m hoping for a rebalance towards the old EA, at least until malaria is actually eradicated!