Comment author: Telofy  (EA Profile) 13 January 2017 09:23:58AM 1 point [-]

Thanks. May I ask what your geographic locus is? This is indeed something that I haven’t encountered here in Berlin or online. (The only more recent example that comes to mind was something like “I considered donating to Sci-Hub but then didn’t,” which seems quite innocent to me.) Back when I was young and naive, I asked about such (illegal or uncooperative) options and was promptly informed of their short-sightedness by other EAs. Endorsing Kantian considerations is also something I can do without incurring a social cost.

Comment author: Fluttershy 13 January 2017 05:43:05PM 3 points [-]

Thank you! I really admired how compassionate your tone was throughout all of your comments on Sarah's original post, even when I felt that you were under attack . That was really cool. <3

I'm from Berkeley, so the community here is big enough that different people have definitely had different experiences than me. :)

Comment author: Fluttershy 12 January 2017 04:15:48AM 10 points [-]

This issue is very important to me, and I stopped identifying as an EA after having too many interactions with dishonest and non-cooperative individuals who claimed to be EAs. I still act in a way that's indistinguishable from how a dedicated EA might act—but it's not a part of my identity anymore.

I've also met plenty of great EAs, and it's a shame that the poor interactions I've had overshadow the many good ones.

Part of what disturbs me about Sarah's post, though, is that I see this sort of (ostensibly but not actually utilitarian) willingness to compromise on honesty and act non-cooperatively more in person than online. I'm sure that others have had better experiences, so if this isn't as prevalent in your experience, I'm glad! It's just that I could have used stronger examples if I had written the post, instead of Sarah.

I'm not comfortable sharing examples that might make people identifiable. I'm too scared of social backlash to even think about whether outing specific people and organizations would even be a utilitarian thing for me to do right now. But being laughed at for being an "Effective Kantian" because you're the only one in your friend group who wasn't willing to do something illegal? That isn't fun. Listening to hardcore EAs approvingly talk about how other EAs have manipulated non-EAs for their own gain, because doing so might conceivably lead them to donate more if they had more resources at their disposal? That isn't inspiring.

Comment author: Fluttershy 12 January 2017 04:24:29AM 8 points [-]

I should add that I'm grateful for the many EAs who don't engage in dishonest behavior, and that I'm equally grateful for the EAs who used to be more dishonest, and later decided that honesty was more important (either instrumentally, or for its own sake) to their system of ethics than they'd previously thought. My insecurity seems to have sadly dulled my warmth in my above comment, and I want to be better than that.

Comment author: Fluttershy 12 January 2017 04:15:48AM 10 points [-]

This issue is very important to me, and I stopped identifying as an EA after having too many interactions with dishonest and non-cooperative individuals who claimed to be EAs. I still act in a way that's indistinguishable from how a dedicated EA might act—but it's not a part of my identity anymore.

I've also met plenty of great EAs, and it's a shame that the poor interactions I've had overshadow the many good ones.

Part of what disturbs me about Sarah's post, though, is that I see this sort of (ostensibly but not actually utilitarian) willingness to compromise on honesty and act non-cooperatively more in person than online. I'm sure that others have had better experiences, so if this isn't as prevalent in your experience, I'm glad! It's just that I could have used stronger examples if I had written the post, instead of Sarah.

I'm not comfortable sharing examples that might make people identifiable. I'm too scared of social backlash to even think about whether outing specific people and organizations would even be a utilitarian thing for me to do right now. But being laughed at for being an "Effective Kantian" because you're the only one in your friend group who wasn't willing to do something illegal? That isn't fun. Listening to hardcore EAs approvingly talk about how other EAs have manipulated non-EAs for their own gain, because doing so might conceivably lead them to donate more if they had more resources at their disposal? That isn't inspiring.

Comment author: Fluttershy 12 January 2017 01:37:36AM 5 points [-]

Since there are so many separate discussions surrounding this blog post, I'll copy my response from the original discussion:

I’m grateful for this post. Honesty seems undervalued in EA.

An act-utilitarian justification for honesty in EA could run along the lines of most answers to the question, “how likely is it that strategic dishonesty by EAs would dissuade Good Ventures-sized individuals from becoming EAs in the future, and how much utility would strategic dishonesty generate directly, in comparison?” It’s easy to be biased towards dishonesty, since it’s easier to think about (and quantify!), say, the utility the movement might get from having more peripheral-to-EA donors, than it is to think about the utility the movement would get from not pushing away would-be EAs who care about honesty.

I’ve [rarely] been confident enough to publicly say anything when I’ve seen EAs and ostensibly-EA-related organizations acting in a way that I suspect is dishonest enough to cause significant net harm. I think that I’d be happy if you linked to this post from LW and the EA forum, since I’d like for it to be more socially acceptable to kindly nudge EAs to be more honest.

Comment author: Fluttershy 03 January 2017 12:53:57AM 1 point [-]

Good Ventures recently announced that it plans to increase its grantmaking budget substantially (yay!). Does this affect anyone's view on how valuable it is to encourage people to take the GWWC pledge on the margin?

Comment author: Fluttershy 25 December 2016 03:17:42AM *  3 points [-]

It's worth pointing out past discussions of similar concerns with similar individuals.

I'd definitely be happy for you to expand on how any of your points apply to AMF in particular, rather than aid more generally; constructive criticism is good. However, as someone who's been around since the last time we had this discussion, I'm failing to find any new evidence in your writing—even qualitative evidence—that what AMF is doing is any less effective than I'd previously believed. Maybe you can show me more, though?

Thanks for the post.

Comment author: Fluttershy 15 December 2016 07:58:56AM 7 points [-]

This post was incredibly well done. The fact that no similarly detailed comparison of AI risk charities had been done before you published this makes your work many times more valuable. Good job!

At the risk of distracting from the main point of this article, I'd like to notice the quote:

Xrisk organisations should consider having policies in place to prevent senior employees from espousing controversial political opinions on facebook or otherwise publishing materials that might bring their organisation into disrepute.

This seems entirely right, considering society's take on these sorts of things. I'd suggest that this should be the case for EA-aligned organizations more widely, since PR incidents caused by one EA-related organization can generate fallout which affects both other EA-related organizations, and the EA brand in general.

Comment author: Kathy 27 October 2016 07:35:15AM *  5 points [-]

I think liberating altruists to talk about their accomplishments has potential to be really high value, but I don't think the world is ready for it yet. I think promoting discussions about accomplishments among effective altruists is a great idea. I think if we do that enough, then effective altruists will eventually manage to present that to friends and family members effectively. This is a slow process but I really think word of mouth is the best promotional method for spreading this cultural change outside of EA, at least for now.

I totally agree with you that the world should not shut altruists down for talking about accomplishments, however we have to make a distinction between what we think people should do and what they are actually going to do.

Also, we cannot simply tell people "You shouldn't shut down altruists for talking about accomplishments." because it takes around 11 repetitions for them to even remember that. One cannot just post a single article and expect everyone to update. Even the most popular authors in our network don't get that level of attention. At best, only a significant minority reads all of what is written by a given author. Only some, not all, of those readers remember all the points. Fewer choose to apply them. Only some of the people applying a thing succeed in making a habit.

Additionally, we currently have no idea how to present this idea to the outside world in a way that is persuasive yet. That part requires a bunch of testing. So, we could repeat the idea 11 times, and succeed at absolutely no change whatsoever. Or we could repeat it 11 times and be ridiculed, succeeding only at causing people to remember that we did something which, to them, made us look ridiculous.

Then, there's the fact that the friends of the people who receive our message won't necessarily receive the message, too. Friends of our audience members will not understand this cultural element. That makes it very hard for the people in our audience to practice. If audience members can't consistently practice a social habit like sharing altruistic accomplishments with others, they either won't develop the habit in the first place, or the habit will be lost to disuse.

Another thing is that there could be some unexpected obstacle or Chesterton's fence we don't know about yet. Sometimes when you try to change things, you run face first into something really difficult and confusing. It can take a while to figure out what the heck happened. If we ask others to do something different, we can't be sure we aren't causing those others to run face first into some weird obstacle... at which point they may just wonder if we have any sense at all, lol. So, this is something that takes a lot of time, and care. It takes a lot of paying close attention to look for weird, awkward details that could be a sign of some sort of obstacle. This is another great reason to keep our efforts limited to a small group for now. The small group is a lot more likely to report weird obstacles to us, giving us a chance to do something sensible about it.

Changing a culture is really, really hard. To implement such a cultural change just within a chunk of the EA movement would take a significant amount of time. To get it to spread to all of EA would take a lot of time, and to get it spreading further would take many years.

Unless we one day see good evidence that a lot of people have adopted this cultural change, it's really best to speak for the audience that is actually present, whatever their culture happens to be. Even if we have to bend over backwards while playing contortionist to express our point of view to people, we just have to start by showing them respect no matter what they believe, and do whatever it takes to reach out across inferential distances and get through to them properly. It takes work.

Comment author: Fluttershy 27 October 2016 02:14:32PM 4 points [-]

I think liberating altruists to talk about their accomplishments has potential to be really high value, but I don't think the world is ready for it yet... Another thing is that there could be some unexpected obstacle or Chesterton's fence we don't know about yet.

Both of these statements sound right! Most of my theater friends from university (who tended to have very good social instincts) recommend that, to understand why social conventions like this exist, people like us read the "Status" chapter of Keith Johnstone's Impro, which contains this quote:

We soon discovered the 'see-saw' principle: 'I go up and you go down'. Walk into a dressing-room and say 'I got the part' and everyone will congratulate you, but will feel lowered [in status]. Say 'They said I was old' and people commiserate, but cheer up perceptibly... The exception to this see-saw principle comes when you identify with the person being raised or lowered, when you sit on his end of the see-saw, so to speak. If you claim status because you know some famous person, then you'll feel raised when they are: similarly, an ardent royalist won't want to see the Queen fall off her horse. When we tell people nice things about ourselves this is usually a little like kicking them. People really want to be told things to our discredit in such a way that they don't have to feel sympathy. Low-status players save up little tit-bits involving their own discomfiture with which to amuse and placate other people.

Emphasis mine. Of course, a large fraction of EA folks and rationalists I've met claim to not be bothered by others bragging about their accomplishments, so I think you're right that promoting these sorts of discussions about accomplishments among other EAs can be a good idea.

Comment author: Fluttershy 27 October 2016 07:04:30AM 6 points [-]

Creating a community panel that assesses potential egregious violations of those principles, and makes recommendations to the community on the basis of that assessment.

This is an exceptionally good idea! I suspect that such a panel would be taken the most seriously if you (or other notable EAs) were involved in its creation and/or maintenance, or at least endorsed it publicly.

I agree that the potential for people to harm EA by conducting harmful-to-EA behavior under the EA brand will increase as the movement continues to grow. In addition, I also think that the damage caused by such behavior is fairly easy to underestimate, for the reason that it is hard to keep track of all of the different ways in which such behavior causes harm.

Comment author: Fluttershy 19 September 2016 11:12:46AM 4 points [-]

Thank you for posting this, Ian; I very much approve of what you've written here.

In general, people's ape-y human needs are important, and the EA movement could become more pleasant (and more effective!) by recognizing this. Your involvement with EA is commendable, and your involvement with the arts doesn't diminish this.

Ideally, I wouldn't have to justify the statement that people's human needs are important on utilitarian grounds, but maybe I should: I'd estimate that I've lost a minimum of $1k worth of productivity over the last 6 months that could have trivially been recouped if several less-nice-than-average EAs had shown an average level of kindness to me.

I would be more comfortable with you calling yourself an effective altruist than I would be with you not doing so; if you're interested in calling yourself an EA, but hesitate because of your interests and past work, that means that we're the ones doing something wrong.

View more: Next