Hide table of contents

[Content warning: this post contains discussions of sexual misconduct, including assault]

In response to the recent articles about sexual misconduct in EA and Rationalism, a lot of discussion has ended up being about around whether the level of misconduct is “worse than average”. I think this is focusing on the wrong thing. EA is a movement that should be striving for excellence. Merely being “average” is not good enough. What matters most is whether EA is the best it could reasonably be, and if not, what changes can be made to fix that.  

One thing that might help with this is a discussion of success stories. How have other communities and workplaces managed to “beat the average” on this issue? Or substantially improved from a bad place? For this reason I’m going to relay an anecdotal success story below. If you have your own or know of others, I highly encourage you to share it as well. 

Many, many, years ago, I joined a society for a particular hobby (unrelated to EA), and was active in the society for many, many years. For the sake of anonymity, I’m going to pretend it was the “boardgame club”. It was a large club, with dozens of people showing up each week. The demographics were fairly similar to EA, with a lot of STEM people, a male majority (although it wasn’t that overwhelming), and an openness to unconventional lifestyles such as kink and polyamory. 

Now, the activity in question wasn’t sexual in nature, but there were a lot of members who were meeting up at the activity meetups for casual and group sex. Over time, this meant that the society gained a reputation as “the club you go to if you want to get laid easily”.  Most members, like me, were just there for the boardgames and the friends, but a reasonable amount of people came there for the sex. 

As it turns out, along with the sex came an acute problem with sexual misconduct, ranging from pushing boundaries on newcomers, to harassment, to sexual assault. I was in the club for several years before I realised this, when one of my friends relayed to me that another one of my friends had sexually assaulted a different friend.  

One lesson I took from this is that it’s very hard to know the level of sexual misconduct in a place if you aren’t a target. If I was asked to estimate the “base rate” of assault in my community before these revelations, I would have falsely thought it was low. These encounters can be traumatic to recount, and the victims can never be sure who to trust or what the consequences will be for speaking out. I’d like to think I was trustworthy, but how was the victim meant to know that?

Eventually enough reports came out that the club leaders were forced to respond. Several policies were implemented, both officially and unofficially. 

  1. Kick people out.

 Nobody has a democratic right to be in boardgame club. 

I think I once saw someone mention “beyond reasonable doubt” when it comes to misconduct allegations. That standard of evidence is extremely high because the accused will be thrown into jail and deprived of their rights. The punishment of “no longer being in boardgame club” does not warrant the same level of evidence. And the costs of keeping a missing stair around are very, very high. 

Everyone that was accused of assault was banned from the club. Members that engaged in more minor offenses were warned, and kicked out if they didn’t change. To my knowledge, no innocent people were kicked out by mistake (false accusations are rare). I think this made the community a much more pleasant place.  

2. Protect the newcomers

When you attend a society for the first time, you do not know what the community norms are. You don’t know if there are avenues to report misconduct. You don’t have any social capital or reason to think you would be believed over an experienced member. And your likelihood to remain in the club is highly dependent on your initial experiences.

For these reasons, newcomers are easy targets. It came to light that some experienced members were essentially using the club as a way to pick up young and inexperienced university students. So new people would show up, and their first impression of the society would be multiple older men trying to sleep with them. A lot of them did not show up again. 

We established a policy that established members, especially members of the executive, were to refrain from hitting on or sleeping with people in their first year at the society. This means that people get a chance to settle in and form friendships. And if an incident does occur, it’s no longer a case of the word of an experienced member vs someone nobody knows, it’s now your old friend Bob vs your new friend Alice. Alice is more likely to be believed, and more likely to actually tell people about the incident: the newcomer will often just leave, assuming that misconduct is the norm. 

This policy is similar to the one proposed in this post and implemented by a few groups, which I endorse. 

 3. Change the leadership

Some of the people on the leadership team were the same ones that were accused of sexual misconduct. These people were removed from leadership roles, and over time the leadership became filled with people that were more trustworthy on this issue. I was one of the people that joined the leadership team around this time. 

It’s somewhat hard to judge who actually can be trusted, of course. Women tend to be more trustworthy on this particular issue, but this is far from universal. And there are many stories of men publicly claiming to be devout feminists but secretly abusing people on the side. But I’d still take someone who has a reputation of taking misconduct seriously over someone who doesn’t. 

4. Change the norms

We took a number of steps to remove the reputation as the place you go to for casual sex. We discussed orgies way less, had parties in pubs rather than in houses, and in general there was less sleeping around. It took years for the reputation to die down, but it did eventually.  I think this disincentivized predators from joining, and encouraged more people who were there for the boardgames. 

This is not to say that members became celibate. Existing relationships continued, new members got into relationships with each other. Age gap relationships also occurred, with the difference being that the younger person initiated and was part of the executive committee at the time. 

Did it work?

I think it did. In the years following the changes, the number of reports of misconduct dropped drastically. There were no more reports of serious assault. I can think of two more cases of misconduct in the following years, and in both cases the offenders were non-core members that were swiftly ejected from the group. I can’t rule out that more misconduct was happening in secret that never got to us, but I’m fairly certain that it was significantly less than was occurring before. 

I don’t think the new rules were overly burdensome on the community. The troublesome members were not crucial to it’s operation, and personally I found the society more pleasant with them gone. As I mentioned, relationships still occurred, just on a more even playing field. The one downside was that membership dropped a bit as it went from the “sex and boardgame” club to just the “boardgame” club, but I consider this to be well worth it.   

I don’t think people should overly focus on my one anecdote. There are some similarities between my old community and EA, but also plenty of differences, such as EA being much larger, and much more decentralized, and has access to resources such as the community health team.  And while I have tried to be as honest and accurate as I can in my recollection, this all occurred over many years, so it’s quite possible that parts have been misremembered. 

I do think some of the lessons are useful. For example, I'm curious as to how long it take for newcomers to find out that the community health team exists. It’s not mentioned in any of the introductory resources on the forum. It’s plausible that people are being harassed and are just leaving without reporting it. Whenever I do dance classes they make sure to mention the code of conduct any time a new member is present, and to have it printed out and visible at the reception area. If something similar isn't happening in your irl events, I would recommend it, it really is not a hassle. (This was not one of the things we did in boardgame club, but I think it would have helped as well). 

What I am most interested in is hearing other success stories. Have you been in a community that defeated misconduct? Or that had such excellence in their policies that misconduct could never find a foothold? What norms and policies were in place?

Comments32
Sorted by Click to highlight new comments since:

I love the concrete advice in this piece. Thanks for taking the time to write it.

"We established a policy that established members, especially members of the executive, were to refrain from hitting on or sleeping with people in their first year at the society."

This sounds super reasonable for EA, too. How would you enforce/communicate this?

In my club this was done informally, by just telling people the rule, and telling people to knock it off if we saw them violate it, which was sufficient for us. 

EA is larger, so you'd have to think harder about enforcement/communication, and the various edge cases. It would certainly depend on the different contexts of different places. The goal of such a policy would be to:

  1. Reduce the number of extreme power imbalance relationships.
  2. Avoid turning off new members. 
  3. Reduce the number of people treating EA primarily as a dating service. 

You have to balance this against the risks of being overly burdensome. Relationships and sex are good, and I don't think two compatible people should be kept apart forever just because they entered a community at different times. If you ban relations forever, then people will just ignore the rule entirely.  (I find suggestions such as "stop being polyamorous" to be unhelpful for these reasons.) 

So if I were to craft a policy, or norm, I'd be thinking about how to strike the balance above. For example, I think a 1 year ban on relations might be appropriate for a university club where new members are also very young and inexperienced at dealing with bad behaviour, but is probably excessive for a place where people are older and the power dynamics are more balanced. 

A hedging I'd add: "...unless these people know each other from outside the boardgame club".

Everyone that was accused of assault was banned from the club. Members that engaged in more minor offenses were warned, and kicked out if they didn’t change. To my knowledge, no innocent people were kicked out by mistake (false accusations are rare). I think this made the community a much more pleasant place.  

I often see suggestions like this, so I think it's worth taking a minute to explain why this is a terrible idea. There is a reason both political parties abandoned this policy.

The idea that false accusations are rare is somewhat dubious. It's a commonly quoted idea, but when you dig down into the citations, the claim often relies on some dubious statistics (like simply assuming all accusations not proven false are true). For more details you can see here

Even if those statistics were correct, they are based on data from a previous time period, one where defendants were treated with considerably more due process. As such, there was much less incentive to create a false report. As society regresses back towards a witchhunt/lynching model, where an accusation is taken as sufficient proof of guilt, the incentive to make false accusations significantly increases. (I am curious how you can be so confident that no innocent people were kicked out by mistake if you really were following a shoot-first-ask-questions-later policy!)

So this policy is self-undermining. If you accept accusations as sufficient proof, strategic accusations will be made more often. We know many people, and the EA movement as a whole, have enemies; literally adopting such a policy would make us trivial to destroy. We often ask people to accept huge personal sacrifices and dedicate large fractions of their life to EA; no reasonable person would be willing to invest in the movement if they knew they were a hair-trigger away from exile at all time.

(I am curious how you can be so confident that no innocent people were kicked out by mistake if you really were following a shoot-first-ask-questions-later policy!)

 

Where did you get the impression that we didn't ask questions? Everyone that was accused was kicked out because we talked to both sides and to relevant witnesses and found the accusers to be credible.  No further information has come out since to contradict this assessment. 

As jason points out below, this was a boardgame club.  Would you really insist we keep a credibly accused abuser around, just in case there was an elaborate conspiracy to destroy the boardgame society? At some point, self-preservation has to kick in. 

Indeed, I would support a higher standard of evidence when the "punishment" is more damaging, such as affecting careers and so on. Fortunately, EA has significantly more resources at hand to investigate with than my little club. 

Every system has to balance between the risk of punishing innocents vs the risk of letting guilty people run free. "beyond reasonable doubt" is built on the principle that it's better for ten guilty people to go free than to lock up one innocent person. But if you let ten abusers into your local community meetup, the community will die, or worse, become a powerbase for abusers. 

And -- at least in a low-resource environment where the cost of a false positive isn't high (like a boardgame club) -- it's not inappropriate to consider base rates. Even if you think the base rate of truthful accusations made to boardgame clubs is only (say) 70 percent, that's still a thumb on the scale.

Think about the decisions of parking-enforcement hearing officers as an analogy. They start with the assumption that the base rate of parking enforcers messing up is lower than the base rate of people lying to get out of parking tickets. It'd be very difficult to run a parking enforcement program otherwise.

The other uncomfortable truth is that kicking probably innocent people out is sometimes unavoidable. For instance, in the context of classified information, you would absolutely revoke someone's clearance (which is career ending) if you concluded there was a 1 percent chance that they were passing major secrets to the enemy and could not improve that estimate. Better to end 100 CIA agents' careers than to allow an Ames or Hanssen type traitor to operate. Kicking someone out of a boardgame club on less than 50 percent likelihood of sexual assault is sensible, because the costs of a false negative are worse than the costs of a false positive.

Since temporalis hasn't answered to the literal question, I think I can. 

The literal text of the OP, as cited, says "Everyone that was accused of assault was banned from the club". That, to me, does not sound like the qualifiers you offer here, where "we talked to both sides and to relevant witnesses and found the accusers to be credible". That would be best summarized as "Everybody that was credibly  accused of assault was banned", and even with that, the full explanation of how was an accusation found credible should better follow, and not long after, if we are to think that it was a more-or-less like a trial process, and not a more-or-less like a witch-hunt process.

The original text was ambigous between a description of policy and of outcomes. My reading now is that it was intended as the latter, though people are likely to interpret it as the former and think it's advocating not looking into accusation credibility?

Titotal described the policy for a boardgame club -- that's a considerably different context. Kicking someone out wrongfully is unfortunate . . . but not much more than that. The harm of erroneously not kicking someone who has committed assault is much higher. And a boardgame club practically lacks the resources to offer much in the way of an adjudicative process. So a minimal standard of proof + minimal process makes more sense in that context.

Thanks for writing this post.

In response to the recent articles about sexual misconduct in EA and Rationalism, a lot of discussion has ended up being about around whether the level of misconduct is “worse than average”. I think this is focusing on the wrong thing. EA is a movement that should be striving for excellence. Merely being “average” is not good enough.

I agree "average" is not a good benchmark. However, I would say EA should be striving for excellence in terms of impact. This does not mean solving 100 % of each problem. For example, reducing the number of drownings to 0 is excellent in the sense of minimising deaths from drownings, but would require spending resourses quite wastefully (the amount of resources required to eliminate the last death could do way more good spent elsewhere). 

So I think the question is how much resources should be spent addressing sexual misconduct and related issues. If the problem was worse than the "average" in the general population, then there would be a strong case for the current level of resources not being sufficient. The better the situation is relative to the "average", the weaker is the case, and there is a point beyond which it would make sense to spend less resources on the margin.

Indeed, the only way to 100% ensure no misconduct ever would be to shut down the society entirely.  But i'll note that none of the actions we took in our club cost any money, really it's mostly a culture and norms thing. EA does pay the community health team, but I would guess it gets back far more than it spends, in terms of recruitment, reducing PR disasters, etc.  

I'll note that high standards are important in general as EA becomes more powerful. EA may have a strong voice in writing the value systems of AI, for example, so it's important that the people doing so are not ethically compromised. 

I see your formulation (that I just saw, after publishing my own) is both more succinct and probably less confrontational that my own. Support. 

I liked this post so much. Thanks. I think I'd like to see future discussion on 3 and 4 (which will probably just happen naturally in the coming weeks but I'll note some curiosities and differences here for now)

On 3. Changing Leadership: I wonder what to do with leaders who have not done misconduct themselves but have mismanaged reports of it. Do we hope to get rid of them, and is this permenant or temporary while they learn their lesson and they can be revisited as leadership candidates later? Also, is there a set of acts we expect them to take whereby we can then trust them more, like take a course on sexual misconduct and a sort of HR training on handling victim reports?

A similar question for leaders who have done the misconduct. Obviously it depends on badness of the act(s)), but I wonder if they might ever be allowed back to leadership positions and in what cases? Here it is important we are talking about people's careers, not a hobby. Someone's career and career trajectory is (1) one of the most precious things in the world to most people (especially leaders) and (2) a role they may be especially well-suited to, example: given they are leaders they may even have designed the role themselves from the ground up. I hope we do view this with more nuance because it is careers and not hobby. It might make sense to having victim's opinions involved heavily here from the start (eg, do you think this person should be in leadership role of X kind? What about Y? Is there anything which would make you think they were well-fitted for it, or were even worse wellfitted than you currently expect?). Like, why not ask complainants/victims how they feel about it?[1]  

On 4. Changing Norms: (Trying to not say so much like I did accidentally above) I wonder how the social reception was. I'd be interested to hear how conversations went and whether any resentment was stewing or successfully avoided from overtly sex-positive people, and how. Maybe this didn't come up because your leadership did not open things up for discussion in the same way as occurs on the EA Forum? But if so I'd be interested to hear about it or similar stories from others 

[Final note:  It's amazing to hear a case report like this and whil I appreciate you summing it up so well I expect you know a lot more little bits that might come in handy, enough that I wonder if even the CH team might interview you. You might consider if you have more to say, and if so requesting a call with them, if you have not already]

  1. ^

    of course you have to be sure they have the whole story, eg if the leader had 3 incidents, but each complainant thinks the leader only did something to them,  they will be more permissive

No one has a right to be a leader. If leaders mismanaged abuse situations they should be removed from positions of leadership. The point of leadership is supposed to be service. 

Okay I expect that is the default consensus, and is my default general desire too from a point of ignorance about any given case. I was just surprised that actors such as that weren't listed in this writeup.

I would also like to say though, that depending how many cases you take, a case will be handled in a way that you could call mismanagement eventually. Extreme mismanagement is one thing and generally having poor policies, but slight mismanagement now and again is a bug of the world. I don't expect 1000/1000 cases to be handled perfectly. Handling sexual misconduct cases is insanely hard. I mean, look, now we are even looking at cases where women at the time of reporting said they were satisfied with the CH Team's handling but who went elsewhere anyway after saying that? So for example I'm not comfortable saying we should remove Julia Wise if an independent investigation says she mismanaged one case. I'm pretty sure hardly any EA wants that. It's cases like that which make me very uncomfortable to speak in definite terms about anything.

On 3 (as applied to those who have committed misconduct), I think people's views on the relevant importance of various reasons why we should take action against people who committed misconduct will significantly influence their opinion of how the opinion of the wronged person should factor into what happens to the wrongdoer. It seems that opinion would have great weight on some rationales, but significantly less on others.

As applied to those who mismanaged misconduct reports, I suspect the appropriate response will often depend on why mismanagement happened (and the nature of the mismanagement). If someone failed to take appropriate action due to ignorance of what proper management would have been, then HR training (with good policy development and organizational structure) should redress that. If someone actively attempted to conceal misconduct to protect a friend, was knowingly complicit in retaliation against a survivor, etc., those are not fundamental fitness-for-role problems, not training-deficiency problems.

Upvoted.

I'm in strong agreement with point two and in agreement with point four. I think these are things that more people should keep in mind while putting together microcultures and they are things I worry about frequently.

I'm also in favor of point one for... basically all social groups and microcultures which aren't EA. But it wouldn't work for EA. EA is more public than a boardgame club, and many loadbearing people in EA are also public figures. Public figures are falsely accused of assault, constantly.

I'm 100 percent in favor of kicking people out to the extent that we can but we should also recognize that it's not really possible for a community as decentralized as EA. So much EA activity goes on at events hosted by someone other than effective ventures (and this goes for parties, events, conferences, etc) so I don't really understand the mechanics of what it would mean to kick someone out of EA.

Large decentralized communities can kick people out, but it's hard. In the successful cases I've seen it has required:

  1. The behavior is clearly unacceptable, in a way that nearly everyone would agree if they looked into it.

  2. Detailed public accusations, so that people can look into it if they doubt the consensus.

The combination of these means that you can have an initial burst of 'drama' in which lots of people learn about the situation and agree that the person should be kicked out, and then this can be maintained whenever they show up again.

Some successful examples:

  • 2016: The EA community kicking out Gleb Tsipursky for a range of shady things and a pattern of apologizing and then continuing (details).

  • 2017: The contra dance community kicking out Jordy Williams after accusations of grooming and rape (details).

  • 2018: The rationality community kicking out Brent Dill after accusations of sexual abuse, gaslighting, and more (details).

The main cases I've seen where this has not been successful are ones where the community didn't have (1) or (2). For example, I've seen people try to apply this playbook to Jacy Reese, but because exactly what he apologized for is vague it doesn't entirely stick.

Unfortunately this approach depends on people making public accusations, which is really hard. We should support people when they do and recognize their bravery, but people will often have valid reasons why they won't: fear of retaliation, unwilling to have that level of public scrutiny, risk of legal action. In those cases it's still possible to make some progress privately, especially in keeping the person out of official events, but the issues of decentralized communities and defamation law make it hard to go further.

I'll probably expand this into a blog post. I'd like to include a bit about Michael Vassar, and am thinking about how to fit it in. I think his case is somewhere between Tsipursky/Williams/Dill and Reese's? There's Jax's 2018 thread which is tricky for the purpose of (2) because it (not faulting her) mixes several categories of things (very serious, distasteful, annoyances) and brings in a lot of different people. There's also this LW thread where in 2019 people were explaining why he needed to be kicked out, but then in 2020/2021 influential people changing their minds and regretting their decision. I'm not sure how much of this is disagreement about whether (1) was met vs not having a clear (2), though it seems like both contributed?

(not legal advice, not researched)

It seems that there would be partial workarounds here, at least in theory. Suppose that CEA or another organization offered a one-hour class called Sexual Misconduct Training for EAs that generated a green, digitally signed certificate of attendance "valid" for a year. The organization does not allow individuals who it has determined to have committed moderate-severity misconduct within the past few years to attend the one-hour class. They may, however, attend a four-hour Intensive Training class with which generates a yellow digitally-signed certificate with a validity of six months. Those known to have committed serious misconduct may only attend a class that does not generate a certificate at all.

A community organizer, party host, etc. could ask people for their certificates and take whatever action they deem appropriate if a person submits a yellow certificate or does not submit one at all. At a minimum, they would know to keep a close eye on the person, ask for references from prior EA involvement, etc. In this scenario, Organization hasn't spoken about anyone to a third party at all! (Classically, defamation at least in the US requires a false statement purporting to be fact that is published or communicated to a third person.) It has, at most, exercised its right not to speak about the person, which is generally rather protected in the US. And if the person voluntarily shows a third party the certificate, that's consent on their part.

The greater legal risk might be someone suing if a green-certificate holder commits misconduct . . . but I think that would be a tough sell. First, no one could plausibly claim reliance on the certificate for more than the proposition that Organization had not determined the individual ineligible to take the relevant class at the time the decision was made. To have a case, a plaintiff would have to show that Organization had received a report about the certificate holder, was at least negligent in issuing the certificate in light of that report, and owed them a legal duty not to issue a certificate under those circumstances. As long as Organization is clear about the limits of the certificate process, I think most courts and juries would be hesitant to issue a decision that strongly disincentivizes risk-reduction techniques deployed in good faith and at least moderate effort.

That's a neat approach! I think it only works for longer events, generally with pre-registration? You don't want to be requiring a class before you can attend, say, your first EA meetup.

(And within EA I think longer events maybe mostly already check in with the community health team?)

Agree that it wouldn't work for every event. I could see it working for someone with a pattern of coming to shorter events -- asking someone who has become a regular attender at events for a certificate would be appropriate. Although I suggested an hour-long class because I like the idea of everyone regularly in the community receiving training, the less-involved person training could be 10-15 minutes.

I think the increased visibility of the process (compared to CH-event organizer checks) could be a feature. If you hand over a green cert, you are subtly reminded of the advantages of being able to produce one. If you hand over a yellow one, you are made aware that the organizers are aware of your yellow status and will likely be keeping a closer eye on you  . . . which is  a good thing, I think. Asking to see a certificate before dating or having sex with another EA shouldn't be an affirmatively encouraged use case, but some people might choose to ask -- and that would be 100% up to the person. But that might be an additional incentive for some people to keep to green-cert behavior.

Although no one should take this as legal advice, one of the possible merits of a certificate-based approach is that the lack of merit in a defamation suit should be clear very early in the litigation. The plaintiff will realize quickly that they aren't going to be able to come up with any evidence on a foundational element of the claim (a communication from defendant to a third party about the plaintff). With a more active check-in, you're going to have to concede that element and go into discovery on whether there was communication that included (or implied) a false statement of fact. Discovery is generally the most expensive and painful part of litigation -- and even better, a would-be plaintiff who can figure out that there was no communication will probably decide never to sue at all.

Yea I basically agree with this although it seems difficult to make an intentional effort to expand the circumstances where individuals are banned because it is only possible when there is wide spread knowledge and agreement about the accusation. However I'm all for making accusations public after some threshold of evidence (although it am not sure exactly what that threshold should be and there would need to be some care with the phrasing to avoid libel lawsuits).

We established a policy that established members, especially members of the executive, were to refrain from hitting on or sleeping with people in their first year at the society. This means that people get a chance to settle in and form friendships. And if an incident does occur, it’s no longer a case of the word of an experienced member vs someone nobody knows, it’s now your old friend Bob vs your new friend Alice. Alice is more likely to be believed, and more likely to actually tell people about the incident: the newcomer will often just leave, assuming that misconduct is the norm. 

 

Can you clarify how this worked in a little more detail? I understand the spirit of the policy, and it seems good. What if a newcomer hits on an old timer, and what were the consequences if an old timer hit on a newcomer? Or was this more of an honor code and cultural norm than a formal rules-and-consequences approach?

Or was this more of an honor code and cultural norm than a formal rules-and-consequences approach?

It was more like this. We just told people the norm and kept an eye out, and told people to knock it off if they seemed to be violating it. That was pretty much sufficient to prevent the behavior, at least in our group. 

I don't recall any cases of a completely new member hitting on an older member. Probably we would have expected people to just ignore it, or mention the norm explicitly. There was  one case where a new member settled in to the society over a year or so, became an exec member, and then initiated a relationship with a much older exec member. This seems perfectly fine to me. 

EA is a movement that should be striving for excellence. Merely being “average” is not good enough. What matters most is whether EA is the best it could reasonably be, and if not, what changes can be made to fix that.  

There is a lot of content packed within that "the best". An org, movement, or even person, can only be "the best" on a certain amount of measures, before you're asking for a solution to an optimization problem with too many constraints, or if I can name it on a simpler way, superpowers. 

Should EA be ethical? Sure. Wholesome? Maybe, why not. A place for intellectual and epistemic humbleness? Very important. Freedom for intellectual debate? Very useful for arriving at the truth. A safe space? Depending what you mean by that. A place completely free from interpersonal drama? That would be nice, certainly. Can all of this be achieved at the same time? No, I don't think so. Some values do funge against others to some extent. I hope I don't need to offer specific examples. 

I'm worried about this (and other recent developments) on EA. Calling for a more perfect world is, by itself, good. But asking for optimizations in one front frequently means, implicitly, de-prioritizing  other causes, if only because the proposed optimizations take a good chunk of the limited collective time, attention span and ability to intelligently communicate and discuss. 

Do I think that changes can be made to EA to make it more ethical, with less misconduct (including sexual misconduct), etc? Yes, certainly. Do I think this will have a cost? Yes, there is no such thing as a free lunch. Do I think this will cause, all things considered, more or less suffering in the world? I'm not sure. But since what EA is unquestionably "the best" at, is in identifying opportunities to do the most good, in the margin. And while all improvements are changes, not all changes are improvements. I think that any changes (the more sweeping, the worse on expectation) on community composition, governance, etc. will be in the best of possible worlds, neutral to distracting for the main goal of doing most good with available resources, and in the worst, actively harmful. Thus, my proposal should be that any proposed changes should pass the bar, not only of improving the situation they purport to improve (and these articles with practical examples of what to do, how to do it, and how did it all result are certainly useful for that), but also, a reasonable case that they are at least neutral to the main mission (doing good better, if I may quote). 

So, am I advocating for an "abandon all hope, every one for themselves" policy? Not at all. I'm merely stating, if on a roundabout way, that "average" ability, as an organization, to keep your members safe, sane, and wholesome is good. Quite probably, good enough. And this is key. Since you cannot optimize for everything at the same time, one must find a compromise for most things that are not the main mission. I think sexual misconduct is one of those many, many things.

(Edit: Vasco Grilo's comment says it better, and in less words) 

I appreciate you writing this up. It's valuable input into the ongoing discussion. 

To add some more context may I ask in what country this "boardgame club" was? (Taking into account the effect of different norms in different countries/cultures is somewhat important piece of information for me personally.)

Also if it does not de-anonymise it - was it mostly an university / youth group or not?

This was in Australia.  Most of the incoming members were university aged, but it was open to people of all ages, so there were people in their 30's alongside 18 year old's fresh out of high school (a dynamic that probably applies to EA spaces as well). I think this kind of dynamic warrants significant extra caution, as you don't want older men coming along in order to try and "pick up" college girls. 

Curated and popular this week
Relevant opportunities