Comment author: byanyothername 12 June 2018 10:39:10AM *  2 points [-]

I don't want to spend too long on this, so to take the most available example (i.e. treat this more as representative than an extreme example): Your summary at the top of this post.

  • General point: I get it now but I had to re-read a few times.
  • I think the old "you're using long words" is a part of this, which is common in EA and non-colloquial terms are often worth the price of reduced accessibility, but you seem to do this more than most (e.g. "posit how" could be "suggest that"/"explore how", "heretofore" could be "thus far", "delineate" could be "identify"/"trace" etc....it's not that I don't recognise these words, they're just less familiar and so make reading more effort).
  • Perhaps long sentences with frequent subordinate clauses - and I note the irony of my using that term - and, indeed, the irony of adding a couple here - add to the density.
  • More paragraphs, subheadings, italics, proofing etc. might help a bit.

I also have the general sense that you use too many words - your comments and posts are usually long but don't seem to be saying enough to justify the length. I am reminded of Orwell:

It is easier — even quicker, once you have the habit — to say "In my opinion it is not an unjustifiable assumption that" than to say "I think".

And yes - mostly on social media. But starting to read this post prompted the comment (I feel like you have useful stuff to say so was surprised to not see many upvotes and wondered if it's because others find you hard to follow too).

Comment author: Evan_Gaensbauer 17 June 2018 06:17:13PM 0 points [-]

One heuristic I use for writing is to try Writing Like I Talk from Paul Graham. Of course, I already tend to speak differently than most people. I find keeping my head in books changes how I think internally, and thus how I speak. It comes full circle when I write like I talk, which is different than most people talk or write. The perfect is the enemy of the good, and there are trade-offs in time taken to write. Another is to know your audience. The post in question was meant to be read by suffering reducers and those familiar with the work on the Foundational Research Institute, from whom I've already received good feedback from, so I relatively achieved my goal with my writing. Also, those posts are rougher on my personal blog, but I would edit them before I put them up on the EA Forum.

As long as it takes to read my stuff, I use a lot of words because it provides full context. For example, I'd hope someone familiar with academic jargon but relatively new to EA might come to fully understand the case of potential s-risks from terraforming, having come in knowing little to nothing about the subject. I'm aware I often use too many words, but when the time comes to make posts more accessible, I can and will do so. I appreciate this feedback though. Please feel free to provide feedback anytime. I update on it quite quickly, even from a single person. I wish more people felt comfortable doing so.

I wrote this post up because it will tie into a series of blog posts I'll be rolling out. When it's done, in context, I hope this post will make more sense. I'm going to be working with various EA organizations to bring remote volunteering opportunities to local EA groups to do direct work. I'm going to consult with Rethink Charity's research team to tighten up a model I have for coordinating teams together numbering in potentially hundreds of individuals. Soon time too may be a unit of caring.

Comment author: byanyothername 12 June 2018 10:42:08AM 1 point [-]

Thanks. Data point: the summary at the top of "Crucial Considerations for Terraforming as an S-Risk" seems like a normal level of hard-to-read-ness for EA.

Comment author: Evan_Gaensbauer 13 June 2018 06:47:36PM 0 points [-]

The summary wasn't supposed to be easier to be read. It was a condensed version so those who are already familiar with the concept would be aware of where the post was going. It was primarily intended for those effective altruists who are already (quite) familiar with risks of astronomical suffering; and the research of Brian Tomasik and the Foundational Research Institute.

Comment author: byanyothername 11 June 2018 06:26:58PM 3 points [-]

Evan, just a data point: I don't understand a lot of what you're saying in most of your posts/comments, and I can only think of one person I find more difficult to understand out of everyone I've come across in the EA community who I've really wanted to understand. (By which I mean "I find the way you speak confusing and I often don't know what you mean", not "Boi, you crazy".)

Comment author: Evan_Gaensbauer 12 June 2018 12:51:30AM 0 points [-]

I also have some posts I've taken more time to edit for clarity on my personal blog about effective altruism.

Comment author: byanyothername 11 June 2018 06:26:58PM 3 points [-]

Evan, just a data point: I don't understand a lot of what you're saying in most of your posts/comments, and I can only think of one person I find more difficult to understand out of everyone I've come across in the EA community who I've really wanted to understand. (By which I mean "I find the way you speak confusing and I often don't know what you mean", not "Boi, you crazy".)

Comment author: Evan_Gaensbauer 12 June 2018 12:03:11AM *  1 point [-]

Thanks. Are you referring to my posts and comments on social media? That's more transient, so I make less of an effort on social media to be legible to everyone. Do you have examples of the posts or comments of mine you mean? I don't get tons of feedback on this. Of course people tell me I'm often confusing. But the feedback isn't actionable. I can decode any posts you send me. For example, here is a post of mine where I haven't gotten any negative feedback on the content or writing style. This post was like a cross between a personal essay and dense cause prioritization discussion, so it's something I wouldn't usually post to the EA Forum. It's gotten some downvotes, but clearly more upvotes than downvotes, so somebody is finding it useful. Again, if I get some downvotes it's ultimately feedback on what does or doesn't work on the EA Forum. This is the kind of clearer feedback specifying something.

Comment author: toonalfrink 08 June 2018 01:34:38PM 1 point [-]

Thank you for the mention!

Comment author: Evan_Gaensbauer 09 June 2018 06:54:59AM 0 points [-]

No problem!

Comment author: RandomEA 13 May 2018 06:42:10PM 2 points [-]

This is somewhat off-topic but it's relevant enough that I thought I'd raise it here.

What is the most impactful volunteering opportunity for a non-EA who prioritizes more conventional causes (including global poverty) and who lacks specialized skills? Basically, I'm seeking a general recommendation for non-EAs who ask how they can most effectively volunteer. I recognize that the recommended volunteering for a non-EA will be much less impactful than the recommended volunteering for an EA, but I think it can sometimes be worthwhile to spread a less impactful idea to a larger number of people (e.g. The Life You Can Save).

The standard view seems to be that volunteering in a low-skill position produces as much value for an organization as donating the amount necessary for them to hire a minimum wage worker as a replacement. While this may be correct as a general matter, I think there are likely exceptions:

  1. An organization may feel that volunteer morale will greatly decrease if there are some people doing the same work as the volunteers for the same number of hours who are paid.

  2. An organization may be unwilling to hire people to do the work for ideological reasons.

  3. An organization may be unwilling to hire people to do the work because doing so would look bad to the public.

  4. An organization may feel that passion about the cause is extremely important and that the best way to select for passion is to only accept people who will work for free.

  5. An all-volunteer organization may lack the infrastructure to pay employees meaning that it would have to pay a high initial cost before hiring its first employee.

Thus, it seems plausible to me that there is some relatively high impact organization with appeal to non-EAs where a person without specialized skills can have a significant impact. Does anyone know of a volunteering opportunity like this?

Comment author: Evan_Gaensbauer 17 May 2018 12:25:46AM *  1 point [-]

These are good questions. I was surprised and disappointed I couldn't find more volunteer opportunities for more conventional causes like global poverty alleviation. I'm spitballing some ideas here.

  • I made contact with a bunch of people running different mental health, well-being and happiness projects from within effective altruism. Some of them might have some kind of volunteer opportunities. I can find out.

  • The closest thing there currently is to organizations rated as effective by EA working in the developed world is some policy organizations. My friend Finan from Seattle had a meetup where a representative from an urban housing reform organization also in Seattle which had received a grant from the Open Philanthropy Project (Open Phil) came to talk about their work. If someone is in a city in or near the same place as where one of Open Phil's grantees throughout North America or the rest of the world, they could have an event, or visit the organization. There could be a 'What Can You Do?' section at the end of talks. Alternatively, you could just email such organizations and ask them if there is anything in addition to donating individuals can do to support the organization. Hosting an event with the organization and non-EAs looking for effective volunteer opportunities could help them build a personal connection to the cause.

That stated, among the options in the post, hosting a Giving Game through The Life You Can Save (TLYCS) is a well-structured volunteering opportunity. TLYCS has all of Givewell's standout and recommended charities, plus some charities well-rated by other charity evaluators. There is over 18 charities to fundraise for through Giving Games, all of them on conventional causes, mostly in the focus area of global development. Whoever runs a Giving Game can choose a few from any of those 18 causes for a Giving Game. After hosting a successful Giving Game, unskilled people can say they gained these skills by doing so, and put that on their resume:

  • Social Media Engagement/Management
  • Public Speaking & Presentation
  • Event Planning & Organization
  • Fundraising

That's because that's all the stuff someone technically has to do to host a Giving Game. So after hosting a Giving Game, if someone didn't want to get involved with EA, they could go to an organization and say "look at all this stuff I learned how to do!", which I think will look pretty good to a lot of different kinds of NPOs. Also, if someone wants to help with farm animal welfare/research, Faunalytics' online Library Assistant volunteer position is recommended as a good fit for high school or university students.

Comment author: RandomEA 13 May 2018 02:48:20PM 3 points [-]

The Humane League (THL) is an ACE-recommended charity. THL runs the Fast Action Network, an online group which sends out easy, one-minute actions two or three times per week, including signing petitions, posting on social media, or emailing decision makers, as part of campaigns to mitigate factory farming. You can sign up to join the Fast Action Network in the United States here, in the United Kingdom here and for a Spanish version of the Fast Action Network here.

Mercy for Animals (which was ACE-recommended for 2014, 2015, and 2016) runs a similar program called Hen Heroes.

Comment author: Evan_Gaensbauer 14 May 2018 12:54:38AM 0 points [-]

Thank you. I will add this to the list.

Comment author: Alex_Barry 08 May 2018 09:10:38AM 1 point [-]

But should we not expect coordinator organizations to be the ones best placed to have considered the issue?

My impression is that they have developed their view over a fairly long time period after a lot of thought and experience.

Comment author: Evan_Gaensbauer 08 May 2018 10:28:54AM 2 points [-]

Yes, but I think the current process isn't inclusive of input from as many EA organizations as it could or should be. It appears it might be as simple as the CEA having offices in Berkeley and Oxford meaning they receive a disproportionate amount of input on EA from those organizations, as opposed to EA organizations whose staff are geographically distributed and/or don't have an office presence near the CEA. I think the CEA should still be at the centre of making these decisions, and after recent feedback from Max Dalton from the CEA on the EA Handbook 2.0, I expect they will make a more inclusive process for feedback on outreach materials.

Comment author: Maxdalton 07 May 2018 01:39:06PM 7 points [-]

Thanks for the comments Evan. First, I want to apologize for not seeking broader consultation earlier. This was clearly a mistake.

My plan now is to do as you suggest: talk to other actors in EA and get their feedback on what to include etc. Obviously any compromise is going to leave some unhappy - different groups do just favour different presentations of EA, so it seems unlikely to me that we will get a fully independent presentation that will please everyone. I also worry that democracy is not well suited to editorial decisions, and that the "electorate" of EA is ill-defined. If the full compromise approach fails, I think it would be best to release a CEA-branded resource which incorporates most of the feedback above. This option also seems to me to be cooperative, and to avoid harm to the fidelity of EA's message, but I might be missing something.

Comment author: Evan_Gaensbauer 07 May 2018 01:45:27PM 5 points [-]

Thanks for responding Max. I agree consulting some key actors but not going through a democratic makes sense. I appreciate you being able to respond to and incorporate all the feedback you're receiving so quickly.

Comment author: Evan_Gaensbauer 06 May 2018 09:12:31PM *  6 points [-]

The tension of over- vs. under-dedication on the part of individuals to find the best balance for the needs of EA has always been at the heart of the movement: it's called Giving What We Can for a reason, and Singer's book was called The Most Good You Can Do not the The Most Good You Should Do. I think it's the experience of some the message of EA combined with a call for dedication can be so overwhelming many people feel compelled to do more than they can; burn out; and feel dejected enough by their self-perception of failure they can't summon as much dedication as they did before. Unfortunately, as you point out, de-emphasizing dedication doesn't lead to effective altruists dedicated at all times as much as they can be in a sustainable manner, but effective altruists who aren't dedicated once the clock at the office is up. Now that EA has been around for a few years, and even if it were slowing down (which it isn't), it'd be around for a while longer. So we can take a longer-term view of investing resources into individual effective altruists. I don't think dedication is fixed: newcomers to EA don't come in with either a level of dedication we can work with, or not, and that's the end of it. Given we can expect some individuals to be dedicated to community projects indefinitely, we can foster a growth of dedication in effective altruists. I think to prevent outcomes like burnout in fostering increased dedication, it's the responsibility of existing community members to create an evidence-based tool-kit for how to do so. Currently we don't have either of those two evidence-based tool-kits (they're two separate tool-kits), but that might because not enough effective altruists were interested in creating them. Pushing 'excited altruism' over 'dedicated altruism' often seems motivated by PR concerns of not being offputting to newcomers, but that's based on assumptions of what kinds of people we should be reaching out to to bring into the movement in the first place. Of all the pieces written cautioning against both the appearance and reality of over-exertion in EA, I think the best is A Defense of Normality by Eric Herboso. A difference between Eric's post and others is his comes from a motivation of both fostering dedication and caring about the sustainable, long term well-being of individual community members, so that dedication and the work those effective altruists do is itself sustainable. This is unlike other motivations to de-emphasize dedication in EA, as you mentioned above:

  • What topics that are researched in an organization. Does it lean more towards what the researchers find fun or towards what will help the most people?

  • Management choices. Does one hire/fund someone they are friends with or do they hire/fund the strongest applicant chosen by more objective criteria?

  • Strategic direction. Does the person point their organization in a direction that might be higher impact but less personally or organizationally prestigious?

At the time Eric wrote his post, Peter Hurford and others commented the level of dedication Eric was observing was to a degree past which we'd see diminishing marginal returns to productivity and also health and well-being were (some) effective altruists were to dedicate themselves as much as the picture Eric was painting. It might be the case Eric was observing other effective altruists personally over-correcting to make up for what they saw as a decline in the average level of dedication in the community. De-emphasizing dedication in EA on shortsighted grounds doing what's more comfortable, fun or prestigious, even if it's not the best plan for how to do the most good from whatever perspective, is the problem to solve. Making EA based on appearances of drawing in people who want to feel comfortable, fun or prestigious, without too much dedication, may be what's drawing in people who who aren't willing to up their dedication when EA stops being fun, comfortable or prestigious. They may not want to become more dedicated, because when they joined the movement they were told they wouldn't have to be.

Having been a past moderator for the EA Peer Support group on Facebook, and a long-time community organizer, I've seen the toll in hundreds of effective altruists of being sent the mixed message of dedicating yourself while doubting everything you're doing but doing so in a self-effacing way so it's not offputting to potentially anyone else. I'd like to construct effective, robust evidence-based tool-kits for both fostering a growth in dedication and long-term sustainable self-care which suits the unique demands of effective altruism. If anyone is interested in such a project, I welcome anyone's contributions, so don't hesitate to contact me.

View more: Next