2

Effective Altruism as a Market in Moral Goods – Introduction

This is post 1 of a 5-part series, where I tentatively apply the market and network concepts in the table below (links to definitions/examples) to hopefully gain a better applied understanding of how the EA community operates. I welcome your feedback in the comments section throughout.   1) Introduction 2)... Read More
Comment author: DardanBastiaan 04 July 2017 11:33:15AM *  1 point [-]

Looking at the list of friends to the Leiden chapter, I am impressed with both the amount of people on it as well as with the amount of talent I know some of these folk to possess. On the other hand, the activities thus far planned out and put out there these past few months, based on for example the 2 people that attended the last event, seem to have been largely unsuccessful.

There is a middle road, I think, combining the best of both our views. Rather than having a core group hosting activities which very few would attend, I envision having a core group that first raises awareness throughout Leiden and otherwise working pragmatically, which could include hosting events, to further the EA agenda. If some of these people are anything I know them to be, i.e. highly talented, motivated individuals, then we'd be able to stick together based on our shared passion and desire for a stimulating environment alone. The risk of such a group falling apart would in that sense be far smaller than what I had estimated it to be in my previous post.

Anyway, again, I very much look forward to working more closely with you and those already active in Leiden.

Comment author: remmelt  (EA Profile) 04 July 2017 02:03:03PM *  0 points [-]

Yes, I think connecting with potentially interested people on existing platforms makes sense for local groups, as an example. The subtle difference for me is that you wouldn't try to 'convert' the entire existing network but instead have targeted conversations with participants (e.g. talking with altruistic, analytical people at an Amnesty event or inviting people to schedule a cup of coffee at the end of your own event).

I'll connect you with the current organiser. Looking forward to exploring this idea further with you!

Comment author: remmelt  (EA Profile) 03 July 2017 10:49:24PM 2 points [-]

Thanks for the points!

First off, you might be interested in helping continue the EA Leiden group (the current organiser has just finished her Masters and is going back to Germany): https://www.facebook.com/?_rdr#~!/profile.php?id=100015874785676 Please let me know if you want me to connect you with her.

Comment author: remmelt  (EA Profile) 03 July 2017 11:19:34PM *  3 points [-]

Second, I lean towards focusing on enabling a handful of small number of highly-committed and capable people in a network instead of trying to shift hundreds of people towards EA.

Besides the outsized impact that these few individuals can have, the time cost of coordinating a large group of slightly motivated people (as you alluded to) and the difficulty of fostering a rigorous EA culture and network effects within such a group for those who the concept 'clicks', mean that I personally have a strong preference for quality over quantity (similar to Kevin Kelly's 1000 true fans concept or Y Combinator's advice of focusing on making initial customers love the product).

To some extent, EAN's strategy leans this way because we focus on building EA Networks instead of influencing existing networks.

The broad reasoning done by Sjir and I (based in part on useful advice given by others in the community) are build on layers of unproven assumptions. I can imagine counterexamples for local groups such as having low cost, low bar ways of getting people acquainted with EA like pub socials, to help build up a core circle of people.

In general, I want to be wary though of aiming for short-term effects by collecting many people instead of building up our collective capacity to solve big problems.

Comment author: DardanBastiaan 03 July 2017 06:06:02PM *  3 points [-]

First off: sign me up. There's a bunch of (potentially) relevant networks I'm connected into in Leiden, e.g. Amnesty, ISN (International student network), LDU (Leiden Debating Union), EUSA (European Student Association), but in which I have yet not been able to get through the change I should have pushed harder for. For example, I once had the idea to set up a debating tournament as to not only raise awareness, but do so amongst those who would be most open to its message and most able to then do something about it, namely debaters. This particular idea might not be as feasible as I once thought it was, but there's always other opportunities to be thought of.

Here are potential network collaborations that we’re exploring right now: Local/student groups These would clearly define their target groups and offer newcomers a path to learn about EA principles and build up their capacity to do good (established groups like EA London and EA Berkeley are inspirations to us here).

Based on having been in a book club with a community of about ~300 individuals, it's really difficult to get a large group of individuals to be consistently involved. It follows that this strategy has a high risk of failing, demotivating those involved in the failure from staying involved in EA. If you want to go ahead with this, it needs to be thought out well.

Comment author: remmelt  (EA Profile) 03 July 2017 10:49:24PM 2 points [-]

Thanks for the points!

First off, you might be interested in helping continue the EA Leiden group (the current organiser has just finished her Masters and is going back to Germany): https://www.facebook.com/?_rdr#~!/profile.php?id=100015874785676 Please let me know if you want me to connect you with her.

11

Testing an EA network-building strategy in the Netherlands

Last January, Effective Altruism Netherlands (EAN) became a registered charity in the Netherlands. The organisation consists of a three-person board and two full-time employees, Sjir Hoeijmakers and yours truly (Remmelt Ellen). Note: as of writing, we are still fundraising to cover our salaries. On 28 May, we publicly launched with... Read More
Comment author: remmelt  (EA Profile) 21 June 2017 04:44:48PM 3 points [-]

To keep it short: your articles on community-building (especially the social dynamics that come into play) have been highly valuable for me.

Comment author: remmelt  (EA Profile) 06 May 2017 04:21:42PM 1 point [-]

I appreciate this article because it makes these emotional problems – and ways to prevent and deal with them – visible and dispels the impression that we're all rational, calculating evaluators, all of the time. I recall 2 cases in the EA community of people who I chatted with online in the last year who seemed (disclaimer: by my amateurish reasoning for their extreme behaviour) to experience mania and/or psychosis at a point.

Comment author: remmelt  (EA Profile) 27 April 2017 10:07:51AM *  4 points [-]

I thought this was a really useful framework to look at the system-level. Thank you for posting this!

Quick points after just reading through it:

1) Your phrasing seems to convey too much certainty to me/flowed too much into a coherent story. I'm not sure if you did this too strongly bring across your points or because that's the confidence level you have in your arguments.

2)

If you want to acquire control over something, that implies that you think you can manage it more sensibly than whoever is in control already.

To me, it appears that you view Holden's position of influence at Open AI as something like a zero-sum alpha investment decision (where his amount of control replaces someone else's commensurate control). I don't see why Holden also couldn't have a supportive role where his feedback and different perspectives can help Open AI correct for aspects they've overlooked.

3) Overall principle I got from this: correct for model error through external data and outside views.

Comment author: Kerry_Vaughan 21 April 2017 04:53:04PM 8 points [-]

As much as I admire the care that has been put into EA Funds (e.g. the 'Why might you choose not to donate to this fund?' heading for each fund), this sentence came across as 'too easy' for me. To be honest, it made me wonder if the analysis was self-critical enough (I admit to having scanned it) as I'd be surprised if the trusted people you spoke with couldn't think of any significant risks. I also think 'largely positive' reception does not seem like a good indicator.

I agree. This was a mistake on my part. I was implicitly thinking about some of the recent feedback I'd read on Facebook and was not thinking about responses to the initial launch post.

I agree that it's not fair to say that the criticism have been predominately about website copy. I've changed the relevant section in the post to include links to some of the concerns we received in the launch post.

I'd like to develop some content for the EA Funds website that goes into potential harms of EA Funds that are separate from the question of whether EA Funds is the best option right now for individual donors. Do you have a sense of what concerns seem most compelling or that you'd particularly like to see covered?

Comment author: remmelt  (EA Profile) 27 April 2017 07:46:31AM 1 point [-]

I forgot to do a disclosure here (to reveal potential bias):

I'm working on the EA Community Safety Net project with other committed people, which just started on 31 March. We're now shifting direction from focusing on peer insurance against income loss to building a broader peer-funding platform in a Slack Team that also includes project funding and loans.

It will likely fail to become a thriving platform that hosts multiple financial instruments given the complexities involved and the past project failures I've seen on .impact. Having said that, we're aiming high and I'm guessing there's a 20% chance that it will succeed.

I'd especially be interested in hearing people's thoughts on structuring the application form (i.e. criteria for project framework) to be able to reduce Unilateralist's Curse scenarios as much as possible (and other stupid things we could cause as entrepreneurial creators who are moving away from the status quo).

Is there actually a list of 'bad strategies naive EAs could think off' where there's a consensus amongst researchers that one party's decision to pursue one of them will create systemic damage on an expected value basis? A short checklist (that I can go through before making an important decision) based on surveys would be really useful to me.

Come to think of this: I'll start by with a quick Facebook poll in the general EA group. That sounds useful for compiling an initial list.

Any other opinions on preventing risks here are really welcome. I'm painfully aware of my ignorance here.

Comment author: Kerry_Vaughan 21 April 2017 04:53:04PM 8 points [-]

As much as I admire the care that has been put into EA Funds (e.g. the 'Why might you choose not to donate to this fund?' heading for each fund), this sentence came across as 'too easy' for me. To be honest, it made me wonder if the analysis was self-critical enough (I admit to having scanned it) as I'd be surprised if the trusted people you spoke with couldn't think of any significant risks. I also think 'largely positive' reception does not seem like a good indicator.

I agree. This was a mistake on my part. I was implicitly thinking about some of the recent feedback I'd read on Facebook and was not thinking about responses to the initial launch post.

I agree that it's not fair to say that the criticism have been predominately about website copy. I've changed the relevant section in the post to include links to some of the concerns we received in the launch post.

I'd like to develop some content for the EA Funds website that goes into potential harms of EA Funds that are separate from the question of whether EA Funds is the best option right now for individual donors. Do you have a sense of what concerns seem most compelling or that you'd particularly like to see covered?

Comment author: remmelt  (EA Profile) 27 April 2017 12:52:21AM *  1 point [-]

I haven't looked much into this but basically I'm wondering if simple, uniform promotion of EA Funds would undermine the capacity of community members in say the upper quartile of rationality/commitment to built robust idea sharing and collaboration networks.

In other words, whether it would decrease their collective intelligence pertaining to solving cause-selection problems. I'm really interested in getting practical insights on improving the collective intelligence of a community (please send me links: remmeltellenis[at]gmail.dot.com)

My earlier comment seems related to this:

Put simply, I wonder if going for a) centralisation would make the 'system' fragile because EA donors would be less inclined to build up their awareness of big risks. For those individual donors who'd approach cause-selection with rigour and epistemic humility, I can see b) being antifragile. But for those approaching it amateuristically/sloppily, it makes sense to me that they're much better off handing over their money and employing their skills elsewhere.

(Btw, I admire your openness to improving analysis here.)

View more: Prev | Next