Comment author: RobinHanson 12 May 2018 01:24:19PM *  8 points [-]

You seem to be comparing prediction markets to perfection, not to the real mechanisms that we now use today instead. People proposing prediction markets are suggesting they'd work better than the status quo. They are usually not comparing them to something like GJP.

Comment author: Denise_Melchin 12 May 2018 05:02:46PM *  8 points [-]

I agree with you prediction markets are in many cases better than the status quo. I'm not comparing prediction markets to perfection but to their alternatives (like extremizing team forecasts). I'm also only arguing that prediction markets are overrated within EA, not in the wider world. I'd assume they're underrated outside of libertarian-friendly circles.

All in all, for which problems prediction markets do better than which alternatives is an empirical question, which I state in the post:

How stringently the conditions for market efficiency need to be met for a market to actually be efficient is an empirical question. How efficient a prediction market needs to be to give better forecasts than the alternatives is another one.

Do you disagree that in the specific examples I have given (an office prediction market about the timeline of a project, an election prediction market) having a prediction market is worse than the alternatives?

It would be good if you could give concrete examples where you expect prediction markets to be the best alternative.

Prediction markets are a neat concept, and are often regarded highly in the EA sphere. I think they are often not the best alternative for a given problem and are insufficiently compared to those alternatives within EA. Perhaps because they are such a neat concept - "let's just do a prediction market!" sounds a lot more exciting than discussing a problem in a team and extremizing the team's forecast even though a prediction market would be a lot more work.

18

Against prediction markets

Within the EA sphere, prediction markets have often been championed as a good solution for forecasting the future. Improved forecasting has been discussed many times as a cause area for humanity  to  make better judgements and generally improve institutional decision making.   In this post, I will argue that prediction... Read More
Comment author: KarolinaSarek 06 May 2018 07:03:47PM 4 points [-]

Thank you, Joey, for gathering those data. And thank you, Darius, for providing us with the suggestions for reducing this risk. I agree that further research on causes of value drift and how to avoid it is needed. If the phenomenon is explained correctly, that could be a great asset to the EA community building. But regardless of this explanation, your suggestions are valuable.

It seems to be a generally complex problem because retention encapsulates the phenomenon in which a person develops an identity, skill set, and consistent motivation or dedication to significantly change the course of their life. CEA in their recent model of community building framed it as resources, dedication, and realization.

Decreasing retention is also observed in many social movements. Some insights about how it happens can be culled from sociological literature. Although it is still underexplored and the sociological analysis might have mediocre quality, but it might still be useful to have a look at it. For example, this analysis implicate that “movement’s ability to sustain itself is a deeply interactive question predicted by its relationship to its participants: their availability, their relationships to others, and the organization’s capacity to make them feel empowered, obligated, and invested."

Additional aspects of value drift to consider on an individual level that might not be relevant to other social movements: mental health and well-being, pathological altruism, purchasing fuzzies and utilons separately.

The reasons for the value drift from EA seems to be as important in understanding the process, as the value drift that led to EA, e.g. In Joey's post, he gave an illustrative story of Alice. What could explain her value drift was the fact that at people during their first year of college are more prone to social pressure and need for belonging. That could make her become EA and drifted when she left college and her EA peers. So "Surround yourself with value aligned people" for the whole course of your life. That also stresses the importance of untapped potential of local groups outside the main EA hubs. For this reason, it's worth considering even If in case of outreach we shouldn't rush to translate effective altruism

About the data itself. We might be making wrong inferences trying to explain those date. Because it shows only a fraction of the process and maybe if we would observe the curve of engagement it would fluctuate over a longer period of time, eg. 50% in the first 2-5 year, 10% in a 6th year, 1% in for the next 2-3 and then coming back to 10%, 50% etc.? Me might hypothesize that life situation influence the baseline engagement for short period (1 month- 3 years). As analogous for changes in a baseline of happiness and influences of live events explained by hedonic adaptation, maybe we have sth like altruistic adaptation, that changes after a significant live event (changing the city, marriage etc.) and then comes back to baseline.

Additionally, the level of engagement in EA and other significant variables does not correlate perfectly, the data could also be explained by the regression to the mean. If some of the EAs were hardcore at the beginning, they will tend to be closer to the average on a second measurement, so from 50% to 10%, and those from 10% to 1%. Anyhow, the likelihood that the value drift is true is higher than that it's not.

More could be done about the vale drift on the structural level, e.g. it might be also explained by the main bottlenecks in the community itself, like the Mid-Tire Trap (e.g. too good for running local group, but no good enough to be hired by main EA organizations -> multiple unsuccessful job applications -> frustration -> drop out).

Becuase mechanism of the value drift would determine the strategies to minimalize risk or harm of it and because the EA community might not be representative for other social movements, we should systematically and empirically explore those and other factors in order to find the 80/20 of long-lasting commitment.

Comment author: Denise_Melchin 11 May 2018 06:29:53PM *  6 points [-]

More could be done about the vale drift on the structural level, e.g. it might be also explained by the main bottlenecks in the community itself, like the Mid-Tire Trap (e.g. too good for running local group, but no good enough to be hired by main EA organizations -> multiple unsuccessful job applications -> frustration -> drop out).

Doing effective altruistic things ≠ Doing Effective Altruism™ things

All the main Effective Altruism orgs together employ only a few dozen people. There are two orders of magnitude more people interested in Effective Altruism. They can't all work at the main EA orgs.

There are lots of highly impactful opportunities out there that aren't branded as EA - check out the career profiles on 80,000hours for reference. Academia, politics, tech startups, doing EtG in random places, etc.

We should be interested in having as high an impact as possible and not in 'performing EA-ness'.

I do think that EA orgs dominate the conversations within the EA sphere which can lead to this unfortunate effect where people quite understandably feel that the best thing they can do is work there (or at an 'EA approved' workplace like D pmind or J n Street) - or nothing. That's counterproductive and sad.

A potential explanation: it's difficult for people to evaluate the highly impactful positions in other fields. Therefore the few organisations and firms we can all agree on are Effectively Altruistic get a disproportionate amount of attention and 'status'.

As the community, we should try to encourage to find the highest impact opportunity for them out of many possible options, of which only a tiny fraction is working at EA orgs.

Comment author: Denise_Melchin 09 May 2018 09:18:25PM *  9 points [-]

Not sure i agree with this. Certainly there is less focus on donating hug sums of money, but that may also be explained by the shift to EA Orgs now often recommending direct work. But i think the EA community as a hole now focusses less on attracting huge ammounts of people and more on keeping the existing members engaged and dedicated and influencing their career choice (if i remember correctly the strategic write-ups from both CEA and EAF seem to reflect this).

For instance, the recent strategy write-up by CEA mentions dedication as an important factor:

We can think of the amount of good someone can be expected to do as being the product of three factors (in a mathematical sense): 1. Resources: The extent of the resources (money, useful labor, etc.) they have to offer; 2. Dedication: The proportion of these resources that are devoted to helping; 3. Realization: How efficiently the resources devoted to helping are used

(top level comment to not make the thread even more messy)

When we talk about dedication and what that looks like in people, I think we can have very different images in mind. We could think of a 'dedicated EA' and think of two different archetypes (of course, reality is more messy than that and people might actually be both):

Person A talks about dedicating their life to having a high impact, about the willingness for self-sacrifice, about optimising everything for this one goal. They're very enthusiastic, think about all their options to do good and talk about nothing but EA.

Person B is careful and measured. They think about how they can use their career and other resources to have a very high impact and about the long road to being in highly impactful position in a later point in their career. They want to make sure they get there by having a proper work-life balance in the process.

When I say (and I think this is true for Joey as well) that EA emphasises dedication much less, I think about dedication in the way that person A embodies. I think CEA in their material think about dedication more in the way of Person B.

EA was much smaller and less professional in the past. That also meant that the 'highest status' positions were much more easily accessible. When I met Joey in 2013, he was interning at 80,000hours and then started his own project with Charity Science and people thought highly of him for that. Now it is not possible anymore to easily intern or volunteer at high profile EA orgs ('management capacity constraints'). Easily accessible positions still exist, but due to the professionalisation and growth of the EA movement, they're less 'high status' and therefore less appealing.

The type of people like Joey who just went out and started their own projects they were enthusiastic about are also relatively speaking (compared to the now 'high status' EA endeavours) less likely to get funding today. I think this might actually be where some part of the conflict about funding constraints and whether small student-y projects are worth funding or not is actually coming from - do we want to support an EA culture where we encourage young people to do random EA projects? Or do we want to foster a professional environment?

I think the move towards professionalising EA has been correct, but we should be aware of the costs it has imposed on people who liked the young people dedicated person A vibe of EA in the past. One alternative name proposal for EA was 'super hardcore do-gooder' - unthinkable today.

Comment author: Denise_Melchin 06 May 2018 09:15:30AM *  16 points [-]

I’m curious what kind of experiences people in the dedicated group actually had that put them off if you could elaborate on that.

I share the impression that dedication is less encouraged in EA these days than five years ago. I’m also personally very disappointed by that since high dedication felt like a major asset I could bring to EA. Now I feel more like it doesn’t matter which is discouraging.

My guess is that this is because high dedication is a trait of youth movements and the age of the median and perhaps more importantly the most influential EAs has gone up in the mean time. EA has lost its youth movement-y vibe.

I’m also interested whether the other movements you’re comparing EA to are youth movements?

Comment author: Denise_Melchin 06 May 2018 09:26:06AM *  9 points [-]

Another factor leading to dedication being emphasized less might be that people are less motivated to be dedicated these days. The growth of the movement and the funding available have resulted in an individual’s EA contributions mattering far less than they used to.

The increased concern about downside risk has also made it much harder to ‘use up’ your dedication. A few years ago you could at least always do some outreach - now it’s commonly considered far less clear the sign on that is positive.

Comment author: Denise_Melchin 06 May 2018 09:15:30AM *  16 points [-]

I’m curious what kind of experiences people in the dedicated group actually had that put them off if you could elaborate on that.

I share the impression that dedication is less encouraged in EA these days than five years ago. I’m also personally very disappointed by that since high dedication felt like a major asset I could bring to EA. Now I feel more like it doesn’t matter which is discouraging.

My guess is that this is because high dedication is a trait of youth movements and the age of the median and perhaps more importantly the most influential EAs has gone up in the mean time. EA has lost its youth movement-y vibe.

I’m also interested whether the other movements you’re comparing EA to are youth movements?

Comment author: [deleted] 03 May 2018 07:51:06PM *  3 points [-]

Thanks for the post :)

If we make any kind of reasonable assumptions about renting, house price increases and mortgage repayments, it makes a lot of sense for people to save to purchase their own home as soon as possible.

Could you provide a source for this claim? If this were true, we would expect that it's possible to make a lot of money by buying property and renting it out. This would imply that the market for housing is hugely inefficient.

In response to comment by [deleted] on Giving Later in Life: Giving More
Comment author: Denise_Melchin 03 May 2018 08:11:09PM 4 points [-]

I think this claim is often true if someone wants to stay in the same location - however, that is very expensive for someone’s career.

Considering EA’s focus on ‘having a good career’ for which the willingness to move is important, buying a property seems much less likely to be a good call compared to the average person. Unless being willing to move whenever a better opportunity arises is not something you’re willing to do anyway, of course.

Comment author: Peter_Hurford  (EA Profile) 02 May 2018 05:34:51PM 4 points [-]

Maybe my view of the landscape is naive, but it appears to me that a lot of spaces these days have effectively just one or two funders that can actually fund a project (e.g., Elie for poverty interventions, Lewis + ACE for nonhuman animal interventions, Nick for AI interventions, and Nick + CEA for community projects and I imagine these two groups confer significantly). I don't think we need dozens of funders, but I think the optimal number would be closer to three or four people that think somewhat differently and confer only loosely, rather than one or two people.

Comment author: Denise_Melchin 02 May 2018 10:58:34PM 1 point [-]

We do not disagree much then! The difference seems to come down to what the funding situation actually is and not how it should be.

I see a lot more than a couple of funders per cause area - why are you not counting all the EtGers? Most projects don’t need access to large funders.

Comment author: Denise_Melchin 02 May 2018 10:24:25AM 1 point [-]

I don’t think of having a (very) limited pool of funders who judge your project as such a negative thing. As it’s been pointed out before, evaluating projects is very time intensive.

You’re also implicitly assuming that there’s little information in the rejection of funders. I think if you have been rejected by 3+ funders, where you hopefully got a good sense for why, you should seriously reconsider your project.

Otherwise you might fall prey to the unilateralist’s curse - most people think your project is not worth funding, possibly because it has some risk of causing harm (either directly or indirectly by stopping others from taking up a similar space) but you only need one person who is not dissuaded by that.

Comment author: Joey 24 April 2018 09:59:41PM 5 points [-]

I agree regarding implementation difficulties, particularly long term ones (e.g. losing a visa for a place you were living in with a big EA community) can muddy the waters a lot. It's hard to get into the details, but I would generally consider someone not drifted if it was a clearly capacity affecting thing (e.g. they got carpal tunnel) but outside of that they are working on the same projects they would have wanted to in all cases.

A more nuanced view might be break it down into: “Value change away from EA” - defined as changing fundamental ethical views, maybe changing to valuing people within your country more than outside of it.. “Action change away from EA” - defined as changing one of the fundamental applications of your still similarly held values. Maybe you think being veg is good, but you are no longer veg due to moving to a different, less conducive living situation.

With short and long term versions of both and with it being pretty likely that “value change” would lead to “action change” over time, I used value drift as a catch-all for both the above. It’s also how I have heard it commonly used as, but I am open to changing the term to be more descriptive.

“As the EA community we should treat people sharing goals and values of EA but finding it hard to act towards implementing them very differently to people simply not sharing our goals and values anymore. Those groups require different responses.”

I strongly agree. These seem to be very different groups. I also think you could even break it down further into “EAs who rationalize doing a bad thing as the most ethical thing” and “EAs who accept as humans that they have multiple drives they need to trade off between”. Most of my suggestions in the post are aimed at actions one could take now that reduce both “action change” and “value change”. Once someone has changed I am less sure about what the way forward is, but I think that could warrant more EA thought (e.g. how to re-engage someone who was disconnected for logistical reasons).

On ii)

Sorry to hear you have had trouble with the EA community and children. I think it's one of the life changes that is generally updated too strongly on by EAs and assuming that a person (of any gender) will definitely value drift upon having children is clearly incorrect. Personally I have found the EAs who I have spoken to who have kids to be unusually reflective about its effects on them compared to other similar life changes, perhaps because it has been more talked about in EA than say partner choice or moving cities. When a couple who plans to have kids has kids and changes their life around that in standard/expected ways, I do not see that as a value drift from their previous state (of planning to have kids and planning to have life changes around that).

I also think people will run into problems pretty quickly if they assume that every time someone goes through a life change that the person will change radically and become less EA. I think I see it intuitively as more of a bayesian prior. If someone has been involved in EA for a week and then they are not involved for 2 weeks, it might be sane to consider the possibilities of them not coming back. On the flip side, if an EA has been involved for years and was not involved for 2 weeks, people would think nothing of it. The same holds true for large life changes. It’s more about the person's pattern of long term of behavior and a combined “overall” perspective.

My list of concerns about a new trend of EA’s “relaying information about opportunities only informally” is so long it will have to be reserved for a whole other blog post.

Comment author: Denise_Melchin 27 April 2018 01:33:40PM 7 points [-]

I still think you're focussing too much on changed values as opposed to implementation difficulties (I consider lack of motivation an example of those).

With short and long term versions of both and with it being pretty likely that “value change” would lead to “action change” over time

I think it's actually usually the other way around - action change comes first, and then value change is a result of that. This also seems to be true for your hypothetical Alice in your comment above. AFAIK it's a known psychology result that people don't really base their actions on their values, but instead derive their values from their actions.

All in all, I consider the ability to have a high impact EA-wise much more related to someone's environment than to someone's 'true self with the right values'. I would therefore frame the focus on how to get people to have a high impact somewhat differently: How can we set up supportive environments so people are able to execute the necessary actions for having a high impact?

And not how can we lock in people so they don't change their values - though the actual answers to those questions might not be that different.

View more: Prev | Next