Why It’s Important to Know the Risk of Value Drift
The concept of value drift is that over time, people will become less motivated to do altruistic things. This is not to be confused with changing cause areas or methods of doing good. Value drift has a strong precedent of happening for other related concepts, both ethical things (such as being vegetarian) and things that generally take willpower (such as staying a healthy weight).
Value drift seems very likely to be a concern for many EAs, and if it were a major concern, it would substantially affect career and donation plans.
For example, if value drift rarely happens, putting money into a savings account with the intent of donating it might be basically as good as putting it into a donor-advised fund. However, if the risk of value drift is higher, a dollar in a savings account is more likely to later be used for non-altruistic reasons and thus not nearly as good as a dollar put into a donor advised fund, where it’s very hard not to donate it to a registered charity.
In a career context, a plan such as building career capital for 8 years and then moving into an altruistic job would be considered a much better plan if value drift were rare than if it were common. The more common value drift is, the stronger near-term focused impact plans are relative to longer-term focused impact plans. For example, you might get an entry-level position at a charity and build up capacity by getting work experience. This has the potential, though not always, to be slower at building your CV than getting a degree or working in a low-impact but high-prestige field. However, it has impact right away, which matters more if the risk of value drift is high.
The Data
Despite the importance of value drift to important questions, it's rarely been talked about or studied. One of the reasons it is so under-studied is that it would take a long time to get good data.
I have been in the EA movement for ~5 years. I decided to pool some data from contacts who I met in my first year of EA. I only included people who would have called themselves EAs for 6 months or longer (I would not include someone who was only into EA for a month and then disappeared), and who and took some sort of EA action (working for an EA org, taking the GWWC pledge, running an EA group). I also only included people who I knew and kept in touch with well enough to know what happened to them (even if they left the EA movement). It is ultimately a convenience sample, but it was based on working for 4 current EA orgs and living in 4 different countries over that time, so it’s not focused on a single location or organization.
I also broke the groups down into ~10% donors and ~50% donors, because many times I have heard people being more or less concerned about one of these groups vs the other. These broad groups are not just focused on people doing earning to give. Someone who is working heavy hours for an EA organization and making most of their life decisions with EA as their number one priority would be considered in the 50% group. Someone running an EA chapter who makes decisions with EA as a factor, but prioritizes other factors above it, would be put in the 10% group. The percentages are aimed at rough proxies of how important EA is in these people's lives, not strictly financial donations. I did not count changing cause areas as value drift (e.g. changing from donating 10% to MIRI to AMF) -- only different levels of overall altruistic involvement.
The results over 5 years are as follows:
16 people were ~50% donors → 9/16 stayed around 50%
22 people were ~10% donors → 8/22 stayed around 10%
No one moved from the 10% category to the 50% category, and I only counted fairly noticeable changes (if someone changed their donations from 50% to 40%, I would not have the resolution to notice).
Value drift was high across both groups, with roughly 50% of the population drifting over 5 years. I talked to many of those people about value drift and their thoughts on long term altruism, and most of them, like most people I talk to now, had previously been very confident that they would stay altruistic in the long term. Why people generally value drifted was a mix of reasons with no clear consistent source, although life changes were a large factor for many (e.g. people moving from university to the workforce, changing cities or workplaces, marrying or having kids).
Interestingly, this data also sheds a little light on concerns of “pushing yourself too hard” and vs “taking it too easy on yourself”, with generally more involved or dedicated people value drifting noticeably less (~30% vs ~60%).
Discussion
Overall, these results seem pretty scary to me, especially since there’s a natural selection effect where I tend to make friends who are more dedicated and drift apart from the ones who leave the movement. It's also worth noting that there have not been particularly major controversies or problems in the EA movement that would cause a lot of value drift over this period of time vs. any other. Historically, many EAs have been young non-family-starters, so we've arguably been seeing a period of artificially low value drift that is not sustainable as the movement gets older and goes through standard life changes.
Of course, the data could be a lot better quality, and I wish it were measured in a more rigorous way. I would be keen to see any more data that anyone else has along these lines. Despite quality concerns, I still think we can draw some conclusions from it, particularly given that people are already effectively drawing conclusions about the likelihood of value drift from no data at all. I also spoke to a few EAs who have been around the movement awhile, and this data broadly fit their intuitions, which gives me more confidence that it's not 100% off the mark.
The implications of this data are that people should be cautious of deferring impact to later, and should set up commitment devices to help them stick to what they care about.
One example: be wary of building capacity for very long periods of time, particularly if the built capacity is broad and leaves open appealing non-altruist paths. Instead, see if you can build capacity in such a way that it also does good at the moment, such as getting work experience with non-profits.
For instance, if you want to do direct impact, volunteering for an organization and showing how good your work is is often better than having a degree, especially an unrelated one. It’s also substantially faster and does good directly. Degrees are largely a way to signal that you’re a hard worker with a decent amount of intelligence, and if all you are is a CV in somebody’s inbox, that’s very important. However, if you’ve been working alongside them for months, they’ll already know these traits of yours.
Another way to build career capacity with value drift in mind is to get experience and credentials that make it harder to work in a non-altruistic area. This could be getting a degree in development economics instead of economics generally, or working for prestigious nonprofits instead of other prestigious organizations. Option value is great if the risk of value drift is low, but if it’s high, it makes it easier for you to slip. It’s like only having healthy food in the house. If the only easy options are also altruistic, you’re much more likely to stick it out in the long haul.
If your primary path to impact is donations and you want to keep value drift in mind, but you don’t know where you want to give yet, don’t save those donations. Put them into a donor-advised fund. That way, even if you become less altruistic in the future, you can’t back out on the pledged donations and spend it on a fancier wedding or a bigger house. You can also set up monthly donations, or ask your employer to automatically donate a preset portion of your income to charity before you even see it in your bank account.
Overall, if 50% of the EAs I met 5 years ago have value drifted, this should factor into your plans. Nobody thinks they’ll value drift, just like no teenager with a fast metabolism thinks they’ll be the one who gains weight when they hit middle age. By all means, indulge in junk food every once in awhile and don’t constantly stress about calories, but put some time into setting up your life to make it easier for you to reach for a banana instead of the ice cream, or in this case, the altruistic path instead of the less altruistic one.
For a deeper dive into concrete ways to reduce value drift, check out this post.
I agree regarding implementation difficulties, particularly long term ones (e.g. losing a visa for a place you were living in with a big EA community) can muddy the waters a lot. It's hard to get into the details, but I would generally consider someone not drifted if it was a clearly capacity affecting thing (e.g. they got carpal tunnel) but outside of that they are working on the same projects they would have wanted to in all cases.
A more nuanced view might be break it down into: “Value change away from EA” - defined as changing fundamental ethical views, maybe changing to valuing people within your country more than outside of it.. “Action change away from EA” - defined as changing one of the fundamental applications of your still similarly held values. Maybe you think being veg is good, but you are no longer veg due to moving to a different, less conducive living situation.
With short and long term versions of both and with it being pretty likely that “value change” would lead to “action change” over time, I used value drift as a catch-all for both the above. It’s also how I have heard it commonly used as, but I am open to changing the term to be more descriptive.
“As the EA community we should treat people sharing goals and values of EA but finding it hard to act towards implementing them very differently to people simply not sharing our goals and values anymore. Those groups require different responses.”
I strongly agree. These seem to be very different groups. I also think you could even break it down further into “EAs who rationalize doing a bad thing as the most ethical thing” and “EAs who accept as humans that they have multiple drives they need to trade off between”. Most of my suggestions in the post are aimed at actions one could take now that reduce both “action change” and “value change”. Once someone has changed I am less sure about what the way forward is, but I think that could warrant more EA thought (e.g. how to re-engage someone who was disconnected for logistical reasons).
On ii)
Sorry to hear you have had trouble with the EA community and children. I think it's one of the life changes that is generally updated too strongly on by EAs and assuming that a person (of any gender) will definitely value drift upon having children is clearly incorrect. Personally I have found the EAs who I have spoken to who have kids to be unusually reflective about its effects on them compared to other similar life changes, perhaps because it has been more talked about in EA than say partner choice or moving cities. When a couple who plans to have kids has kids and changes their life around that in standard/expected ways, I do not see that as a value drift from their previous state (of planning to have kids and planning to have life changes around that).
I also think people will run into problems pretty quickly if they assume that every time someone goes through a life change that the person will change radically and become less EA. I think I see it intuitively as more of a bayesian prior. If someone has been involved in EA for a week and then they are not involved for 2 weeks, it might be sane to consider the possibilities of them not coming back. On the flip side, if an EA has been involved for years and was not involved for 2 weeks, people would think nothing of it. The same holds true for large life changes. It’s more about the person's pattern of long term of behavior and a combined “overall” perspective.
My list of concerns about a new trend of EA’s “relaying information about opportunities only informally” is so long it will have to be reserved for a whole other blog post.
I still think you're focussing too much on changed values as opposed to implementation difficulties (I consider lack of motivation an example of those).
I think it's actually usually the other way around - action change comes first, and then value change is a result of that. This also seems to be true for your hypothetical Alice in your comment above. AFAIK it's a known psychology result that people don't really base thei... (read more)