Comment author: vollmer  (EA Profile) 25 August 2018 11:47:37AM *  9 points [-]

There is this list of essential EA resources: https://www.effectivealtruism.org/resources/

Comment author: baxterb 09 August 2018 01:58:39PM 3 points [-]

Great point, and we should have mentioned more about our intention to track things like career or career path changes as a result of the program. We don't currently have data on this because our audience are generally too young to show reliable signs of moving toward an effective career, but part of what we hope to accomplish with extended engagement (detailed in the Plans for Autumn 2018 section above) is to follow high-potential participants more closely so we can monitor changes like that.

We have had several participants state their intentions of taking actions like this to make a bigger impact, but it is uncertain as to whether they will follow through.

Comment author: vollmer  (EA Profile) 10 August 2018 03:37:04PM 1 point [-]

Thanks, makes sense! Would be great to see such data in the future, though I agree it seems hard to track.

Comment author: Peter_Hurford  (EA Profile) 09 August 2018 01:58:47PM 3 points [-]

Speaking just for myself here, I think tracking career outcomes for SHIC students is important, but Canadian high schoolers in affluent areas are typically 4-7 years away from being able to start a career, so this may take awhile to track well. I also don't expect high schoolers to have meaningful and stable views on their career since it would be so early in their life.

Comment author: vollmer  (EA Profile) 10 August 2018 03:33:34PM *  0 points [-]

Right, when I wrote "career plan changes" I mostly meant that they end up studying a subject different from their previous best guess (if they had one) at least partly for EA reasons. (Or at a different university, e.g. a top school.)

Comment author: vollmer  (EA Profile) 09 August 2018 09:50:08AM *  4 points [-]

Have you tried / considered tracking career plan changes, and if so, do you have any tentative results you could share? (If not, what's your reasoning for not focusing on this more?)

Comment author: weeatquince  (EA Profile) 08 August 2018 11:17:18AM 4 points [-]

Marek, well done on all of your hard work on this.

Separate from the managed funds. I really like the work that CEA is doing to help money be moved around the world to other EA charities. I would love to see more organisations on the list of places that donations can be made through the EA Funds platform. Eg, REG or Animal Charity Evaluators or Rethink Charity. Is this in the works?

https://app.effectivealtruism.org/donations/new/organizations

Comment author: vollmer  (EA Profile) 09 August 2018 07:50:53AM 3 points [-]

(To support REG, you can select "Stiftung für Effektiven Altruismus" (= Effective Altruism Foundation) from the list. If you'd like to restrict your donation to EAF's philanthropic advice (which includes REG, but also a new crypto fundraising project and an "impact masterclass" for UHNWs), just shoot us an email.)

Comment author: Ben_Todd 08 August 2018 06:04:37AM 4 points [-]

I agree that would be ideal, but it doesn't seem like a high priority feature. The risk-free 1yr interest rate is about 2% at the minute (in treasuries), so even if the money is delayed for a whole year, we're only talking about a gain of 2%, and probably more like 1% after transaction costs.

You could invest in the stock market instead, but the expected return is still probably only 1-5% per year (as I argue here: https://80000hours.org/2015/10/common-investing-mistakes-in-the-effective-altruism-community/). Plus, then you have a major risk of losing lots of the money, which will probably be pretty hard to explain to many of the users, the press etc.

I expect the staff time spent adding and managing this feature could yield much more than a couple of percent growth to the impact of the funds in many other ways (e.g. the features Marek lists above).

Comment author: vollmer  (EA Profile) 08 August 2018 03:27:57PM *  4 points [-]

The sources you quote seem to suggest more like 5% in real annual returns (or 7% nominal), and you wrote "2-7% nominal returns". If you're investing $2m, that would be $40k-$140k per year. I'd expect this to cost maybe one week of staff time per year, so it might easily be worth the cost. (Mission hedging and more diversification would push this up further; fees and risk aversion would push it down. Overall I don't expect these factors to be very strong though.)

To me it seems that the difficulty of explaining this to the users is the stronger reason against implementing this. (Unless the users themselves can choose but that would cost more staff time again.)

Comment author: MichaelPlant 06 August 2018 04:01:42PM 1 point [-]

I'm not sure I see which direction you're coming from. If you're a symmetric person-affector (i.e. reject the procreatve asymmetry, the view we're neutral about creating happy lives but agasinst creating unhappy lives) then you don't think there's value in creating future life, good or bad. So neither x-risks nor s-risks are a concern.

Maybe you're thinking 'don't those with person-affecting views care about those who are going to exist anyway?' the answer is Yes if you're a necessitarian (No if you're a presentist), but given that what we do changes who comes into existence necessitarianism (holds you value wellbeing of those that exist anyway) collapses, in practice, into presentism (holds you value wellbeing of those that exist right now).

Vollmer, the view that would be care about the quality of the long-term future, but not whether it happens, seems to be averagism.

Comment author: vollmer  (EA Profile) 07 August 2018 08:06:56AM *  0 points [-]

Right, sorry, I misread. I thought you were assuming some form of Epicureanism with concern for all future beings, not Epicureanism plus a person-affecting view.

Comment author: RandomEA 04 August 2018 06:12:11PM *  43 points [-]

Here are ten reasons you might choose to work on near-term causes. The first five are reasons you might think near term work is more important, while the latter five are why you might work on near term causes even if you think long term future work is more important.

  1. You might think the future is likely to be net negative. Click here for why one person initially thought this and here for why another person would be reluctant to support existential risk work (it makes space colonization more likely, which could increase future suffering).

  2. Your view of population ethics might cause you to think existential risks are relatively unimportant. Of course, if your view was merely a standard person affecting view, it would be subject to the response that work on existential risk is high value even if only the present generation is considered. However, you might go further and adopt an Epicurean view under which it is not bad for a person to die a premature death (meaning that death is only bad to the extent it inflicts suffering on oneself or others).

  3. You might have a methodological objection to applying expected value to cases where the probability is small. While the author attributes this view to Holden Karnofsky, Karnofsky now puts much more weight on the view that improving the long term future is valuable.

  4. You might think it's hard to predict how the future will unfold and what impact our actions will have. (Note that the post is from five years ago and may no longer reflect the views of the author.)

  5. You might think that AI is unlikely to be a concern for at least 50 years (perhaps based on your conversations with people in the field). Given that ongoing suffering can only be alleviated in the present, you might think it's better to focus on that for now.

  6. You might think that when there is an opportunity to have an unusually large impact in the present, you should take it even if the impact is smaller than the expected impact of spending that money on long term future causes.

  7. You might think that the shorter feedback loops of near term causes allow us to learn lessons that may help with the long term future. For example, Animal Charity Evaluators may help us get a better sense of how to estimate cost-effectiveness with relatively weak empirical evidence, Wild Animal Suffering Research may help us learn how to build a new academic field, and the Good Food Institute may help us gain valuable experience influencing major economic and political actors.

  8. You might feel like you are a bad fit for long term future causes because they require more technical expertise (making it hard to contribute directly) and are less funding constrained (making it hard to contribute financially).

  9. You might feel a spiritual need to work on near term causes. Relatedly, you might feel like you're more likely to do direct work long term if you can feel motivated by videos of animal suffering (similar to how you might donate a smaller portion of your income because you think it's more likely to result in you giving long term).

  10. As you noted, you might think there are public image or recruitment benefits to near term work.

Note: I do not necessarily agree with any of the above.

Comment author: vollmer  (EA Profile) 06 August 2018 09:03:38AM 2 points [-]

Why do you think Epicureanism implies a focus on the near term and not a focus on improving the quality of life in the long-term future?

In response to Open Thread #40
Comment author: vollmer  (EA Profile) 09 July 2018 06:50:16AM *  3 points [-]

Side note: I'd encourage commenters to put a title at the top of their comments (maybe this can be done in the OP).

Comment author: remmelt  (EA Profile) 04 July 2018 03:56:43PM *  2 points [-]

Hmm, I can’t think of a clear alternative to ‘V2ADC’ yet. Perhaps ‘decision chain’?

Comment author: vollmer  (EA Profile) 04 July 2018 05:06:10PM 2 points [-]

Yeah, that sounds great. Decision chain (abbreviated "DC").

View more: Next