Comment author: Peter_Hurford  (EA Profile) 26 April 2018 07:00:14PM 3 points [-]

Thanks. I don't feel guilty about it. I just chose a different life. EA is still very important to me, but not as important as it once was. I think a lot of it is, like Joey said, the slow build up of small path changes over time.

Comment author: Nekoinentr 27 April 2018 07:26:55PM 3 points [-]


If you feel you've become much less EA, I wonder what many others who were very into it must feel. From the outside you seem extremely involved - .impact/Rethink Charity do a huge amount with limited resources, and it seems like you do substantial volunteering with them, which doesn't seem like putting little of yourself into EA. Thanks for what you do.

Comment author: Peter_Hurford  (EA Profile) 25 April 2018 03:27:20AM 1 point [-]

I'd like to see this. I have some data on this from the EA Survey and intend to follow up on something similar later this year.

Comment author: Nekoinentr 26 April 2018 05:35:46PM 0 points [-]

Please do share that data when you get a chance. You guys have a lot of fascinating data in those survey results, and while I understand you have limited time/resources, it would be a shame to see them go untapped.

Comment author: Peter_Hurford  (EA Profile) 25 April 2018 02:43:07AM 7 points [-]

Many aspects of this story sound kinda like things that have happened to me to make me less hardcore. I definitely still strongly affiliate with EA, donate ~15% / $30K, and spend about 20hrs/week on EA projects, but my college EA idealistic self expected me to donate ~$100K/yr by now or work full-time 60hrs/week on EA projects. I'm unsure how "bad" of a "value drift" this is, but definitely short of my full potential.

Comment author: Nekoinentr 26 April 2018 05:34:38PM 2 points [-]

Maybe your college EA idealistic self expectation's were never that likely, so you shouldn't beat yourself up about them.

Comment author: Nekoinentr 12 March 2018 05:31:45PM 0 points [-]

It's no big deal, but your formatting is a little different from the normal forum formatting - it might be worth requesting .impact provide a button to clear extraneous formatting via the issues link at

Comment author: Nekoinentr 04 January 2018 07:34:04AM 2 points [-]

For example, suppose you see an idea for an effective charity on Charity Science. You contact them and they provide you with advice and link you up with potential cofounders.

Have they done this for anyone?

Comment author: Richenda  (EA Profile) 04 January 2018 04:41:57AM 2 points [-]

We agree about the EA Hub. However we were overstretched across too many projects, and have been in the process of identifying which things to prioritise, and which cost-effective things we can deliver to a high standard. This assessment and decisions in the next few months will be critical for the direction of the site.

Comment author: Nekoinentr 04 January 2018 07:25:35AM 4 points [-]

Surely if someone gave you a few hundred dollars to sustain a staff member such as yourself to spend a few man days leveraging volunteer tech & design effort, you'd do it? So less a matter of prioritizing things and more a matter of the EA Community Fund covering low hanging fruit like this so you don't have to take time you presumably don't have laboriously convincing someone that this is worth those few hundred dollars.

Comment author: Nekoinentr 04 January 2018 07:22:12AM 1 point [-]

For more speculative things, we want to put part of the money towards a project that a friend we know through the Effective Altruism movement is starting. In general I think this is a good way for people to get funding for early stage projects, presenting their case to people who know them and have a good sense of how to evaluate their plans.

Agreed. Thanks for the work you do supporting things that'd otherwise not happen!

Comment author: Tobias_Baumann 20 July 2017 08:40:43AM *  11 points [-]

Thanks for writing this up! I agree that this is a relevant argument, even though many steps of the argument are (as you say yourself) not airtight. For example, consciousness or suffering may be related to learning, in which case point 3) is much less clear.

Also, the future may contain vastly larger populations (e.g. because of space colonization), which, all else being equal, may imply (vastly) more suffering. Even if your argument is valid and the fraction of suffering decreases, it's not clear whether the absolute amount will be higher or lower (as you claim in 7.).

Finally, I would argue we should focus on the bad scenarios anyway – given sufficient uncertainty – because there's not much to do if the future will "automatically" be good. If s-risks are likely, my actions matter much more.

(This is from a suffering-focused perspective. Other value systems may arrive at different conclusions.)

Comment author: Nekoinentr 21 July 2017 12:53:49AM 1 point [-]

The Foundational Research Institute site in the links above seems to have a wealth of writing about the far future!

Comment author: Nekoinentr 20 July 2017 02:47:31PM 2 points [-]

On premise 1, a related but stronger claim is that humans tend to shape the universe to their values much more strongly than do blind natural forces. This allows for a simpler but weaker argument than yours: it follows that, should humans survive, the universe is likely to be better (according to those values) than it otherwise would be.

Comment author: Brian_Tomasik 11 July 2017 10:23:20PM 1 point [-]

IMO, the philosophers who accept this understanding are the so-called "type-A physicalists" in Chalmers's taxonomy. Here's a list of some such people, but they're in the minority. Chalmers, Block, Searle, and most other philosophers of mind aren't type-A physicalists.

Comment author: Nekoinentr 20 July 2017 02:43:23PM 0 points [-]

IMO, the philosophers who accept this understanding are the so-called "type-A physicalists" in Chalmers's taxonomy.

I'm not wholly sure I understand the connection between this and denying that consciousness is a natural kind. The best I can do (and perhaps you or thebestwecan can do better? ;-) ) is:

"If consciousness is a natural kind, then the existence of that natural kind is a separate fact from the existence of such-and-such a physical brain state (and vica versa)"

View more: Next