Comment author: casebash 26 May 2017 10:38:11AM 1 point [-]

Very happy to see such an ambitious plan! I will be surprised if you manage to pull it off, but then again the vast majority of good that is achieved will come from a few projects that work amazingly well.

Comment author: casebash 16 April 2017 04:57:33PM 0 points [-]

We should definitely consider creating some awards for the EA community. Not only do they look good on your CV, but they also encourage people. However, they need to be limited in order to a) maintain value, b) maintain integrity/cred.

Comment author: Richard_Batty 27 March 2017 01:37:27PM 3 points [-]

A lot of these would be good for a small founding team, rather than individuals. What do you mean by 'good for an EA group?'

Comment author: casebash 27 March 2017 01:56:51PM 1 point [-]

Like a local university group or local city meetup.

Comment author: casebash 27 March 2017 04:44:12AM 1 point [-]

Many of those seem like individual projects. Does anyone have any suggestions for projects that would be particularly good for EA groups?

In response to EA Funds Beta Launch
Comment author: casebash 28 February 2017 07:57:10PM 6 points [-]

I'm especially excited about Effective Altruism community building. There are too many EA orgs to keep track of, all running there own fundraisers.

In response to Why I left EA
Comment author: casebash 20 February 2017 12:59:55AM -1 points [-]

If morality isn't real, then perhaps we should just care about our selves.

But suppose we do decide to care about other people's interests - maybe not completely, but at least to some degree. To the extent that we decide to devote resources to helping other people, it makes sense that we should do this to the maximal extent possible and this is what utilitarianism does.

Comment author: Peter_Hurford  (EA Profile) 14 February 2017 04:23:02AM 3 points [-]

I don't think that's true for two reasons:

(1) A 10% chance of donating $100K should be roughly as motivating to a risk-neutral EA as a 100% chance of donating $10K (not taking into account arguments that the risk-neutral utility of money may be nonlinear).

(2) Research around whether to donate $100K or $10K (or how to donate $100K conditional on winning the lottery) would be useful.

Comment author: casebash 16 February 2017 11:24:07PM 1 point [-]

"A 10% chance of donating $100K should be roughly as motivating to a risk-neutral EA as a 100% chance of donating $10K (not taking into account arguments that the risk-neutral utility of money may be nonlinear)." - that's not how human psychology works.

Comment author: Peter_Hurford  (EA Profile) 14 February 2017 12:01:20AM 4 points [-]

I think others have suggested this, but have you thought about putting your 10K GBP into a donor lottery or otherwise saving up to getting a larger donation size? I'd like to see research address that question (e.g., is a 100K donation >10x better than a 10K donation?).

Comment author: casebash 14 February 2017 03:50:13AM 0 points [-]

That would defeat the purpose of the project. I think that the purpose is to spur research and the money is there for extra encouragement.

Comment author: casebash 13 January 2017 11:21:10PM 3 points [-]

I was definitely disappointed to see that post by Sarah. It seemed to defect from good community norms such as attempting to generously interpret people in favour of quoting people out of context. She seems to be applying such rigourous standards to other people, yet applying rather loose standards to herself.

Comment author: casebash 07 December 2016 10:42:27PM 1 point [-]

So I'm guessing the idea is that donors will do something a bit more complex than just throwing the money over to AMF?

View more: Next