Comment author: jimrandomh 06 June 2018 03:45:58AM 3 points [-]

And if it's the latter, it's unclear to me why this idea would be better than just funding poor EAs directly and letting them decide where to live

That would cost much more per person. With that cost would come an expectation of filtering and grant proposals, which would keep out a lot of people who might otherwise use this to do good things.

Comment author: WilliamKiely 06 June 2018 03:59:19AM 0 points [-]

Good point. Do you think EAs with more money ought to consider living in group houses for the sake of reducing the cost of living to enable them to donate more?

Comment author: WilliamKiely 06 June 2018 03:35:32AM 1 point [-]

Alex K Chen says: "You should like talk to people who do summer camp housing too, like SPARC" https://sparc-camp.org/

Comment author: WilliamKiely 06 June 2018 03:16:01AM 2 points [-]

Is the main value of this coordination to cause EAs to live together in a group? Or is it causing poor EAs to be able to do direct work without having to build up savings first?

If the former, it's unclear to me why there would only be value in grouping together EAs who don't have much money/income (would getting other EAs with money to live together not be equally as valuable?).

And if it's the latter, it's unclear to me why this idea would be better than just funding poor EAs directly and letting them decide where to live -- e.g. Alex K. Chen has proposed that paying for talented young people with high potential to live in the Harvard/MIT area so they could unschool themselves there is potentially very high value.

Comment author: AviN 04 March 2018 01:35:29AM *  1 point [-]

We’ve received information from nonprofits on donation and match amounts for ~81% of the estimated amount donated, and I've updated the Follow-up with nonprofits section with some information on this. In general, we’ve found that the amounts received by the nonprofits have been similar to or greater than the amounts we had estimated.

Comment author: WilliamKiely 04 March 2018 04:18:40PM 0 points [-]

Thanks Avi.

Comment author: Denkenberger 26 November 2017 07:15:05PM 4 points [-]

Thanks for letting us know about this great opportunity! While I'm waiting to be approved for the Facebook group, is there any way to find out how much money is going to be chasing the $2 million match? Since this is not just EAs and there appear to be hundreds of charities listed, it could easily be $100 million, so then do you think we will have something like 3 minutes or 3 seconds to do the donation before the $2 million match limit is reached?

Comment author: WilliamKiely 27 November 2017 12:37:31AM *  4 points [-]

Facebook saw over 100,000 people donate to thousands of fundraisers that raised $6.79 million on Giving Tuesday across the United States. (Source)

This year I expect it to be more, though I'm not well-informed on how much more. Perhaps $10-$20MM is a reasonable expectation. https://en.wikipedia.org/wiki/Giving_Tuesday

Also, the match last year was for $500K instead of $2MM. From the same source:

After the initial match of $500,000 was reached within hours, The Bill & Melinda Gates Foundation increased their pledge to $900,000 total to match more Giving Tuesday Fundraisers on Facebook.

Note that last year's matching campaign was also announced in advance.

So I think 3 minutes is overkill. While apriori I would be expecting people to take advantage of this such that $2MM in donations are made in the first ~3 minutes, I think that last year shows that this is unlikely to happen. I would be surprised if the $2MM match is reached in less than 30 minutes. I'll assign a 20% probability to that happening somewhat arbitrarily. And maybe a 5% chance to it being reached in less than 10 minutes. My median estimate would be around 9:30 EST (1.5 hours). And maybe a 20% chance that it takes more than 3 hours. Although I don't really know, so my suggestion is to donate ASAP. If you're donating more than just a small amount it's worth it even if it's inconvenient.

I intend to make all of my donations ASAP after 8:00 AM EST. (I am going to try to make 10 separate $1,000 donations before 8:10 AM EST).

13

#GivingTuesday: Counter-Factual Donation Matching is the Lowest-Hanging Fruit in Effective Giving

Self-described effective altruists in the latest EA Survey reported $9.8 million in donations in 2016. However, most of these donations were not matched counter-factually. That is, most of the donations did not generate matching funds representing new money towards effective nonprofits as a whole . Given the existence of counter-factual... Read More
Comment author: Ben_West  (EA Profile) 21 July 2017 11:08:32PM 3 points [-]

Yeah, it would change the meaning.

My assumption was that, if things monotonically improve, then in the long run (perhaps the very, very long run) we will get to net positive. You are proposing that we might instead asymptote at some negative value, even though we are still always improving?

Comment author: WilliamKiely 22 July 2017 07:57:08PM 2 points [-]

I wasn't proposing that (I in fact think the present is already good), but rather was just trying to better understand what you meant.

Your comment clarified my understanding.

Comment author: WilliamKiely 21 July 2017 12:58:27AM *  2 points [-]

7 - Therefore, the future will contain less net suffering

8 - Therefore, the future will be good

Could this be rewritten as "8. Therefore, the future will be better than the present" or would that change its meaning?

If it would change the meaning, then what do you mean by "good"? (Note: If you're confused about why I'm confused about this, then note that it seems to me that 8 does not follow from 7 for the meaning of "good" I usually hear from EAs (something like "net positive utility").)

Comment author: MichaelDickens  (EA Profile) 27 August 2016 03:49:40AM 3 points [-]

Even if you discount insects that heavily (which I believe is wrong), there's still a strong case to be made for trying to prevent wild vertebrates from suffering.

Comment author: WilliamKiely 27 August 2016 04:35:37AM 0 points [-]

Hmm. I do believe I discount vertebrates much less than I discount insects, however I also think there's a huge difference between say chickens and chimpanzees or chimpanzees and humans. Even among humans (who have quite similar brains to one another compared to inter-comparisons), I think that the top 10% of Americans probably live lives that I value inherently (by which I mean ignoring the effects that they have on other things and only counting the quality of their conscious life experience) at least one order of magnitude (if not several) more than the bottom 10% of Americans. I believe this is an unpopular view also, but one consideration I might be able to give in support of it is if you reflect on how much you value your own conscious experience during some parts of your life compared to others you may find as I do that some moments or short periods seem to be of much greater value than others of equal duration.

An exercise I tried recently was making a plot of "value realized / time" vs "time" for my own conscious life experience (so again: not including the effects of my actions, which is the vast majority of what I value) and found that there were some years I valued multiple times more than other years and some moments I valued many times more than all years on net. The graph was also all positive value and trending upwards. Sleeping much less than awake. (I don't think I have very vivid dreams relative to others, but even if I did, I would probably still tend to value waking moments much more than sleeping ones.) Also, remembering or reflecting on great moments fondly can be of high value too in my evaluation. There's also the problem of not knowing now what certain experiences were like in the past to actually experience them since I'm relying on my memory of what they were like, which for all I know could be faulty. I think in general I choose to value experiences based on how I remember them being rather than how I think they were when I lived them (if there is a discrepancy between the two).

Also note that I'm a moral anti-realist and so I don't think there are correct answers, so to a certain extent how much I value some periods of my conscious life experience relative to others is a choice, since I don't believe that there are completely defined definite values that are mine that I can discover either.

A general thing I'd be really interested in seeing is peoples' estimates of how much they value (whether positively or negatively) the total life experiences of say, mosquitoes, X, Y, Z, chickens, cows, humans (and what that distribution looks like), oneself over time, a typical human over time, etc. And also "What would a graph of (value realized per unit time) vs (time) look like for Earth's history?" which would answer the question "How much value has been realized since life began on Earth?" (note: I'd ignore estimates of value realized elsewhere in the universe, which may actually be quite significant, for the sake of the question). If you'd like to indulge me on your own views on an of this I would be very interested, but of course no need if you don't want to. I'll estimate and write my own answers up sometime.

Comment author: WilliamKiely 27 August 2016 03:35:48AM *  0 points [-]

How many painful mosquito deaths would you have to be offered to prevent to choose that over causing one new human life (of quality equal to that of a typical person today) to be lived (all instrumental effects / consequences aside)?[1][2][3] (For my answer see [2].)

What would the distribution of EAs' answers look like? College graduates' answers? Everyone's answers?

What range of answers does the OP assume?

Or more broadly, for what range of moral theories can a case be made that WAS should be prioritized?

I ask these questions because, while I find the OP argument intriguing, my current values (or my current beliefs about my values, depending on how you want to think about it) are such that preventing mosquito suffering is very insignificant relative to many other things (e.g. there being more humans that live good lives, or humans living better lives) and is therefore far from being a high priority for me.

While I haven't dived deeply into arguments for negative utilitarianism or other arguments that could conceivably change my view significantly, I think it's unlikely (~10%, reported in [2]) that doing so would lead me to change my view significantly.[4]

It seems to me that the most probable way that my view could be changed to believe that (e.g.) OPP ought to prioritize WAS would be to persuade me that I should adopt a certain view on how to deal with moral uncertainty that would, if adopted, imply that OPP ought to prioritize WAS even given my current beliefs about how much I value the suffering of mosquitoes relative to other things (e.g. the lives of humans).

Is there a case to be made for prioritizing WAS if one assigns even a small probability (e.g. 1%) to a negative utilitarian-like view being correct given that they also subscribe to certain plausible views on moral uncertainty?

My views on how to deal with moral uncertainty are very underdeveloped. I think I currently have a tendency to evaluate situations or decide on actions on the basis of the moral view I deem most probable, however as the linked LessWrong wiki article points out, this has potential problems. (I'm also not aware of a less problematic view, so I will probably continue to do this until I encounter something else that appeals to me more. Bostrom's parliamentary model seems like a reasonable candidate, although I'm unsure how this negotiation process works exactly or would play out. Would have to think about it more.

Lastly, let me just note that I don't challenge the non-normative factual claims of the OP. Rather, I'm simply stating that my hesitation to take the view that OPP should prioritize WAS comes from my belief that I value things significantly differently than I would have to in order for WAS to be something that OPP should prioritize.


{1] A similar question was asked in the Effective Altruism Facebook group. My version gets at how much one values the life of a typical person today relative to the life of a typical mosquito rather than how much one values extreme pleasure relative to extreme suffering.

[2] Since I'm asking for others' answers, I should estimate my own answer. Hmm. If I had to make the decision right now I would choose to create the new human life, even if the number of painful mosquito deaths I was offered to prevent was infinite. Although note that I am not completely confident in this view, perhaps only ~60%. Then maybe ~30% to 10^10-infinity and ~10% to <10^10 mosquitoes, where practically all of that 10% uncertainty comes from the possibility that a more enlightened version of myself would undergo a paradigm shift or significant change in my fundamental values / moral views. In other words, I'm pretty uncertain (~40/60) about whether mosquitoes are net negative or not, but I'm pretty certain (~75%=30%/40%) that if I do value them negatively that the magnitude of their negative value is quite small (e.g. relative to the positive value I place on (the conscious experience of) human life).

[3] Knowing that my view is controversial among EAs (see the link at [1]), perhaps I should meta-update significantly towards the consensus view that not only is the existence of suffering inherently bad, but it's also a much greater magnitude bad than I think in the ~30% scenario that it is. I'll refrain from doing this for now, or figuring out how much I should update if I only think there's an X% that it's proper to update. (I'm also not sure how much my intuitions / current reported estimates already take into account others estimates or not.)

[4] The basis of my view that the goodness of a human life is much greater than the possible (~40% in my view) badness of a mosquito's suffering or painful death (and the basis of more general versions of this view) is my intuition. Thinking about the question from different angles I have been unable to shift my view significantly towards placing substantially more value on mosquitoes' significance or preventing mosquito suffering.

View more: Next