Comment author: MichaelDickens  (EA Profile) 03 April 2016 04:15:18PM 2 points [-]

Link is broken, it redirects to a sub-directory of effective-altruism.com.

Comment author: saulius  (EA Profile) 03 April 2016 07:49:43PM 0 points [-]
In response to The great calculator
Comment author: saulius  (EA Profile) 28 March 2016 01:31:37PM 0 points [-]

Question "How much would you be willing to donate to save a human's life?" makes no sense to me. It all depends on how old the human is, how happy/miserable his future life is going to be and... how much meat he eats.

By saving non-vegan's life, you kill all the animals he is going to eat (well, in reality it's more complicated, with market elasticities, etc.). If we are going to be perfectly rational, we should first calculate whether we even want to save the life.

In response to The great calculator
Comment author: Habryka 27 March 2016 12:01:38AM 3 points [-]

Interesting idea. Main thing that bothers me is the conflation between "animal lives" and "animal life-years". It's standard in development economics to use life-years as the relevant measures, not total amount of deaths-averted. So I think using life-years as the measure would result in more accurate results (e.g. ask how much a person is willing to pay to save 15 years of pig-life).

Comment author: saulius  (EA Profile) 28 March 2016 01:00:00PM 0 points [-]

I think you meant "avert 15 years of pig-life in industrial agriculture"

Comment author: saulius  (EA Profile) 19 February 2016 12:45:13AM *  2 points [-]

If you are writing a summary of existing arguments, then yes. But if you have a new argument, then there is no reason to drown it in old arguments.

Comment author: saulius  (EA Profile) 28 December 2015 05:55:26PM 0 points [-]

1A) In my experience, typical people don't have strong desires to help people far away. They just don't care about them nearly as much as themselves and their relatives, especially children. It never seems to me when talking with such people that they are confused. It always seems that they just have different values. Actually, their values make more sense from evolutionary psychology POV. So if you ask a person "Being effective at altruism (towards people/animals that might be far away and you won't necessarily meet) is one of goals in your life, right?" and he disagrees (or agrees to seem good but then doesn't act on it), IMO most likely that person has different core values, which are usually very hard to change. If I am right, little will be donated by audience you gain by omitting that altruism is your goal. By omitting that you may also fail to attract some people who are interested in altruism and can be targeted more productively.

Not sure if people who e. g. donate to cancer charities because they recently lost their relative to cancer are usually confused. It could also be different values to some degree. IMO that could be a more productive target audience.

1B) If I was a non-EA fan of InIn and after a google search I found a sentence like "it won't be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them", I would probably feel angry, manipulated and being treated as someone of a lower intellectual class. Not sure what percentage of people would feel in a similar way. If a journalist found such sentence when writting about InIn, he might see it as an opportunity for initiating a scandal. Stuff like that can never happen when you always say/write everything you are thinking that is important enough to be said: no lies, no "Not Technically Lying", no omissions. Just always trying to make maps in other brains closer to what you think is reality. This is what I call honesty. Spreading EA ideas seems like an admirable goal to many people so to me it's strange that you chose to hide that.

Comment author: saulius  (EA Profile) 28 December 2015 08:47:08PM 1 point [-]

Just noticed that almost the same thoughts regarding 1A) were said in http://effective-altruism.com/ea/rr/the_big_problem_with_how_we_do_outreach/ You don't have to answer any of this if it's not new.

Comment author: Gleb_T  (EA Profile) 21 December 2015 10:17:38PM *  3 points [-]

Thanks for raising these points, and no worries about sounding critical! If you have these concerns, other people do too, and it's important to have a transparent dialogue about them :-)

1) First, let's be very clear and specific about out terminology. I think the word "dishonest" does not serve us well here. Let's taboo that, and talk about what we actually do. As I stated above, what we do is help people realize their actual goals, if they knew about the best methods of reaching them. Namely, typical people have a desire to help others, but they don't necessarily know the best way to do that. They fall into attention bias, they don't realize the salience of the drowning child problem, and they give to whatever charity has the best marketing. That's why our mission states "We empower people to refine and reach their goals," and the refining part addresses helping people figure out what their goals actually are. We help them achieve their more long-term goals, in other words.

2A) Yup, Intentional Insights promotes both effective giving and effective decision-making. That's mentioned briefly above and described here in more depth. Doing so helps improve the capacity of EAs who engage with our content, and contributes to the flourishing of non-EAs.

2B) Ugh, the wiki thing was pretty ugly. The full story is here - we had a hater try to wipe the InIn wiki entry. It was settled but the part about promoting EA was deleted in the settlement. I'm not happy with the outcome, but it's the best we could get.

2C) You can see my goals through my actions - I have invested a lot of time and efforts and money into promoting effective giving, freely and of my own volition. There was nothing forcing me to do so, and no specific benefit I was getting from it. Both from a social status perspective and from a financial perspective, I'm fine with my situation as a professor at Ohio State - I get social respect and fine job benefits. So my only gain from promoting effective giving is other people giving effectively and my only reason for engaging with the EA movement and taking the GWWC pledge and TLYCS pledge is my passion for helping people flourish :-)

3) Actually, not promoting effective giving leaves a lot of money on the table, as I argue here.

Comment author: saulius  (EA Profile) 28 December 2015 05:55:26PM 0 points [-]

1A) In my experience, typical people don't have strong desires to help people far away. They just don't care about them nearly as much as themselves and their relatives, especially children. It never seems to me when talking with such people that they are confused. It always seems that they just have different values. Actually, their values make more sense from evolutionary psychology POV. So if you ask a person "Being effective at altruism (towards people/animals that might be far away and you won't necessarily meet) is one of goals in your life, right?" and he disagrees (or agrees to seem good but then doesn't act on it), IMO most likely that person has different core values, which are usually very hard to change. If I am right, little will be donated by audience you gain by omitting that altruism is your goal. By omitting that you may also fail to attract some people who are interested in altruism and can be targeted more productively.

Not sure if people who e. g. donate to cancer charities because they recently lost their relative to cancer are usually confused. It could also be different values to some degree. IMO that could be a more productive target audience.

1B) If I was a non-EA fan of InIn and after a google search I found a sentence like "it won't be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them", I would probably feel angry, manipulated and being treated as someone of a lower intellectual class. Not sure what percentage of people would feel in a similar way. If a journalist found such sentence when writting about InIn, he might see it as an opportunity for initiating a scandal. Stuff like that can never happen when you always say/write everything you are thinking that is important enough to be said: no lies, no "Not Technically Lying", no omissions. Just always trying to make maps in other brains closer to what you think is reality. This is what I call honesty. Spreading EA ideas seems like an admirable goal to many people so to me it's strange that you chose to hide that.

Comment author: Gleb_T  (EA Profile) 21 December 2015 10:17:38PM *  3 points [-]

Thanks for raising these points, and no worries about sounding critical! If you have these concerns, other people do too, and it's important to have a transparent dialogue about them :-)

1) First, let's be very clear and specific about out terminology. I think the word "dishonest" does not serve us well here. Let's taboo that, and talk about what we actually do. As I stated above, what we do is help people realize their actual goals, if they knew about the best methods of reaching them. Namely, typical people have a desire to help others, but they don't necessarily know the best way to do that. They fall into attention bias, they don't realize the salience of the drowning child problem, and they give to whatever charity has the best marketing. That's why our mission states "We empower people to refine and reach their goals," and the refining part addresses helping people figure out what their goals actually are. We help them achieve their more long-term goals, in other words.

2A) Yup, Intentional Insights promotes both effective giving and effective decision-making. That's mentioned briefly above and described here in more depth. Doing so helps improve the capacity of EAs who engage with our content, and contributes to the flourishing of non-EAs.

2B) Ugh, the wiki thing was pretty ugly. The full story is here - we had a hater try to wipe the InIn wiki entry. It was settled but the part about promoting EA was deleted in the settlement. I'm not happy with the outcome, but it's the best we could get.

2C) You can see my goals through my actions - I have invested a lot of time and efforts and money into promoting effective giving, freely and of my own volition. There was nothing forcing me to do so, and no specific benefit I was getting from it. Both from a social status perspective and from a financial perspective, I'm fine with my situation as a professor at Ohio State - I get social respect and fine job benefits. So my only gain from promoting effective giving is other people giving effectively and my only reason for engaging with the EA movement and taking the GWWC pledge and TLYCS pledge is my passion for helping people flourish :-)

3) Actually, not promoting effective giving leaves a lot of money on the table, as I argue here.

Comment author: saulius  (EA Profile) 28 December 2015 04:51:00PM *  0 points [-]

3) I'm not arguing against promoting effective giving. I'm all for that. Just thinking which ways to do that are the most effective. To your knowledge, how many people you already convinced to donate to EA charities? How much do you think was donated because of InIn?

Comment author: Gleb_T  (EA Profile) 21 December 2015 06:03:38PM -1 points [-]

Thanks for bringing this up! We only mention altruism briefly in our vision, among other things. I can see how this might be confusing :-)

Here is the reason. Since we're pursuing promoting effective giving to non-EAs, our organization is outward-facing to the broad audience, unlike the majority of EA meta-organizations, which are mainly inward-facing to the EA movement. Since we are outward-facing, we need to be careful about stating explicitly the goals we are pursuing - it won't be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them :-) Instead, we tell our non-EA audiences that we are trying to help them reach their goals, which is the case - we are helping them realize their actual goals if they understood how to reach their actual giving goals by giving in the most impactful manner.

This is why we have a separate EA webpage that outlines our EA orientation, which is not linked to from our outward, public-facing website.

Comment author: saulius  (EA Profile) 21 December 2015 09:24:38PM *  5 points [-]

Not sure it's a good approach because:

  1. Being dishonest about your goals and half-secretly manipulating people spreads the wrong values and can backfire easily, giving bad reputation to EA.

  2. Topics like http://intentionalinsights.org/how-sure-are-you-about-your-memories don’t seem to promote EA in any way and yet your resources are spent on them. Because you admitted being dishonest about your goals to your readers, it makes me doubt whether you are honest about them to us :) In fact, your changes in https://wiki.lesswrong.com/index.php?title=Intentional_Insights&diff=15260&oldid=152533 targeted at a rational crowd also don't mention charity or altruism. And defending your stuff on a wiki in third person is also not very nice.

  3. I see no reason to promote EA indirectly because EA is easy to sell to many rational people. People who easily agree to EA are most-likely the ones that can be most cost-effectively targeted for advertisement while the movement small.

Sorry if any of that came of as harsh. I still think it's great you are actually trying something while I just sit at home and criticise everyone :)

Comment author: Ben_Todd 21 December 2015 06:37:45PM *  6 points [-]

I broadly agree with this, but I'd put it a little differently.

If you think what most matters about your actions is their effect on the long-run future (due to Bostrom-style arguments), then GiveWell recommended charities aren't especially "proven", because we have little idea what their long-run effects are. And they weren't even selected for having good long-run effects in the first place.

One response to this is to argue that the best proxy for having a good long-run impact is having a good short-run impact (e.g. via boosting economic growth).

Another response is to argue that we never have good information about long-run effects, so the best we can do is to focus on the things with the best short-run effects.

I also still think it's fair to say GiveWell recommended charities are a "safe bet" in the sense that donating to them is very likely to do much more good than spending the money on your own consumption.

Comment author: saulius  (EA Profile) 21 December 2015 07:30:38PM 0 points [-]

I've heard this "the best proxy for having a good long-run impact is having a good short-run impact" a couple of times now but I haven't seen anyone make any argument for it. Could someone provide a link or something? To me it's not even clear why impact on economy of different charities like Give Directly and AMF should be proportional to their short-term impact.

Comment author: Tom_Davidson 21 December 2015 05:21:35PM 2 points [-]

Why? What are the very long term effects of a murder?

Comment author: saulius  (EA Profile) 21 December 2015 06:00:29PM 0 points [-]

Murdering also decreases world population and consumption, which decreases problems like global warming, overfishing, etc. and probably reduces some existential risks.

View more: Prev | Next