Comment author: kbog  (EA Profile) 17 March 2017 05:21:55PM *  3 points [-]

What's ill-founded is that if you want to point out a problem where people affiliate with NU orgs that promote values which increase risk of terror,

But they do not increase the risk of terror. Have you studied terrorism? Do you know about where it comes from and how to combat it? As someone who actually has (US military, international relations) I can tell you that this whole thing is beyond silly. Radicalization is a process, not a mere manner of reading philosophical papers, and it involves structural factors among disenfranchised people and communities as well as the use of explicitly radicalizing media. And it is used primarily as a tool for a broad variety of political ends, which could easily include the ends which all kinds of EAs espouse. Very rarely is destruction itself the objective of terrorism. Also, terrorism generally happens as a result of actors feeling that they have a lack of access to legitimate channels of influencing policy. The way that people have leapt to discussing this topic without considering these basic facts shows that they don't have the relevant expertise to draw conclusions on this topic.

Calling it "unnecessary" to treat that org is then a blatant non-sequitur, whether you call it an argument or an assertion is up to you.

But Austen did not say "Not supporting terrorism should be an EA value." He said that not causing harm should be an EA value.

Our ability to discern good arguments even when we don't like them is what sets us apart from the post-fact age we're increasingly surrounded by.

There are many distinctions between EA and whatever you mean by the (new?) "post-fact age", but responding seriously to what essentially amounts to trolling doesn't seem like a necessary one.

It's important to focus on these things when people are being tribal, because that's when it's hard.

That doesn't make any sense. Why should we focus more on things just because they're hard? Doesn't it make more sense to put effort somewhere where things are easier, so that we get more return on our efforts?

If you only engage with facts when it's easy, then you're going to end up mistaken about many of the most important issues.

But that seems wrong: one person's complaints about NU, for instance, isn't one of the most important issues. At the same time, we have perfectly good discussions of very important facts about cause prioritization in this forum where people are much more mature and reasonable than, say, Austen here is. So it seems like there isn't a general relationship between how important a fact is and how disruptive commentators are when discussing it. At the very minimum, one might start from a faux clean slate where a new discussion is started separate from the original instigator - something which takes no time at all and enables a bit of a psychological restart. That seems to be strictly slightly better than encouraging trolling.

Comment author: Austen_Forrester 19 June 2017 10:39:56PM 0 points [-]

Those radicalization factors you mentioned increase the likelihood for terrorism but are not necessary. Saying that people don't commit terror from reading philosophical papers and thus those papers are innocent and shouldn't be criticized is a pretty weak argument. Of course, such papers can influence people. The radicalization process starts with philosophy, so to say that first step doesn't matter because the subsequent steps aren't yet publicly apparent shows that you are knowingly trying to allow this form of radicalization to flourish. Although, NUEs do in fact meet the other criteria you mentioned. For instance, I doubt that they have confidence in legitimately influencing policy (ie. convincing the government to burn down all the forests).

FRI and its parent EA Foundation state that they are not philosophy organizations and exists solely to incite action. I agree that terrorism has not in the past been motivated purely by destruction. That is something that atheist extremists who call themselves effective altruists are founding.

I am not a troll. I am concerned about public safety. My city almost burned to ashes last year due to a forest fire, and I don't want others to have to go through that. Anybody read about all the people in Portugal dying from a forest fire recently? That's the kind of thing that NUEs are promoting and I'm trying to prevent. If you're wondering why I don't elaborate my position on “EAs” promoting terrorism/genocide, it is for two reasons. One, it is self-evident if you read Tomasik and FRI materials (not all of it, but some articles). And two, I can easily cause a negative effect by connecting the dots for those susceptible to the message or giving them destructive ideas they may not have thought of.

Comment author: Austen_Forrester 19 June 2017 10:06:38PM 0 points [-]

Have you considered combining the "GiveWell for impact investing" idea with the Effective Altruism Funds idea and create an EA impact investing biz within your charity? You could hire staff to find the best impact investing opportunities and create a few funds for different risk tolerances. Theoretically, it could pay for itself (or make serious money for CEA if successful enough) with a modest management fee. I'm not sure if charities are allowed to grant to businesses, but I know they can operate their own businesses as long as it's related to their mission.

Comment author: Austen_Forrester 14 June 2017 02:34:36AM 3 points [-]

Entering China would be awesome. So many people with money and no one's donating it. It ranks dead freaking last on the World Giving Index. Which in a way is a good thing... it means lots of room to grow!

China's domestic charities are usually operated and funded by the government (basically part of the government). And starting this year, the government has basically taken control of foreign NGO's in China.

Often, rich Chinese elect to donate to foreign NGOs because they are more credible. Besides, being government-controlled, charities in China are not known for being reputable, prompting billionaire Jack Ma to famously quip "It's harder to donate money to Chinese charities than to earn it." The China Foundation Center was created a few years ago to promote transparency in the nonprofit sector.

India is also a good target. Like China, no one there trusts charities. Probably because they're all scams? But there is an organization called Credibility Alliance that accredits the more transparent ones. I'm a big fan of Transparency International India. They do so much on a shoestring in the single most important issue in the country (corruption), and are the most credible/transparent.

Comment author: Telofy  (EA Profile) 17 May 2017 02:26:43PM *  2 points [-]

Here’s what I usually found most unfortunate about the comparison, but I don’t mean to compete with anyone who thinks that the math is more unfortunate or anything else.

  1. The decision to sacrifice the well-being of one person for that of others (even many others) should be hard. If we want to be trusted (and the whole point of GiveWell is that people don’t have the time to double-check all research no matter how accessible it is – plus, even just following a link to GiveWell after watching a TED Talk requires that someone trusts us with their time), we need to signal clearly that we don’t make such decisions lightly. It is honest signaling too, since the whole point of EA is to put a whole lot more effort into the decision than usual. Many people I talk to are so “conscientious” about such decisions that they shy away from them completely (implicitly making very bad decisions). It’s probably impossible to show just how much effort and diligence has gone into such a difficult decision in just a short talk, so I rather focus on cases where I am, or each listener is, the one at whose detriment we make the prioritization decision, just like in the Child in the Pond case. Few people would no-platform me because they think it’s evil of me to ruin my own suit.
  2. Sacrificing oneself, or rather some trivial luxury of oneself, also avoids the common objection why a discriminated against minority should have to pay when there are [insert all the commonly cited bad things like tax cuts for the most wealthy, military spending, inefficient health system, etc.]. It streamlines the communication a lot more.
  3. The group at whose detriment we need to decide should never be a known, discriminated against minority in such examples, because these people are used to being discriminated against and their allies are used to seeing them being discriminated against, so when someone seems to be saying that they shouldn’t receive some form of assistance, they have just a huge prior for assuming that it’s just another discriminatory attack. I think their heuristic more or less fails in this case, but that is not to say that it’s not a very valid heuristic. I’ve been abroad in a country where pedestrian crosswalks are generally ignored by car drivers. I’m not going to just blinding walk onto the street there even if the driver of the only car coming toward me is actually one who would’ve stopped for me if I did. My heuristic fails in that case, but it generally keeps me safe.
  4. Discriminated minority groups are super few, especially the ones the audience will be aware of. Some people may be able to come up with a dozen or so, some with several dozens. But in my actual prioritization decisions for the Your Siblings charity, I had to decide between groups of so fuzzy reference classes that there must be basically arbitrarily many of such groups. Street children vs. people at risk of malaria vs. farmed animals? Or street children in Kampala vs. people at risk of malaria in the southern DRC vs. chickens farmed for eggs in Spain? Or street children of the lost generation in the suburb’s of Kampala who were abducted for child sacrifice but freed by the police and delivered to the orphanage we’re cooperating with vs. …. You get the idea. If we’re unbiased, then what are the odds that we’ll draw a discriminated against group from the countless potential examples in this urn? This should heavily update a listener toward thinking that there’s some bias against the minority group at work here. Surely, the real explanation is something about salience on our minds or ease of communication and not about discrimination, but they’d have to know us very well to have so much trust in our intentions.
  5. People with disability probably have distance “bias” at the same rates as anyone else, so they’ll perceive the blind person with the guide dog as in-group, the blind people suffering from cataracts in developing countries as completely neutral foreign group, and us as attacking them, making us the out-group. Such controversy is completely avoidable and highly dangerous, as Owen Cotton-Barratt describes in more detail in his paper on movement growth. Controversy breeds an opposition (and one that is not willing to engage in moral trade with us) that destroys option value particularly by depriving us of the highly promising option to draw on the democratic process to push for the most uncontroversial implications of effective altruism that we can find. Scott Alexander has written about it under the title “Toxoplasma of Rage.” I don’t think publicity is worth sacrificing the political power of EA for it, but that is just a great simplification of Owen Cotton-Barratt’s differentiated points on the topic.
  6. Communication is by necessity cooperative. If we say something, however true it may be, and important members of the audience understand it as something false or something else entirely (that may not have propositional nature), then we failed to communicate. When this happens, we can’t just stamp our collective foot on the ground and be like, “But it’s true! Look at the numbers!” or “It’s your fault you didn’t understand me because you don’t know where I’m coming from!” That’s not the point of communication. We need to adapt our messaging or make sure that people at least don’t misunderstand us in dangerous ways.

(I feel like you may disagree on some of these points for similar reasons that The Point of View of the Universe seemed to me to argue for a non-naturalist type of moral realism while I “only” try to assume some form of non-cognitivist moral antirealism, maybe emotivism, which seems more parsimonious to me. Maybe you feel like or have good reasons to think that there is a true language (albeit in a non-naturalist sense) so that it makes sense to say “Yes, you misunderstood me, but what I said is true, because …,” while I’m unsure. I might say, “Yes, you misunderstood me, but what I meant was something you’d probably agree with. Let me try again.”)

Comment author: Austen_Forrester 06 June 2017 11:08:45PM 1 point [-]

Blind people are not a discriminated group, at least not in the first world. The extreme poor, on the other hand, often face severe discrimination -- they are mistreated and have their rights violated by those with power, especially if they are Indians of low caste.

Comparative intervention effectiveness is a pillar of EA, distinct from personal sacrifice, so they are not interchangeable. I reject that there is some sort of prejudice for choosing to help one group over another, whether the groups are defined by physical condition, location, etc. One always has to choose. No one can help every group. Taking the example of preventing blindness vs assisting the blind, clearly the former is the wildly superior intervention for blindness so it is absurd to call it prejudiced against the blind.

Comment author: PeterSinger 12 May 2017 11:31:18PM 6 points [-]

I don't understand the objection about it being "ableist" to say funding should go towards preventing people becoming blind rather than training guide dogs

If "ableism" is really supposed to be like racism or sexism, then we should not regard it as better to be able to see than to have the disability of not being able to see. But if people who cannot see are no worse off than people who can see, why should we even provide guide dogs for them? On the other hand, if -- more sensibly -- disability activists think that people who are unable to see are at a disadvantage and need our help, wouldn't they agree that it is better to prevent many people -- say, 400 -- experiencing this disadvantage than to help one person cope a little better with the disadvantage? Especially if the 400 are living in a developing country and have far less social support than the one person who lives in a developed country?

Can someone explain to me what is wrong with this argument? If not, I plan to keep using the example.

Comment author: Austen_Forrester 06 June 2017 10:10:58PM 0 points [-]

Peter, even if a trachoma operation cost the same as training a guide dog, and didn't always prevent blindness, it would still be an excellent cost comparison because vision correction is vastly superior to having a dog.

Comment author: concerned_ 12 March 2017 02:02:28AM *  4 points [-]

I mostly agree with you. It honestly does worry me that the mainstream EA movement has no qualms about associating with FRI, whose values, I would wager, conflict with the those of the majority of humankind. This is one of the reasons I have drifted away from identifying with EA lately.

Self-styled “effective altruists” try to pass themselves off as benevolent, but the reality is that they themselves are one of the biggest threats to the world by promoting terrorism and anti-spirituality under the cloak of altruism.

It's a stretch to say FRI directly promotes terrorism; they make it clear on their website that they oppose violence and encourage cooperation with other (non-NU) value systems. The end result of their advocacy, however, may be less idealistic than they anticipate. (It's not too hard to imagine a negative utilitarian Kaczynski, if their movement gains traction. I think there's even a page on the FRI website where they mention that as a possible risk of advocating for suffering-focused ethics.)

I don't know what you mean by "anti-spirituality".

Comment author: Austen_Forrester 15 March 2017 02:24:17AM -2 points [-]

They encourage cooperation with other value systems to further their apocalyptic goals, but mostly to prevent others from opposing them. That is different from tempering "strong NU" with other value systems to arrive at more moderate conclusions.

LOOOOL about your optimism of people not following FRI's advocacy as purely as they want! Lets hope so, eh?

Comment author: inconvenient 11 March 2017 10:43:22PM 1 point [-]

If it was the case that FRI was accurately characterized here, then do we know of other EA orgs that would promote mass termination of life? If not, then it it is a necessary example, plain and simple.

Comment author: Austen_Forrester 15 March 2017 02:17:01AM 2 points [-]

It's the only negative utilitarianism promoting group I know of. Does anyone know of others (affiliated with EA or not)?

Comment author: concerned_ 12 March 2017 02:02:28AM *  4 points [-]

I mostly agree with you. It honestly does worry me that the mainstream EA movement has no qualms about associating with FRI, whose values, I would wager, conflict with the those of the majority of humankind. This is one of the reasons I have drifted away from identifying with EA lately.

Self-styled “effective altruists” try to pass themselves off as benevolent, but the reality is that they themselves are one of the biggest threats to the world by promoting terrorism and anti-spirituality under the cloak of altruism.

It's a stretch to say FRI directly promotes terrorism; they make it clear on their website that they oppose violence and encourage cooperation with other (non-NU) value systems. The end result of their advocacy, however, may be less idealistic than they anticipate. (It's not too hard to imagine a negative utilitarian Kaczynski, if their movement gains traction. I think there's even a page on the FRI website where they mention that as a possible risk of advocating for suffering-focused ethics.)

I don't know what you mean by "anti-spirituality".

Comment author: Austen_Forrester 15 March 2017 02:03:18AM -1 points [-]

I know they don't actually come out and recommend terrorism publicly... but they sure go as far as they can to entice terrorism without being prosecuted by the government as a terrorist organization. Of course, if they were explicit, they'd immediately be shut down and jailed by authorities.

I promise you this – all those who endorse this mass termination of life ideology are going to pay a price. Whether by police action or public scrutiny, they will be forced to publicly abandon their position at some point. I implore them to do it now, on their volition. No one will believe them if they conveniently change their minds about no-rules negative utilitarianism after facing public scrutiny or the law. Now is the time. I warned CEA about this years ago, yet they still promote FRI.

I actually respect austere population-control to protect quality of life, even through seemingly drastic means such as forced sterilization (in extreme scenarios only, of course). However, atheists don't believe in any divine laws such as the sin of killing, are thus not bound by any rules. The type of negative utilitarianism popular in EA is definitely a brutal no-rules, mass killing-is-okay type. It is important to remember, also, that not everyone has good mental health. Some people have severe schizophrenia and could start a forest fire or kill many people to “prevent suffering” without thinking through all of the negative aspects of doing this. I think that the Future of Humanity Institute should add negative utilitarian atheism to their list of existential risks.

Anti-spirituality: Doesn't have anything to do with NU or FRI, I probably should have left it out of my comment. It just means that many EAs use EA as a means to promote atheism/atheists. Considering about 95% of the world's population are believers, they may have an issue with this aspect of the movement.

Comment author: the_jaded_one 11 March 2017 03:24:00PM *  5 points [-]

Also, I am somewhat concerned that this comment has been downvoted so much. It's the only really substantive criticism of the article (admittedly it isn't great), and it is at -3, right at the bottom.

Near the top are several comments at +5 or something that are effectively just applause.

Comment author: Austen_Forrester 15 March 2017 01:54:55AM 0 points [-]

LOL. Typical of my comments. Gets almost no upvotes but I never receive any sensible counterarguments! People use the forum vote system to persuade (by social proof) without having a valid argument. I have yet to vote a comment (up or down) because I think people should think for themselves.

Comment author: Austen_Forrester 10 March 2017 05:44:00AM -1 points [-]

Those guiding principles are good. However, I wished you would include one that was against doing massive harm to the world. CEA endorses the “Foundational Research Institute,” a pseudo-think tank that promotes dangerous ideas of mass-termination of human and non-human life, not excluding extinction. By promoting this organization, CEA is promoting human, animal, and environmental terrorism on the grandest scale. Self-styled “effective altruists” try to pass themselves off as benevolent, but the reality is that they themselves are one of the biggest threats to the world by promoting terrorism and anti-spirituality under the cloak of altruism.

View more: Next