Comment author: DavidNash 10 July 2017 09:04:47PM 0 points [-]

Might it be that 80k recommend X-risk because it's neglected (even within EA) and that if more then 50% of EAs had X-risk as their highest priority it would no longer be as neglected?

Comment author: Evan_Gaensbauer 10 July 2017 09:47:05PM -1 points [-]

I don't think that'd be the case as from inside the perspective of someone already prioritizing x-risk reduction, it can appear that cause is at least thousands of times more important than literally anything else. This is based on an idea formulated by philosopher Nick Bostrom: astronomical stakes (this is Niel Bowerman in the linked video, not Nick Bostrom). The ratio x-risk reducers think is appropriate for resources dedicated to x-risk relative to other causes is arbitrarily high. Lots of people think the argument is missing some important details, or ignoring major questions, but I think from their own inside view x-risk reducers probably won't be convinced by that. More effective altruists could try playing the double crux game to find the source of disagreement about typical arguments for far-future causes. Otherwise, x-risk reducers would probably maintain in the ideal as many resources as possible ought be dedicated to x-risk reduction, but in practice may endorse other viewpoints receiving support as well.

Comment author: lukeprog 24 June 2017 11:04:21PM 1 point [-]

Sure. In that case, I won't reply to them (if they aren't posted directly to the AMA) until the AMA is "winding down," or something.

Comment author: Evan_Gaensbauer 25 June 2017 07:40:09AM 0 points [-]

Hey, that sounds great to me. Thanks. Here's my question.

Do you think science or philosophy can meaningfully separate the capacity to experience suffering or pain from however else consciousness is posited to be distributed across species? What would be fruitful avenues of research for effective altruists to pursue if it's possible to solve the problem in the first question, without necessarily addressing whatever remains of consciousness?

Comment author: Evan_Gaensbauer 24 June 2017 10:13:07AM 0 points [-]

May we submit questions here to be asked on our behalf if we don't think we'll be free on Wednesday to ask during the AMA live?

Comment author: Evan_Gaensbauer 15 June 2017 07:15:37AM 1 point [-]

There are ongoing controversies in EA, even if they're not obvious. That is, there are lots of ongoing debates EA that flare up occasionally, but are unresolved in terms of what concessions different effective altruists think our community ought be willing to make. I might cover some of those in the near future, and I'll cite this blog post. This is valuable in that I'd be covering object-level controversies, and having the outline of an argument established on the forum here in a neutral fashion beforehand will be helpful. Thanks for writing this.

In response to Political Ideology
Comment author: Evan_Gaensbauer 28 May 2017 06:11:32AM *  0 points [-]

It seems if the utility of a political ideology is based on a sort of mindset that generates outcomes which produce the greatest average well-being, but getting involved in political party machines to change society with policy isn't efficient enough a process, then getting upstream of politics to what produces cultural changes allowing these attitudes to permeate democratic societies which engender and incentivize, all other things being equal, optimal lifestyle outcomes is the best thing to do.

Societies throughout Africa and most of Asia still seem largely religious or traditional, while societies in the Americas, Europe, East Asia and Australia are more secular and have a stronger civic culture generating their values. I think there's a state of heightened tension in political across the world today, and the low-level of extremists which typically blend in with broader partisan coalitions are exploiting the opportunity to push an anti-humanistic agenda. Part of this is upstream of violence paving the way for it involves dehumanizing one's political opponents with propaganda. That's something that's happening on the political fringes in North America today. There's lots of allegations reported in the news of this in Europe as well, but honestly what counts as reliable from this far overseas where I am I don't know. News might travel far, but accurate reporting doesn't. The accusations of political violence in North America I know are based on multiple local news sources, eye-witness accounts, video footage, and things one can plainly see with one's own eyes by visiting these places. This has been my experience living in Vancouver, Canada, having lived on the west coast my whole life. I guess it's like that regionally all over the map, but virtually any blog or news source beyond the local level distorts real events to the point it's practically impossible to substantiate any allegations.

However, it's also been my experience most people repudiate all this type of behaviour, and it appears to be staying out of the party machine. I can't find the link now, but I read a blog post written by economist Bryan Caplan about as much hype as there is in the news, what boils down to apparently no more than a few thousand extremists on either side of the political divide in North America doesn't pose much of a real threat to a society of hundreds of millions who'd sooner turn such people out then condone their behaviour. That's the gist of the argument, and I found it more or less convincing. I mean, certainly, everyone ought condemn the instigation of political violence, and not tolerate propaganda that would promote such as legitimate means of activism either. However, all I'm saying is it doesn't hold up as an argument against getting involved in politics as a form of effective altruism.

I think this is a major consideration against political involvement in the world today. This speaks to what I know to be true of North America. I think in the other continents I mentioned where political action is a viable way to improve human well-being at this point in history, one will have to defer to someone who knows better. I definitely encourage people with that sort of knowledge to speak up. I only know enough to speak about politics in North America. However, I think the inefficiency of political involvement or activism is so great that alone demonstrates for most people and movements the opportunity costs of diverting resources to politics will be too high.

In the Americas, there's still a sufficient percentage of deeply religious people among the population of democratic societies. It's a commonly accepted belief the political base inculcating the attitudes towards some superior well-being outcomes through policy action and thus electoral politics and political activism, is the socially conservative religious right. However, based on effective altruism taking a stance of moral pluralism, there are certain shibboleths of religious/social conservatism much of the religious public wouldn't be willing to compromise on which will necessarily remain tolerated in the EA community. So, getting involved in that sort of political action is impracticable from a realistic EA perspective at present.

However, there are plenty of organizations and communities in and around the effective altruism movement which are taking an approach to engendering changes in cultural attitudes to set the stage for later policy reform activism in politics. Sentience Politics is currently doing this work in German-speaking Europe, and is currently trying to expand into other countries. The Life You Can Save and Giving What We Can function as projects that do so, and funding movement growth of EA to promote and spread values which in practice lead to significant lifestyle change are abundant. One cause which stands out as not correlated with favouring cultural change is x-risk reduction. However, the base of support for that cause also largely comes out of the rationality movement. There are lots of people in the rationality community who are already supporting, promoting and creating such projects aimed at changing society in a manner upstream or outside of politics. I haven't talked to anyone enough to figure out which projects would appeal best or most to effective altruism in this regard.

Comment author: Evan_Gaensbauer 24 March 2017 09:22:31PM 4 points [-]

Upvoted. Maybe this is just what's typical for academic style, but when you address Gabriel it seems you're attributing the points of view presented to him, while when I read the original paper I got the impression he was collating and clarifying some criticisms of effective altruism so they'd be coherent enough they'd affect change. One thing about mainstream journalism is it's usually written for a specific type of audience the editor has in mind, even if the source claims to be seeking a general audience, and so they spin things a certain way. While criticisms of EA definitely aren't what I'd call sensationalistic, they're written in a style of a rhetorical list of likes and dislikes about the EA movement. It's taken for granted the implied position of the author is somehow a better alternative than what EA is currently doing, as if no explanation is needed.

Gabriel fixes this by writing the criticisms of EA up in a way that we'd understand what about the movement would need to change to satisfy critics, if we were indeed to agree with critics. Really, except for the pieces published on the Boston Review, I feel like other criticisms of EA were written not for EA at all, but rather a review of EA for other do-gooders as a warning to stay away from the movement. It's not the job of critics to solve all our problems for us, but being a movement that is at least willing to try to change in the face of criticism, it's frustrating nobody takes us up on the opportunity given what blindspots we may have and tries to be constructive.

In response to Open Thread #36
Comment author: Evan_Gaensbauer 17 March 2017 03:26:28AM 3 points [-]

As people age their lives become more difficult. Physically and mentally, they just aren't where they previously were. Most effective altruists are younger people, and they may not take into consideration how risky it can be to not have any savings cushion in the case things change. We can't necessarily count on pension plans to cover us in our old age. We can't assume our health will always be what it is now. A lot of people will face harder times in the future, and being put in the mindset of assuming one won't face personal hardship, so one need not save money, is reckless.

It's one thing if someone aspires to be wealthy, retire at age 30 like Mr. Money Mustache, or live a luxurious retirement. But it's dangerous to create a culture in EA where people might be accused of hypocrisy to even save enough for retirement to cover their own basic living expenses. It's also dangerous for us to presume that each of our lives will go so easily we can work until we die, or we won't get sick. While talking about these things in the abstract may be well and fine, I want to register my conviction using social influence, i.e., peer pressure, alone to normalize "don't/no need to save for retirement" as practical advice among effective altruists is potentially dangerous.

Comment author: RobBensinger 07 February 2017 10:36:23PM -1 points [-]

Anonymous #2:

I'd prefer it if more people in EA were paid on a contract basis, if more people were paid lower salaries, if there were more mechanisms for the transfer of power in organizations (e.g., a 2- or 3-year term limit for CEOs and a maximum age at entry), and if there were more direct donations. Also: better systems to attract young people. More people in biology. More optimism. More willingness to broadcast arguments against working on animal welfare that have not been refuted.

Comment author: Evan_Gaensbauer 22 February 2017 05:48:12AM 1 point [-]

I originally downvoted this comment, because some of the suggestions obviously suck, but some of the points here could be improved.

I'd prefer it if more people in EA were paid on a contract basis.

There are a lot of effective altruists who have just as good ideas as anyone working at an EA non-profit, or a university, but due to a variety of circumstances, they're not able to land those jobs. Some effective altruists already run Patreons for their blogs, and I think the material coming out of them is decent, especially as they can lend voices independent of institutions on some EA subjects. Also, they have the time to cover or criticize certain topics other effective altruists aren't since their effort is taken up by a single research focus.

I'd prefer it if more people in EA were paid on a contract basis.

Nothing can be done about this criticism if some numbers aren't given. Criticizing certain individuals for getting paid too much, or criticizing certain organizations for paying their staff too much, isn't an actionable criticism unless one gets specific. I know EA organizations whose staff, including the founders who decide the budget, essentially get paid minimum wage. On the other hand, Givewell's cofounders Holden and Elie get paid well into the six figures each year. While I don't myself much care, I've privately chatted with people who perceive this as problematic. Then, there may be some staff at some EA organizations who may appear to others to get paid more than they deserve, especially when their salaries may be able to pay for one or more full-time salaries as other individuals perceived to be just as competent. That last statement was full of conditionals, I know, but it's something I'm guessing they anonymous commenter was concerned about.

f there were more mechanisms for the transfer of power in organizations (e.g., a 2- or 3-year term limit for CEOs and a maximum age at entry),

Again, they'd need to be specific about what organization they're talking about. The biggest problem with this comment is the commenter made broad, vague generalizations which aren't actionable. It's uncomfortable to make specific criticisms of individuals or organizations, yes, but the point of an anonymous criticism is to be able to do that if it's really necessary with virtual impunity, while bad commentary which are more or less character assassinations can easily be written off without a flamewar ensuing, or feelings not getting as hurt.

Anyway, I too can sympathize with demands for more accountability, governance and oversight at EA organizations. For example, many effective altruists have been concerned time and again with the influence of major organizations like the Centre for Effective Altruism which, even if its not their intent, may be perceived to represent and speak for the movement as a whole. This could be a problem. However, while EA need not only be a social movement predicated on and mediated through registered NPOs, it by and large is and will continue to be in practice, as many social movements which are at all centralized are. Making special asks for changes in governance at these organizations to become more democratic without posting to the EA Forum directly and making the suggestions consistent with how NPOs at all operate in a given jurisdiction will just not result in change. These suggestions really stand out considering they're more specific than I've seen anyone call for, as if this is a desperate problem in EA, when at most I've seen similar sentiments at most expressed as vague concerns on the EA Forum.

and if there were more direct donations.

The EA Forum and other channels like the 'Effective Altruism' Facebook group appear dominated by fundraisers and commentary on and from metacharities because those are literally some of the only appropriate outlets for metacharities to fundraise or to publish transparency reports. Indeed, that they're posting material besides fundraisers beyond their own website is a good sign, as it's the sort of transparency and peer review the movement at large would demand of metacharities. Nonetheless, between this and constant chatter about metacharities on social media, I can see how the perception most donations are indirect and go to metacharities arises. However, this may be illusory. The 2015 EA Survey, the latest date for which results are available, show effective altruists overwhelmingly donate to Givewell's recommended charities. Data isn't available on the amounts of money self-identified effective altruists are moving to each of these given charities. So, it's possible lots of effective altruists earning to give are making primarily indirect donations. However, anecdotally, this doesn't seem to be the case. If one wants to make that case, and then mount a criticism based on it, one must substantiate it with evidence.

Comment author: Evan_Gaensbauer 30 January 2017 09:23:42PM 1 point [-]

If effective altruists aren't perfect utilitarians because they're human, and humans can't be perfect utilitarians because they're human, maybe the problem is effective altruists trying to be perfect utilitarians despite their inability to do so, and that's why they make mistakes. What do you think of that?

Comment author: RyanCarey 12 January 2017 09:54:57PM 8 points [-]

I strongly agree with the points Ben Hoffman has been making (mostly in the other threads) about the epistemic problems caused by holding criticism to a higher standard than praise. I also think that we should be fairly mindful that providing public criticism can have a high social cost to the person making the criticism, even though they are providing a public service.

This is completely true.

I personally have a number of criticisms of EA (despite overall being a strong proponent of the movement) that I am fairly unlikely to share publicly, due to the following dynamic: anything I write that wouldn't incur unacceptably high social costs would have to be a highly watered-down version of the original point, and/or involve so much of my time to write carefully that it wouldn't be worthwhile.

There are at least a dozen people for whom this is true.

Comment author: Evan_Gaensbauer 13 January 2017 01:47:15PM 3 points [-]

I feel like this is true for me too. I'd guess I've got more spare time on my hands than you guys. I also don't currently work for any EA charities. It's really hard to make your beliefs pay rent when you're in near mode and you're constantly worried about how if you screw up a criticism you'll lose connections and get ostracized, or you'll hurt the trajectory of a cause or charity you like by association because as much as we like to say we're debiased a lot of time affective rationalizations sneak into our motivations. Well, we all come from different walks of life, and a lot of us haven't been in communities trying to be as intellectually honest and epistemically virtuous as EA tries to be. It's hard to overcome that aversion to keeping our guard up because everywhere else we go in life our new ideas are treated utterly uncharitably, like, worse than anything in EA on a regular day. It's hard to unlearn those patterns. We as a community need to find ways to trust each other more. But that takes a lot of work, and will take a while.

In the meantime, I don't have a lot to lose by criticizing EA, or at least I can take a hit pretty well. I mean, maybe there are social opportunity costs, what I won't be able to do in the future if I became low-status, but I'm confident I'm the sort of person who can create new opportunities for himself. So I'm not worried about me, and I don't think anyone else should either. I've never had a cause selection. Honestly, it felt weird to talk about, but this whole model uncertainty thing people are going for between causes now is something I've implicitly grasped the whole time. Like, I never understood why everyone was so confident in their views on causes when a bunch of this stuff requires figuring out things about consciousness, or the value of future lives, which seem like some philosophically and historically mind-boggling puzzles to me.

If you go to my EAHub profile, you'll notice the biggest donation I made was in 2014 for $1000 to Givewell for unrestricted funds. That was because I knew those funds would increase the pool of money for starting the Open Philanthropy Project. And it was matched. You'll also notice I select pretty much every cause as something to consider, as I'm paranoid about myself or EA in general missing out on important information. All I can say about my politics is I'm a civil libertarian and otherwise I don't get offended by reading things when they're written by people who want to improve EA in earnest. I hope you'll take my word that I didn't just edit my EA Hub profile now. That's what I got for a badge to show I really try to stay neutral.

If anyone wants to privately and/or anonymously send me their thoughts on an EA organization, and what they're doing wrong, no matter what it is, I'll give my honest feedback and we can have a back and forth and hopefully hammer something out to be published. I also don't particularly favour any EA org right now as I feel like a lot of these organizations are people who've only been in academia, or the software industry, or are sometimes starting non-profits right out of college, who might just not have the type or diversity of experience to alone make good plans/models, or get skills for dealing with different types of people and getting things done. I've thought for a while all these organizations at different points have made little or big mistakes, which are really hard to talk about in public, and it feels a bit absurd to me they're never talked about.

Feel free to send me stuff. Please don't send me stuff about interpersonal drama. Treat what you send me like filing a bug report.

View more: Prev | Next