Comment author: Milan_Griffes 03 October 2017 01:22:58AM 2 points [-]

Minor thing: it'd be helpful if people who downvoted commented with their reason why.

Comment author: Evan_Gaensbauer 03 October 2017 03:10:49AM *  3 points [-]

Presumably it's because they thought that either this sort of drug policy reform, or, more likely, they don't think an announcement for conferences exclusive to what is still only a minor cause in the effective altruism community justifies its own post on the EA Forum.

Based on our investigation so far, US drug policy reform appears to be an impactful and tractable cause area.

Some users might just not visit the Forum often enough to have heard of Enthea's work before, so you could edit the post and add some hyperlinks to your other posts on the EA Forum so everyone will know the context of this post.

Comment author: Evan_Gaensbauer 26 August 2017 09:24:51PM 5 points [-]

Another specific part of life that isn't replicable for lots of effective altruists as compared to others is being fully able-bodied, or being in good health. One common but largely unspoken facet of life is lots of people have problems with physical or mental illness which either cost money, or hinder their ability to earn money as they would have been able to otherwise. So, including opportunity costs, the costs of health problems can be quite steep. This is the number one thing I think would affect all kinds of people, and so is a primary consideration to take into account of what the added necessary and fixed costs in a budget would be in addition to the template provided above.

In response to EAGx Relaunch
Comment author: Evan_Gaensbauer 04 August 2017 07:17:55AM 1 point [-]

What's the timeframe in which CEA will be accepting applications to host/organize EAGx events?

Comment author: Linch 13 July 2017 05:56:57AM 0 points [-]

This seems like a perfectly reasonable comment to me. Not sure why it was heavily downvoted.

Comment author: Evan_Gaensbauer 14 July 2017 08:09:15AM 0 points [-]

Talking about people in the abstract, or in a tone as some kind of "other", is to generalize and stereotype. Or maybe generalizing and stereotyping people others them, and makes them too abstract to empathize with. Whatever the direction of causality, there are good reasons people might take my comment poorly. There's lots of skirmishes online in effective altruism between causes, and I expect most of us don't all being lumped together in a big bundle, because it feels like under those circumstances at least a bunch of people in your inner-ingroup or whatnot will feel strawmanned. That's what my comment reads like. That's not my intention.

I'm just trying to be frank. On the Effective Altruism Forum, I try to follow Grice's Maxims because I think writing in that style heuristically optimizes the fidelity of our words to the sort of epistemic communication standards the EA community would aspire to, especially as inspired by the rationality community to do so. I could do better on the maxims of quantity and manner/clarity sometimes, but I think I do a decent job on here. I know this isn't the only thing people will value in discourse. However, there are lots of competing standards for what the most appropriate discourse norms are, and nobody is establishing to others how the norms will not just maximize the satisfaction of their own preferences, but maximize the total or average satisfaction for what everyone values out of discourse. That seems the utilitarian thing to do.

The effects of ingroup favouritism in terms of competing cause selections in the community don't seem healthy to the EA ecosystem. If we want to get very specific, here's how finely the EA community can be sliced up by cause-selection-as-group-identity.

  • vegan, vegetarian, reducetarian, omnivore/carnist
  • animal welfarist, animal liberationist, anti-speciesist, speciesist
  • AI safety, x-risk reducer (in general), s-risk reducer
  • classical utilitarian, negative utilitarian, hedonic utilitarian, preference utilitarian, virtue ethicist, deontologist, moral intuitionist/none-of-the-above
  • global poverty EAs; climate change EAs?; social justice EAs...?

The list could go on forever. Everyone feels like their representing not only their own preferences in discourse, but sometimes even those of future generations, all life on Earth, tortured animals, or fellow humans living in agony. Unless as a community we make an conscientious effort to reach towards some shared discourse norms which are mutually satisfactory to multiple parties or individual effective altruists, however they see themselves, communication failure modes will keep happening. There's strawmanning and steelmanning, and then there's representations of concepts in EA which fall in between.

I think if we as a community expect everyone to impeccably steelman everyone all the time, we're being unrealistic. Rapid growth of the EA movement is what organizations from various causes seem to be rooting for. That means lots of newcomers who aren't going to read all the LessWrong Sequences or Doing Good Better before they start asking questions and contributing to the conversation. When they get downvoted for not knowing the archaic codex that are evolved EA discourse norms, which aren't written down anywhere, they're going to exit fast. I'm not going anywhere, but if we aren't more willing to be more charitable to people we at first disagree with than they are to us, this movement won't grow. That's because people might be belligerent, or alarmed, by the challenges EA presents to their moral worldview, but they're still curious. Spurning doesn't lead to learning.

All of the above refers only to specialized discourse norms within just effective altruism. This would be on top of the complicatedness of effective altruists private lives, all the usual identity politics, and otherwise the common decency and common sense we would expect on posters on the forum. All of that can already be difficult for diverse groups of people as is. But for all of us to go around assuming the illusion of transparency makes things fine and dandy with regards to how a cause is represented without openly discussing it is to expect too much of each and every effective altruist.

Also, as of this comment, my parent comment above has net positive 1 upvote, so it's all good.

Comment author: DavidNash 10 July 2017 09:04:47PM 0 points [-]

Might it be that 80k recommend X-risk because it's neglected (even within EA) and that if more then 50% of EAs had X-risk as their highest priority it would no longer be as neglected?

Comment author: Evan_Gaensbauer 10 July 2017 09:47:05PM -1 points [-]

I don't think that'd be the case as from inside the perspective of someone already prioritizing x-risk reduction, it can appear that cause is at least thousands of times more important than literally anything else. This is based on an idea formulated by philosopher Nick Bostrom: astronomical stakes (this is Niel Bowerman in the linked video, not Nick Bostrom). The ratio x-risk reducers think is appropriate for resources dedicated to x-risk relative to other causes is arbitrarily high. Lots of people think the argument is missing some important details, or ignoring major questions, but I think from their own inside view x-risk reducers probably won't be convinced by that. More effective altruists could try playing the double crux game to find the source of disagreement about typical arguments for far-future causes. Otherwise, x-risk reducers would probably maintain in the ideal as many resources as possible ought be dedicated to x-risk reduction, but in practice may endorse other viewpoints receiving support as well.

Comment author: lukeprog 24 June 2017 11:04:21PM 1 point [-]

Sure. In that case, I won't reply to them (if they aren't posted directly to the AMA) until the AMA is "winding down," or something.

Comment author: Evan_Gaensbauer 25 June 2017 07:40:09AM 0 points [-]

Hey, that sounds great to me. Thanks. Here's my question.

Do you think science or philosophy can meaningfully separate the capacity to experience suffering or pain from however else consciousness is posited to be distributed across species? What would be fruitful avenues of research for effective altruists to pursue if it's possible to solve the problem in the first question, without necessarily addressing whatever remains of consciousness?

Comment author: Evan_Gaensbauer 24 June 2017 10:13:07AM 0 points [-]

May we submit questions here to be asked on our behalf if we don't think we'll be free on Wednesday to ask during the AMA live?

Comment author: Evan_Gaensbauer 15 June 2017 07:15:37AM 1 point [-]

There are ongoing controversies in EA, even if they're not obvious. That is, there are lots of ongoing debates EA that flare up occasionally, but are unresolved in terms of what concessions different effective altruists think our community ought be willing to make. I might cover some of those in the near future, and I'll cite this blog post. This is valuable in that I'd be covering object-level controversies, and having the outline of an argument established on the forum here in a neutral fashion beforehand will be helpful. Thanks for writing this.

In response to Political Ideology
Comment author: Evan_Gaensbauer 28 May 2017 06:11:32AM *  0 points [-]

It seems if the utility of a political ideology is based on a sort of mindset that generates outcomes which produce the greatest average well-being, but getting involved in political party machines to change society with policy isn't efficient enough a process, then getting upstream of politics to what produces cultural changes allowing these attitudes to permeate democratic societies which engender and incentivize, all other things being equal, optimal lifestyle outcomes is the best thing to do.

Societies throughout Africa and most of Asia still seem largely religious or traditional, while societies in the Americas, Europe, East Asia and Australia are more secular and have a stronger civic culture generating their values. I think there's a state of heightened tension in political across the world today, and the low-level of extremists which typically blend in with broader partisan coalitions are exploiting the opportunity to push an anti-humanistic agenda. Part of this is upstream of violence paving the way for it involves dehumanizing one's political opponents with propaganda. That's something that's happening on the political fringes in North America today. There's lots of allegations reported in the news of this in Europe as well, but honestly what counts as reliable from this far overseas where I am I don't know. News might travel far, but accurate reporting doesn't. The accusations of political violence in North America I know are based on multiple local news sources, eye-witness accounts, video footage, and things one can plainly see with one's own eyes by visiting these places. This has been my experience living in Vancouver, Canada, having lived on the west coast my whole life. I guess it's like that regionally all over the map, but virtually any blog or news source beyond the local level distorts real events to the point it's practically impossible to substantiate any allegations.

However, it's also been my experience most people repudiate all this type of behaviour, and it appears to be staying out of the party machine. I can't find the link now, but I read a blog post written by economist Bryan Caplan about as much hype as there is in the news, what boils down to apparently no more than a few thousand extremists on either side of the political divide in North America doesn't pose much of a real threat to a society of hundreds of millions who'd sooner turn such people out then condone their behaviour. That's the gist of the argument, and I found it more or less convincing. I mean, certainly, everyone ought condemn the instigation of political violence, and not tolerate propaganda that would promote such as legitimate means of activism either. However, all I'm saying is it doesn't hold up as an argument against getting involved in politics as a form of effective altruism.

I think this is a major consideration against political involvement in the world today. This speaks to what I know to be true of North America. I think in the other continents I mentioned where political action is a viable way to improve human well-being at this point in history, one will have to defer to someone who knows better. I definitely encourage people with that sort of knowledge to speak up. I only know enough to speak about politics in North America. However, I think the inefficiency of political involvement or activism is so great that alone demonstrates for most people and movements the opportunity costs of diverting resources to politics will be too high.

In the Americas, there's still a sufficient percentage of deeply religious people among the population of democratic societies. It's a commonly accepted belief the political base inculcating the attitudes towards some superior well-being outcomes through policy action and thus electoral politics and political activism, is the socially conservative religious right. However, based on effective altruism taking a stance of moral pluralism, there are certain shibboleths of religious/social conservatism much of the religious public wouldn't be willing to compromise on which will necessarily remain tolerated in the EA community. So, getting involved in that sort of political action is impracticable from a realistic EA perspective at present.

However, there are plenty of organizations and communities in and around the effective altruism movement which are taking an approach to engendering changes in cultural attitudes to set the stage for later policy reform activism in politics. Sentience Politics is currently doing this work in German-speaking Europe, and is currently trying to expand into other countries. The Life You Can Save and Giving What We Can function as projects that do so, and funding movement growth of EA to promote and spread values which in practice lead to significant lifestyle change are abundant. One cause which stands out as not correlated with favouring cultural change is x-risk reduction. However, the base of support for that cause also largely comes out of the rationality movement. There are lots of people in the rationality community who are already supporting, promoting and creating such projects aimed at changing society in a manner upstream or outside of politics. I haven't talked to anyone enough to figure out which projects would appeal best or most to effective altruism in this regard.

Comment author: Evan_Gaensbauer 24 March 2017 09:22:31PM 4 points [-]

Upvoted. Maybe this is just what's typical for academic style, but when you address Gabriel it seems you're attributing the points of view presented to him, while when I read the original paper I got the impression he was collating and clarifying some criticisms of effective altruism so they'd be coherent enough they'd affect change. One thing about mainstream journalism is it's usually written for a specific type of audience the editor has in mind, even if the source claims to be seeking a general audience, and so they spin things a certain way. While criticisms of EA definitely aren't what I'd call sensationalistic, they're written in a style of a rhetorical list of likes and dislikes about the EA movement. It's taken for granted the implied position of the author is somehow a better alternative than what EA is currently doing, as if no explanation is needed.

Gabriel fixes this by writing the criticisms of EA up in a way that we'd understand what about the movement would need to change to satisfy critics, if we were indeed to agree with critics. Really, except for the pieces published on the Boston Review, I feel like other criticisms of EA were written not for EA at all, but rather a review of EA for other do-gooders as a warning to stay away from the movement. It's not the job of critics to solve all our problems for us, but being a movement that is at least willing to try to change in the face of criticism, it's frustrating nobody takes us up on the opportunity given what blindspots we may have and tries to be constructive.

View more: Next