Comment author: Denkenberger 15 September 2018 06:29:42PM 8 points [-]

Interesting idea, but I think GCR / X risk is further along. CSER has identified thousands of relevant papers, and there have been 3 special issues in the last 4 years. So I think GCR is ready for a journal (perhaps 2-4 issues per year). I would recommend for cause prioritization to do a few special issues and see how it turns out. Even that is a significant time commitment. One way to do it would be to have a special issue associated with an EA global.

Comment author: Denkenberger 25 August 2018 05:54:15AM 1 point [-]

Other long term EA-aligned organizations include Future of Life Institute, Global Catastrophic Risk Institute, Alliance to Feed the Earth in Disasters, AI Impacts, Berkeley Existential Risk Institute, etc.

Comment author: Peter_Hurford  (EA Profile) 04 August 2018 10:14:37PM *  18 points [-]

I agree the far future is overwhelmingly important. However, I don't think it's been shown that focusing on the far-future really is more cost-effective, even when taking a far-future point of view. I have a degree of epistemic uncertainty with wide error bars such that I wouldn't be too surprised if MIRI turned out to be the most cost-effective but I also wouldn't be too surprised if it turned out that AMF was the most cost-effective. Right now, in my view, the case for the far-future seems to be arguing that if you take a large number and divide it by some unknown probability of success that you must still get a large number, where this isn't true. I'd like organizations like MIRI to back up the claim that they have a "medium probability" of success.

I personally tend to value being able to learn about causes and being empirical about how to do good. This makes it more difficult to work in far-future causes due to the lack of feedback loops, but I don't think it's impossible (e.g., I like the approach being taken by AI Impacts and through Rethink Priorities I'm now working to try to refine my own views on this).

I think this update on my skepticism post still represents my current position somewhat well, though it is definitely due for an update.

Overall, I definitely favor spending resources on x-risk reduction efforts. I'm even comfortable with roughly 50% of the EA movement's resources being spent on it, given that I sure wouldn't want to be wrong on this issue -- extinction seems like a tremendous downside! However, I'd prefer there to be more effort spent on learning what we can about the value of these efforts and I also think it's not yet clear that poverty or animal-focused interventions are not equally or more valuable.

Lastly, as a movement, we certainly can and should do more than one thing. We can fight x-risk while also fighting malaria. I think we'd have a stronger and more robust movement this way.

I hope to write more on this in the future, eventually.

Comment author: Denkenberger 08 August 2018 05:17:15AM 1 point [-]

I agree that a lot of work on X risk/far future is value of information. But I argued here that the distributions of cost-effectiveness in the present generation of alternative food for agricultural catastrophes did not overlap with AMF. There very well could be flow-through effects from AMF to the far future, but I think it is hard to argue that they would be greater than actually addressing X risk. So I think if you do value the far future, it would be even harder to argue that the distribution of alternate foods and AMF overlap. There would be a similar results for AI vs AMF if you believe the model referred to here.

In response to Open Thread #40
Comment author: remmelt  (EA Profile) 08 July 2018 08:24:24PM *  18 points [-]

The EA Forum Needs More Sub-Forums

EDIT: please go to the recent announcement post on the new EA Forum to comment

The traditional discussion forum has sub-forums and sub-sub-forums where people in communities can discuss areas that they’re particularly interested in. The EA Forum doesn’t have these and this make it hard to filter for what you’re looking for.

On Facebook on the other hand, there are hundreds of groups based around different cause areas, local groups and organisations, and subpopulations. Here it’s also hard to start rigorous discussions around certain topics because many groups are inactive and moderated poorly.

Then there are lots of other small communication platforms launched by organisations that range in their accessibility, quality standards, and moderation. It all kind of works but it’s messy and hard to sort through.

It’s hard to start productive conversations on specialised niche topics with international people because

  • 1) Relevant people won’t find you easily within the mass of posts

  • 2) You’ll contribute to that mass and thus distract everyone else.

Perhaps this a reason why some posts on specific topics only get a few comments even though the quality of the insights and writing seems high.

Examples of posts that we’re missing out on now:

  • Local group organiser Kate tried X career workshop format X times and found that it underperformed other formats

  • Private donor Bob dug into the documents of start-up vaccination charity X and wants to share preliminary findings with other donors in the global poverty space

  • Machine learning student Jenna would like to ask some specific questions on how the deep reinforcement learning algorithm of AlphaGo functions

  • The leader of animal welfare advocacy org X would like to share some local engagement statistics on vegan flyering, 3D headset demos, before sending them off in a more polished form to ACE.

Interested in any other examples you have. :-)

What to do about it?

I don’t have any clear solutions in mind for this (perhaps this could be made a key focus in the transition to using the forum architecture of LessWrong 2.0). Just want to plant a flag here that given how much the community has grown vs. 3 years ago, people should start specialising more in the work they do, and that our current platforms are woefully behind for facilitating discussions around that.

It would be impossible for one forum to handle all this adequately and it seems useful for people to experiment with different interfaces, communication processes and guidelines. Nevertheless, our current state seems far from optimal. I think some people should consider tracking down and paying for additional thoughtful, capable web developers to adjust the forum to our changing needs.

UPDATE: After reading @John Maxwell IV's comments below, I've changed my mind from a naive 'we should overhaul the entire system' view to 'we should tinker with it in ways we expect would facilitate better interactions, and then see if they actually do' view.

In response to comment by remmelt  (EA Profile) on Open Thread #40
Comment author: Denkenberger 11 July 2018 01:46:23PM 4 points [-]

I like that the forum is not sorted so one can keep abreast of the major developments and debates in all of EA. I don't think there is so much content as to be overwhelming.

Comment author: Greg_Colbourn 18 June 2018 02:14:29PM 8 points [-]

I've not yet had anyone say it's a dealbreaker (and of course people are allowed to buy meat from takeaways - or microwaveable burgers etc - with their spending money if they are really craving it..). Whilst frugality comes into it, the main reason for the all vegan catering is ethics.

Also, I'd put money on the 2018 survey coming out with higher numbers for veg*anism :)

Comment author: Denkenberger 24 June 2018 04:13:29PM 0 points [-]

Good backup plan. That's great that it has not been a dealbreaker for anyone.

Comment author: Denkenberger 18 June 2018 04:15:13AM 0 points [-]

This says 20% of EA is vegan or vegetarian, so I would guess less than 10% vegan. Granted, the hard core EAs you are attracting may be more likely vegan, and you are lowering the barrier if someone else is reading labels and is hopefully a good cook. But I still think you are really limiting your pool by having all meals vegan. I understand you want to be frugal, and vegan from scratch is cheaper, but animal product substitutes are generally more expensive than animal products.

Comment author: Denkenberger 18 June 2018 03:58:00AM 0 points [-]

Nice idea! The free health care in the UK helps make it low cost, though is there a probationary period for immigrants?

Comment author: Denkenberger 01 June 2018 11:33:17PM 5 points [-]

This is a great project! I agree unfunded theses are a huge untapped resource. We were trying to do something similar with our essay contest on global agricultural catastrophes, but it was not very successful. Joshua Pearce and I have dozens of ideas for effective theses, so we will reach out. Minor comment: it is good to spell out the month, because date conventions are different in the US than Europe.

Comment author: Denkenberger 23 May 2018 04:14:09PM 1 point [-]

This is very helpful. I would note that the Global Catastrophic Risk Institute does AI and is funding constrained. Of course it also does other X risk work, but I think it would be good to broaden your category to include this or have a separate category.

Comment author: Denkenberger 20 May 2018 07:59:23PM 1 point [-]

Interesting - so then interventions that do well on both long-term future and humans today like AI and alternate foods would do very well by your numbers.

View more: Next