47

Peter_Hurford comments on My current thoughts on MIRI's "highly reliable agent design" work - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (57)

You are viewing a single comment's thread.

Comment author: Peter_Hurford  (EA Profile) 09 July 2017 12:48:26AM 5 points [-]

If one disagreed with an HRAD-style approach for whatever reason but still wanted to donate money to maximize AI safety, where should one donate? I assume the Far Future EA Fund?

Comment author: Daniel_Dewey 10 July 2017 07:27:24PM 3 points [-]

I am very bullish on the Far Future EA Fund, and donate there myself. There's one other possible nonprofit that I'll publicize in the future if it gets to the stage where it can use donations (I don't want to hype this up as an uber-solution, just a nonprofit that I think could be promising).

I unfortunately don't spend a lot of time thinking about individual donation opportunities, and the things I think are most promising often get partly funded through Open Phil (e.g. CHAI and FHI), but I think diversifying the funding source for orgs like CHAI and FHI is valuable, so I'd consider them as well.

Comment author: LawrenceC 23 July 2017 05:24:51AM 3 points [-]

Not super relevant to Peter's question, but I would be interested in hearing why you're bullish on the Far Future EA Fund.

Comment author: WillPearson 09 July 2017 10:59:40AM 2 points [-]

On the meta side of things:

I found ai impacts recently recently. There is a group I am loosely affiliated that is trying to make a MOOC about ai safety.

If you care about doing something about immense suffering risks (s-risks) you might like the foundational research institute.

There is an overview of other charities but it is more favourable of HRAD style papers.

I would like to set up an organisation that studies autonomy and our response to making more autonomous things (especially with regards to adminstrative autonomy). I have a book slowly brewing. So if you are interested in that get in contact.