Opportunities for individual donors in AI safety

Tl;dr Over the past few years, the availability of funding for AI safety projects has increased significantly. Yet opportunities for individual donors to find high impact grant opportunities remain. In this post I review the recent history of AI safety organizations, speculate on the ways in which funding early AI... Read More
Comment author: alexflint 07 September 2015 12:44:35AM 1 point [-]

So we get astronomical stakes by multiplying a large amount of time by a large amount of space to get a large light cone of potential future value. Interventions that work along only one of those dimensions -- say, I bury a single computer that generates one utilon per year deep underground, which continues to run for the life of the universe, or I somehow grant a one-off one utilon to every human alive in the year 1 billion -- are dominated by those interventions that affect the product of space and time (e.g. the interventions you listed here). But if there were just one more dimension to multiply, then interventions that addressed the product of all three might dominate all considerations that we currently think about.

Comment author: alexflint 07 September 2015 12:33:39AM 3 points [-]

Or maybe we could invest in server capacity in readiness of a EM future.

This one seemed out of place to me. Conditioned on the time we start expanding and the rate at which we expand, we're going to have access to some fixed set of resources at a given point in the future, so I don't see how investing in server capacity now affects our server capacity in the far future. (though I do agree that affecting the start time and rate of expansion could be permanent improvements.)

Establishing norms that will protect biological humans and EMs from Hansonian competition - like a right to retire. If uploads are not conscious, it might be important to agree on this before EMs massively outnumber biological humans; after that point it would become much harder.

These seem to be about simply picking the right policies now and locking them in. It might also be important to lock in the right policies vis-a-vis privacy, the death penalty, property rights, etc etc, but why should we think that we can lock such policies in now? This reduces to either "minimize value drift" or "create a singleton", both of which I agree with but you already listed them.

Comment author: alexflint 21 January 2015 04:49:25PM 0 points [-]

I'm going to be in ottawa jan 27 - feb 2. Would anyone be interested in meeting up?

Comment author: vollmer 21 January 2015 08:40:22AM *  3 points [-]

Personal experience: Most considerate housemates ever, and not at all cultish ... though it depends a lot on the people :)

I personally think it's not that helpful to have an additional system for EA housing since it's not so hard to just get in touch via a local group / chapter and look for other interested people there.

Comment author: alexflint 21 January 2015 04:47:45PM 2 points [-]

My experience over the last few days has been that finding potential roommates in the bay area is much harder than it needs to be. There is a lesswrong housing spreadsheet but many EAs are not aware of it. There are many different EA groups and websites where bay area EAs hang out (I have a list of 11). I've reached out to dozens of people directly and been introduced to dozens more, which is super awesome, but it's by no means efficient.

Comment author: alexflint 20 January 2015 11:49:21PM 2 points [-]

Thanks Tom. I've posted at skillshare.

I'd also be interested in whether folks think that a dedicated housing/roommate spreadsheet for EAs would be helpful.

In response to January Open Thread
Comment author: alexflint 20 January 2015 09:15:59PM 3 points [-]

Is there any way for EAs who are looking for housemates to find each other?

Living with other EAs is a really powerful way to strengthen the community, and finding like-minded housemates can also have a financial impact for many of us.

If there's nothing already out there then I'm going to make something.