Comment author: Richard_Batty 03 February 2017 04:17:41PM *  6 points [-]

Is there an equivalent to 'concrete problems in AI' for strategic research? If I was a researcher interested in strategy I'd have three questions: 'What even is AI strategy research?', 'What sort of skills are relevant?', 'What are some specific problems that I could work on?' A 'concrete problems'-like paper would help with all three.

Comment author: Sebastian_Farquhar 04 February 2017 10:59:06AM 3 points [-]

This is a really good point, and I'm not sure that something exists which was written with that in mind. Daniel Dewey wrote something which was maybe a first step on a short form of this in 2015. A 'concrete-problems' in strategy might be a really useful output from SAIRC.

http://globalprioritiesproject.org/2015/10/three-areas-of-research-on-the-superintelligence-control-problem/

Comment author: kbog  (EA Profile) 03 February 2017 04:19:57PM 1 point [-]

Which organizations exactly were included? Can you give a list of the 50?

Comment author: Sebastian_Farquhar 04 February 2017 10:49:42AM 3 points [-]
28

Changes in funding in the AI safety field

This article is cross-posted from the CEA blog . The field of AI Safety has been growing quickly over the last three years, since the publication of “Superintelligence”. One of the things that shapes what the community invests in is an impression of what the composition of the field currently is, and... Read More
Comment author: Sebastian_Farquhar 25 February 2016 10:19:29AM 9 points [-]

Often (in EA in particular) the largest cost to a failed started project isn't to you, but is a hard-to-see counterfactual impact.

Imagine I believe that building a synth bio safety field is incredibly important. Without a real background in synth bio, I go about building the field but because I lack context and subtle field knowledge, I screw it up having reached out to almost all the key players. They would now are be conditioned to think that synth bio safety is something that is pursued by naive outsiders who don't understand synth bio. This makes it harder for future efforts to proceed. It makes it harder for them to raise funds. It makes it harder for them to build a team.

The worst case is that you start a project, fail, but don't quit. This can block the space, and stop better projects from entering it.

These can be worked around, but it seems that many of your assumptions are conditional on not having these sorts of large negative counterfactual impacts. While that may work out, it seems overconfident to assume a 0% chance of this, especially if the career capital building steps are actually relevant domain knowledge building.

Comment author: RyanCarey 29 January 2016 07:10:01AM *  4 points [-]

To get a good prediction market, we need more participation than the EA community would provide at its current size.

This is going to be a problem where the superset - creating a prediction market accessible by everyone - is easier to solve than the specific case - making a prediction market for the EA community.

Prediction Book is already taking on this role - a Prediction market by EAs - and would welcome help.

edit: fixed spelling

Comment author: Sebastian_Farquhar 02 February 2016 11:04:58PM 0 points [-]

Prediction markets benefit a lot from liquidity. Making it EA specific doesn't seem to gain all that much. But EAs should definitely practice forecasting formally and getting rewarded for reliable predictions.

Comment author: Linch 19 January 2016 01:46:22AM *  -1 points [-]

"you rather than the very many other groups" The post above said "several." I think the number of players here is incredibly important.

Can we get ballpark estimates on how many is the several/very many other players of equivalent or higher weight in this field? 5? 50? 500?

EDIT: Why was my question downvoted? I feel like it's an important question, and asked in good faith.

Comment author: Sebastian_Farquhar 19 January 2016 02:37:50PM 0 points [-]

I'm not very confident on this estimate, but I'd hazard that between 5-50 causally connected groups will have made a recommendation related to the balance of research vs direct work in global health as part of the DfID budget (in either direction).

That's maybe a 75% confidence interval.

Comment author: jayd 19 January 2016 12:25:05AM 3 points [-]

This is not especially egregious in a fundraising post, and I understand that in these you have to adopt the persona of a marketer and can't add too many qualifications and doubts. So I don't think it's necessarily bad that you said this. But, as an intellectual matter, I don't think it's quite fair to count "[DFID] reallocating £2.5bn to fund research into treating and responding to the diseases that cause the most suffering rather than direct work" as one of your "tangible results so far". This was discussed plenty on the Facebook group, and as several people pointed out there was no clear evidence that you rather than the very many other groups that commented on DFID's proposals were responsible for this particular spending decision.

Comment author: Sebastian_Farquhar 19 January 2016 02:29:52PM *  2 points [-]

Yes this is absolutely not a thing that just GPP did - which is why I tried to call out in this post that several other groups were important to recommending it! (And also something I emphasised in the facebook post you link to.)

I don't know how many groups fed into the overall process and I'm sure there were big parts of the process I have no knowledge about. I know of two other quite significant entities that have publicly made very similar recommendations (Angus Deaton and the Centre for Global Development) as well as about half a dozen other entities that made similar but slightly narrower suggestions (many of which we cited). The general development aid sector is clearly enormous, but the field of people proposing this sort of thing is smaller.

Assigning causal credit for policy outcomes is very complicated. It obviously matters to us to assess it, so that we can tell if it's worth doing more work in an area. What we do is just talk to the people we made recommendations to and ask them how significant a role our recommendation played. Usually people prefer we don't share their reflections further, which is unfortunate but inevitable.

6

Donations to Global Priorities Project matched for just two more weeks

Government decision-making underpins almost half the developed world's spending, and is often involved in the areas effective altruists care about the most. So learning how effective altruists should work with policy seems incredibly important. That's what we're trying to achieve at the Global Priorities Project. One of our donors has... Read More
Comment author: AGB 31 December 2015 09:03:00AM 1 point [-]

I'm not following the second paragraph. CEA has an overall budget and the general theme I've heard w.r.t. reserves has been 'we want to have about 12 months' reserves'. 12 months reserves is the same number whether the org budgets are consolidated or split out.

Is the thinking that with a large pool of unrestricted funding CEA as a whole would be comfortable running with significantly less, say 6 months reserves (which would still be more than 12 months for any individual org)?

Comment author: Sebastian_Farquhar 01 January 2016 06:54:35PM 1 point [-]

At the moment most of the orgs within CEA target 12 months reserves (though some have less and, in particular, they sometimes fall quite low at some point in the course of the year because we avoid on-going fundraising).

If we had something like 3 months of reserves for all costs unrestricted it would give us either greater financial security or the ability to cut the size of restricted overall reserves to, say, 7 months while keeping similar stability. This would free up EA capital for other projects.

It's a little unclear what the right level of reserves ought to be. In the US it's common for charities to have very large endowments (say 20 years). I think the 12 months at all times target we have right now is about appropriate, given the value of capital to EA projects, but would expect that number to drift upwards as the EA community matures.

Comment author: AGB 25 December 2015 11:46:00AM 0 points [-]

Ping. I also have ongoing confusion here because in some places this is referred to as 'shared services' and in some places as 'CEA unrestricted'. These sound like very different things! Is this something that existed in previous years (and wasn't fundraising for what reason exactly?), and if not what's changed?

Comment author: Sebastian_Farquhar 30 December 2015 07:43:52AM 3 points [-]

You're quite right, they are different. At the moment, we are planning to use marginal unrestricted funds to invest in shared services. Partly this aims to increase the autonomy of the shared services function and reduce the extent they feel they need to ask for permission to all the orgs to do useful things.

Past that level though, unrestricted funding would help us build a small reserve of unrestricted money that would provide us with financial stability. Right now, each organisation needs to keep a pretty significant independent runway because virtually all our reserves are restricted. If we had a bigger pool of funds that could go to any org, we could get the same level of financial security with smaller total reserves.

View more: Next