Robert_Wiblin comments on EA Survey 2017 Series: Cause Area Preferences - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (38)

You are viewing a single comment's thread.

Comment author: Robert_Wiblin 01 September 2017 06:47:50PM *  6 points [-]

For next year's survey it would be good if you could change 'far future' to 'long-term future' which is quickly becoming the preferred terminology.

'Far future' makes the perspective sound weirder than it actually is, and creates the impression you're saying you only care about events very far into the future, and not all the intervening times as well.

Comment author: Peter_Hurford  (EA Profile) 01 September 2017 07:33:27PM 3 points [-]

I've added to our list of things to consider for the 2018 survey.

Comment author: Austen_Forrester 04 September 2017 02:25:08PM 0 points [-]

For "far future"/"long term future," you're referring to existential risks, right? If so, I would think calling them existential or x-risks would be the most clear and honest term to use. Any systemic change affects the long term such as factory farm reforms, policy change, changes in societal attitudes, medical advances, environmental protection, etc, etc. I therefore don't feel it's that honest to refer to x-risks as "long term future."

Comment author: Robert_Wiblin 05 September 2017 11:21:30PM 1 point [-]

The term existential risk has serious problems - it has no obvious meaning unless you've studied what it means (is this about existentialism?!), and is very often misused even by people familiar with it (to mean extinction only, neglecting other persistent 'trajectory changes').

Comment author: RobBensinger 06 September 2017 08:12:51AM 1 point [-]

"Existential risk" has the advantage over "long-term future" and "far future" that it sounds like a technical term, so people are more likely to Google it if they haven't encountered it (though admittedly this won't fully address people who think they know what it means without actually knowing). In contrast, someone might just assume they know what "long-term future" and "far future" means, and if they do Google those terms they'll have a harder time getting a relevant or consistent definition. Plus "long-term future" still has the problem that it suggests existential risk can't be a near-term issue, even though some people working on existential risk are focusing on nearer-term scenarios than, e.g., some people working on factory farming abolition.

I think "global catastrophic risk" or "technological risk" would work fine for this purpose, though, and avoids the main concerns raised for both categories. ("Technological risk" also strikes me as a more informative / relevant / joint-carving category than the others considered, since x-risk and far future can overlap more with environmentalism, animal welfare, etc.)

Comment author: WillPearson 09 September 2017 06:59:32PM *  0 points [-]

Just a heads up "technological risks" ignores all the non-anthropogenic catastrophic risks. Global catastrophic risks seems good.

Comment author: Austen_Forrester 09 September 2017 04:36:08AM -1 points [-]

Of course, I totally forgot about the "global catastrophic risk" term! I really like it and it doesn't only suggest extinction risks. Even its acronym sounds pretty cool. I also really like your "technological risk" suggestion, Rob. Referring to GCR as "Long term future" is a pretty obvious branding tactic by those that prioritize GCRs. It is vague, misleading, and dishonest.