Comment author: philosophytorres 14 November 2016 10:25:34PM 0 points [-]

Friends: I recently wrote a few thousand words on the implications that a Trump presidency will have for global risk. I'm fairly new to this discussion group, so I hope posting the link doesn't contravene any community norms. Really, I would eagerly welcome feedback on this. My prognosis is not good.

https://medium.com/@philosophytorres/what-a-trump-presidency-means-for-human-survival-one-experts-take-ed26bf9f9a21

Comment author: philosophytorres 05 November 2016 03:55:28PM *  0 points [-]

A fantastically interesting article. I wish I'd seen it earlier -- about the time this was published (last February) I was completing an article on "agential risks" that ended up in the Journal of Evolution and Technology. In it, I distinguish between "existential risks" and "stagnation risks," each of which corresponds to one of the disjuncts in Bostrom's original definition. Since these have different implications -- I argue -- for understanding different kinds of agential risks, I think it would be good to standardize the nomenclature. Perhaps "population risks" and "quality risks" are preferable (although I'm not sure "quality risks" and "stagnations risks" have exactly the same extension). Thoughts?

(Btw, the JET article is here: http://jetpress.org/v26.2/torres.pdf.)

Comment author: Carl_Shulman 02 October 2016 06:11:05PM *  9 points [-]

Note: news publications impose titles on authors without consulting them. Obviously he would never write that title.

Comment author: philosophytorres 05 November 2016 01:01:51AM 0 points [-]

Oh, I see. Did they not ask for his approval? I'm familiar with websites devising their own outrageously hyperbolic headlines for articles authored by others, but I genuinely assumed that a website as reputable as Slate would have asked a figure as prominent as Bostrom for approval. My apologies!

Comment author: philosophytorres 05 November 2016 12:44:46AM 1 point [-]

Very interesting map. Lots of good information.

Comment author: philosophytorres 02 October 2016 02:04:22AM -3 points [-]

How about this for AI publicity, written by Nick Bostrom himself: "You Should Be Terrified of Superintelligent Machines," via Slate!

http://www.slate.com/articles/technology/future_tense/2014/09/will_artificial_intelligence_turn_on_us_robots_are_nothing_like_humans_and.html