Comment author: Risto_Uuk 27 April 2018 11:32:00AM 0 points [-]

I feel that the book contains too much fluff and even these commandments, despite appearing useful, seem to lack enough specificity to be useful. Does anyone have other book recommendations or guidelines for improving one's forecasting and probabilistic thinking? At the end of the day, it's important to actually practice forecasting and thinking probabilistically, but specific information for how to do that would be useful. E.g. how do you actually determine 40/60 and 45/55 or even 43/57 probabilities?

Comment author: Richenda  (EA Profile) 09 April 2018 08:22:52PM 3 points [-]
Comment author: Risto_Uuk 26 April 2018 05:18:28AM 0 points [-]

Thanks for putting it on EA Groups Resource Map! I think it'd be better if the link was to the Google Docs document rather than to this forum post, because we might edit it in the future.

Comment author: Risto_Uuk 26 March 2018 05:28:47PM 8 points [-]

If someone can't apply right now due to other commitments, do you expect there to be new roles for generalist research analysts next year as well? What are the best ways one could make oneself a better candidate meanwhile?

Comment author: cassidynelson 16 March 2018 01:05:55AM 0 points [-]

I agree, I found it surprising as well that he has taken this view. It seems like he has read a portion of Bostrom's Global Catastrophic Risks and Superintelligence, has become familiar with the general arguments and prominent examples, but then has gone on to dismiss existential threats on reasons specifically addressed in both books.

He is a bit more concerned about nuclear threats than other existential threats, but I wonder if this is the availability heuristic at work given the historical precedent instead of a well-reasoned line of argument.

Great suggestion about Sam Harris - I think Steven Pinker and him had a live chat just the other day (March 14) so may have missed this opportunity. I'm still waiting for the audio to be uploaded on Sam's podcast, but I wonder given Sam's positions if he questions Pinker on this as well.

Comment author: Risto_Uuk 16 March 2018 11:03:43PM 2 points [-]

Sam Harris did ask Steven Pinker about AI safety. If anybody gets around listening to that, it starts at 1:34:30 and ends at 2:04, so that's about 30 minutes about risks from AI. Harris wasn't his best in that discussion and Pinker came off much more nuanced and evidence and reason based.


Reading group guide for EA groups

Hi,  I'm Risto Uuk and I run EA Estonia. We started organizing reading groups last semester. We tried to find relevant guides for that, but weren't able to find anything comprehensive in the EA community. Because of the need, we started to create a guide ourselves. It's a draft and... Read More
Comment author: Risto_Uuk 12 March 2018 11:00:58AM 1 point [-]

Do you offer any recommendations for communicating utilitarian ideas based on Everett's research or someone else's?

For example, in Everett's 2016 paper the following is said:

"When communicating that a consequentialist judgment was made with difficulty, negativity toward agents who made these judgments was reduced. And when a harmful action either did not blatantly violate implicit social contracts, or actually served to honor them, there was no preference for a deontologist over a consequentialist."

Comment author: DavidMoss 11 March 2018 03:01:11PM 2 points [-]

I'm curious how much mass outreach there actually is in EA and/or what people have in mind when they talk about mass outreach.

Aside from Doing Good Better and Will/CEA's other public intellectual work, which they seem to be retreating from, it's not clear to me what mass outreach has actually been done.

Comment author: Risto_Uuk 11 March 2018 03:45:43PM 2 points [-]

I think this depends on how we define mass outreach. I would consider a lot of activities organized in EA community to be mass outreach. For example, EAG, books, articles in popular media outlets, FB posts in EA group, 80 000 Hours podcast, etc. They are mass outreach because they reach a lot of people and very often don't enable an in-depth work on. Exceptions would be career coaching session at EAG event and discussing books/articles in discussion groups.

Comment author: Risto_Uuk 11 March 2018 12:37:50PM 3 points [-]

Thank you for the post!

I agree that from the point of view of translation Doing Good Better might be too focused on donating to charity and on global health, but this doesn't seem to be an issue at all when it comes small in-depth discussion groups. I guess this is another argument in favor of focusing on these types of activities rather than large-scale outreach.