Comment author: EliezerYudkowsky 28 September 2016 10:10:28PM *  16 points [-]

The idea of running an event in particular seems misguided. Conventions come after conversations. Real progress toward understanding, or conveying understanding, does not happen through speakers going On Stage at big events. If speakers On Stage ever say anything sensible, it's because an edifice of knowledge was built in the background out of people having real, engaged, and constructive arguments with each other, in private where constructive conversations can actually happen, and the speaker On Stage is quoting from that edifice.

(This is also true of journal publications about anything strategic-ish - most journal publications about AI alignment come from the void and are shouting into the void, neither aware of past work nor feeling obliged to engage with any criticism. Lesser (or greater) versions of this phenomenon occur in many fields; part of where the great replication crisis comes from is that people can go on citing refuted studies and nothing embarrassing happens to them, because god forbid there be a real comments section or an email reply that goes out to the whole mailing list.)

If there's something to be gained from having national-security higher-ups understanding the AGI alignment strategic landscape, or from having alignment people understand the national security landscape, then put Nate Soares in a room with somebody in national security who has a computer science background, and let them have a real conversation. Until that real progress has already been made in in-person conversations happening in the background where people are actually trying to say sensible things and justify their reasoning to one another, having a Big Event with people On Stage is just a giant opportunity for a bunch of people new to the problem to spout out whatever errors they thought up in the first five seconds of thinking, neither aware of past work nor expecting to engage with detailed criticism, words coming from the void and falling into the void. This seems net counterproductive.

Comment author: ZachWeems 26 August 2017 02:46:04AM 0 points [-]

|...having a Big Event with people On Stage is just a giant opportunity for a bunch of people new to the problem to spout out whatever errors they thought up in the first five seconds of thinking, neither aware of past work nor expecting to engage with detailed criticism...

I had to go back and double-check that this comment was written before Asilomar 2017. It describes some of the talks very well.

Comment author: kbog  (EA Profile) 25 January 2016 09:02:05PM 0 points [-]

This really should go in the Crazy EA Investing Ideas group... I'd sent an invite but I don't know this guy's name.

Also, whatever happened to the self help group...?

Comment author: ZachWeems 11 June 2017 03:29:51PM 0 points [-]

I would also like to be added to the crazy EA's investing group. Could you send an invite to me on here?

In response to Open Thread #36
Comment author: DavidNash 15 March 2017 09:33:17AM 3 points [-]

I think there is a goldmine of advice and practical tips on this website.

http://www.mrmoneymustache.com/

But instead of aiming to retire at 30, you'll be able to donate more and still have a healthy retirement fund by not spending all your money, and investing sensibly. The site below is useful with step by step guides.

http://monevator.com/how-to-retirement-plan/

At the moment I give 10% and invest any other savings over that but I probably wont be going into a high paying job and have the benefit of free healthcare.

I may slightly disagree with Linch about retirement money. I think it gives people a lot of power in their careers and job choices if they are able to tell their manager what they actually think and if they aren't desperate to succeed in a job interview. Being financially independent can make it a lot easier to take ethical decisions and make a stand against a bad policy, without having to worry about losing your job.

That depends on how much you think you need to feel secure.

In response to comment by DavidNash on Open Thread #36
Comment author: ZachWeems 16 March 2017 02:58:50PM 1 point [-]

The 'Stache is great! He's actually how I heard about Effective Altruism.

In response to Open Thread #36
Comment author: DiverWard 15 March 2017 10:10:36PM 3 points [-]

I am new to EA, but it seems that a true effective altruist would not be interested in retiring. When just a $1000 can avert decades of disability-adjusted life years (years of suffering), I do not think it is fair to sit back and relax (even in your 70's) when you could still be earning to give.

In response to comment by DiverWard on Open Thread #36
Comment author: ZachWeems 16 March 2017 02:57:54PM 1 point [-]

Right, I'm accounting for my own selfish desires here. An optimally moral me-like person would only save enough to maximize his career potential.

In response to Open Thread #36
Comment author: Linch 15 March 2017 04:33:49AM *  4 points [-]

My personal opinion is that individuals should save enough to mitigate emergencies, job transitions, etc. (https://80000hours.org/2015/11/why-everyone-even-our-readers-should-save-enough-to-live-for-6-24-months/), but no more.

It just seems rather implausible, to me, that retirement money is anywhere close to being a cost-effective intervention, relative to other likely EA options.

In response to comment by Linch on Open Thread #36
Comment author: ZachWeems 15 March 2017 02:27:59PM 6 points [-]

| It just seems rather implausible, to me, that retirement money is anywhere close to being a cost-effective intervention, relative to other likely EA options.

I don't think that "Give 70-year-old Zach a passive income stream" is an effective cause area. It is a selfish maneuver. But the majority of EAs seem to form some sort of boundary, where they only feel obligated to donate up to a certain point (whether that is due to partially selfish "utility functions" or a calculated move to prevent burnout). I've considered choosing some arbitrary method of dividing income between short term expenses, retirement and donations, but I am searching for a method that someone considers non-arbitrary, because I might feel better about it.

4

Open Thread #36

Hello, EA Forum! Here is an open thread. I will kick it off by asking what thoughts people have on saving for retirement while donating more than a set 10% of income. I am likely to have a relatively high paying job within a few months and don't plan on spending... Read More
Comment author: ZachWeems 12 October 2016 02:25:13PM 3 points [-]

Question 2: Suppose tomorrow MIRI creates a friendly AGI that can learn a value system, make it consistent with minimal alteration, and extrapolate it in an agreeable way. Whose values would it be taught?

I've heard the idea of averaging all humans' values together and working from there. Given that ISIS is human and that many other humans believe that the existence of extreme physical and emotional suffering is good, I find that idea pretty repellent. Are there alternatives that have been considered?

Comment author: ZachWeems 12 October 2016 01:57:22PM 5 points [-]

It seems like people in academia tend to avoid mentioning MIRI. Has this changed in magnitude during the past few years, and do you expect it to change any more? Do you think there is a significant number of public intellectuals who believe in MIRI's cause in private while avoiding mention of it in public?