Comment author: Peter_Hurford  (EA Profile) 20 July 2017 05:51:25PM 14 points [-]

One concern might be not malevolence, but misguided benevolence. For just one example, spreading wild animals to other planets could potentially involve at least some otherwise avoidable suffering (within at least some of the species), but might be done anyway out of misguided versions of "conservationist" or "nature-favoring" views.

Comment author: Ben_West  (EA Profile) 16 August 2017 09:27:41PM 0 points [-]

I'm curious if you think that the "reflective equilibrium" position of the average person is net negative?

E.g. many people who would describe themselves as "conservationists" probably also think that suffering is bad. If they moved into reflective equilibrium, would they give up the conservation or the anti-suffering principles (where these conflict)?

In response to comment by Ben_West  (EA Profile) on EAGx Relaunch
Comment author: Roxanne_Heston  (EA Profile) 02 August 2017 12:13:42AM 1 point [-]

In brief, large speaker events and workshops, depending on the needs of a local group. Perhaps self-evidently, large speaker events are best for nascent chapters trying to attract interest; workshops for augmenting the engagement and/or skill of existing members. There's some information about this in the Organizer FAQ, as well a prompts about this in the EAGx organizer application and on the "Get Involved' tab of effectivealtruism.org.

In response to comment by Roxanne_Heston  (EA Profile) on EAGx Relaunch
Comment author: Ben_West  (EA Profile) 13 August 2017 05:55:46PM 1 point [-]
Comment author: Peter_Hurford  (EA Profile) 20 July 2017 05:51:25PM 14 points [-]

One concern might be not malevolence, but misguided benevolence. For just one example, spreading wild animals to other planets could potentially involve at least some otherwise avoidable suffering (within at least some of the species), but might be done anyway out of misguided versions of "conservationist" or "nature-favoring" views.

Comment author: Ben_West  (EA Profile) 30 July 2017 07:38:53PM 1 point [-]

Yeah, I think the point I'm trying to make is that it would require effort for things to go badly. This is, of course, importantly different from saying that things can't go badly.

In response to comment by LawrenceC on EAGx Relaunch
Comment author: Roxanne_Heston  (EA Profile) 24 July 2017 07:27:35PM *  3 points [-]

I'm curious what prompted this change - did organizers encounter a lot of difficult converting new conference attendees to more engaged EAs?

They were often stretched so thin from making the main event happen that they didn't have the capacity to ensure that their follow-up events were solid. We think part of the problem will be mitigated if the events themselves are smaller and more targeted towards groups with a specific level of EA understanding.

I'm also curious about what sort of support CEA will be providing to smaller, less-established local groups, given that fewer groups will receive support for EAGx.

Local groups can apply for funding through the EAGx funding application, as well as use the event-organizing resources we generated for EAGx. Depending on the size and nature of the event, they can receive individualized support from different CEA staff working on community development, such as Harri, Amy, Julia, and/or Larissa. If they're running a career or rationality workshop they may be able to get 80,000 Hours' or CFAR's advice or direct support.

Here are the event-organizing resources, if you'd like to check them out: https://goo.gl/zw8AjW

In response to comment by Roxanne_Heston  (EA Profile) on EAGx Relaunch
Comment author: Ben_West  (EA Profile) 26 July 2017 09:38:20PM 0 points [-]

Depending on the size and nature of the event, they can receive individualized support from different CEA staff working on community development, such as Harri, Amy, Julia, and/or Larissa.

Could you say more about what kind of (smaller, local, non-EAGx) events CEA would like to see/would be interested in providing support for?

Comment author: Lila 22 July 2017 09:32:48AM 5 points [-]

Humans are generally not evil, just lazy

?

Human history has many examples of systematic unnecessary sadism, such as torture for religious reasons. Modern Western moral values are an anomaly.

Comment author: Ben_West  (EA Profile) 23 July 2017 01:54:18PM *  3 points [-]

Thanks for the response! But is that true? The examples I can think of seem better explained by a desire for power etc. than suffering as an end goal in itself. (To quote every placeholder text: Lorem ipsum dolor sit amet...)

Comment author: WilliamKiely 21 July 2017 12:58:27AM *  2 points [-]

7 - Therefore, the future will contain less net suffering

8 - Therefore, the future will be good

Could this be rewritten as "8. Therefore, the future will be better than the present" or would that change its meaning?

If it would change the meaning, then what do you mean by "good"? (Note: If you're confused about why I'm confused about this, then note that it seems to me that 8 does not follow from 7 for the meaning of "good" I usually hear from EAs (something like "net positive utility").)

Comment author: Ben_West  (EA Profile) 21 July 2017 11:08:32PM 3 points [-]

Yeah, it would change the meaning.

My assumption was that, if things monotonically improve, then in the long run (perhaps the very, very long run) we will get to net positive. You are proposing that we might instead asymptote at some negative value, even though we are still always improving?

Comment author: Wei_Dai 20 July 2017 07:50:49PM 15 points [-]

What lazy solutions will look like seems unpredictable to me. Suppose someone in the future wants to realistically roleplay a historical or fantasy character. The lazy solution might be to simulate a game world with conscious NPCs. The universe contains so much potential for computing power (which presumably can be turned into conscious experiences), that even if a very small fraction of people do this (or other things whose lazy solutions happen to involve suffering), that could create an astronomical amount of suffering.

Comment author: Ben_West  (EA Profile) 20 July 2017 11:06:43PM 5 points [-]

Yes, I agree. More generally: the more things consciousness (and particularly suffering) are useful for, the less reasonable point (3) above is.

Comment author: Tobias_Baumann 20 July 2017 08:40:43AM *  11 points [-]

Thanks for writing this up! I agree that this is a relevant argument, even though many steps of the argument are (as you say yourself) not airtight. For example, consciousness or suffering may be related to learning, in which case point 3) is much less clear.

Also, the future may contain vastly larger populations (e.g. because of space colonization), which, all else being equal, may imply (vastly) more suffering. Even if your argument is valid and the fraction of suffering decreases, it's not clear whether the absolute amount will be higher or lower (as you claim in 7.).

Finally, I would argue we should focus on the bad scenarios anyway – given sufficient uncertainty – because there's not much to do if the future will "automatically" be good. If s-risks are likely, my actions matter much more.

(This is from a suffering-focused perspective. Other value systems may arrive at different conclusions.)

Comment author: Ben_West  (EA Profile) 20 July 2017 02:49:49PM *  5 points [-]

Thanks for the response!

  1. It would be surprising to me if learning required suffering, but I agree that if it does then point (3) is less clear.
  2. Good point! I rewrote it to clarify that there is less net suffering
  3. Where I disagree with you the most is your statement "there's not much to do if the future will 'automatically' be good." Most obviously, we have the difficult (and perhaps impossible) task of ensuring the future exists at all (maxipok).
19

An Argument for Why the Future May Be Good

In late 2014, I ate  lunch with an EA who prefers to remain anonymous. I had originally been of the opinion that, should humans survive, the future is likely to be bad. He convinced me to change my mind about this. I haven’t seen this argument written up anywhere and... Read More
Comment author: Ben_West  (EA Profile) 24 April 2017 09:59:15PM 0 points [-]
  1. Are there blocks of rooms reserved at some hotel?
  2. Are there "informal" events planned for around the official event? (I.e. should everyone plan to land Thursday night and leave Sunday night or would it make sense to leave earlier/stay later?)

Thanks!

View more: Next