T

tyleralterman

581 karmaJoined Nov 2014

Comments
60

This dialogue with my optimizer-self, showing it evidence that it was undermining its own values by ignoring my other values, was very helpful for me too.

Just want to say that even though this is my endorsed position, it often goes out the window when I encounter tangible cases of extreme suffering. Here in Berlin, there was a woman I saw on the subway the other day who walked around dazedly with an open wound, seemingly not in touch with her surroundings, walking barefoot, and wearing an expression that looked like utter hopelessness. I don't speak German, so I wasn't able to interact with her well.

When I run into a case like this, what the "preliminary answer" I wrote above is hard to keep in mind, especially when I think of the millions who might be suffering in similar ways, yet invisibly.

Hi Braxton – I feel moved by what you wrote here and want to respond later in full.

For now, I just want to thank you for your phrase here: "How could I possibly waste my time on leisure activities once I’ve seen the dark world?" I think this might deserve its own essay. I think it poses a challenge even for anti-realists who aren't confused about their values, in the way Spencer Greenberg talks about here (worth a read).

Through the analogy I use in my essay, it might be nonsensical, in the conditions of a functional society, to say something like "It's more important to eat well than to sleep well," since you need both to be alive, and the society has minimized the tradeoffs between these two things. However, in situations of scarcity or emergency, that might change. If supply chains suddenly fail, I might find myself without food, and start sacrificing sleep to go forage.

A similar tradeoff might apply to value threatened by emergency or scarcity. For instance, let's say that a much-loved yet little-seen friend messages me: "Tyler! I missed my layover in Berlin, so now I'm here for the night!" Under this condition of scarcity, I might put off the art project I'd planned to work on this night. This is a low-stakes example, but seems to generalize to high stakes ones.

One might say: The emergency/scarcity conditions of the world bring our EA values under continuous threat. Does the same logic not apply?

I need to think about this more. My preliminary answer is kind of simple. Eventually I do need to sleep well to continue to forage for food, otherwise I'll undermine my foraging. Likewise, I can't raincheck my artmaking every night if my art project is a key source of fulfillment – doing so will undermine my quality of engagement when I spend time with friends. And similarly, allowing EA pursuits to crowd out the rest of the good life seems to undermine those EA pursuits long term (for roughly everyone that I know at least – my opinion). This seems to be true not only of EA pursuits, but any value that I allow to crowd out my other values. So the rational strategy seems to be to find ways for values to reduce conflict or even become symbiotic. An internal geopolitics of co-prosperity.

Agree so much with the antidote of silliness! I’m happy to see that EA Twitter is embracing it.

Excited to read the links you shared, they sound very relevant.

Thank you, Oliver. May your fire bum into the distance.

Thank you, David! I also worry about this:

When we model to the rest of the world that “Effective” “Altruism” looks like brilliant (and especially young) people burn themselves out in despair, we are also creating a second order effect where we broadcast the image of Altruism as one tiled with suffering for the greater good. This is not quite inviting or promising for long term engagement on the world's most pressing problems.

Of course, if one believes that AGI is coming within a few years, then you might not care about these potential second order effects. I do not believe AGI is coming within a few years. Though if I did, I might actually believe that a culture of mutual dedication to one another's full set of ends is also instrumentally useful for the level of trust & coordination that could be required to deal with short timelines. Big "might."

Ah, got it. My current theory is that maximizing caused suffering (stress) which caused gut motility problems which caused bacterial overgrowth which caused suffering, or some other crazy feedback phenomenon like that.

Sometimes positive feedback loops are anything but. 😓

Moreover: 

It's not obvious to me that severe sacrifice and tradeoffs are necessary. I think their seeming necessary might be the byproduct of our lack of cultural infrastructure for minimizing tradeoffs. That's why I wrote this analogy:

To say that [my other ends] were lesser seemed to say, “It is more vital and urgent to eat well than to drink or sleep well.” No – I will eat, sleep, and drink well to feel alive; so too will I love and dance as well as help.

Once, the material requirements of life were in competition: If we spent time building shelter it might jeopardize daylight that could have been spent hunting. We built communities to take the material requirements of life out of competition. For many of us, the task remains to do the same for our spirits.

I believe it's possible to find and build synergies that reduce tradeoffs. For instance, as a lone ancient human in the wilderness, time building shelter might jeopardize daylight that could have been spent foraging for food. However, if you joined a well-functioning tribe, you're no longer forced to choose between [shelter-building] and [foraging]. If you forage, the food you find will power the muscles of your tribesmate to build shelter. Similarly, your tribesmate's shelter will give you the good night's rest you need to go out and forage. Unless there's a pressing emergency, it would be a mistake for the tribe to allocate everyone only to foraging or only to shelf-building.

 I think we're in a similar place with our EA ends. They seem like they demand the sacrifice of our other ends. But I think that's just because we haven't set up the right cultural infrastructure to create synergies and minimize tradeoffs. In the essay, I suggest one example piece of infrastructure that might help with this: a fractal altruist community. But I'm excited to see what other people come up with. Maybe you'll be one of them.

Load more