Comment author: Telofy  (EA Profile) 19 June 2018 03:33:59PM *  5 points [-]

Sweet! I hope it’ll become a great resource! Are you planning to merge it with https://causeprioritization.org/? If there are too many wikis, we’d just run into the same problem with fragmented bits of information again.

Comment author: Telofy  (EA Profile) 08 June 2018 09:59:21AM 1 point [-]

Thank you! I suspect, this is going to be very helpful for me.

Comment author: Telofy  (EA Profile) 08 June 2018 08:00:30AM 1 point [-]

Awesome! Do you also have plans to assist EA founders of for-profit social enterprises (like e.g. Wave)?

Comment author: Denkenberger 26 March 2018 12:30:39AM 2 points [-]

From my book: "Synthetic food production refers to using chemical synthesis. Sugar has been synthesized from noncarbohydrates for decades (Hudlicky et al., 1996). Hudlicky, T., Entwistle, D.A., Pitzer, K.K., Thorpe, A.J., 1996. Modern methods of monosaccharide synthesis from non-carbohydrate sources. Chem. Rev. 96, 1195–1220." I didn't write much because I don't know enough about catalysis to say whether it can be ramped up quickly in a catastrophe. But for space colonization, that is not an issue.

Comment author: Telofy  (EA Profile) 30 March 2018 08:30:39AM 0 points [-]

Awesome, thank you!

Comment author: Jeffhe  (EA Profile) 17 March 2018 02:12:42AM *  0 points [-]

Hi Telofy,

Thanks for this lucid reply. It has made me realize that it was a mistake to use the phrase "clear experiential sense" because that misleads people into thinking that I am referring to some singular experience (e.g. some feeling of exhaustion that sets in after the final headache). In light of this issue, I have written a "new" first reply to Michael_S to try to make my position clearer. I think you will find it helpful. Moreover, if you find any part of it unclear, please do let me know.

What I'm about to say overlaps with some of the content in my "new" reply to Michael_S:

You write that you don't see anything morally relevant linking the person moments of a single person. Are you concluding from this that there is not actually a single subject-of-experience who feels, say, 5 pains over time (even though we talk as if there is)? Or, are you concluding from this that even if there is actually just a single subject-of-experience who feels all 5 pains over time, it is morally no different from 5 subjects-of-experience who each feels 1 pain of the same sort?

What matters to me at the end of the day is whether there is a single subject-of-experience who extends through time and thus is the particular subject who feels all 5 pains. If there is, then this subject experiences what it's like of going through 5 pains (since, in fact, this subject has gone through 5 pains, whether he remembers going through them or not). Importantly, the what-it's-like-of-going-through-5-pains is just the collection of the past 5 singular pain episodes, not some singular/continuous experience like an feeling of exhaustion or some super intense pain from the synthesis of the intensity of the 5 past pains. It is this what-it's-like that can plausibly be worse than the what it's like of going through a major pain. Since there could only be this what-it's-like when there is a single subject who experiences all 5 pains, therefore 5 pains spread across 5 people cannot be worse than a major pain (since, at best, there would only be 5 experientially independent what-it's-like-of-going-through-1-minor-headache).

My latest reply to Michael_S focuses on the question whether there could be a single subject-of-experience who extends through time, and thus capable of feeling multiple pains.

Comment author: Telofy  (EA Profile) 25 March 2018 03:05:07PM *  1 point [-]

Hi Jeff!

To just briefly answer your question, “Are you concluding from this that there is not actually a single subject-of-experience”: I don’t have an intuition for what a subject-of-experience is – if it is something defined along the lines of the three characteristics of continuous person moments from my previous message, then I feel that it is meaningful but not morally relevant, but if it is defined along the lines of some sort of person essentialism then I don’t believe it exists on Occam’s razor grounds. (For the same reason, I also think that reincarnation is metaphysically meaningless because I think there is no essence to a person or a person moment besides their physical body* until shown otherwise.)

* This is imprecise but I hope it’s clear what I mean. People are also defined by their environment, culture, and whatnot.

Comment author: Denkenberger 21 March 2018 12:48:01PM 1 point [-]

Impressive work - I especially liked the graphs. For humane space colonization, rather than photosynthesis, it would be far more efficient to use solar or nuclear electricity for direct chemical synthesis of food or powering electric bacteria. One non-space colonization motivation would include agricultural catastrophes. Outside a catastrophe, it would likely be more fossil energy intensive than growing plants, but maybe not than producing animals. And it would be far less fossil energy intensive than artificial light growing of plants, which people are working on.

Comment author: Telofy  (EA Profile) 25 March 2018 09:49:59AM 0 points [-]

Cool, thank you! Have you written about direct chemical synthesis of food or can you recommend some resources to me?

Comment author: Jeffhe  (EA Profile) 16 March 2018 03:36:34AM *  0 points [-]

Imagine you have 5 headaches, each 1 minutes long, that occur just 10 seconds apart of each other. From imagining this, you will have an imagined sense of what it's like to go through those 5 headaches.

And, of course, you can imagine yourself in the shoes of 5 different friends, who we can suppose each has a single 1-minute long headache of the same kind as above. From imagining this, you will again have an imagined sense of what it's like to go through 5 headaches.

If that's what you mean when you say that "the clear experiential sense is just as clear or unclear to me no matter whether I think about the person moments of the same person or of different people", then I agree.

But when you imagine yourself in the shoes of those 5 friends, what is going on is that one subject-of-experience (i.e. you), takes on the independent what-it's-likes (i.e. experiences) associated with your 5 friends, and IN DOING SO, LINKS THOSE what-it's-likes - which in reality would be experimentally independent of each other - TOGETHER IN YOU. So ultimately, when you imagine yourself in the shoes of your 5 friends, you are, in effect, imagining what it's like to go through 5 headaches. But in reality, there would be no such what-it's-like among your 5 friends. The only what-it's-like that would be present would be the what-it's-like-of-going-through-1-headache, which each of your friend would experience. No one would experience the what it's like of going through 5 headaches. But that is what is needed for it to be the case that 5 such headaches can be worse than a headache that is worse than any one of them.

Please refer to my conversation with Michael_S for more info.

Comment author: Telofy  (EA Profile) 16 March 2018 10:43:32PM 1 point [-]

Argh, sorry, I haven’t had time to read through the other conversation yet, but to clarify, my prior was the other one – not that there is something linking the experiences of the five people but that there is very little, and nothing that seems very morally relevant – that links the experiences of the one person. Generally, people talk about continuity, intentions, and memories linking the person moments of a person such that we think of them as the same one even though all the atoms of their bodies may’ve been exchanged for different ones.

In your first reply to Michael, you indicate that the third one, memories, is important to you, but in themselves I don’t feel that they confer moral importance in this sense. What you mean, though, may be that five repeated headaches are more than five times as bad as one because of some sort of exhaustion or exasperation that sets in. I certainly feel that, in my case especially with itches, and I think I’ve read that some estimates of DALY disability weights also take that into account.

But I model that as some sort of ability of a person to “bear” some suffering, which gets worn down over time by repeated suffering without sufficient recovery in between or by too extreme suffering. That leads to a threshold that makes suffering below and above seem morally very different to me. (But I recognize several such thresholds in my moral intuitions, so I seem to be some sort of multilevel prioritarian.)

So when I imagine what it is like to suffer headaches as bad as five people suffering one headache each, I imagine them far apart with plenty of time to recover, no regularity to them, etc. I’ve had more than five headaches in my life but no connection and nothing pathological, so I don’t even need to rely on my imagination. (Having five attacks of a frequently recurring migraine must be noticeably worse.)

Comment author: Jeffhe  (EA Profile) 13 March 2018 10:43:44PM *  0 points [-]

Hi Telofy,

Thanks for your comment, and quoting oneself is always cool (haha)/

In response, if I understand you correctly, you are saying that if I don't prefer saving many similar, though distinct, people each from a certain pain than another person from the same pain, then I have no reason to prefer saving myself from many of those pains than just one of them.

I certainly wouldn't agree with that. Were I to suffer many pains, I (just me) suffers all of them in such a way that there is a very clear sense how they, cumulatively, are worse to endure than just one of them. Thus, I find intra-personal aggregation of pains intelligible. I mean, when an old man reminiscing about his past says to us, "The single worst pain I had was that one time when I got shot in the foot, but if you asked me whether I'd go through that again or all those damn'ed headaches I had over my life, I would certainly ask for the bullet.", we get it. Anyways, I think the clear sense I mentioned supports the intra-personal aggregation of pains and if pains intra-personally aggregate, then more instances of the same pain will be worse than just one instance, and so I have reason to prefer saving myself from more of them.

However, in the case of many vs one other (call him "C"), the pains are spread across distinct people rather than aggregate in one person, so they cannot in the same sense be worse than the pain that C goes through. And so even if I show no preference in this case, I still have reason to show preference in the former case.

Comment author: Telofy  (EA Profile) 15 March 2018 12:32:26PM 0 points [-]

Okay, curious. What is to you a “clear experiential sense” is just as clear or unclear to me no matter whether I think about the person moments of the same person or of different people.

It would be interesting if there’s some systematic correlation between cultural aspects and someone’s moral intuitions on this issue – say, more collectivist culture leading to more strongly discounted aggregation and more individualist culture leading to more linear aggregation… or something of the sort. The other person I know who has this intuition is from a eastern European country, hence that hypothesis.

8

Current Thinking on Prioritization 2018

Summary: This article documents my current thoughts on how to make the most out of my experiment with earning to give. It draws together a number of texts by other authors that have influenced my thinking and adds some more ideas of my own for a bundle of heuristics that... Read More
Comment author: Telofy  (EA Profile) 13 March 2018 07:20:08PM 3 points [-]

I think Brian Tomasik has addressed this briefly and Nick Bostrom at greater length.

What I’ve found most convincing (quoting myself in response to a case that hinged on the similarity of the two or many experiences):

If you don’t care much more about several very similar beings suffering than one of them suffering, then you would also not care more about them, when they’re your own person moments, right? You’re extremely similar to your version a month or several months ago, probably more similar than you are to any other person in the whole world. So if you’re suffering for just a moment, it would be no better than being suffering for an hour, a day, a month, or any longer multiple of that moment. And if you’ve been happy for just a moment sufficiently recently, then close to nothing more can be done for you for a long time.

I imagine that fundamental things like that are up to the subjectivity of moral feelings – so close to the axioms, it’s hard to argue with even more fundamental axioms. But I for one have trouble empathizing with a nonaggregative axiology at least.

View more: Next