2

kbog comments on Towards a measure of Autonomy and what it means for EA - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (11)

You are viewing a single comment's thread. Show more comments above.

Comment author: kbog  (EA Profile) 25 August 2017 09:52:27AM 0 points [-]

Why should we care about someone's desire to have their thoughts not checked for the presence of malicious genius? They may use their thinking to create something equally dangerous that we have not yet thought of.

If you can do that, sure. Some people might have a problem with it though, because you're probing their personal thoughts.

Why care about freedom at all?

Because people like being free and it keeps society fresh with new ideas.

If I upload and then want to take a spaceship somewhere hard to monitor, will I be allowed to take a super computer, if I need it to perform science?

Sure. Just don't use it to build a super-AGI that will take over the world.

What is in my pocket was once considered a dangerous super computer. The majority of the world is now trusted with it, or at least the benefits of having them out weigh the potential costs.

That's because you can't use what is in your pocket to take over the world. Remember that you started this conversation by asking "How would that be allowed if those people might create a competitor AI?" So if you assume that future people can't create a competitor AI, for instance because their computers have no more comparative power to help take over the world than our current computers do, then of course those people can be allowed to do whatever they want and your original question doesn't make sense.

Comment author: WillPearson 25 August 2017 02:16:42PM *  0 points [-]

Why care about freedom at all?

Because people like being free and it keeps society fresh with new ideas.

If I upload and then want to take a spaceship somewhere hard to monitor, will I be allowed to take a super computer, if I need it to perform science?

Sure. Just don't use it to build a super-AGI that will take over the world.

What if there is a very small risk that I will do so, lets say 0.0000001%? Using something like the arguments for the cosmic inheritance, this could be seen as likely causing a certain amount of astronomical waste. Judged purely on whether people are alive, this seems like a no go. But if you take into consideration that the society that stops this kind of activity would be less free, and less free for all people throughout history, this is a negative. I am trying to get this negative included in our moral calculus, else I fear we will optimize it away.