Under classical utilitarianism, it is permissible to take actions which severely violate moral rules if the action has sufficiently high expected value.

This includes rules like “don’t break the law”, “don’t be dishonest” and “don’t be violent”.

Many classical utilitarians argue that even under classical utilitarianism, in practice, we shouldn’t violate important moral rules. This is because actions which do so don’t have positive expected value, because of considerations such as “you’ll encourage other people to break these moral rules, which will do lots of harm over the long term and the positives of your actions won’t outweigh this”.

However, some strains of EA thinking contain 2 key ideas that will encourage classical total utilitarians to break moral rules despite these considerations:

  1. There is almost infinite value in humanity’s future, especially if a techno-utopian future involving space colonisation and digital minds comes to fruition. Mitigating extinction risks will help make this future happen, giving actions to mitigate extinction risks infinite expected value.

  2. Extinction from AGI disaster is likely in the next 20-30 years.

Idea 1 means that x-risk mitigating actions which violate moral rules will still have positive expected value.

Idea 2 means that there isn’t much time for long-term harmful effects of moral rule-breaking to accumulate.

Crucially, I think if you agree with both idea 1 and 2, you must think that stealing money to fund AI safety research has positive expected value, and you cannot object to alleged fraud at FTX to fund AI safety research on classical utilitarian grounds.

I think there is strong consensus in EA against severely violating moral rules, even amongst the classical utilitarians in EA, but this is because most self-described classical utilitarians don’t endorse the classical utilitarian action in every situation, they just endorse it in most situations.

However, as movements grow, the chances of someone holding a more extreme version of other people’s views also grows.

If EA becomes a very big movement, I predict that individuals on the fringes of the movement will commit theft with the goal to donate more to charity and violence against individuals and organisations who pose x-risks.

I think this is an almost inevitable result of becoming a large social movement (think environmentalism and ecoterrorism).

However, I think we can minimise this risk while retaining a difference from common sense morality and while continuing to be highly impactful.

This would involve:

EA organisations and leaders more prominently expressing an explicit belief that pure classical total utilitarianism is wrong (I think Will MacAskill has said stuff along these lines in the past)

EA orgs, leaders and community builders prominently emphasising that EAs should pursue their goals within the confines of legality and non-violence, even when breaking the law and being violent has very high expected value

-13

0
0

Reactions

0
0
Comments3
Sorted by Click to highlight new comments since:

Hm. I think  I agree with the point you're making, but not the language it's expressed in? I notice that your suggestion is a change in endorsed moral principles, but you make an instrumental argument, not a moral one. To me, the core of the issue is here: 

If EA becomes a very big movement, I predict that individuals on the fringes of the movement will commit theft with the goal to donate more to charity and violence against individuals and organisations who pose x-risks.

This seems to me more of a matter of high-fidelity communication than a matter of which philosophical principles we endorse. The idea of ethical injunctions is extremely important, but not currently treated a central EA pillar idea. I would be very wary of EA self-modifying into a movement that explicitly rejects utilitarianism on the grounds that this will lead to better utilitarian outcomes.

There are large multi national organizations that functions at scale - handling billions of dollars on a daily basis  and does not go down through large scale fraud.  Preventive measures on how to tackle the scale of money, how to track and secure it ain't as complex  as the current FTX  situation is.

Additional measures (governance) and internal controls should just be installed.

Curated and popular this week
Relevant opportunities