Note: This post summarizes every mention of EA in Going Infinite. It is NOT a copy and pasted list of every sentence with the words "effective altruism".  
 

Pinocchio is narrated by Jiminy Cricket; the Great Gatsby is narrated by Nick; Sam Bankman-Fried’s story is narrated by Michael Lewis. His book, Going Infinite, isn’t written from Sam’s perspective, exactly. Michael Lewis is a sympathetic fly on the wall documenting a tragically flawed underdog being destroyed by his own infinite ambition.

But like the Great Gatsby, where Nick speaks so fondly about Daisy (who Gatsby loves) and so unkindly about Tom (who Gatsby hates), you get the sense that Sam’s love for the effective altruism movement colours a lot of what ended up in the book.

Spectacle is another thing that colours what went into the book. Michael Lewis doesn’t take the tabloid road and make this a book about drugs and shrimp rings and “the polycule”, thankfully. But I can’t shake the feeling that most of the characters in the book are basically zoo animals on display for the public’s amusement.

He’s constantly saying, implicitly or explicitly, that effective altruists are high iq children who don’t trust emotions or adults. It feels condescending: “Of course you failed. You’re children! Idiot savants, like something out of a Dostoevsky novel.”

But he also seems to really believe EAs are genuine, and that’s not nothing.

The first mention of effective altruism is in the preface. It doesn’t say effective altruism, exactly, but

He [Sam] needed infinity dollars because he planned to address the biggest existential risks to life on earth: nuclear war, pandemics far more deadly than Covid, artificial intelligence that turned on mankind and wiped us out, and so on.

And that really sets the tone for the rest of the book. At least according to his book, Michael Lewis believes that Sam, Caroline, Gary and Nishad were motivated to earn a lot of money because they wanted to give it to charity.

Chapters 1 and 2 introduce the character of Sam, including the way he thinks about expected value and the way he taught himself to make facial expressions, and Chapter 3 properly introduces effective altruism.

One other oddly big thing happened to Sam at the beginning of his junior year. Completely out of the blue, a twenty-five-year-old lecturer in philosophy at Oxford University named Will Crouch [Will MacAskill’s unmarried name] reached out and asked to meet with him. Sam never learned how the guy had found him—probably from the writing Sam had been doing on various utilitarian message boards.

He devotes several pages to talking about Peter Singer, Toby Ord and Will MacAskill, and the early version of 80,000 Hours Will was promoting on his visit to Harvard. As far as I can tell, Lewis basically gets those things right. And then he included a quote from Will that completely blew my mind:

“The demographics of who this appeals to are the demographics of a physics PhD program,” he said. “The levels of autism ten times the average. Lots of people on the spectrum.”

yeah.

Moving on: the end of Chapter 3 mentions allegedly the first earning-to-giver on Wall Street, Matt Wage (nominative determinism strikes again). Chapter 4 says that Sam became a vegan and an effective altruism because he thought through the arguments and concluded they were correct, not because of feelings of guilt or empathy. And Chapter 5 gets into the Alameda origin story, which is where EA really gets involved.

The business hadn’t even really been Sam’s idea but Tara’s. Tara had been running the Centre for Effective Altruism, in Berkeley, and Sam, while at Jane Street, had become one of her biggest donors. … Her success [trading crypto] led Sam to his secret belief that he might make a billion dollars by creating a hedge fund to trade crypto the way Jane Street traded everything.

But he couldn’t do it by himself. Crypto trading never closed. Just to have two people awake twenty-four hours a day, seven days a week, he’d need to hire at least five other traders. He’d also need programmers to turn the traders’ insights into code, so that their trading could be automated and speeded up. Tara had been making a handful of trades a week on her laptop; what Sam had in mind was an army of bots making a million trades a day. He’d need to hire some lower-IQ people to do the boring stuff, like finding office space and getting food for the traders and paying utility bills and probably lots of other things he hadn’t thought of.

His access to a pool of willing effective altruists was his secret weapon. Sam knew next to nothing about crypto, but he did know how easy it was to steal it. Anyone who started a crypto trading firm would need to trust his employees deeply, as any employee could hit a button and wire the crypto to a personal account without anyone else ever having the first idea what had happened. Wall Street firms were not capable of generating that level of trust, but EA was.

Lewis touches on how Alameda started out just sending money to random EAs in specific countries and getting them to transfer it back as a way of exploiting arbitrage opportunities. It’s absolutely wild.

Caroline, Gary and Nishad join Alameda in this chapter - Lewis briefly touches on each of their EA origin stories - and wealthy EAs including Jaan Tallinn invest in the young hedge fund. (No friends and family discount on the interest rate though - “investors were charging them a rate of interest of 50 percent” according to Lewis.)

Lewis does cover half of the EA founding team leaving Alameda as well, and to his credit I think he did interview some of the people who left, although it still comes off as very Sam-friendly compared to the accounts I’ve heard.

“The prospect of losing a couple of hundred million dollars that would have otherwise gone into solving the world’s problems felt pretty high stakes,” said Ben West. Under the circumstances, they thought it was insane to continue trading, but Sam insisted on trading. Crypto markets would not remain inefficient for long. They needed to make hay while the sun shone.

Lewis says the other managers were trying to figure out how to get rid of Sam; “aimed to bankrupt Sam, almost as a service to humanity”; and “told our investors he was faking being an EA, because it was the meanest thing they could think to say”.

In the end, for Sam to leave he had to want to leave, and Sam did not really want to leave. And so, on April 9, 2018, his entire management team, along with half of his employees, walked out the door, with somewhere between one and two million dollars in severance.

People in the wider EA community started hearing things, but…

“There was no smoking gun.” No one thing Sam had done for which they could easily condemn him. It was, as Tara said, “one hundred small things.”

And after a while when Alameda and then FTX started doing so well, some of them even started to think that maybe they were wrong, maybe Sam was just rude and stubborn and untrustworthy in a way that makes you good at business, not in a way that means you should cut all ties.

In Chapter 6, Lewis shares Sam’s writing about the situation:

“I did damage to the EA community,” he wrote. “I made people hate each other a little more and trust each other a little less … and I severely curtailed my own future ability to do good. I’m pretty sure my net impact on the world has, so far, been negative and that is why.”

At that point, Sam stopped hiring so many EAs.

Chapter 7 focuses on George Lerner, a psychiatrist who “was treating maybe twenty EAs” when he was invited to move to the Bahamas and join the FTX staff. He says really mean things about Caroline particularly. And he comments a lot on the trends in his EA patients:

“A lot of EAs chose not to have kids,” said George. “It’s because of the impact on their own lives. They believe that having kids takes over from their ability to have impact on the world.”

“There are two parts of being EA,” said George. “Part one is the focus on consequences. Part two is the personal sacrifice.”

And trends in the EA staff in the Bahamas:

“Everyone is complaining about the lack of dating opportunities,” said George. “Except the EAs. The EAs didn’t care.”

The non-EAs thought the EAs thought they were smarter than everybody else.

*shrug*

Chapter 8 introduces a new EA character - Igor Kurganov. “Kurganov was a Russian-born former professional poker player to whom Musk had entrusted the task, it was reported, of giving away more than $ 5 billion worth of his fortune. He was also a self-described effective altruist”.

Lewis describes Sam as spending more of his money now:

considering buying some of Twitter with Elon Musk, in a bid to convince him to donate more of his money towards EA-style causes

investing $500 million in Anthropic

going to EA leaders’ retreats where they discuss how to spend his money

and trying to buy political power

In a very short time, Sam’s money had bankrolled some of the most spectacular failures in the history of political manipulation. Carrick Flynn, for example. When Sam stumbled upon him, Carrick Flynn was a newcomer to elective politics. He was the quintessential Washington, DC, policy wonk—one of the faceless minions in blue suits who sit along the wall behind the more important people and occasionally rise and whisper something in their ears. Carrick Flynn’s most important trait, in Sam’s view, was his total command of and commitment to pandemic prevention. His second-most important trait was that he was an effective altruist.

[…]

Flynn asked some fellow EAs what they thought about him running for Congress. As a political candidate he had obvious weaknesses: in addition to being a Washington insider and a bit of a carpetbagger, he was terrified of public speaking and sensitive to criticism. He described himself as “very introverted.” And yet none of the EAs could see any good reason for him not to go for it—and so he’d thrown his hat into the ring. Somewhere along the EA trail he’d become known to Sam.

[…]

The people of Oregon not only did not appreciate the effort; a lot of them started to kind of hate Carrick Flynn. And Carrick Flynn was not designed to ignore their feelings. Attacked by other candidates during a debate, he simply walked out in the middle.

[…]

Anyway, to Sam, the money he’d spent on Carrick Flynn had been a drop in his second bucket. Other congressional races had worked out better.

Chapter 8 also introduces the FTX Future Fund:

From the moneymaking division of FTX, it was just her, Sam, Gary, and Nishad; from the money-giving side were the four employees who worked for FTX’s philanthropic wing. They shared their employer’s habit of turning life decisions into expected value calculations, and their inner math yielded similarly surprising results. In 2020, Avital Balwit had won a Rhodes Scholarship and turned it down, first to run Carrick Flynn’s congressional campaign and then to give away FTX’s money. Leopold Aschenbrenner, who had entered Columbia University at the age of fifteen and graduated four years later as class valedictorian, had just declined a spot at Yale Law School to work for this new philanthropy. Their boss, a former Oxford philosopher named Nick Beckstead, was also present, as was their spiritual guru, Will MacAskill—who was of course in some way responsible for everyone, including Sam, even being there.

Lewis introduces the idea of longtermism here, and tries to shoehorn it into his theme of high iq kids with no use for feelings who follow the numbers.

One day some historian of effective altruism will marvel at how easily it transformed itself. It turned its back on living people without bloodshed or even, really, much shouting. You might think that people who had sacrificed fame and fortune to save poor children in Africa would rebel at the idea of moving on from poor children in Africa to future children in another galaxy. They didn’t, not really—which tells you something about the role of ordinary human feeling in the movement. It didn’t matter. What mattered was the math. Effective altruism never got its emotional charge from the places that charged ordinary philanthropy. It was always fueled by a cool lust for the most logical way to lead a good life.

And for this one sentence, I remember how exciting things were very briefly:

After handing out $ 30 million in 2021, they were on pace to hand out $ 300 million in 2022, and then $ 1 billion in 2023. As Nishad had put it to me not long before, “We’re finally going to stop talking about doing good and start doing it.”

And then the rest of the book is about the collapse of FTX. I won’t get into it; I expect the trial over the next six weeks will cover this much better than the book could.

It’s weird - whenever Lewis wants to talk about Sam, Caroline, Nishad and Gary, he calls them “the effective altruists”. And I guess that is how they met and a way to distinguish them from other key players in FTX! But I found it jarring every time.

I’ve spent so much time thinking about how those people were naive utilitarians or possibly just bad actors, but either way criminals who make upstanding people like us look bad, that I forgot how relatable Nishad and Gary and Caroline and Sam are. I’m glad the book helped me to remember.

This post first appeared on the EA Lifestyles Substack. Subscribe for free weekly posts.

94

2
2

Reactions

2
2

More posts like this

Comments17
Sorted by Click to highlight new comments since: Today at 7:12 AM

Just noting, for people who might not read the book, that there are many more mentions of "effective altruism":

I agree that EA seems often painted as "High IQ immature children", especially from Chapter 6 or 7.

To me, EA also seems painted as kind of a cult[1], where acolytes sacrifice their lives for "the greater good" according to a weird ideology, and people seem to be considered "effective altruists" mostly based on their social connections with the group.

I'm surprised you didn't mention what was for me the spiciest EA quote, from SBF in ~2018:

This combos really badly with the current EA shitshow I’m supposed to be, in some ways, adjudicating.

  1. ^

    Same way as this Washington Post article puts it

It seems somewhat irresponsible to title this post "every mention of EA in Going Infinite" if it only includes a handful of the many mentions of EA in Going Infinite. Appreciate you for clarifying!

Yes, I think the title should be changed.

"A Summary of Every Mention of EA in Going Infinite"?

"How EA is portrayed in Going Infinite"?

Yeah the latter is good.

I wrote about every mention, but some were summaries rather than direct copies and pastes, which I thought was straightforward for readers.

For example when I say, "He devotes several pages to talking about Peter Singer, Toby Ord and Will MacAskill, and the early version of 80,000 Hours Will was promoting on his visit to Harvard", I mean there were many mentions of effective altruism on those pages!

I also include sections of the book that talk about effective altruism without using that exact phrase.

I don't think there are any I didn't either quote or summarise, but I only read it once, so I could have missed some

I didn't really understand that SBF quote to be honest! What was he referring to - the conflict he caused?

Note that the author uses "the effective altruists" as shorthand for "Caroline, Nishad, Gary, and Sam". E.g. "that's where the effective altruists all lived, at least until Caroline booted Sam out" is just referring to the four of them.

So I think there are fewer references to EA per se than these search results might imply.

He does, but at the same time I think it matters that he uses that shorthand rather than some other expression (say CNGS), since it makes the EA connection more salient.

Agreed, it's just important to understand that what the author means by that term is not what most of us would mean by the term.

"Sam became a vegan and an effective altruism because he thought through the arguments and concluded they were correct, not because of feelings of guilt or empathy."

There is so much in this sentence that captures the entire EA / non-EA division. Most especially, though, it captures the general public's misunderstanding of what empathy is. 

First is the implication that "following the arguments" means you don't have empathy. This is so patently false. If anything, EA's have more empathy - so much empathy that we want to do the most effective things for those who need help, rather than the things that give us personal satisfaction or assuage our guilty feelings. 

The non-EA would say "I saw a blind person today, with a guide dog. I felt so much empathy that I decided to donate $50K to the charity that trains guide dogs." (net result: one more guide dog trained for a person in a wealthy country, but massive feelings of self-satisfaction for the owner, who may even get to personally meet the dog and get thanked by its new owner). 

The EA sees the same thing, and thinks "Imagine how terrible it must be to be blind and not even have a guide dog, maybe living in a country which doesn't accommodate blind people the way the US does." And, after some research, donates $50K to a charity that prevents blindness, saving the sight of maybe 1000 people in a poor country. 

But still, in the eyes of many, the non-EA has shown more empathy, while the EA has just "followed the arguments". People think empathy is about a warm, fuzzy feeling they get when they help someone. But it's not. Empathy is about getting inside someone's head, seeing the world from their perspective and understanding what they need. 

 

Second is the focus on the giver rather than the receiver. 

The EA understands that a person in need needs help. They do not need empathy or sympathy or guilt. They need help. If they get that help from a cynical crypto-billionaire or from a generous kid who gives away her birthday savings, it makes no difference to them. 

The non-EA focuses on the generosity of the kid, giving up toys and chocolate to help (and that is wonderful and fully to be encouraged) and on the calculated logic of the billionaire who will not even notice the money donated. 

The EA focuses on the receiver, and on whether that person's needs are met. This is far closer to true empathy. 

 

I wonder if there's a way for EA's to fight back against our critics by explaining (in a positive way) that what we do and the way we think is empathy to the power of n, that the suggestion that we don't have empathy is utterly false. 

This is tangential to your point, but I don't think using the "prevent blindness in poor countries vs train guide dogs" comparison they way you're doing it is a great idea; more here: Fact checking comparison between trachoma surgeries and guide dogs.

Part of the problem is that there's nowhere you can actually donate $50k and prevent or cure 1k cases of blindness. The only [1] EA recommendation in this area is Hellen Keller International (recommended by GiveWell), but looking at their vision benefits writeup and BOTEC, linked from their cost-effectiveness model most of the supplementation is going to people who wouldn't otherwise become blind. Yes, it's two pills for ~$2.70, but interpreting GW's rough estimate is that only 1:1,100 people who get the pill would otherwise become blind. [2] So about $3k to prevent a case of blindness via vitamin A supplementation.

(Note that vitamin A is unlikely to be the cheapest option, since the mortality benefits are the main reason GW has been working on evaluating it, so this is not an estimate of the cheapest way to avert blindness.)

[1] You might also think that Sightsavers would have something to do with vision, but GiveWell's review is specifically for their deworming program.

[2] 1.3% fraction of kids with night blindness, 10% of those progress to full blindness, relative risk of night blindness post vitamin A supplementation is 32%. Combining these (1/(1.3% * 10% * (1-32%)) gives me 1,100x.

Thanks Jeff, 

It's helpful to have the facts. I will look for a better example next time! 

Cheers

Denis

People think empathy is about a warm, fuzzy feeling they get when they help someone. But it's not. Empathy is about getting inside someone's head, seeing the world from their perspective and understanding what they need. 

Not a psychologist, but I've heard this distinction described as "cognitive empathy" vs "affective empathy".

Cognitive and Affective Empathy in Eating Disorders: A Systematic Review and Meta-Analysis - PMC (nih.gov)

Empathy can be separated into two major facets. Cognitive empathy refers to the ability to recognize and understand another's mental state (part of theory of mind (ToM) or mentalising) while affective empathy is the ability to share the feelings of others, without any direct emotional stimulation to oneself 

I'm not aware of surveys being done, but I strongly suspect EAs have average to above-average affective empathy, and unusually high levels of cognitive empathy.

Semi-related: There's this great video from Dr K about how smart kids develop a reliance on cognitive empathy, and that reminds me of a lot of EAs.

What a great build! Thank you for this! 

I hadn't looked at it that way, but it makes so much sense. 

But I would still say (as you might too) that EA's tend to be more affectively empathetic than average. We do care. 

There is this misrepresentation of EA's as if it's some kind of game for us, like Monopoly, but that is absolutely not representative of the EA's I've interacted with. 

I really appreciated this comment nice one

"People think empathy is about a warm, fuzzy feeling they get when they help someone. But it's not. Empathy is about getting inside someone's head, seeing the world from their perspective and understanding what they need."

I think this comment is especially excellent. Technically there are different definitions of empathy, the most stark example of the difference being psycopaths who can have amazing cognitive empathy but zero affective empathy, but I still think this comment captures the heart of the issue well.

M
7mo17
3
0

“The demographics of who this appeals to are the demographics of a physics PhD program,” he said. “The levels of autism ten times the average. Lots of people on the spectrum.”

yeah.

 

What's the problem with acknowledging the existence and success of people on the spectrum? Sam is very clearly on the spectrum and I take issue with the fact that Michael never addressed it. It seems like people are too scared to have real discussions about the autism spectrum.

Curated and popular this week
Relevant opportunities