Hide table of contents
This is a special post for quick takes by Ben Jamin. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.

Sometimes the high impact game feels weird, get over it.

I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).

I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0.01% mroe productive (and I trust their judgment and value alignment) they are welcome to my money. Giving them my money is winning and as far as I am concerned it's a far better use of money than basically anything else I could do. To be clear, I know that there are optics issues, community health issues etc. but sometimes we can spend money without worrying about these things (e.g. retreats for people already familiar with LT).

Yes this would feel weird, but am I really going to let my own feelings of weirdness stop me helping billions of people in expectation. That feels much more weird.

Sorted by Click to highlight new quick takes since:

Sometimes the high impact game feels weird, get over it.

I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).

I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0.01% mroe productive (and I trust their judgment and value alignment) they are welcome to my money. Giving them my money is winning and as far as I am concerned it's a far better use of money than basically anything else I could do.

Yes this would feel weird, but am I really going to let my own feelings of weirdness stop me helping billions of people in expectation. That feels much more weird. a

I agree with the sentiment, but I wouldn't put it quite as drastically. (If someone actually talked about things that make them 0.01% more productive, that suggests they have lost the plot.) Also, "(and I trust their judgment and value alignment)"  does a lot of work. I assume you wouldn't say this about any researcher who self-describes as working on longtermism. If some grantmakers have poor judgment, they may give away large sums of money to other grantmakers for regranting who may have even worse judgment or could be corrupt, then you get a pretty bad ecosystem where it's easy for the wrong people to attain more influence within EA. 

I want to be clear that I am endorsing not only the sentiment but the drastic framing. At the end of the day, a few 100k here and there is literally a rounding error on what matters and I would much rather top researchers were spending this money of weird things that might help them slightly than we had a few more mediocre researchers who are working on things that don't really matter.

I certainly wouldn't say this about any researcher, if they could work in constellation/lightcone they have a 30% chance of hitting my bar. I am much more excited about this for the obvious top people at constellation/lightcone.

(If someone actually talked about things that make them 0.01% more productive, that suggests they have lost the plot.)

I don't really like this, presumably if impact is extremely heavy tailed we can get a lot of value from finding these actvities and a general aversion to this bcause it might waste mere money seems very bad. Things like optics are more of a reason to be careful but idk, maybe we should just make anonymous forum accounts to discuss these things and then actually take our ideas seriously.

Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn't been studied, or they've found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there's no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they're calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we probably shouldn't give them lots of money.

A separate point that makes me uneasy about your specific example (but not about generally spending more money on some people with the rationale that impact is likely extremely heavy-tailed) is the following. I think  even people with comparatively low dark personality traits are susceptible to corruption by power. Therefore, I'd want people to have mental inhibitions from developing taste that's too extravagant. It's a fuzzy argument because one could say the same thing about spending $50 on an uber eats order, and on that sort of example, my intuition is  "Obviously it's easy to develop this sort of taste and if it saves people time, they should do it rather than spend willpower on changing their food habits." 
 But on a scale from $50 uber eats orders to spending $150,000 on a sports car, there's probably a point somewhere where someone's conduct too dissimilar to the archetype of "person on a world-saving mission." I think someone you can trust with a lot of money and power would be wise enough that, if they ever form the thought "I should get a sports car because I'd be more productive if I had one," they'd sound a mental alarm and start worrying they got corrupted. (And maybe they'll end up buying the sports car anyway, but they certainly won't be thinking "this is good for impact.") 


 

Curated and popular this week
Relevant opportunities