I do independent research on EA topics. I write about whatever seems important, tractable, and interesting (to me). Lately, I mainly write about EA investing strategy, but my attention span is too short to pick just one topic.
I have a website: https://mdickens.me/ Most of the content on my website gets cross-posted to the EA Forum.
My favorite things that I've written: https://mdickens.me/favorite-posts/
I used to work as a software developer at Affirm.
I'm not particularly knowledgeable about this but my take is:
I would not interpret that as the community being complacent.
I had an idea for a different way to evaluate meta-options. A meta-option behaves like a call option where the price equals the current value of the equity and the strike price equals the cash salary you'd be able to get instead.[1]
If I compare an equity package worth $100K per year versus a counterfactual cash salary of $100K and assume a volatility of 70% (my research suggests that small companies have a volatility around 70–100%), the call option for the equity that vests in the first year is worth $29K, and the call option for the equity that vests in the 4th year is worth $56K (which is equivalent to a 12% annual return). So on average, a meta-option on a 4-year equity package is worth somewhere in the ballpark of an 18% annual return.
(But if the equity has a lower face value than the counterfactual cash salary, it pretty quickly becomes not worth it.)
[1] This is kind of wrong because with a normal stock option you don't have to pay the strike until you exercise, but with an employee meta-option, you have to give up your counterfactual salary as soon as you start working, and you don't vest for the first year so you have to give up a full year of cash salary no matter what. If you have monthly vesting, the fact that you have to pay at the beginning of the month instead of the end doesn't matter much.
(edited to make the numbers make more sense)
I disagree-voted to indicate that I did not donate my mana because of this post (I use Manifold sometimes but I have only a trivial amount of mana)
I feel your pain. I hope the amount of upvotes and hearts you're getting helps you feel better, but I know brains don't always work that way (mine doesn't).
I believe this sort of thing doesn't get much attention from EAs because there's not really a strong case for it being a global priority in the same way that existential risk from AI is.
It's really hard to judge whether a life is net positive. I'm not even sure when my own life is net positive—sometimes if I'm going through a difficult moment, as a mental exercise I ask myself, "if the rest of my life felt exactly like this, would I want to keep living?" And it's genuinely pretty hard to tell. Sometimes it's obvious, like right at this moment my life is definitely net positive, but when I'm feeling bad, it's hard to say where the threshold is. If I can't even identify the threshold for myself, I doubt I can identify it in farm animals.
If I had to guess, I'd say the threshold is something like
it seems important for my own decision making and for standing on solid ground while talking with others about animal suffering.
To this point, I think the most important things are
If we're talking about financial risk, I enjoyed Deep Risk, a short book by William Bernstein.
The use of quantitative impact estimates by EAs can mislead audiences into overestimating the quality of quantitative empirical evidence supporting these estimates.
In my experience, this is not a winnable battle. Regardless of how many times you repeat that your quantitative estimates are based on limited evidence / embed a lot of assumptions / have high margins of error / etc., people will say you're taking your estimates too seriously.
Could you say more about your thoughts on animal welfare vs. x-risk? I agree that animal welfare is relatively neglected, but it also seems to me that x-risk needs a lot more funding and marginal dollars are still really valuable. (I don't have a strong opinion about which to prioritize but those two considerations seem relevant.)