D

DC

1704 karmaJoined Aug 2016

Comments
216

Topic contributions
2

DC
2d1
0
0

I recommend asking clarifying questions to reduce confusion before confidently expressing what turn out to be at least in part, spurious criticisms. I guarantee you it's not fun for the people announcing their cool new project to receive.

I feel a little alienation by the emphasis on elite education from both sides of this kind of debate. Not that there's necessarily much that can be changed there, it's probably just the nature of the game mostly. But I find a little odd that the "be more normal [with career capital]" camp presumes normal to include being in the upper middle class of the Anglo world. That's usually the sort of person making the critique. Though I could see a blue-collar worker levying it too.

And? Do you have a particular solution to guarantee pandemic prevention that deals with the specific logistical complexities inherent to the task, that can be applied to every country on Earth without being resisted?

"Step 2: Draw the rest of the owl."

I see you state your solutions will come in later posts but I think it's better to do that upfront given your rhetoric is currently not justified. Given your title I expect to see a theory of change that attempts to address the overwhelming challenges involved.

DC
2mo27
22
1

It would be helpful to know what events have been hosted there by now.

DC
2mo11
1
9

"X-Risk" Movement-Building Considered Probably Harmful

My instinct has generally been for a while now that it's probably really really bad for the majority of the population to be aware of the meme  of x-risk, or at least more harm than good. See climate doomerism. See (attempted) gain of function research at Wuhan. See asteroid deflection techniques that are dual-use with respect to asteroid weaponization which is orders of magnitude worse of a still far-off risk than natural asteroid impact. See gain of function research at Anthropic which, idk, maybe it's good but that's kinda concerning, as well as all the other resources provided to questionably benevolent AGI companies under the assumption it will do good. "X-risk" seems like something that will make people go crazy in ways that will cause destruction, e.g. people use the term "pivotal act" even when I'd claim it's been superceded by Critch's "pivotal process". I'm also worried about dark triad elites or bureaucrats co-opting these memes for unnecessary power and control, a take from the e/acc vein of thought that I find their most sympathetic position, because it's probably correct when you think in the limit of social memetic momentum. Sorta relatedly, I'm worried about EA becoming a collection of high modernist midwittery as it mainstreams, watered down and unable to course correct from co-options and simplifications. Please message me if you want to riff on these topics.

Minor point that isn't engaging with the substance of your post, which I basically agree with the main point, but a negative externality here is that fundraising is often annoying. There is adverse selection where organizations that fundraise are often corrupt (see: Wikipedia) and ineffective. If an org is fundraising, it makes me think implicitly, "Why do you need my money? What has caused you to have this scarcity? Are you ineffective and have been passed over?" Personally I'd prefer moving past the social technology of donations and move more towards impact market-like mechanisms.

DC
3mo11
2
1

one part of me is under the impression that more people should commit themselves to things that probably won't work out but would pay off massively if they do. The relevant conflict here is this means losing optionality and taking yourself out of the game for other purposes. We need more wild visions of the future that may work out if e.g. AI doesn't. Playing to your outs is very related but I'm thinking more generally we do in fact need more visions based on different epistemics about how the world is going, and someone might necessarily have to adopt some kind of provisional story of the world that will probably be wrong but is requisite to model any kind of payoff their commitment may have. Real change requires real commitment. Also, most ways to help look like particular bets towards building particular infrastructural upgrades, vs starting an AGI company that Solves Everything. On the flip side, we also need people holding onto their wealth and paying attention, ready to pounce on opportunities that may arise. And maybe you really should just get as close to the dynamo of technocapital acceleration as possible.

Not noticing big obvious problems with impact certificates/markets

What problems are you thinking of in particular?

Ancestor worship also came to mind a la What We Owe The Past, but I wasn't sure if OP had something different in mind than that post.

https://forum.effectivealtruism.org/posts/ndvguMbcdAMXzTHJG/what-we-owe-the-past

Load more