Hide table of contents
  • I am very confident that there are stages (or levels) to awakening that are very accessible to the average person. 
  • I think some of the earlier stages covered here could be achieved by the average person with ~ a month of work. 
  • I think achieving this would improve the average persons wellbeing between 1% - 20%.
  • I think achieving the higher stages of enlightenment could raise wellbeing levels > 20%, especially if emptiness or the cessation of suffering is achieved for prolonged periods.

Why is this a field not taken seriously by EAs?

13

0
0

Reactions

0
0
New Answer
New Comment

2 Answers sorted by

A lot of EAs are into mindfulness/meditation/enlightenment. You link to Clearer Thinking, and I consider Spencer Greenberg to be part of our community. If you want to get serious about tractable, scalable mental health interventions, SparkWave (also from Spencer Greenberg) has a bunch of very cool apps that focus on this. 

I'm personally not into enlightenment/awakening because meditation doesn't do much for me, and a lot of the "insights" I hear from "enlightened" people strike me as the sensation of insight more than the discovery of new knowledge. I'm also skeptical that meditation actually solves problems for most people. Poverty and preventable diseases cause a huge amount of suffering; the solution to these are systematic changes to society/medical interventions, not meditation. 

This is all just my personal opinion; I know plenty of people in the EA community on the spectrum of loving enlightenment techniques to hating them. 

This is not central to the original question (I agree with you that poverty and preventable diseases are more pressing concerns), but for what it's worth, one shouldn't be all that nonplussed at how the “insights” one might hear from “enlightened” people sound more like the sensation of insight than the discovery of new knowledge. Most people who've found something worthwhile in meditation—and I'm speaking here as an intermediate meditator who's listened to many advanced meditators—would agree that progress/breakthroughs/the goal in meditation is not about gaining new knowledge, but rather, about seeing more clearly what is already here. (And doing so at an experiential level, not a conceptual level.)

Just saw how strongly downvoted this parent comment is! OP asked "Why do EA people think a thing?" And I responded with "This is why I, an EA person, think a thing." You can disagree with my opinion, but you can't deny that I have this opinion. I'm not obsessed with EA forum karma, but it's kind of annoying how badly people are following discourse norms here by downvoting opinions that they simply don't like. (There's a disagree button for this exact purpose, people!)

1
yanni kyriacos
i find this a strange feature of this forum tbh.  i don't think ive ever downvoted anything?  but yeah, the best strategy is not to care imo

I think Yanni actually works at SparkWave :)

My guess is

  • AI safety people would argue that if the world is destroyed then this improved happiness doesn't buy us much
  • Animal welfare people would argue that there are a lot more low-hanging fruit to improve animal's lives so that focusing on humans isn't the best we can do
  • global health/well-being people tend towards less speculative/less "weird" interventions (?)

I still think there are probably a lot of people who could get excited about the topic and it might be the right time to start pitching it to EAs.

(Also side note, maybe you're already aware of it but Sasha Chapin is basically researching enlightenment: https://sashachapin.substack.com/p/hey-why-arent-we-doing-more-research )

Curated and popular this week
Relevant opportunities