In response to Open Thread #40
Comment author: Farhan 21 July 2018 05:55:00PM *  6 points [-]

Just a newbie exploring the forum!

Hi, I'm new here!

By the looks of it, there is SO much to learn about effective altruism and I absolutely love that. I've really come to accept learning as a never ending process and it's liberating to look at learning that way.

I'm hoping to earn some Karma points so I can make my own posts here and interact with members of this lovely forum, to continuously learn and maybe contribute sometimes along the way.

I've totally bought into the concepts of effective altruism, with the ideals of working as a community to edge closer to a better society resonating with me. I'm so excited about EA that I've decided that I want to help host an Effective Altruism Global X in my home country, Bangladesh.

I know for a fact that not many people know about effective altruism in Bangladesh. I've seen there is a listed group from Bangladesh in Effective Altruism Hub and I've mailed them to get in touch, but I have not detected much activity from them prior to sending them an email.

I just feel that concepts such as "Earning To Give" and "Cause Neutrality" are ideas that more people should know about. So many people do not fully understand the potential for impact each individual holds, they underestimate their potential to do good and do not invest their time in finding out what they can do with their career to have more impact. So many incredibly intelligent people, due to lack of information that could have been easily available, prematurely decide that earning money is the best they can do with their lives.

I absolutely believe that coming across 80,000 Hours was one of the luckiest moments in my life. The way they use scientific evidence to make a person understand the sheer capacity in one's hands, whether through donating effectively, advocacy, or direct work as explained in parts 2 and 3 of the career guide, inspires people to go out there and dedicate their lives to learning and doing good better.

Bangladesh, a lower-middle income country, is an area that desperately needs more effective altruists. Also derived from the career guide, in identifying proper problem areas, they should be large enough in scale, neglected, and solvable. Dhaka, the capital of Bangladesh, is the most overcrowded city in the world according to a 2017 post by the Telegraph, and 5th in the world in population density according to Wikipedia. Dhaka is a regular in the Economist Intelligence Unit's annual rankings of the "Least Liveable Cities" in the world. Dhaka came 2nd in 2014. Whether the misfortune of the people of Bangladesh is due to a weak government, swayed by corruption, or a lack of unity by the people, one thing is clear, that spreading the ideals of effective altruism has potential for massive impact in Bangladesh. The facts are that even if an effective altruist group exists in Dhaka, they have not been active and there is a huge opportunity to turn some heads and galvanize the doers of the society with the effective altruism movement, so that we can stride towards the development of our society that has been overdue now.

I'm very excited to meet more wonderful people, and learn many new things. It keeps me awake at night to think of what a success an effective altruist community fostered in Bangladesh could prove to be. I have not yet applied to be an organizer, because I thought I should maybe get at least a little bit involved with the community.

Lots of things to look forward to, and that is always how life should be.

Love,

Farhan

In response to comment by Farhan on Open Thread #40
Comment author: Peter_Hurford  (EA Profile) 22 July 2018 09:55:09PM 1 point [-]

This is really sweet to hear. :) I wish you the best and hope you find a lot in the effective altruism community.

Comment author: [deleted] 10 July 2018 05:23:54PM 0 points [-]

Oh man, that's not what I meant, sorry! I wasn't deliberately overdoing it for practice (and I've generally been much more critical on here than I am in person, I haven't singled out Brendon). I have doubts about people's reasoning in my mind all the time, but it's very rare that I say them out loud, and thereby give others the chance to learn from them, present evidence to the contrary or say how they think I'm being irrational. I was just trying to express my doubts out loud the way other people seem to, but I knew I'd make some mistakes and I really did fuck up.

Don't worry, I've given up on the idea. I'll shut down my account if I can (struggling to find the option right now), and I don't plan on starting any more.

Sorry again.

Comment author: Peter_Hurford  (EA Profile) 11 July 2018 01:58:30AM 1 point [-]

Maybe just send the feedback privately next time?

FYI I spent five karma points to say this to you, so you better take it seriously.

In response to comment by Peter_Hurford  (EA Profile) on Open Thread #40
Comment author: Milan_Griffes 11 July 2018 12:15:00AM 0 points [-]

I'd argue we don't necessarily know yet whether this is true. It may well be true, but it may well be false.

I think it's almost certainly true (confidence ~90%) that far future effects account for the bulk of impact for at least a substantial minority of interventions (like at least 20%? But very difficult to quantify believably).

Also seems almost certainly true that we don't know for which interventions far future effects account for the bulk of impact.

Comment author: Peter_Hurford  (EA Profile) 11 July 2018 01:54:55AM 1 point [-]

Separately, I'd wager that I feel pretty confident that taking into account all the possible long-term effects I can think of (population ethics, meat eating, economic development, differential technological development), that the effect of AMF is still net positive. I wonder if you really can model all these things? I previously wrote about five ways to handle flow-through effects in analysis and like this kind of weighted quantitative modeling.

In response to comment by Peter_Hurford  (EA Profile) on Open Thread #40
Comment author: Milan_Griffes 11 July 2018 12:15:00AM 0 points [-]

I'd argue we don't necessarily know yet whether this is true. It may well be true, but it may well be false.

I think it's almost certainly true (confidence ~90%) that far future effects account for the bulk of impact for at least a substantial minority of interventions (like at least 20%? But very difficult to quantify believably).

Also seems almost certainly true that we don't know for which interventions far future effects account for the bulk of impact.

Comment author: Peter_Hurford  (EA Profile) 11 July 2018 01:51:52AM 1 point [-]

I recently played two different video games with heavy time-travel elements. One of the games heavily implied that choosing differently made small differences for a little while but ultimately didn't matter in the grand scheme of things. The other game heavily implied that even the smallest of changes could butterfly effect into dramatically different changes. I kind of find both intuitions plausible so I'm just pretty confused about how confused I should be.

I wish there was a way to empirically test this, other than with time travel.

In response to Open Thread #40
Comment author: Milan_Griffes 10 July 2018 03:15:37PM *  5 points [-]

Why I'm skeptical of cost-effectiveness analysis

Reposting as comment because mods told me this wasn't thorough enough to be a post.

Briefly:

  • The entire course of the future matters (more)
  • Present-day interventions will bear on the entire course of the future, out to the far future
  • The effects of present-day interventions on far-future outcomes are very hard to predict
  • Any model of an intervention's effectiveness that doesn't include far-future effects isn't taking into account the bulk of the effects of the intervention
  • Any model that includes far-future effects isn't believable because these effects are very difficult to predict accurately
Comment author: Peter_Hurford  (EA Profile) 10 July 2018 07:14:20PM 5 points [-]

I'm glad you reposted this.

Any model of an intervention's effectiveness that doesn't include far-future effects isn't taking into account the bulk of the effects of the intervention

I'd argue we don't necessarily know yet whether this is true. It may well be true, but it may well be false.

Any model that includes far-future effects isn't believable because these effects are very difficult to predict accurately

This doesn't account for the fact that there's still gradients of relative believability here, even if the absolute believability is low. There's also an interesting meta-question of what to do when under various levels and kinds of uncertainty (and getting a better handle just how bad the uncertainty is).

Comment author: [deleted] 10 July 2018 05:26:24PM 0 points [-]

Ack, never mind, I think it's best I just best I shut down my account.

Comment author: Peter_Hurford  (EA Profile) 10 July 2018 07:06:56PM 0 points [-]

Why?

In response to Open Thread #40
Comment author: Naryan 09 July 2018 10:19:58PM *  4 points [-]

Impact Investing from an EA Perspective

This is just a teaser, since I don't have enough karma for a full post yet!

Picture a scale that has charity one one side (good social utility, -100% financial return) and Investing on the other (zero social utility, 7% financial return). Impact investing is a space that can give similar risk-adjusted market returns as traditional investments, but also provides social utility.

In my research, I've found several factors that make me excited about this area:

  • Impact investing is about 5% the size of charitable donations (22B vs 410B in 2016), and is growing much faster (17% vs 4% annually)

  • Impact investing makes up only 0.16% of the total capital markets - huge room for growth

  • Philanthropic enterprises with sustainable business models can use existing capital markets to get funded on a large scale

  • Due to the market's current inability to accurately value the 'social utility' provided, there are many greatly under-valued investment opportunities, providing similar social utility as comparable charities

I've got more detail, logic and sources in the full post, but in the mean time, I'll tell you about one example opportunity that I've zoomed-in on.

WorldTree is a company that lets you buy an acre of fast-growing Empress Splendor trees. It's goal is to generate income from the harvest of the trees, and offset the carbon footprint of investors: * $2500 CAD minimum investment, enough to plant 1 acre of trees * One acre is enough to offset your lifetime carbon footprint * The timber is sold after 10 years, conservative return to the investor is $20k

From an EA perspective, I compared the stated carbon cost of World Tree ($1.72/tonne) to Cool Earth ($1.34/tonne) and traditional carbon offset programs ($10/tonne). This investment could return a 23% annual return, while the Cool Earth 'investment' would be a loss of 100%. At it's surface, this example does look quite promising when counting both the social utility generated, and the future utility my $20k could do in 10 years time.

Looking forward to posting a more detailed write-up on the space once I'm able, and to hearing your feedback on these ideas!

In response to comment by Naryan on Open Thread #40
Comment author: Peter_Hurford  (EA Profile) 10 July 2018 05:50:58AM 4 points [-]

This investment could return a 23% annual return

That's insanely high... social arguments would be irrelevant if you could safely get that kind of return. Every investor would want in.

In response to Open Thread #40
Comment author: vollmer 09 July 2018 06:50:16AM *  2 points [-]

Side note: I'd encourage commenters to put a title at the top of their comments (maybe this can be done in the OP).

In response to comment by vollmer on Open Thread #40
Comment author: Peter_Hurford  (EA Profile) 09 July 2018 05:02:27PM 2 points [-]

I edited the OP to mention it.

Comment author: [deleted] 05 July 2018 09:01:40PM 3 points [-]

Thank you so much for taking the time to explain :-) This is really useful.

Ha, yes, this post started as a long comment that I then thought might warrant a post, and I was hoping it might just cause a lightbulb moment for some people and not require much explanation. Alas!

I have a more substantial post r.e. community strategy in the works anyway. I'll probably just delete this one soon. I apologise for my first post being a bit of a disaster.

Comment author: Peter_Hurford  (EA Profile) 06 July 2018 03:52:53AM 3 points [-]

I have a more substantial post r.e. community strategy in the works anyway. I'll probably just delete this one soon. I apologise for my first post being a bit of a disaster.

Hey, no worries. Sorry this forum can be kinda intense for newcomers. I'll be really excited to see your more substantial post. :)

Comment author: [deleted] 05 July 2018 07:49:58AM 0 points [-]

I don't want to distract from my point even further by going into the common criticisms of Leverage (although to some extent I fear it's already too late for that...the Leverage reference was the bit I was most unsure about, so I'm assuming that's what's driving the downvotes until I see evidence to the contrary).

Comment author: Peter_Hurford  (EA Profile) 05 July 2018 04:38:27PM 4 points [-]

I don't really care about Leverage one way or another. I initially downvoted to try to avoid the needless community drama, but I've removed my downvote now that you've removed that.

However, I'm still not going to upvote and the reason is the the wider and more important point -- your post still doesn't actually argue for anything. Why do we need full-time EA community strategisers? What would they do? What's wrong with our current approach? I'd love to see more elaboration here.

View more: Next