Oct 10 20151 min read 36

1

Here's a place to discuss projects, ideas, events and miscellanea relevant to the world of effective altruism that don't need a whole post of their own!

1

0
0

Reactions

0
0
Comments36
Sorted by Click to highlight new comments since: Today at 10:39 PM

Hi, I've recently written an article about what I think are some image problems that effective altruism have and how we can combat them. I'd love to post on this website here so that I get feedback and stimulate discussion but don't have enough Karma points to do so. Please like this post so that I can post it!

If you're worried about the material you can see an earlier draft of the article on the Effective Altruism fb group (https://www.facebook.com/groups/effective.altruists/) or the EA Hangout fb group (https://www.facebook.com/groups/eahangout/?fref=ts).

You're now good to go.

I have a fear/uncertainty/doubt about excessive cause-prioritization as a focus of the Effective Altruism movement's message.

Let's say Tess goes to an Ivy League school and wants to make an impact through work in education. She does Teach for America and teaches underprivileged kids in the US for a few years, and gradually rises within the schools she works at until she is an administrator and can allocate resources for an entire district. Because she's very good at it and deeply cares about her work, she ends up making an enormous impact with her career, transforming a bad public school system into a great one, and substantially positively affecting the lives of thousands of kids per year.

I think the EA movement would disapprove of the early steps in this career path. So if Tess discovered the current EA movement too early in her career, she would either become disenchanted with EA, or "drop out" of TFA and instead go earn-to-give, or something like that -- i.e., follow a career path which is more approved by EAs, but ultimately less impactful. Of course, this would be justified since she would have no way to know that she has the "plot armor" to succeed at her original path. But by dropping out of something she is passionate about and doing earning-to-give, she is sacrificing her potential upside through the passion/resonance she has with her work.

I guess I worry about a lot of people saying such-and-such isn't really EA because it doesn't maximize a narrow ideal of "effectiveness", and that can either turn people off of EA, or turn them off of careers in which they might actually have real upside through resonance.

That scenario only makes sense with the benefit of hindsight. Generally speaking, people who choose impactful career paths tend to do more for the world, so we recommend accordingly. In EA career advice e.g. on 80,000 Hours, it's well accounted for that personal enjoyment and passion are parts of being successful in one's career, although they are not nearly the only components, and difficult to plan for.

I believe this is largely the reason why 80K has been trying harder more recently to downplay the importance of earning to give. It's not the only good career path and it's not something everyone should be doing.

Let's say Tess goes to an Ivy League school and wants to make an impact through work in education. She does Teach for America and teaches underprivileged kids in the US for a few years, and gradually rises within the schools she works at until she is an administrator and can allocate resources for an entire district. Because she's very good at it and deeply cares about her work, she ends up making an enormous impact with her career, transforming a bad public school system into a great one, and substantially positively affecting the lives of thousands of kids per year.

This career seems good ex post. It turned out well! But unfortunately forecasting is hard, especially about the future. The relevant question for Tess at age 21 is whether, in expectation, doing TfA is the best use of her time. And the answer is probably no - most TfA graduates achieve very little. In your story she got lucky - but 80k cannot advise 'be lucky' as a career strategy!

[anonymous]8y0
0
0

Have you seen Holden Karnofsky's thoughts on career choice? I tend to agree with him (and with your concern here) that doing outstanding work in any career is more important than abstractly considering "which career is best" when it comes to opportunities to do good. Hence very specific personal factors may be the most important concerns. I've heard that the people at 80,000 Hours have been talking to him and moving in that direction, but if so they seem to have some trouble communicating this.

I tend to agree with him (and with your concern here) that doing outstanding work in any career is more important than abstractly considering "which career is best" when it comes to opportunities to do good.

This conclusion seems weird to me. The space of careers is wide and the space within any career is much narrower, so I'd expect that choosing which career to pursue is one of the most important decisions, similarly to how choosing a cause area is one of the most important parts of choosing a charity. Individual skill sets matter more for career choice, but I'd expect that most people (especially EAs) have many careers they could choose where they'd perform about as well in any of them.

[anonymous]8y0
0
0

Interesting! I honestly don't think charity and career choices are all that similar. My impression has been that there are huge differences between positions in the same field, and even between positions with the same employer. What's the team/management/environment like? How well does it fit your working style and skills? What will you learn, who will you meet? What would the prospective employer do if they didn't hire you, and what will other applicants do if you take the job? What further opportunities will you be able to pursue that others wouldn't be able or motivated to?

Holden's angle as I interpret it is that answers to the first sorts of question might be the difference between doing good work -- the level you'd perform about as well as in any career -- and doing outstanding work. And doing outstanding work is what often turns up the best further opportunities, which often couldn't have been known in advance (e.g. new initiatives, organizations, collaborations, leadership positions, policy change), and which make for much larger differences in outcomes than the between-career differences in acceptably good work.

But you can't really get at the first questions at the level of "doctors may be replaceable" or "an additional PhD researcher gives marginal speedup X to biomedical progress." At best this can narrow things down slightly from things you think you might be a good fit for, depending on how broad that category is to begin with.

Beyond that, there's not really a substitute for doing a bunch of informational interviews, taking a variety of internships, volunteering in fields of interest, turning up as many positions as you can through your network, asking important questions during real interviews, and so on; then finally evaluating your specific options. And at that point it should not be surprising to find a better opportunity (for you personally) in what was a "worse" field in a more zoomed-out analysis.

Hi all, I'm new to the forum so thanks for having me! I've just launched a blog called Science for Seekers (www.scienceforseekers.com), with an emphasis on effective giving (in the context of a broader focus on uniting rationality and meaning/purpose). It's just a little fledgling project, and I'd love any comments/feedback from all of you deep thinkers.

  • Andrew

Would you like to leave money in your will to GiveWell’s top rated charities at the time of your passing? If so, Charity Science will you help you write it for free.

To make it as easy as possible for you, we at Charity Science have made a simple form that takes as little as 5 minutes to complete. After that you come out with a ready made will. And don’t worry if you’re not sure what to put in it; it’s easy to change and you can always come back to it later. So give it a shot here. The default option should be to set it up just in case something terrible does happen, that way you always have something ready.

A few more reasons to take the time to write a will include:

  • Reducing the inheritance tax incurred - leaving money to charity being an excellent way to do so.

  • Making provisions for your children if you have any, for example by choosing who will take care of them and setting aside funds for this.

  • Making any other necessary provisions, such as for your pets, or your business, or other responsibilities that you have.

  • Specifying what sort of funeral you would like, which will spare your family from having to make the decision.

  • Naming your executors for your will (family members are a standard choice).

But most of all it’s because you have the incredible opportunity to do an epic amount of good.

You can set it up here. After that consider talking to your friends, parents and grandparents to see if they would be interested in doing the same. It’s really important you mention it because the average amount left to charities in a will is in the thousands of dollars so a few words may go a very long way.

If this doesn’t appeal to you then there are other things that you could do. You can always run a fundraiser for Christmas, your Birthday or any event you like.

I've been meaning to get a will sorted out for over five years. Can I name FHI or MIRI or CFAR?

Hey Dale,

The system only works for GiveWell/Charity Science recommended charities.

Are you thinking of adding other EA charities at some point in the near future?

I have a question for those who donate to meta-charities like Charity Science or REG to take advantage of their multiplier effect (these charities typically raise ~$5-10 per dollar of expenditure). Do you donate directly towards the operations expenses of these meta-charities? For example, REG's donations page has the default split of your donations as 80% towards object-level charities (and other meta-charities), while 20% is towards REG's operating expenses, which include the fundraising efforts that the multiplier presumably is coming from. It seems to me that in order to get the best multiplier for your donation, you would donate 100% towards operating expenses, since any dollar not spent on operating expenses wouldn't have any multiplier. Is this right?

Do you donate directly towards the operations expenses of these meta-charities?

Yes. I donate to both GiveWell top charities (which Charity Science supports) and Charity Science's operations.

It seems to me that in order to get the best multiplier for your donation, you would donate 100% towards operating expenses, since any dollar not spent on operating expenses wouldn't have any multiplier. Is this right?

This seems largely right, though it's important to note that donating to the recommended charities of a meta-charity do help that charity.

I am wondering if someone can explain, or point me to a link on, why they think global poverty charity matters compared with policy. For example, one statistic from GWWC was that the Iraq war cost more than all government foreign aid from the developed world for 50 years, and I would guess that the war's economic effects on Iraq were comparable to its costs. Also, African exports and imports are worth about $35 billion each but total US international charity (to all countries, not just Africa) was $19 billion in 2012, according to this source. This suggests to me that (from an EA standpoint) policies on trade alone are more important than charity.

Merely comparing at the amount of money spent by the government or the economy on X or Y doesn't tell you what is the best place for you to maximize your impact. Not that policy is always a waste of time, of course. It could very well be a great thing to pursue, but the benefits of policy advocacy are not that clear once you frame the problem correctly.

If you have no issue-specific info about crowdedness or tractability, then total effects seem a decent starting point. Right now what I see about policy (trade and wars) vs. charity is that there are a group of ~300 million US citizens who are somehow producing both. Therefore your expected contribution for both is something like (size of total effect)/(300 million).

Edit: I am also forgetting about immigration. Apparently there are 3.8 million black immigrants in the United States, who probably increased their standard of living enormously.

Well you can just point out that the amount of effort being put in by the average citizen is much greater for trade and war. You can tell by the amount of focus given to these issues in political debates and the amount of interest groups focused on these policies. The amount of person-hours and money spent by Americans producing wars and trade isn't a clearly better ratio than the amount of person-hours and money spent by Americans producing charity.

I would agree there is more interest in war than international charity. On the other hand it could be that charity is limited in the interest it is capable of drawing, so there is effectively a hidden obstacle, or apathy. This would not pertain to your personal donations (since EAs are presumably not apathetic), but if you were thinking about outreach to build a movement with others, it would matter.

Also, even if changing these policies as a whole is not cost effective, I don't see why changes orthogonal to partisan disputes would also be. For example, EAs prioritize Africa because of its low living standard. On immigration policy, instead of just fighting for more immigration, you might push for less immigration from, say, Mexico, and more from Africa.

I think the default explanation is that it's surprisingly ineffective in practice to try for stuff that requires overcoming intelligent opposition. Obviously there are cases where this doesn't apply, but it does seem reasonably sensible as a default, both from a decision-theoretic perspective and a practical perspective. You quote a 50x impact multiplier; I suppose it depends on how smart you think the opposition is, but it doesn't seem unreasonable that a smart opposition would be able to reduce your impact by 50x.

Yes, your opposition is intelligent, but so are you. I think with politics it's true that your median impact is lower because political policies often depend on getting a majority vote, so typically as an individual you will make zero difference. But your average impact, I think, ought to be fine.

I'm about to donate some money to charity this year. It seems rational to pick one of Givewell's top charities. However, I'm a gay man and so have a special loyalty to gay issues. Unfortunately, no gay rights (or any human rights) campaigns are listed among the top items in terms of effectiveness, so I'm a little conflicted, especially as many of the people benefiting would probably not want to speak to me if they met me and knew what I was.

Then I thought about the movie Pride and the inspiring real events behind it. In the 1980's, a group of gays and lesbians decided to collect money for some miner communities, whose lives were being made difficult by the government (because they were striking). The groups ended up breaking barriers and the miner unions later became significant supporters of gay rights.

So I'm wondering if it's possible to do something similar in the developing world. Donate effectively, helping people unconditionally, but making it clear to the recipients who the help is coming from. Perhaps there's an LGBT group out there that already does this? Or perhaps it's possible to include a personal message in your donation to some effective charity? Or perhaps there's some other way? Any suggestions or advice would be appreciated.

Of course, there wouldn't be the kind of personal connection here as they had with the miners (unless the aid is given by some local gay activists). But then again, the effectiveness of the giving would be much higher, as would (eventually) the amount of money given. And I also might well be motivated to donate more if I get this emotional reward.

I'm soliciting advice from Wall Street businessmen on how to maximize earnings, and they're telling me I shouldn't worry about money and should just follow my passion in life.

I don't know who to believe anymore.

I expect they see a lot of people who are interested in maximising earnings in order to become rich. They infer that this is because the person believes that will make them happy, and they are trying to steer them away from that error by recommending following one's passion (a reasonable heuristic for personal happiness).

I work on Wall Street and have a strong instinctive negative reaction to your question. You sound crass. People who want to get into finance have to pretend to be in it for reasons other than the money.

If you actually want advice I will overcome that visceral reaction and try to help. Could you tell me a little about yourself? Age/Sex/Education/Work Experience/Interests/Computer Skills etc,

You asked them what to do to have an impact in the world? If they don't share your values, their all-things considered advice isn't automatically valid.

No, just maximizing earnings. It's hard to get people to answer specific questions like that, they prefer to answer the questions which they want you to ask them.

So you asked them how to maximize your earnings, and they objected to the question? Did they say why?

Is it publicly given advice?

Sort of, internet forums

Dale
8y-2
0
0

People might enjoy a joke article I wrote, where I argue that Ashley Madison (the Infidelity website) was an Effective Altruist plot all along. Categorize it under the EA equivalent of crazy startup ideas.

  • Ashley Madison was set up by an activist who wanted to promote ethical behavior and punish the unjust.
  • Firstly, it took money from people who wanted to commit infidelity. Taking money from people makes them worse off.
  • Then, it didn’t provide any services. It never matched any cheaters up.
  • After having handed over credit card details but not received anything, the would-be cheaters realized it was a scam.
  • Then can’t take Ashley Madison to court, because that would be public record.
  • So they try to get out … but realize Ashley Madison has them in an incriminating position.
  • Ashley Madison extorts more money from them to delete their data.
  • Ashley Madison does not delete the data.
  • Ashley Madison discusses a possible IPO purely for the publicity. It knows it’s a fraud and could never stand up to auditing.
  • Ashley Madison then hacks itself. This explains why they were able to access the data so easily. They had previously hacked another competing service.
  • Ashley Madison then releases the data. This provides early downloaders with the opportunity to extort the would-be cheaters.
  • Eventually all the would-be cheaters are revealed, and face the wrath of their poor spouses.
  • No-one ever trusts an infidelity website again, making it harder to commit infidelity in future

So the net result is:

  • Would-be cheaters are effectively fined a significant amount of money.
  • And then exposed.
  • And no-one can ever create an infidelity website.

Do you know that:

a) AM did not verify email addresses? Ie, you could register someone else's email address and they may not know it.

b) AM had users in repressive regimes where non-hetrosexuals faced violence/death? For some of AM's users, the promise of a discrete forum represented a less-dangerous way to find partners

c) Additionally, AM was generally know to be a good place for queer/gay/bi/etc users to hook up even in non-repressive regimes.

d) It's unknown how many users were single or ethically non-monogamous.

e) It's unknown how many users were researchers/journalists/or just simply curious.

I understand your post is a joke, however it's in poor taste. Also even if everybody involved was demonstrably a cheater, I don't think it's good for EA's image to be seen as a finger wagging movement.

To expand on my last point: my understanding of effective altruism is that it is expansive. Generous. About becoming "more the people we wished we were". I do not see it as a movement that ridicules or comes from schadenfreude or is punitive. The AM hack is the result of horribly unethical business and software practices, and its fallout is causing a lot of suffering. That's why I think it's bad for EA's image if 'we' are seen to be joking about it.

its fallout is causing a lot of suffering.

Committing adultery causes a lot of suffering. Punishing people for anti-social behavior is an important part of any society, to incentivize good behavior. To the extent that western societies hardly punish this behavior at all, despite the huge amounts of suffering it causes, appropriately disincentivizing it could be an extremely effective way of improving the world.