Hide table of contents

A Q1 update and 2018 in review

By Baxter Bullock and Catherine Low

Since launching in 2016, Students for High-Impact Charity (SHIC), a project of Rethink Charity, has focused on educational outreach for high school students (primarily ages 16-18) through our interactive content. In January 2018, we began implementing instructor-led workshops, mostly in the Greater Vancouver area. Below, we summarize our experiences of 2018 and explain why we are choosing to suspend our outreach operations.

Summary

  • 2018 saw strong uptake, but difficulty securing long-term engagement - Within a year of instructor-led workshops, we presented 106 workshops, reaching 2,580 participants at 40 (mostly high school) institutions. We experienced strong student engagement and encouraging feedback from both teachers and students. However, we struggled in getting students to opt into advanced programming, which was our behavioral proxy for further engagement.
  • By the end of April, SHIC outreach will function in minimal form, requiring very little staff time - Over the next two months, our team will gradually wind down delivered workshops at schools. We plan on maintaining a website with resources and fielding inquiries through a contact form for those who are looking for information on how best to implement EA education.
  • The most promising elements of SHIC may be incorporated into other high-impact projects - The SHIC curriculum could likely be repurposed for other high-impact projects within the wider Rethink Charity umbrella. For example, it could be a tool for engaging potential high-net-worth donors, or as content to provide local group leaders.
  • We believe in the potential of educational outreach and hope to revisit this in the future - While we acknowledge the possibility that poor attendance at advanced workshops is indicative of general interest level in our program and/or EA in general, it's also possible that the methods we used to facilitate long term engagement were inadequate. We think that under the right circumstances, educational outreach could be more fruitful.
  • SHIC will release an exhaustive evaluation of our experience with educational outreach in the coming months.

2018 in review

In late 2017 we made a strategic shift towards a high-fidelity model of student engagement through instructor-led workshops. We tested this model throughout 2018, with our instructors visiting schools in Greater Vancouver, Canada[1].

Most students (56%) participated in a single-session workshop lasting approximately 80 minutes. These workshops consisted of a giving game[2], followed by an overview of the core ideas of effective altruism[3], including coverage of key cause areas. The remaining 44% of participants participated in multi-session (typically three), in-depth workshops which usually included a giving game, interactive explorations of the topics mentioned above, a cause prioritization activity, and a discussion of effective career paths.

Our goal for the second half of 2018 was to identify high-potential students from our school visits, and engage them further with supplementary advanced workshops at a central location in Vancouver. To gauge interest initially, we began with an opt-in approach for all interested students who provided an email address in order to obtain more information. We ran a workshop in November which primarily consisted of an in-depth activity on cause prioritization, and a workshop in December focused on effectively creating online fundraisers for the holidays.

Our results

The metrics we identified to gauge our success were:

  1. Teachers and school uptake
  2. Student survey results indicating shifts of opinion and/or behavior
  3. The number of students who continue to engage with the material beyond our initial visit, and the extent to which they remain involved.

We exceeded our expectation with metric 1 - Demand from teachers was higher than initially expected, with 25% of the schools we contacted booking at least one workshop. We also saw some success through representation at local teacher conferences. All told, in 2018 we reached a total of 2,580 students at 40 institutions, mostly in high school Social Studies and Math classes throughout Greater Vancouver[4]. Qualitative feedback from teachers and students, which includes our reads on the reactions of workshop participants, suggested that students were engaged. Teachers generally thought the content and delivery was excellent and well-suited to their curriculum area. The number of teachers who asked us to return for additional classes and/or the for the autumn semester after our initial visit serves as further evidence of continued interest in the program.

We met our expectations for metric 2 - Qualitative and quantitative data gathered from pre- and post-workshop surveys showed small but significant improvements in knowledge, attitudes, and values for the majority of our questions, suggesting that students generally shift their perspectives as a result of our program. However, there is likely a social desirability effect to consider[5]. Additionally, it was unclear to what extent these students are likely to act upon these newfound perspectives.

We attempted to gauge whether these shifts were temporary, but only 37 out of the 388 students who were emailed a 3-month survey chose to start the survey, so we were unable to draw conclusions from such a small sample.

We did not meet our expectations for the third, and most important metric - Despite the success of our first two metrics, we struggled to further engage students with opt-in advanced workshops. We feel the turnout was poor enough to suggest that this was not an effective method for SHIC to engage students beyond the initial workshop. 1247 students were eligible to be informed about our advanced workshops, and we sent out 392 invitations[6]. 17 students expressed interest, and five ended up attending our first advanced workshop in November, and two attended the December workshop (both returning from the first). There may have been logistical reasons for why the December workshop was less attended by students. The decrease in attendance could be attributed to a change in the workshop time (moved to begin on a Sunday morning) or its proximity to the holidays, or it could have been the particularly discouraging weather (cold, raining heavily).

Based on these results, we’re left with one or more of the following three conclusions:

  1. The students were engaged in the SHIC program in class, but our methods for engaging students beyond the classroom were ineffective.
  2. Students had the will, but not the bandwidth to engage further with the SHIC program.
  3. Students were not as engaged by the SHIC program as our post-program survey data and experience suggested, and therefore uninclined to participate in further programming.

Our best guess is that all three of these conclusions are true to some extent. High school-aged students in North America tend to engage in a plethora of extracurricular activities, often leaving them with little bandwidth. We also believe that using email as our primary method of contacting students was not the optimal form of correspondence, as email tends not to be a widely used form of communication with this age group[7].

Our reasoning for suspending workshop operations

Our decision to suspend SHIC’s operations was an unusually difficult one because, despite our failure to deliver on our key metric, we still believe educational outreach in some form could be effective. For example, we could have shifted our audience to more targeted groups beyond high schools, or used other methods of engagement within high schools. It’s also possible that SHIC workshops had more of an effect than we were able to measure, such as increasing the chances that students will donate or fundraise for effective charities in the future, or be more receptive next time they encounter EA ideas.

However, as a result of not reaching our goals, we came to believe SHIC is now less marketable to new and existing donors who were also interested in long-term engagement of students. Had this not been the case, we would have more strongly considered exploring the effectiveness of our program with new audiences, such as university students and adults.

Finally, the section below outlines our belief that organizational bandwidth and resources would be more valuable if reallocated to other promising projects, such as RC Forward and the Local Effective Altruism Network (LEAN).

We are nonetheless proud of what SHIC has accomplished, and believe there is a possibility that this program made an impact on many students’ lives, despite our difficulties in tracking that impact.

Looking Forward

A SHIC website containing our materials and a contact form will remain active - While active outreach will be suspended, we think it’s important to maintain accessibility to the user-friendly content SHIC has created. We would also like to put volunteer time and minimal staff time towards consulting those with inquiries about how best to implement EA education.

A detailed evaluation of SHIC and high school outreach is forthcoming - By May 1, we plan on releasing an in-depth look at not only our experiences with high school workshops, but all programs SHIC has conducted, as well as aggregate data from other similar projects related to EA educational outreach. It is our hope that this post will provide a foundation upon which experimentation with educational outreach can continue within EA.

Repurposed SHIC resources could be very valuable - A dissolution of one of our projects allows us to reallocate some of the more promising elements of that project towards other initiatives. In SHIC, we have the resources and staff to effectively make complex philosophical and mathematical concepts accessible to the public. We believe this could prove very valuable for other projects within Rethink Charity, such as educating high net worth donors in Canada about effective giving opportunities as part of RC Forward, and providing high quality materials and guidance for group leaders who wish to run workshops on EA concepts as part of the Local EA Network (LEAN).

A potential return to educational outreach in the future - We still believe that educational outreach could be very impactful, and are pleased that others within the EA community are continuing with their outreach programs. Pending clearance of several hurdles from a strategic standpoint, this could be worth returning to in the future. Those hurdles include reconsidering the optimal audience, methods of data collection, and methods of long-term engagement.

_______________________________________________

Endnotes

1. Prior to becoming SHIC Manager, Catherine Low tested SHIC materials in the classroom as a high school teacher in 2016 and 2017. Her experiences provided evidence that an instructor-based model may be more effective than student-led models.

2. SHIC’s giving game involves students analyzing 3 to 4 charities comparatively, then deciding which charity will receive a donation.

3. In most cases, the term “effective altruism” was not used as part of the SHIC program.

4. In Vancouver high schools specifically, we reached 2,228 students in 31 institutions.

5. We included a social desirability test in our survey to attempt to measure this bias, however the results have not been factored into our survey analysis.

6. For students who participated in the full SHIC program in the Winter and Spring terms of early 2018, providing a contact email was mandatory. In Summer and Fall 2018, 19% of the students voluntarily gave us a contact email.

7. We considered other (social media focused) methods of communication that ultimately felt inappropriate.

Comments17
Sorted by Click to highlight new comments since: Today at 9:25 PM

This was very interesting. Retrospectives on projects that didn't work can be extremely helpful to others, but I imagine can also been tough to write, so thanks very much!

Just because they didn't get the evidence of impact they were aiming for doesn't mean it "didn't work".  

I understand if EAs want to focus on interventions with strong evidence of impact, but I think it's terrible comms (both for PR and for our own epistemics) to go around saying that interventions lacking such evidence don't work.

It's also pretty inconsistent; we don't seem to have that attitude about spending $$ on speculative longtermist interventions! (although I'm sure some EAs do, I'm pretty sure it's a minority view).

Kudos for pursuing this and not getting too attached to it as to not be willing to give it up when the evidence showed such made sense.

Here's a question: do you have thoughts on how this could have failed faster? If you were to go about this again, what would you have done differently in order to spend even fewer resources on it?

Good question Ozzie. In the start of 2018 we mostly focussed on getting into schools and on the surveys (metrics 1 and 2 above), because they were our first hurdles and we were very uncertain on how these would go until we started the project.

However that meant we didn't optimise our workshops for engaging students long term (metric 3) for several months after starting the project. That meant we weren't confident in making decisions based on the first indications that we were not meeting metric 3, and ran the project for several more months as a result. If we had planned our long term engagement strategy at the start of 2018 and set success criteria earlier we could have learnt what we needed to in less time.

Reiterating my other comments: I don't think it's appropriate to say that the evidence showed it made sense to give up.  As others have mentioned, there are measurement issues here.  So this is a case where absence of evidence is not strong evidence of absence.  

Strong upvote for publishing this summary. Reading it, I feel like I have a good sense of the program's timeline, logistics, and results. I also really appreciated the splitting up of metrics by "success level" and "importance" -- a lot of progress updates don't include the second of those, making them a lot less useful.

Sounds like any future project meant to teach EA values to high-school students will have to deal with the measurement problem (e.g. "high school students are busy and will often flake on non-high-school things"). Maybe some kind of small reward attached to surveys? At $10/person, that seems affordable for 380 students given the scale of the program, though it might make social desirability bias even stronger.

Thanks Aaron. Measurement problems were a big issue. We experimented with incentives a bit, particularly offering to randomly select from students who completed the post-program survey, and those selected would receive money to give to a charity of their choice, but that didn't seem to make a difference, or at least we weren't in a place to offer a significant enough incentive to make a noticeable difference.

The other measurement problem that we ran into was knowing that, given the age of workshop participants, in most cases we wouldn't be able to measure actionable impact for another ~5 years.

I think this illustrates a harmful double standard.  Let me substitute a different cause area in your statement:
"Sounds like any future project meant to reduce x-risk will have to deal with the measurement problem".


 

I think that X-risk reduction projects also have a problem with measurement!

However, measuring the extent to which you've reduced X-risk is a lot harder than measuring whether students have taken some kind of altruistic action: in the latter case, you can just ask the students (and maybe give them an incentive to reply).

Thus, if someone wants me to donate to their "EA education project", I'm probably going to care more about direct outcome measurement than I would if I were asked to support an X-risk project, because I think good measurement is more achievable. (I'd hold the X-risk project to other standards, some of which wouldn't apply to an education project.)

Others have said this but it bears repeating: thank you for writing this up! This sort of detailed post-mortem is a resource the whole community can learn from. Kudos!

Thanks for writing this up. Great to see people testing things and then adjusting their plans in light of the results.

This is probably a relatively minor question but this wasn't something you mentioned so I thought I'd ask: was transportation a problem in people getting to the advanced workshops? I can imagine that, if the a student needed to be driven to the workshop, that would make it much harder to attend.

Hi Michael, I'm the Vancouver-based Educator for SHIC and one of two people who ran the advanced workshops (the other being Baxter).

Your question is a good one! Whilst we did our best to accommodate students from all parts of Greater Vancouver - picking a central location that was very close to a train station - the trek for many students would have been quite a long one (Greater Vancouver is very spread out!). Since we currently don't have any feedback regarding this, it's hard to say to what extent students were deterred by the prospect of a long commute, and whether or not having access to a car factored into their decision to come or not (I should probably note that some of the students who did attend one or both of the advanced workshops had a long way to come, and made the journey by public transit). My own guess is that it was probably a minor factor overall, but one still worth addressing if we, or anyone else, ever decides to take another crack at something like this.

Online meetings could be an alternative/supplement, especially in the post-COVID world.

I thought this post was a very thoughtful reflection of SHIC and what went wrong in approaching highschoolers for EA outreach, which is made all the more interesting given that as of 2021 high school outreach is now a pretty sizable effort of EA movement building. SHIC in many ways was too ahead of its time. I hope that the lessons learned from SHIC have made current high school outreach attempts more impactful in their execution.

Disclaimer: I am on the board of Rethink Charity which SHIC was a part of at the time of this post, but I am writing this review in a personal capacity as a personal reflection. I did not share this review with anyone at Rethink Charity, so it's quite possible other people might disagree and I may be wrong here, so do not take this as an official Rethink Charity position.

Thanks for this update, and for your valuable work.

I must admit I was frustrated by reading this post.  I want this work to continue, and I don't find the levels of engagement you report surprising or worth massively updating on (i.e. suspending outreach).

I'm also bothered by the top-level comments assuming that this didn't work and should've been abandoned.  What you've shown is that you could not provide strong evidence of the type that you hoped for the programs effectiveness, NOT that it didn't work!

Basically, I think there should be a strong prior that this type of work is effective, and I think the question should be how to do a good job of it.  So I want these results to be taken as a baseline, and for your org to continue iterating and trying to improve your outreach, rather than giving up on it.  And I want funders to see your vision and stick with you as you iterate.  

I'm frustrated by the focus on short-term, measurable results here.  I don't expect you to be able to measure the effects well. 

Overall, I feel like the results you've presented here inspire a lot of ideas and questions, and I think continued work to build a better model of how outreach to high schoolers works seems very valuable.  I think this should be approached with more of a scientific/tinkering/start-up mindset of "we have this idea that we believe in and we're going to try our damndest to make it work before giving up!" I think part of "making it work" here includes figuring out how to gauge the impact.  How do teachers normally tell if they're having an impact?  Probably they mostly trust their gut.  So is there a way to ask them (obvious risk is they'll tell you a white lie).  Maybe you think continuing this work is not your comparative advantage, or you're not the org to do it, which seems fine, but I'd rather you try and hire a new "CEO"/team for SHIC in that case (if possible), and not throw away existing institutional knowledge, rather than suspend the outreach.

-------------------------
RE evaluating effectiveness:
I'd be very curious to know more  about the few students who did engage outside of class.  In my mind, the evidence for effectiveness hinges to a significant extent on the quality and motivation of the students who continue engaging.

I think there are other ways you could gauge effectiveness, mostly by recruiting teachers into this process.  They were more eager for your material than you expected (well, I think it makes sense, since its less work for them!) So you can ask for things in return: follow-up surveys, assignments, quiz questions, or any form of evaluation from them in terms of how well the content stuck and if they think it had any impact.  

A few more specific questions:
- RE footnote 3: why not use "EA" in the program?  This seems mildly dishonest and liable to reduce expected impact.
- RE footnote 7: why did they feel inappropriate?

Hi capybaralet, 
Thanks for your comments and enthusiasm for the program!

> I must admit I was frustrated by reading this post.  I want this work to continue, and I don't find the levels of engagement you report surprising or worth massively updating on (i.e. suspending outreach).

I admit when the decision was made to stop actively working on SHIC, I was pretty sad and frustrated too. However for our team, and our funders too, the main question was "do we think this is worth continuing compared to other things we could spend our time and money on?", and the answer was "probably not".  

You might also be interested in this post which combines the experience of all the EA outreach attempts I was aware of at the time: 

https://forum.effectivealtruism.org/posts/L5t3EPnWSj7D3DpGt/high-school-ea-outreach

This probably will answer many of your questions about why we didn't continue to test out different ideas of engaging students and teachers - we'd already tried quite a few different things and learnt from work from other EAs. The post is now nearly 2 years old and there have been other efforts in the EA community to work with high schoolers since then. But I still basically agree with my conclusion which was:

> I don’t think our outreach described in this post was a particularly effective use of resources. However, outreach could be effective if you are able to attract highly promising students to sign up for a program over a longer term. This might be possible if you have a strong brand (such as an association with elite University) allowing you to attract suitable students through schools and other networks, and the resources to run a fellowship-type program with these students.
 
To answer your specific questions:
1. There were only a few students who engaged significantly out of class, so it is hard to know what to conclude from a small number. Some were quite keen on EA concepts, others were eager to do good, but didn't seem to be particularly excited about apply EA principles to their career path or volunteering, so we didn't feel that the impact these students could have was sufficient to outweigh the small number.
2.  The question of whether to use "effective altruism" was discussed a lot within our team. We ended up using the term a little in the program, referred to EA on our website, providing copies of "Doing Good Better" to the teachers to lend to interested students, and using the term with the advanced workshop students. The reason for not using the term prominently was partly because we felt some teachers/parents might be put off by the term, and also to provide brand separation between the EA community and SHIC - if we used "EA" and did a poor job that would reflect poorly on EA as a whole. Similarly, if EA got a poor reputation we might still be able to continue with SHIC. 
3. Some high schools have policies around teachers not connecting with students on social media (except through official school pages), and we know some schools and parents are cautious about how minors interact over social media, especially with older folks like us! So we were wary about using social media to have one-on-one or small group conversations. Our hope was that the advanced workshop students would eventually drive their own EA group, and make their own social media presence. 

Thank you to everyone who put so much time and energy into this program as well as this summary and willingness to share something which may be deemed as not having the best outcome. Strong upvote for sure.

Curated and popular this week
Relevant opportunities