Hide table of contents



By Baxter Bullock, Catherine Low, David Moss and Tee Barnett

Contents

Summary

Our Current Strategy

Summary and Interpretation of Workshop Data

Outreach Data

Quantitative Survey Data

Qualitative Results

Conclusion

Plans for Autumn 2018 and Beyond

The following post details SHIC’s 2018 strategic shift, evidence gathered, and future plans. You can learn more about what SHIC does and to see our most recent promotional video here.

Summary

Within the first half of 2018, Students for High-Impact Charity (SHIC) has successfully piloted its workshop model in 16 schools to 855 students in Vancouver, allowing for a strategic pivot towards a higher fidelity model for communicating key concepts, more robust data collection, and more opportunities for long-term student engagement. Our data thus far, consisting of both formal (outreach data and the post-workshop survey) and informal (teacher, student and administrator reactions to the program) feedback has helped affirm the value of SHIC in several ways:

  • Demand from both teachers and students for this type of program is substantial.
  • The program is appealing enough that the majority of schools have requested repeat visits.
  • Our limited data shows broad success across a variety of metrics.
  • There is at least some interest from promising students for further engagement.

We believe this data suggests strong value in continuing to test key components of the  workshop model.

While our goal in early 2018 was to measure the broad demand for and appeal of our program, we’ve shifted our focus for the upcoming school year (September 2018) toward engaging promising students in further education and long-lasting engagement. We believe this should be the primary metric for measuring the success of our program.

 

Aside from the workshop model, SHIC is maintaining a scaled-down version of our Student Leader program due to a couple of notable successes. We will also explore how the SHIC program can be implemented in a university setting.

Our Current Strategy

In Autumn 2017, we released a post detailing our reasons for turning away from our Student Leader model of supporting interested students in implementing the SHIC program for their peers around the world. Our current workshop-based model focuses on higher-quality content delivery and data collection.

Since our strategic pivot, SHIC carried out the following:

  • Prepared two ‘SHIC Instructors’ to run workshops in high schools.1
  • Reformatted and consolidated the SHIC program to fit into 3-5 hours of programming the amount of time we recommended schools provide for our workshops.
  • Reached out to high schools throughout the greater Vancouver area.
  • Presented 41 workshops in 16 schools to 855 students in the greater Vancouver area, gathering data, including surveys and charity choices,2 in the process.

 There are several reasons we’ve found this model to be a stronger fit with our overarching goals:

1. Controlling content delivery.

With our previous model, student leaders had the opportunity to run as little or as much of the SHIC program as they wished, cherry-picking elements of the program that suited their needs or starting the program without finishing it. While we agree that some of the program is better than none at all, we also feel like many were missing out on the impact of SHIC as a cohesive narrative; a program that ties many ideas into a set of overarching concepts.

2. Having a presenter equipped with the knowledge to answer questions and respond to uncertainty.

SHIC was created to spark interest and to provide foundational knowledge and skills. What a student chooses to build upon this foundation will vary with the individual. SHIC Instructors are knowledgeable enough in these concepts to guide students in the right direction. Student leaders would be less equipped to address concerns on the spot, which could lead to miscommunication or misunderstanding.

3. Controlling data collection.

As we detailed in last year’s post, our ability to collect post-program data was severely hindered by attrition. Most student leaders either failed to complete the entire program or were unable to gather sufficient survey data from participants. With our workshop model, we have stronger assurances that post-workshop data will be collected, either through direct contact with the teacher after we leave, or in many cases, by giving students time at the end of the workshop to complete a survey.

4. Beginning long-lasting dialogues with students with high potential.

The interactiveness of the SHIC workshop allows a rapport to form between students and the Instructor, increasing the chances of us continuing the conversation after the workshop. It is far easier for a SHIC Instructor to identify students with a high interest and/or aptitude and engage them further. It is also clearer whom an interested student should approach for more information. SHIC is currently experimenting with structured methods by which student engagement can be extended. These are detailed in the “Plans for Fall 2018” section below.

 

Summary and Interpretation of Workshop Data

Outreach Data

SHIC staff began reaching out to teachers within the greater Vancouver area in December 2017. Our primary method of onboarding schools was cold-emailing teachers of Grade 11 and 12 Social Studies, Social Justice, Theory of Knowledge or Philosophy. We also attended three teacher association meetings to promote our workshop. Of the 125 schools contacted, 25 (20%) responded with interest, and 16 (13%) invited SHIC into the classroom, a much higher response rate than we expected from cold emailing. Based on this, we are moderately confident that the SHIC program is scalable. By the end of the 2017-18 school year, SHIC had presented 25 full workshops (514 students) of three to five hours, and 16 short workshops (342 students) of one to one and a half hours.

Quantitative Survey Data

We administered surveys to students before and immediately after our full 3-5 hour workshops.

These surveys investigated changes in agreement or disagreement (6 point scale) with a range of statements about beliefs, attitudes, plans and behaviours. Students were also asked to select 2 (from a list of 5) factors that would be most important for them when choosing a charity. Two of these were coded as ‘effective’ and scored (1), and three as ‘not effective’ and scored (0). Students were also asked to choose a charity (or cause) to donate to and charities were coded as ‘effective’ if they were recommended by selected charity evaluators, or otherwise mentioned in our workshop.3 In the pre-workshop survey, this was a free write-in response and donations were incentivised (a random sample of charities chosen received actual donations), and post-workshop the students were directed to our giving platform,4 where they were asked to donate the $10 they received from SHIC to the charity of their choice.

Data was somewhat limited due to practical difficulties in data collection (we received ~500 pre-workshop responses and ~300 post-workshop responses). In addition, only 168 pairs of responses could be matched, due to a large number of respondents failing to correctly enter their anonymous ‘matching codes’ correctly. As a result, the final sample may be a more highly engaged/attentive self-selected sample.

We conducted paired sample t-tests on the 168 matching responses.5

In the charity selection task, pre-workshop 325 students stated a charity, 1 of which was coded as an ‘effective’ charity (0.3%). Post-workshop, 131 students stated a charity, 84 of which were ‘effective’ charities (64%). It is reassuring that post-workshop a majority of respondents chose an effective charity, though this is likely partially due to increased familiarity (these charities just having been mentioned in the workshop).

Funding totaling $4,400 CAD was provided to students through the online donation platform. $1,638 was directed to charities by the students, with $1,004 (61%) of donated money going to ‘effective’ charities. The remaining $2,762 has yet to be donated.


We found significant, if small, improvements (by our standards) in knowledge, attitudes and values for the majority of our questions. We found null results for 3 items (“I have a duty to take significant action to help people that are suffering.” “I plan to choose a career that will allow me to do as much good as I can” “I have compared charities against each other on my own time.”) The first two of these may be partly explained by the high mean responses in the pre-workshop survey. The null result for students comparing charities “on their own time” is unsurprising as the post-workshop survey was administered immediately after the workshop, before most students would have had time to independently compare charities.

 

On the whole, we view these results as fairly promising and slightly increase our confidence that SHIC workshops are having a positive effect on students beliefs and attitudes.

 

Two significant limitations are high rates of respondent attrition and the likely influence of social desirability bias and/or demand effects, as it was likely clear (post-workshop) which were the desired responses. In addition, it is hard to get a sense of how significant the impact of these self-reported changes are. For these reasons, in future iterations we plan to place less emphasis on such survey data, focusing instead on evaluating our impact on behavioral changes with a smaller number of highly engaged students. Read more about our plans for further engagement in the “Plans for Fall 2018” section below.

 

Qualitative Results

Every teacher we worked with gave us positive, sometimes glowing feedback, and most  expressed a desire to have us come to their school again. Some thought our workshop should be compulsory for all students, and some teachers later said that students continued to mention the content of the workshop weeks after we left. The response from students in the classroom was also very positive. In all classes, most students were engaged in the presentations, and in most classes, there were several students who participated in discussions enthusiastically.

Our post workshop survey included a qualitative question: “Has the SHIC program changed how you might act in the future? If so, how?” There were 122 responses from the 176 students offered this question. 33 students described actions they would take as a result of the workshop, such as donating to better charities, donating more money, or consuming fewer animal products. 36 students gave responses stating their increased motivation to help others or something they had learned, and 13 stated that the SHIC workshop has not changed how they will act in the future. Overall these responses indicated a high level of endorsement of the ideas in the workshop, but it is hard to know how these responses will translate into future actions or impact.

In order to get an understanding for interest levels beyond the workshop, at the end of the 2017-18 school year we extemporaneously attempted to re-engage students who had previously participated in a full SHIC workshop. We first worked to schedule an advanced workshop with interested students, but there was inadequate demand and we were unable to find a time that suited more than a few of the students, so the workshop did not go ahead. Further attempts to engage via one-on-one coaching also yielded no results. While this trial could potentially be indicative of insufficient student interest beyond the initial workshop, we feel that more planning (informing students of opportunities during the workshop and engaging shortly after) and better timing (not at the end of the school year) will result in more student engagement. We remain hopeful that future, more structured attempts to engage students will lead to a more fruitful uptake.

 

Conclusion

Overall, though still ascertaining whether or not SHIC workshops are cost-effective relative to the impact of other movement building initiatives, we feel confident based on evidence thus far that we should continue to explore this strategy, and moderately confident that we should attempt to expand to other cities in early 2019. Impact assessments of our program will continue for the foreseeable future.

 

Plans for Autumn 2018 and Beyond

The most significant update we’ve made since initiating the workshop experiment in January 2018 is placing more emphasis on the need to create an intentional engagement funnel through which high-potential students can continue their education.

Our current plan is to initiate engagement in two different ways:

  • The first is to run advanced SHIC workshops (called Next Level SHIC) locally for high-potential students throughout the schools SHIC visits. These workshops would be generally more free-form and we will be prepared to provide content and resources specific to the particular students in the workshop. Our goal is to build a local self-sustaining community of young people that will allow them to remain engaged throughout high school and university.
  • The second is to offer high-potential students one-on-one coaching in order to create a lasting dialogue and increase the chances for significant future impact. This is likely to involve recommending resources that deepen the student’s knowledge, and could include information about potential career paths, choosing charities for school fundraisers, or the use of rationality and productivity tools.

Key metrics for both coaching and advanced workshops will be the number of students we engage through these methods, and the actions students take as a result of their involvement, such as education and career decisions, pursuing volunteering opportunities with effective charities, and attending EA meetups and conferences.

In 2019, pending analysis of data to be gathered in Autumn 2018, we hope to expand SHIC workshops into one or more additional cities beyond Vancouver.

Other changes we are making for Autumn 2018:

  • We are now placing less emphasis on the future collection of quantitative survey data during the workshop, as this data doesn’t appear to properly reflect our impact for reasons outlined in the ‘Quantitative Survey Data’ Section above. As a result we are now administering a shortened post-workshop survey with qualitative questions after all workshops, and omitting the pre-survey.
  • We are in the process of creating a more mathematically dense workshop which we have started advertising to Grade 11 and 12 Calculus and Statistics teachers, and have had a promising number of responses coming from teachers already. This should allow us to reach students that we may not have been able to reach so far.
  • SHIC intends to test out methods of engaging tertiary students through guest lectures and workshops in relevant student clubs. We think connecting with tertiary students could be very effective, however we currently do not know how easy it will be to get an audience in this way.
  • Students around the world will continue to be supported in running the SHIC program as part of a student organization. Despite the high attrition rate of student leaders, the few that are successful are sometimes profoundly affected by the program. These few students may make the Student Leader model worthwhile. For the 2018-19 school year, most of the recruitment and one on one support will be provided by SHIC’s volunteer team with staff oversight.


This post was written by Baxter Bullock, with sections written by Catherine Low and David Moss. Editing and research by Tee Barnett, David Vatousios and Marisa Jurczyk. Data analysis by David Moss and Tamara Stimatze. Data collection by David Vatousios and Catherine Low (SHIC Instructors). Thanks to the CHIMP Foundation, who provided us the platform through which we could provide workshop participants money for donations, and a special thanks to all of the schools that allowed SHIC into the classroom and data collection to take place.

 


1. Since April, one of the two instructors is no longer Vancouver-based.

2.At various points throughout the data collection process, SHIC provides workshop participants with $10 donations through an online platform where students can give the money to their choice of charities.

3. 'Effective' charities include Cool Earth, GiveWell and its top charities, Animal Charity Evaluators and its recommended charities, and charities supported by Open Philanthropy and Founders Pledge.

4. Provided by the Charitable Impact Foundation (CHIMP)

5. Information about t-tests, t-values and p-values can be found here.

17

0
0

Reactions

0
0

More posts like this

Comments19
Sorted by Click to highlight new comments since: Today at 9:17 AM

Do you think the general knowledge of EA that a typical EA has is sufficient to run a SHIC workshop? It seems to me that having local groups and university groups give EA lectures at high schools on career day is potentially both high impact and a way for those groups to do direct work.

As a former teacher, I'd suggest that better-than-average presentation skills would also be essential for volunteers. But I'm also curious to hear the response to this question.

We experimented with this model in the early days of SHIC and didn't have much success, but it may have been partly because we didn't have the bandwidth to adequately prepare and support the university students who had volunteered. We are considering a second attempt with a stronger training/support system in place. Some of the limitations we suspect are:

  • As @Khorton mentioned, we place a lot of value on good presentation skills in order to engage the students. Thus we prefer instructors to have some background in teaching or public speaking.

  • Better-than-average knowledge of the relevant topics may also be a key component, as participant questions can be fairly complex. We have been slowly compiling a list of FAQs that future instructors can use in training.

  • As a substantial amount of time and effort goes in to training an instructor, it may only be cost-effective if that instructor is able to commit to running a fair number of workshops. Many university students/local group members are unable to make that commitment.

We believe that these limitations are potentially surmountable, but haven't made any plans as of yet to test this model further.

Could you put together a handbook and/or video that could be sent to all trainees or is it critical that there be interaction between the trainer and trainee?

Could you put together a handbook and/or video that could be sent to all trainees or is it critical that there be interaction between the trainer and trainee?

This would be along the lines of what we would consider doing if we explore this model further. However, I think there would still need to be a vetting process of some kind so that the we can be confident about the quality of the content and delivery.

Perhaps a mentorship model could also work with a few dedicated volunteers. They could shadow and watch a presentation by a trained staff member the first time, then team teach with the staff member 2-3 times, before teaching on their own. This model would hopefully mean that minimal extra staff time is spent on training, but volunteers are still able to deliver high quality presentations.

Great idea, thank you!

Have you tried / considered tracking career plan changes, and if so, do you have any tentative results you could share? (If not, what's your reasoning for not focusing on this more?)

Speaking just for myself here, I think tracking career outcomes for SHIC students is important, but Canadian high schoolers in affluent areas are typically 4-7 years away from being able to start a career, so this may take awhile to track well. I also don't expect high schoolers to have meaningful and stable views on their career since it would be so early in their life.

Right, when I wrote "career plan changes" I mostly meant that they end up studying a subject different from their previous best guess (if they had one) at least partly for EA reasons. (Or at a different university, e.g. a top school.)

I really don't think doing your undergrad at a "top school" is as important in Canada as it is in the US or UK, and I'm not sure it's worth the money for a Canadian undergrad to study out of the country.

Great point, and we should have mentioned more about our intention to track things like career or career path changes as a result of the program. We don't currently have data on this because our audience are generally too young to show reliable signs of moving toward an effective career, but part of what we hope to accomplish with extended engagement (detailed in the Plans for Autumn 2018 section above) is to follow high-potential participants more closely so we can monitor changes like that.

We have had several participants state their intentions of taking actions like this to make a bigger impact, but it is uncertain as to whether they will follow through.

Thanks, makes sense! Would be great to see such data in the future, though I agree it seems hard to track.

Two significant limitations are high rates of respondent attrition and the likely influence of social desirability bias and/or demand effects, as it was likely clear (post-workshop) which were the desired responses.

It seems to me one indication of social desirability bias and/or selective attrition is that there is a nearly half point shift in the average response to "I currently eat less meat than I used to for ethical reasons." On the other hand, it's possible students interpreted it as "I currently plan on eating less meat than I used to for ethical reasons."

I wonder if it is possible to add a check for this in a future survey. Maybe ask them if they intentionally conserve their water usage to save the environment? There should be no reason for that to change from pre- to post- without a change in social desirability or attrition.

My guess is the main problem occurs when it is very clear to students what the instructors want them to say. Since we don't talk about water usage, students may not change their answer for the water usage question, but still change their answer to the questions relating to the content (whether or not they have been impacted by the program), so there may not be a shift in the water question, but still a shift in the other questions.

We did test out some different social desirability scales in the surveys, which is a common method. The preliminary analysis suggested that social desirability is a factor, but we haven't finished that analysis on the full data yet.

Definitely true on both counts. I suspect that many answers are signalling intentions, but social desirability certainly has a role to play, as we mentioned above. This is one of the reasons we are now placing less emphasis on the future collection of quantitative survey data.

What do you see as a better way of gathering data going forward?

In the future SHIC is going to be placing more weight on our impact on the understanding and trajectory changes of the smaller number of students who progress from the primary workshops (of the kind we described in this report) onto our advanced workshops and individual coaching. Because we'll be working more closely with these students and discussing concrete actions (e.g. education and career decisions, pursuing volunteering opportunities with effective charities, and attending EA meetups and conferences), we hope to have a much more reliable insight into whether we're actually producing valuable changes in their understanding and plans.