Here at Student Hubs, our 2016-17 year began in September when university students started or returned to their studies. December and January therefore present a great opportunity for mid-year impact analysis, the results of which we are sharing in this post.
We separate impact analysis from programme evaluations. Evaluations focus on which bits of the programme work and don’t work. For example, we ask volunteers how well our training prepared them, we review how students heard about each opportunity so that we can adjust our marketing, and we monitor our equal opportunities data to measure the extent to which Hub activities are inclusive and representative of each university population.
Impact measurement, on the other hand, is about monitoring the outputs and outcomes that are tied to our theory of change; metrics that we believe can tell us something about whether or not our activities benefit both students and communities in the way that we intend.
Outputs
Our output metrics monitor how well we engage students and community partners in our work. We want to see growth, as well as an increasing focus on our Key Social Action Activities (Practical Volunteering, Skilled Placements, Incubation), which provide a double benefit to students and communities.
- Across our seven Hubs measuring their impact, we have seen an increase in all but one output metric compared to this point last year. The decrease is a 1% drop in the number of subscribers to The Week, our Hubs’ newsletters.
- We have 821 active long-term and one-off volunteers, a 21% increase compared to the same group of Hubs last year. This is ~55% of our total projection for this year.
- There are 125 students in the Social Innovation Programme, a 20% increase compared to last year. That is 4 cohorts so far, with 4-5 more planned in Term 2 and Term 3.
- We are incubating 15 new projects with students across the network, and 35% of these launched in Term 1. We are behind our projections (18%), but Term 2 is typically busier for incubation activities.
- We have 2397 event attendees and 2170 training attendees; 46% and 63% of this year’s projections.
Our open rates for our newsletters have increased at all but two Hubs. Our average open rate is now 21%, up from 17% last year.
Outcomes
Following a pilot last year, we are applying before and after measurement to our student outcomes this year. All students complete a survey at the beginning and end of their activity, then we take a ‘distance traveled’ measure which tracks a group’s progress during a programme. Response rates to surveys is always a challenge, and this challenge doubles with before and after surveying, so we have to be diligent about collecting responses.
- Kingston and Cambridge both used the surveys for their Term 1 cohort of the Social Innovation Programme which lasts 6 weeks. Both Hubs saw an improvement in all three of our outcomes:
- Cambridge saw a 5% increase in participants’ confidence while Kingston saw a 13% increase.
- Cambridge and Kingston both saw a 6% improvement in participants’ ability to work with others to make change.
- Cambridge saw a 12% improvement in participants’ ability to lead others to make change while Kingston saw a 16% increase.
- We intend to collect and analyse outcomes data from as many of the 1000+ volunteers and skilled placement participants as possible when all of our programmes end in the spring.
Looking ahead
We are pleased with this progress so far and our Hubs are already in the midst of Term 2 activities. Monitoring and evaluating our activities, and sharing our results, is essential for us to prove and improve the impact of our work with universities, students and local communities. We review our outputs data monthly in each Hub, so we will be carefully monitoring progress from this mid-year point until June, when our 2016-17 year ends. In the meantime, you can read last year’s impact report, including case studies, here. Look out for our 2016-17 report later this year.