Evaluation in 2011
In June 2011 we came to the end of a grant from the Wellcome Trust to run I’m a Scientist in 2010 and 2011. We’ve evaluated throughout the project and the final evaluation report is available to download below, as well as a shorter executive summary and summary sheet.
2010-2011 Evaluation Report– at 75 pages long it contains a lot information including the methodology, aims, key figures and data, findings and recommendations.
Executive Summary – this outlines the main findings from the 2010-2011 Evaluation Report and looks at whether the objectives set at the start of the project have been met.
Summary sheet of the event in 2010-2011 – this highlights some of the key numbers and quotes from evaluations, and reasons why the event works so well.
We also produce summary reports of each zone and send them to the scientists and teachers in the zone. These give more facts and figures about the zone – how many students asked how many questions, how many votes there were, how many answers the scientists gave, pageviews in the zone etc.
An example of one of these zone reports, from the Space Zone in March 2012 is available here.
How do we evaluate?
Evaluation is absolutely key to understanding what students and scientists have got out of the event, and it helps us continuously improve the event. There are two types of evaluation that we’ve done on I’m a Scientist, right from the first pilot event back in 2008.
Formative: We have done formative evaluation throughout the project – asking students, scientists, teachers and stakeholders for their views and acting upon them.
Summative: This is the more formal type of evaluation where we ask participants for their views before the events starts, observe the event in action in the classroom and survey views after the event has finished.
One of the benefits of running an online event is that we collect an enormous amount of data about how students and scientists engage with the event. We know how many students asked questions, how many questions they asked, how much they participated in live chats. We can tell the scientists how many students they’ve reached, how many questions they’ve answered and whether they’ve changed the attitudes of those students.
We’re always looking for ways to improve our evaluation, but what’s the point of evaluation is no-one reads and learns from what you’ve done. We’re genuinely interested in hearing what measures you’d like to see. What is it you want to see in an evaluation? Send us an email with your suggestions.
Previous evaluation reports
You can find the evaluation report on the 2010 event here
You can find the evaluation report on the 2009 event here
You can find a summary of the 2008 pilot event evaluation report here
Recent Comments
No comments to display