This week we hosted a workshop to kick off the the pilot phase of the Digital Inclusion Outcomes Framework and evaluation toolkit. Julia tells more below.
In July we published the Digital Inclusion Outcomes Framework and Evaluate-IT Toolkit, a step-by-step guide that outlines four key steps to evaluating the impact of digital inclusion activities and projects. This collection of resources was designed to help local and national stakeholders measure and evidence the wider economic, health and social benefits of their digital inclusion activities.
Now that we’ve created the Framework and Toolkit, we want to find out how these work in practice. We want to know if the Toolkit adds the value we think it will. Will it be easy to use? Will it be too resource intensive for organisations? Is it relevant and useful to those using it?
Basically, we want to make the Toolkit better. And to do this, we will spend the next 6 months running a pilot project with a number of organisations who are evaluating their own projects and using the toolkit in real life.
To begin the pilot process, we invited representatives from local government, social housing, private sector and charities to a workshop to develop their understanding of the Toolkit, how it works and how to use it.
Attendees tried out processes described in the toolkit, from filling in a Theory of Change table, to having a go at questionnaire data entry. They got a taster of the evaluation process, to build their confidence to apply this in their own areas of work. They also provided valuable feedback and first impressions on the Toolkit.
So, what next?
Workshop over, evaluation of the Toolkit begins! Our pilot participants will get on with the evaluation of their projects, following the steps outlined in the toolkit, and we will seek feedback at crucial points in the process to understand where the Toolkit works well and where it could be improved.
By the end of the pilot process, the Evaluate-IT Toolkit will be iterated on the basis of the feedback that we’ve gathered. It will be tried, tested, and improved ready to go forward as an even stronger tool for effective digital inclusion evaluation.