Skip to main content

https://digitalinclusion.blog.gov.uk/2016/02/04/the-evaluate-it-toolkit-what-does-everyone-think/

The Evaluate IT Toolkit: What does everyone think?

Posted by: and , Posted on: - Categories: About digital engagement, Digital Economy, User research

 

IMG_0009 copy

We are almost halfway through our six month Outcomes Framework and Evaluation Toolkit pilot process. Following our workshop in October, seven organisations signed up to use and test the Evaluation Toolkit to evaluate their own digital engagement projects. We’ve been following them to find out exactly what they think.

Our seven pilot participants cover a number of bases (local authority, charity, housing association), and their project focuses are really varied: upskilling organisation staff, IT training for social housing residents, digital upskilling for people with disability, empowering residents to get online and use online services, bringing together engagement activities to address community needs and providing digital support for young people.

This variety is great, it means that we can really test the toolkit under all sorts of circumstances; each project is using the toolkit in different ways. Some are using the whole toolkit, others picking and choosing the sections most relevant to them. Some organisations are using the toolkit in conjunction with other guidance, and some are asking partners to carry out evaluation independently, guiding them through the Toolkit as a step-by-step resource to complete the evaluation process.

So, what do they think?

Everyone thought it was (mostly) great! ….

The Toolkit is a great catalyst for the evaluation process. It does the preparation and planning for you, so you can get on with engaging your users and analysing the data. This takes out some of resource intensity of doing evaluation.

The guide is flexible. You don’t have to use the whole thing depending on the sort of evaluation you are doing. Maybe you could focus primarily on the outcomes and data collection resource alone, or skip to the second step if you already engaged with your stakeholders.

The language is easy to follow, the definitions are useful, and hints helpful (especially the one about starting your report as early in the process as possible!)

The outcomes and indicators have been received well. They are mostly simple and straightforward, and generally cover the needs of all the different projects in our pilot. Organisations have had to add to add some new questions, but found that the ones already provided acted as useful templates to adapt to their needs.

The fact that the toolkit is a proper published ‘thing’ gives the evaluation process some purchase and authority. People have found this helpful when justifying evaluation methods, processes and indicators.

… But there are some key areas for improvement.

Formatting and instructions could, in some places, be improved.

Parts of the Evaluation Toolkit have caused confusion and we are looking to make them clearer.

There was mixed feedback on whether the Toolkit is too long or too short. Some think that we should develop ‘add-on’ documents with extra info on certain areas like ethics, or how to use the toolkit in conjunction with funding requirements. Others think that it is too long and quite daunting, especially for smaller organisations with less resource. It was suggested that we make a ‘lite’ version to tackle this.

What about the things that get in the way?

The toolkit takes you through evaluation as an ideal, but sometimes things don’t run so smoothly; a funder might ask you to go through the process in a certain way, or you might not be able to collect data the way the toolkit suggests.

People asked for a troubleshooting piece to help when things don’t go as planned.

Finally, data collection in the toolkit focusses mostly on the quantitative. Qualitative data features primarily in step one- engaging stakeholders, but for many it also plays a vital role later in the process as well. It enriches qualitative responses with added detail, and better lends itself to alternative data collection methods like data through blogs and social media. People think that this could be better acknowledged.

What next?

We will be back in touch with our pilot participants in the coming weeks and months to find out how they’re getting on as they delve deeper into the evaluation process and the Toolkit. The more we can find out about practical use of the Toolkit, the better future iterations will be.

If you are looking to begin evaluation on a digital engagement project, give our Outcomes Framework a go! We’d love to know how you get on with it and if it improves your evaluation experience.

Sharing and comments

Share this page