Evaluation is a necessary and important part of arts and cultural practice. As funding systems become more sophisticated, there is an emerging imperative to evaluate outcomes of activities over and above simply counting the activities themselves.
Culture Counts has been working on evaluation planning, data collection, analysis and reporting for arts, cultural and sporting events for over 10 years. We have distilled five rules for conducting best-practice evaluations. Some of these might seem obvious, but it’s surprising how often the basics are messed up! We hope you find these useful.
Rule 1: Understand why you need to evaluate
This is a common issue for event presenters and funders alike. Before considering what to evaluate and how to evaluate, you first need to understand why you are evaluating.
Many evaluations are one-off happenings with no strategic justification or usefulness outside of the narrow needs of the individual who organises them. Long-term success in evaluation comes from aligning program outcomes with strategic objectives. If no strategic objectives exist, take a minute to record what they might be. If you’re a presenting organisation, consider what outcomes might appeal to funders or sponsors. If you’re a sponsor, publish your desired outcomes before providing funding, and hold funded organisations to account for achieving those outcomes. In all cases, be very clear about your program goals so that you can align them with strategic objectives. This step makes the rest of the evaluation process meaningful and sustainable. Find out more about our Outcomes Framework.
Rule 2: Measure the right things
Once you understand why you are evaluating, deciding what to measure is the next step.
Know the difference between inputs, outputs and outcomes – you can watch our video explaining the difference here. Most organisations measure inputs and outputs and think that they are doing a great job. If you are only counting program expenditure, audience sizes or the number of staff, you will need to think more about what outcomes your audiences and stakeholders seek from their engagement with you. If you have any doubts, it’s sensible to ask them what needs they have – and then choose measures that will directly address those needs. You will probably need to ask your audience one set of questions, with another set for sponsors and others for special interest stakeholders. An Evaluation Plan is a great place to store this information.
Rule 3: Use the best tools
Now that you know why you’re evaluating and what you want to measure, the question is how to conduct the evaluation.
From old-school manual approaches to the latest in online tools, here are some options:
- Print some questions out on paper and send interviewers out to gather public feedback. Analyse the results in a spreadsheet, make some graphics and write a report
- Use an online survey builder to create and distribute your surveys, like the Culture Counts Evaluation Platform. The platform collects data in real-time – generating charts for you to analyse and include in your reporting
When choosing the best approach for you, consider the total cost of gathering, analysing and reporting data. If your chosen option requires large amounts of your own time, this cost should be built into your estimate. Incomplete or unsuccessful research projects are most often caused by underestimating the amount of time required to ‘do it yourself’.
Rule 4: Know what to do with insights
Once you understand the insights from your evaluation, the next step is to communicate them.
You can share your report internally to inform tactical changes in resource allocations (with the management team), or more significant changes in strategic direction (with the board). You can share with investors to demonstrate how your outcomes meet their funding criteria. You can share with the public in support of your brand messaging. In fact, going back to Rule 1, it’s important to know what to do with the results before you start the evaluation process. The worst thing to do is to bury the results if they do not align with your preconceptions. This is an awkward challenge for those new to evaluation.
Rule 5: Understand how evidence should influence decision-making
Does your organisation practice evidence-based decision-making; or decision-based evidence-making? Insights driven by rigorous research can be very useful because they confirm or contradict presumptions about the returns on investment associated with funded programs. If decision-making systems are not set up to respond to objective feedback, no evaluations will be useful. If decisions are based on evidence, resource allocations will automatically go to activities that give the best returns. Remember that you have the power to specify what returns you seek in any of the economic, social, cultural, civic or other domains; and if you follow Rule 2 – you will always have the evidence you need to measure the impacts of the decisions you make.