Owning the Evidence Agenda in Arts and Cultural Learning

Blog

  • Creativity

As part of our Learning About Culture project, we’re exploring how cultural learning practitioners can support each other to improve the use of evidence and evaluation. With the Beast from the East doing its best to disrupt us, the RSA recently held a series of workshops with the Arts Council Bridge Organisations, to scope out a national network of cultural learning 'evidence champions'. Their brief – to move the conversation around evidence from a primary focus on advocacy, to one that focuses on learning and improvement.

Evidence and evaluation serve three key functions for any organisation wanting to understand the difference it makes:

  • Accountability – knowing how well I am doing (against agreed criteria)
  • Improvement – understanding how I can improve
  • Advocacy – demonstrating my success and persuading others of the importance of my work.

Tighter budgets and fears about declining arts participation in schools have led to a prioritisation of the third of these functions. 'Proofs' of the transfer benefits of the arts are used in order to persuade other people of their value.  The Culture and Sport Evidence Programme, ACE Cultural Value report and the Cultural Learning Alliance’s (CLA) ImagineNation for example, all use evidence that arts activity supports improvements in non-arts outcomes to advocate for continued support from policy-makers. All have been welcomed by a sector that has a strong awareness of its own vulnerability. Conversely, when reports like the EEF’s systematic review show that the evidence isn’t strong enough to make claims for this kind of impact, they have been received by some as anti-advocacy and harmful to the cause. Criticism is levelled at their failure to include (discontinued) projects that did lead to improved outcomes, as if the point of reviewing evidence was to find the one study that proved once-and-for-all that cultural learning ‘works’.

While this interest in evidence to advocate is motivated by good intentions, without a stronger interest in evidence to improve, it may end up undermining the quality of the thing it’s trying to protect. Our recent conversations with teachers and arts educators at a series of workshops around England revealed that they find it difficult to find and engage with academic research. Instead, they often look to advocacy campaigns and documents for evidence that justifies their approach. This was reflected in the applications to the RSA's Cultural Learning Fund, many of which used headline claims in advocacy campaigns as supporting evidence for the design of their programmes – even if the projects, learning contexts or art forms were different to those in the studies cited.

The trouble is that understanding what works in education is complex, with many independent variables influencing outcomes. Successful approaches tend to be context-dependent, so that even if an approach works in one classroom, there is no guarantee that it will work in the next. When practitioners read claims in arts education advocacy documents like ‘participation in structured arts activities increases cognitive abilities by up to 17%’ (which appears in several prominent papers), they often fall into the trap of imagining that because some arts projects have led to impact, all projects could be expected to have an impact. Many assume that the benefits to learning are derived from the art and apply equally, regardless of what artform is involved or how learning is supported by the activity's design and delivery. The different cognitive requirements of different art forms, the different ways of engaging in practice or observation of art and the question of what it means for activity to be 'structured' are overlooked. Using evidence primarily as a tool for advocacy means that when asked to explain programme designs, practitioners too often rely on evidence that seeks to justify the presence of arts in education, not the specifics of what they are going to do.

Different activities work in different ways and some are designed and delivered better than others, so it would be in vain to imagine that the strongest advocacy for arts and cultural education will come from a once-and-for-all-now-leave-us-alone proof of worth. Instead of relying on evidence to advocate for the arts wholesale, those involved in arts and cultural education need to demonstrate a commitment to using evidence to hold themselves accountable for what they hope to achieve and to improve over time. The scoping workshops for the Evidence Champions Network reinforced our recent survey finding that cultural learning practitioners are keen to evaluate their work in order to improve how their work benefits participants, but aren't regularly using the best methods. Better use of evidence in the sector requires whole system action, with cultural organisations, schools and funders all having roles to play in improving how practice learns from experience to improve effectiveness.  That's why we're keen to listen to our workshop participants and include practitioners from all these sectors in our network of Evidence Champions (details of which we'll be announcing shortly).

A key part of this shift to what we’re calling Evidence Rich Practice, is that practitioners should take more ownership of evidence and evaluation. As Caroline Sharp, of the National Foundation for Education Research put it in a recent interview with the RSA, this requires practitioners “to see evidence and evaluation as being for them, rather than primarily for others.” If evidence and evaluation are only ever for advocacy, and not for learning, we risk relying on hollow claims of impact and missing out on more chances to make a bigger difference.

Be the first to write a comment

0 Comments

Please login to post a comment or reply

Don't have an account? Click here to register.

Related articles