evaluation

Evaluation - Focus on 'Diversity'

I delivered a webinar for the Diversity Arts Australia Fair Play Program on evaluation with a focus on ‘diversity.’ You can access the slides here. I have copied in the main ones below.

There is also a lot more detail on various aspects of evaluation in the other resources on this website if you want to get into the nitty gritty of developing an impact framework or understanding theory of change.

Evaluation is an exercise in power

When evaluating a project which involves people with lived experience of marginalisation, it’s very important to understand that evaluation is not neutral, but is an exercise in power. It can be an exercise in sharing power, giving it away; or it can be an exercise in entrenching power, and maintaining the status quo.

Slide3.jpeg
Slide4.jpeg

I recommend that before you start the process of evaluation, ask yourself honestly:

Will your evaluation itself make things better or worse for the people you would like to help?

Slide5.jpeg

As a rule of thumb, when you go into evaluation of projects with people with lived experiences of marginalisation, remember your starting point should be: justice, co-design, working with evaluators with lived experience, and ownership.

Slide12.jpeg

When you decide what to evaluate, the most important thing to do is: ask the intended ‘beneficiaries’ of the project. I use quotation marks for ‘beneficiaries’ because it is not necessarily the case that your project is benefiting others more than it is benefiting your own organisation - see above re power.

What do the people, who are the intended beneficiaries of this project, want to see change?

Slide13.jpeg

There are a bazillion ways to collect data. The main ones are: self-assessment, observations (be super careful with these, so you don’t disempower the people you are trying to support), interviews and questionnaires.

I generally recommend questionnaires (which someone asks you) over surveys (which you send out and hope people respond to). This is because people with lived experiences of marginalisation tend to be under-represented in survey responses.

Slide28.jpeg

Play devil’s advocate. Try to prove the opposite of what you hope are the outcomes of your project.

Slide35.jpeg

All the way through the evaluation you need to check in on a range of ethical considerations around consent, privacy, risk of harm and power.

Slide40.jpeg

Make sure you have put in place the support people need to respond to your evaluation.

Slide41.jpeg

When you design your tools, make sure you work with ‘critical friends’ who also have lived experience of the forms of marginalisation of the group you are working with. And be prepared to change things.

Slide42.jpeg

We often use a ‘before and after’ tool, which allows people to jot down words or do drawings of how they felt ‘before’ and ‘after’ a project experience.

Slide43.jpeg

I also love Voice Lab by Polyglot, which allowed kids to say what they really felt in a beautiful private cocoon space.

Once you have your data, again try to hand the power over to the people with lived experience of marginalisation.

Slide46.jpeg

Top Tips

Here are my top tips for evaluating projects which involve people with lived experience of marginalisation.

Slide48.jpeg

Values-based planning

I recently delivered a short presentation for the Theatre Network Australia (TNA) as part of a seminar entitled ‘Planning for a future we can’t imagine.’ I am including here a quick overview of the main points:

  • Theory of Change as an impact-driven planning tool

  • Action research as a useful model for iterative planning

  • Values-driven planning

TNA also has a useful bunch of links to other planning tools and resources.

Slide2.jpeg

Theory of Change

Slide3.jpeg

The key to a good Theory of Change is to work backwards from the big picture goal to the activity, rather than the other way around. A Theory of Change has the following elements:

  • Big picture goals - these are your legacy goals, the difference you want your project or organisation to make to the long arc of history

  • Project impacts - these are the medium to long-term impacts that your project or organisation needs to achieve to contribute to the big picture goals

  • Outcomes - these are the intermediate or short-term outcomes that your project or organisation needs to achieve to get to the project impacts

  • Activities - this is what your project or organisation will do to lead to the outcomes.

You decide your activity based on simple equation:

What will lead to the change we want to see?

+

What we can do that no one else can?

For example, I might want to contribute to world peace. To do that, perhaps I should become an international human rights lawyer. But when I tried to do a law degree, I hated it. So what can I do that no one else can? I can write this blog post, for starters - and work in the arts for social change.

Slide4.jpeg

Below is a rough example of what a Theory of Change might look like for an art gallery which wants to support cultural equity and justice.

The useful thing about a Theory of Change are the arrows. The example may look like an electrical circuit diagram but it is essential that you use a Theory of Change to unpack your cause and effect assumptions. Adding the arrows allows you to really critically evaluate:

Will X really lead to Y, or am I basing that on false assumptions, poor research or unconscious bias?

What else do I need to do to make sure X leads to Y?

Who do I need to ask to make sure my assumptions are valid?

Slide5.jpeg

For more details about Theory of Change, go to our dedicated resource. Also thanks to Ian David Moss for schooling me in Theory of Change in the first place.

Action Research

Now let’s turn to action research. This is the term given to a cycle of observe, reflect, plan and act. You may have heard this called all sorts of things:

  • In the innovation field this is called the fast fail model

  • In design it might be called beta testing or design-led thinking

  • In business it is called agile project management

In essence, it just means:

  • Do something (‘Act’)

  • See how it goes (‘Observe’)

  • Ask yourself and others what worked, and what needs to be done differently (‘Reflect’)

  • Get ready (‘Plan’)

  • Do something a bit different this time (‘Act’)

  • etc

Slide6.jpeg

Values-Based Planning

‘Values-based’ or ‘values-driven’ planning is about checking in with your values, rather than your KPIs or immediate metrics, to make sure you are on track. It is a useful grounding exercise, to being you back to what you really believe in and the change you want to see in the world as a result of your existence.

Values-based planning is super useful for individuals and organisations, and is something I do in general, with my life.

For an organisation, checking in with values makes sure you are not working your tail off to achieve what Vu Le calls the organisation’s ‘shadow mission.’ For example, you may have set up your organisation wanting kids to experience live music. In a Theory of Change, this is actually an outcome - not a goal.

Big picture goal: Kids have access to a source of replenishable joy throughout their lives.

Impact: Kids aged Y in X neighbourhood know where to find music and how to enjoy it.

Outcomes: Kids have access to live music experiences; kids enjoy the experiences; the experiences are accessible; the venue and logistics all work well.

Activity: Set up live experiences of music for kids [from X neighbourhood and Y age range); work with schools and families to make sure kids can get there (e.g. look at transport, timing, costs)

If you understand this, then you can stay focused on the big picture goal.

But what often happens is, your organisation becomes focused on building its brand, or fundraising, or beating competitors.

Over time, instead of being about helping kids access joy, most of your activities seem to become focused on organisational survival, and fighting with other organisations which have the same mission, over resources.

Slide7.jpeg

Values-driven planning is about checking in and challenging yourself and your organisation.

Identify

  • What are your values? Be honest and specific! For example, I might say my values are cultural equity and justice, but are they really? Or as a writer and a woman, are my values actually ‘cultural equity and justice for women of colour working in literature?’

  • For example, you might say your values are to help people experience the joy you experience when you see theatre. Which is lovely. But let’s unpack it. There are two potentially conflicting goals here: joy for people, and theatre. They might not go together for everyone, so get specific.

  • Also - which people? If your organisation ends up with a price point for its activities too high for the majority of society, then you have to acknowledge that your organisation is actually delivering ‘the joy of theatre experiences for the wealthiest people in my city.’ Which may not be where you started, or what you value. So that means you have to challenge what you are doing.

Ask

  • Who do you need to ask to make sure what we want to do matches what we say we value?

  • For example, if I say I am all about cultural equity and justice for people of marginalised identities, I need to bring on people of marginalised identities into ownership, planning and decision-making. Otherwise I am just feathering my own nest. If your organisation says it wants to help, say, First Nations Australians access employment opportunities, then you have to have as a goal the transfer of power and ownership to First Nations Australians. Otherwise your ‘shadow mission’ starts to take over, and instead of transferring power, you start to hoard it for your organisation’s survival.

Design

  • Co-design activities which will enact your values. Again this is about making sure that, if you have social change goals, you centre the people you want to support.

Act

  • You implement the activities - again centring the people you want to support in delivering your activities.

Measure and reflect

  • Check in. Are your activities actually achieving your values? Or are there unintended or unforeseen consequences that are not achieving your values - may even be undermining your values?

  • For example, I once worked with an organisation which wanted to support better understanding and safety for people of marginalised identities, via online videos marketed to the general public. But these videos generated so much Internet vitriol that the project was unsafe for the intended beneficiaries.

Change

  • Make changes to meet core values. In the example above, the organisation ‘pivoted,’ shifting the target audience to other people of marginalised identities who could create a wider, safe community for the target beneficiaries.

Communicating results

Communicating evaluation findings

You’ve collected all your data, you’ve analysed it and you’ve written the report.  Job done?

Well that depends. 

There are few more questions to ask yourself? For example:

  • Are my key stakeholders likely to actually read my report? 

  • If my results are persuasive, will a report alone do the trick in getting the word out?

Usually, in my experience, the answer is, ‘probably not.’

That’s why it is really important, from the very beginning (if possible), to think about how you’d like to communicate your results. 

Why it is important

A lot of time and effort goes into conducting evaluations.  Thinking through how you will communicate your results will help you get the most ‘bang for buck’ out of that effort. For example:

  • You may have useful lessons to share with the project team and/or the sector

  • You may have some compelling or persuasive statistics or stories that could be used to support things like:

    • brand building

    • fundraising

    • influencing (policy-makers, senior management etc.)

But don’t confuse evaluation with advocacy. Evaluation results can be used to support advocacy but it mustn’t drive this work (more on that later).

What kind of communication materials are there?

Good communication materials pull key information from your report and present it in an engaging, accurate and easy to understand manner.

Common materials include:

  • Executive or Summary findings

  • Fact sheets

  • Infographics

  • Video presentations

  • Content for web or social media campaigns

  • Conference presentations

  • Articles, press releases etc.

Ethical considerations in communicating results

When using findings out of the context of the report it is really important to be ethical.  All the same ethical considerations apply as when you report your results (See Tips for Writing an Evaluation Report).  But it is also important to think through how they apply to the new context in which you are reporting.   

There are also some other considerations you may need to think through. For example:

  • Don’t cherry pick only positive results

  • Don’t confuse advocacy with evaluation activity. For example:

    You can use evaluation findings to support advocacy but it shouldn’t drive this work. You must create a safe space for people to tell you confidentially about their experience good or bad. 

For example, if you want to bring your results to life with interviews that’s OK, but do it as a separate exercise to any interviews you have planned for the evaluation.  Interviewees might feel pressure to tell you only stuff they think you want to hear on camera (i.e. social desirability bias).

For your inspiration…

There are many examples of great communications materials out there for your inspiration.  You have probably come across many yourselves.

Here are some examples that inspire me….

Helpful tools

Here are some handy free and low cost tools I have used in the past to help with producing communications materials for evaluation results.  There are lots out there.  Find the ones that you like and work best for your needs. 

Free/low cost graphic design software: provides great adaptable templates for reports, infographics, social media infographic templates etc.

Two I have used in the past are:

Explainer video software: provides great templates for designing video presentations

One I have used in the past is:

This video gives examples of these tools in action

How do I decide what is necessary?

Start by developing a communication strategy, sometimes known as a dissemination plan. It will help you identify the best methods to communicate your results and the resources you’ll need.   

Tops tips

  • Don’t do everything

  • Prioritise the things that will resonate with your audience

  • Allocate budget and time to do these tasks e.g. photographer, videographer, software, graphic designer etc.

  • Assign responsibility

  • Build in time for learning design software.  It is designed to be user-friendly but it is easy to get carried away!

  • Put your plan together asap because you may need to collect materials during your project (in addition to research data) e.g. photos, video etc.

The diagram below steps you through a dissemination plan if you are doing it for the first time.

Screenshot 2020-04-17 14.05.58.png

Step 1. Identify your audience

The first step is to work out who you want to communicate your results to.  This might include the following stakeholders:

  • The project team

  • Funders

  • Participants

  • Your followers

Step 2. Articulate why you want them to know about your results

Spelling out why you want each group to know about your results will help you work out what materials are going to be most suitable.  Some reasons might include

  • Celebrating the project

  • Accountability requirements

  • Advocacy

Step 3. Identify any special requirements

The next step is to think about whether anyone has any special requirements.  For example:

  • Are you dealing with time-poor people who won't have time to read a report?

  • Is English a second language for some stakeholders? Would they respond better to audio visual materials etc.?

  • Do any stakeholders have accessibility requirements?

  • Do some of you stakeholders have stakeholders of their own e.g. your Executive might need to report to the board, what kind of materials might be useful for them?

Step 4 - 5. Identify appropriate materials and resources you’ll need

Steps two and three will help you work out which are the most appropriate materials for each group i.e. the materials they will respond to and will meet their needs.

That final step is to work out which materials to prioritise and whether you can afford it.

Can I have a template?

Why yes!  Here is a template that I have used in the past.  It also includes a worked through example.

Evaluation results dissemination plan template

Screenshot 2020-04-17 14.08.14.png
Screenshot 2020-04-17 14.08.24.png

Qualitative Methods - A Non-Exhaustive List

Sometimes, people shy away from qualitative methods, thinking that they are only descriptive and don’t give you “hard” data. In fact, you can use qualitative methods to gather deeper insights, as well as to derive quantitative outcomes. All you need is the right questions and the right coding techniques.

In this section we look at some key qualitative methods:

  • Observation

  • Interviews and focus groups, including how to code results

  • Arts-based evaluation

Observation

Screen Shot 2020-02-11 at 9.36.05 am.png
Play Me, I’m Yours, Artist: Luke Jerram, Arts Centre Melbourne 2014. Source: Bailey & Yang Consultants (2014) Play Me, I’m Yours Project Evaluation (Betty Amsden Participation Program), Arts Centre Melbourne, Melbourne

Play Me, I’m Yours, Artist: Luke Jerram, Arts Centre Melbourne 2014. Source: Bailey & Yang Consultants (2014) Play Me, I’m Yours Project Evaluation (Betty Amsden Participation Program), Arts Centre Melbourne, Melbourne

Observation: Pros and Cons

Screen Shot 2020-02-11 at 9.40.03 am.png

Observation: Steps

Screen Shot 2020-02-11 at 9.40.49 am.png

Observation Form: Example

Screen Shot 2020-02-11 at 10.13.29 am.png

Interviews and Focus Groups

Interview styles can range from open-ended through to quite formally structured. We tend to conduct semi-structured interviews, using a discussion guide to make sure we cover key areas but allowing us flexibility to follow the interviewee’s line of thought. We also make sure there are a handful of key questions we ask all the interviewees to ensure we have some consistency.

With some groups e.g young people, I prefer to conduct interviews with “friendship triads” which just means groups of three people who know each other. This seems to help people bounce off each other. But one-on-one interviews are fine.

Focus groups are useful when you have to talk to a lot of people, but you do need to try to avoid group-think. We tend to limit online focus groups to 4-5 people, and in-person groups to 5-8 at the most.

Walking interviews are a useful approach when you are interested in people’s relationship to a particular place. They are also great if you want to hand the “power” back to the interviewee, so you are not in the position of expert any more, but in a position of not-knowing and learning from the interviewee.

Interview Process

The following is a rough process when using interviews to evaluate programs:

  • Stage 1: Design the interview guide

  • Stage 2: Conduct the interview

  • Stage 3: transcribe the interview

  • Stage 4: Code the interview

  • Stage 5: Analyse the codes

  • Stage 6: Report, and add illustrative quotes for key thematic findings

We ask permission to record interviews and focus groups so we can transcribe and code later. This is good to do, because it helps us avoid simply summarising our own interpretations of what people have said.

In the pre-amble to an interview, we always emphasise a few key points:

  • We ask permission to record

  • We explain that we are just here to listen and understand, and that there are no right or wrong answers

  • We ask for their frank and honest feedback, because that is the only way to improve a program

Sometimes we ask an interviewee to reflect on an experience and tell us about moments that have stayed with them. It can be useful to flag you are going to ask this “e.g. you don’t have to answer this now, but later it would be great to hear about a moment that has stayed with you, if any…”. This allows the question to simmer away in the back of the interviewee’s head so they aren’t taken by surprise.

It can also be a good idea to ask your most “important” questions towards the end of the interview. By then you have hopefully established some rapport with the interviewee.

Throughout the interview, be mindful of your body language, the noises you make, your facial expression - anything which might unconsciously bias the interviewee to try to tell you what you want to hear. A lot of communication is non-verbal. This doesn’t men you have to be deadpan - but whatever they say, you have to be ready to be sympathetic and encouraging. Your job is not to get them to answer questions in a particular way, but to let them reflect on an experience. You are like a journal, with prompts.

Coding

Coding is the thing that seems to scare people so I will run through it here. Basically, coding means reviewing the interview transcript and identifying the key themes that the person has talked about.

You’ll start out with fairly verbatim themes, then gradually, as you do more transcripts, common themes will start to emerge and you can start grouping them.

It’s a cumulative process: you have to start out allowing for a fair bit of detail in a theme, and then gradually refining it.

The way I do it: I print out the transcript and jot down on the left-hand side the category of enquiry the person seems to be talking about, and on the right-hand side I jot down the theme of what the person is saying. It looks something like the below.

What my first stage of “coding” looks like.

What my first stage of “coding” looks like.

The “themes” become the initial codes, and the “categories” become the areas of enquiry which I gather the themes against.

From here, I start to transcribe my themes into a spreadsheet which will end up being a very wide spreadsheet - the below images are examples of parts of the spreadsheet. Because a person might say more than one thing which relates to a category, you have to include multiple columns for each category.

Screen Shot 2020-02-11 at 11.05.05 am.png
Screen Shot 2020-02-11 at 11.05.32 am.png

I type in the “code”, e.g. under personal goals I might include “working with artists from overseas”. I do this for each thing the participant said. Gradually I refine the codes so they might start to cohere, e.g “work with artist from overseas” could eventually be grouped under “artistic development”.

Categories may correspond to some of the main areas of questions in the interview. However, don’t assume this is the case: there might be some unexpected comments which it is important to include and categorise.

The categories may also match up with the areas I identified as part of my impact framework. But they won’t always match up, which is why it is important to code the interviews as they are, and not attempt to code them using your impact framework language in the first instance. You have to be as objective as possible, and not subconsciously shoe-horn the interviewee’s comments into what you thought the impact would be.

Templates

Here are some templates for interview discussion guides, note templates and coding templates.

Walking Interviews

Walking Interviews were used in the “Connected Lives Project” (UK). These are examples of the routes and photos taken on two walking interviews. Source: Clark, A. Emmel, N. (2010) “Realities Toolkit #3 Using walking interviews.” Realities, part of t…

Walking Interviews were used in the “Connected Lives Project” (UK). These are examples of the routes and photos taken on two walking interviews. Source: Clark, A. Emmel, N. (2010) “Realities Toolkit #3 Using walking interviews.” Realities, part of the ESRC National Centre for Research Methods, Manchester University, UK.

Walking interviews are interviews which you do whilst walking. They are useful when you want to understand people’s relationship with a particular place. They are also useful when you want to hand the “power” back to the interviewee, so you are not in the role of “expert” - the interviewee chooses the route and may talk more comfortably in motion.

Walking Interviews: Pros and Cons

Screen Shot 2020-02-11 at 9.59.11 am.png

Walking Interviews: Steps

Screen Shot 2020-02-11 at 10.14.50 am.png

Arts-Based Evaluation

Screen Shot 2020-02-11 at 10.20.39 am.png

When you are working in the arts, it seems to makes sense to use creative tools to evaluate the impact of a program. However, it can be tricky to interpret the artefacts which participants create.

Arts-Based Evaluation: Pros and Cons

Screen Shot 2020-02-11 at 10.23.58 am.png

Arts-Based Evaluation: Steps

Screen Shot 2020-02-11 at 10.26.29 am.png

Arts-Based Evaluation: Examples

Youthrex Webinar, 2015

Youthrex Webinar, 2015

Art Jam 

Art Jams are a gathering of individuals who make art by way of collaboration, improvisation and responsiveness. Art Jams are like focus groups, but with meaning arising about an evaluation topic through the interactive mode of art-making, rather than conversation e.g. facilitators can document the jam to collect important data about the experience. 

transformational self-portraits

Participants create a series of self-portraits at different intervals of a project (painting, drawing, collage, or any other artform). 

  • Stage 1. depicts how participants see themselves, things that have influenced their sense of self and their motivation for participating in the project.

  • Stage 2. participants represent themselves in what they see as their role or contribution to the project.

  • Stage 3. participants create a portrait of how they have changed as a result of the project and /or how they imagine taking what they have learned to shape their future selves.

Screen Shot 2020-02-11 at 10.44.35 am.png

Before and After Tool

We use this quite a lot, especially with kids and young people. We ask respondents to reflect on how they were before the activity and afterwards. Then we code the responses according to emotional valence and arousal, to see if there has been an overall change in mood amongst participants.

Use this Before and After Coding Spreadsheet Template to code responses. You can also use this type of spreadsheet to code the transformational self-portraits.

Sources

Better Evaluation, Collect and/or Retrieve Data, Better Evaluation.

Burns, L. Frost, J. (2010) Arts informed evaluation: A Creative Approach to Assessing Community Arts PracticesArts for Children and Youth Toronto, VIBE Arts, Canada.

Clark, A. Emmel, N. (2010) “Realities Toolkit #3 Using walking interviews.” Realities, part of the ESRC National Centre for Research Methods, Manchester University, UK.

Daykin, N.(2015) ”Creative and Arts Based Evaluation Methods.” Credible and Credible.

Kinney, P. (2017). “Walking Interviews” Social Research Update. Issue 67, Summer. University of Surrey. 

Macfarlane, A. (2017) “Non-Participant Observation” Better Evaluation, Sharing information to improve evaluation, the Australia and New Zealand School of Government (ANZSOG), Melbourne.

Searle, M. (2016) “Capturing the Imagination: Arts-Informed Inquiry as a Method in Program Evaluation,“ Canadian Journal of Program Evaluation Spring: 34-60.

Spicksley, K. (2018) “Walking interviews: A participatory research tool with legs?” The Bera Blog.  Research Matters British Educational Research Association.

Van der Vaart, G. van Hoven, B. and  P.P. Huigen, P. (2018) “Creative and Arts-Based Research Methods in Academic Research. Lessons from a Participatory Research Project in the Netherlands” FQS Forum: Qualitative Social Research Volume 19, No. 2, Art. 19 – May.

Wang, Q. Coemans, S. Siegesmund, R. Hannes, K.(2017) ”Arts-based methods in Socially Engaged Research Practice: a Classification Framework”. Art/Research International: A Transdisciplinary Journal. 2017;2(2): 5–39. 

Wimpenny, K. and Savin-Baden. M. “Using Theatre and Performance for Promoting Health and Wellbeing Amongst the 50+ Community: An Arts-Informed Evaluation,” The International Journal of Social, Political and Community Agendas in the Arts, 8(1): 47-64.

Introduction to quantitative and qualitative research methods

We recommend that arts companies use a mix of narrative and numbers when telling the story of impact. This means gathering two main types of data:

  • Qualitative data - this generally refers to the findings from observations, interviews and case studies (although once you code interviews, you can translate this into quantitative data - more on this later)

  • Quantitative data - this refers to findings from surveys, questionnaires, and economic proxies for impact (e.g. quantifying impact in dollar terms)

By using both narrative and numbers, you can communicate to people who respond to stories and people who respond to figures.

It can be time-consuming and takes a degree of expertise to deliver a robust and trustworthy data set. 

So ask yourself:

  1. What are we trying to measure, and why?

  2. How will we use the data we collect?

  3. Is a quantitative approach the best method to achieve what we need?

  4. Is there an existing, validated survey we can use?

Screen Shot 2019-12-03 at 3.11.19 pm.png
Screen Shot 2019-12-03 at 3.03.08 pm.png
Screen Shot 2019-12-03 at 3.08.24 pm.png
Screen Shot 2019-12-03 at 3.09.07 pm.png