Back in 2018, I reflected on some of the reactions I had to my leaving the security of the University to start a business (was I crazy?).

Four years on, and Matter of Focus stands tall – a growing team, a successful software tool, and an ever-increasing number of organisations in the UK and beyond using our approach to understand their outcomes and track their impact.

5 tethered purple balloons floating in front of a row of trees

As I was cycling home from meeting a colleague recently, I thought about the unbelievable fact that we are five years old already. We have learnt so much – about impact and outcome assessment, about building a business in these strange times, and about how anchoring purpose into the heart of our work has been such a powerful way to stay on track.

Nobody could have predicted the curveball thrown by the pandemic. Nor the bizarre effect it had on both slowing and accelerating time. But if ever there was a test for our purpose-led plans, this was it.

We already had a flexible working model – communicating regularly by Zoom with our CTO Steve who works remotely. We already understood the importance of context and were here for our clients when theirs shifted so dramatically. While known for our sticky walls, we took our outcome mapping approach fully online. And having developed OutNav as a cloud-based software it really stepped up in its role as a central place to bring everything together.

So, I am proud to say we are still very much inhabiting our space amongst purpose-led companies, building tech for good.

Our journey, so far…

Click to enlarge

To mark our fifth birthday, I’d like to share and celebrate five pillars of our success…


5 pillars of our success

  1. Pioneering organisations, who care deeply about making a difference to people and communities. We have learned a lot about what it takes to stay focused on outcomes, to embed learning into daily work, to build the capacity and capability of staff teams, and to embrace working well with outcome and impact evaluation. I am super excited that lots of this learning has been consolidated into our forthcoming book.
  2. A fabulous staff team, who have been happy to help us grow our approach, build our systems and processes, and jump in when others need help. It is sometimes exciting and occasionally scary building a start-up and moving on to scale up. We are fortunate to have attracted talented and thoughtful people to work with us – you can find out more in about them here.
  3. A sense of purpose that guides big and small decisions. We are a certified B Corporation, a member of the Zebras Unite cooperative, joining with like-minded companies who want to make a difference. We make big and small decisions with our mission and values in mind. You can hear me talking more about what this means to me personally and the challenges in building this culture in the podcast I recently recorded for the Turing Trust.
  4. A strong partnership at the core of Matter of Focus. Often business success is promoted as resting on one brilliant person. But I can’t imagine going on this journey without my co-founder Ailsa Cook. We have supported and challenged each other, had difficult conversations and made difficult decisions. The business and what we offer is so much richer through our combined efforts.
  5. Ecosystem support has allowed us to build a business without taking external investment. And while it has been challenging to develop our software bankrolled through our consultancy work, we have done most of it now and are in a really good place. This would not have been possible without support through the business ecosystem – Research and Development tax credits, Scottish Enterprise part-funding, and grants.

Seated in a team photo formation, the Matter of Focus team taking great pleasure in playing with a large foil 5 balloon.
Simple pleasures: nearly all of the Matter of Focus team together to celebrate our 5th birthday. Back row L-R: Craig Roberts, Allan Middlemass, Charlie Mills, Adeola Akisanya-Ali, Helen Berry, Grace Robertson. Front row L-R: Alex Perry, Simon Bradstreet, Steve D’Souza, Sarah Scott, Sarah Morton. Photographed by Kathleen Taylor. Ailsa Cook, unfortunately absent, but present in spirit!

Looking forward…

So, as I look forward to the next five years, we have some ambitious and exciting plans for the sort of company we want to develop that will support many more organisations to understand their impact, learn and improve, use data and evidence well, and stay focused on what matters to them.

Meanwhile we will continue to stay focused on what matters to us – being a purpose-led company and a great place to work – supporting people in public services to solve social challenges.

Life may sometimes be crazy, but the path we’re carving certainly isn’t.

The network seeks to understand and improve children’s experiences – positive and negative – of being online, with a focus on children in low- and middle-income countries.

This team was confident that their work was having an impact but didn’t have a way of demonstrating or tracking it. They could see impacts emerging for local countries and in relation to the international agenda.

What we did

In 2018, Matter of Focus was commissioned to evaluate and provide independent verification of self-reported emerging national and global research outcomes and impact from the collective activities of the Global Kids Online network.

We used our method of impact mapping, context mapping, auditing existing data and collecting more to build a contribution story for this initiative and produce an impact report.

For more about our method, see The Matter of Focus approach to tracking research impact.

For how we do this work see An overview of the Matter of Focus Approach.

An early outcome mapping session with the team

We worked together with the team to set out three main pathways highlighting this initiative’s contribution:

  1. How the work of the project team inspires and supports people around the world to conduct their own research about children’s experiences online.
  2. How the partner countries contribute to changes in policy and practice locally to improve children’s online experiences.
  3. How the project team pulls together information from partner countries and uses this to influence the international agenda.

These three pathways then formed the basis for data organisation and analysis. We collated existing data from social media, web, press, and event feedback, and collected further data to fill the gaps. This included case studies of four countries, interviews with stakeholders and policy analysis.

We used OutNav to hold these pathways and collate and analyse data against them.

Screenshot of pathway 3 in OutNav (click to enlarge)

Who with

The Global Kids Online management team Daniel Kardefelt Winther, Kerry Albright and Alessandra Ipince from UNICEF Innocenti, Sonia Livingstone and Mariya Stoilova from LSE.

(Left to right) Daniel Kardefelt Winther, Kerry Albright, Sonia Livingstone, Mariya Stoilova, and Sarah Morton

The study draws on in-depth evidence from three research sites in Bulgaria, Ghana and Uruguay, as well as a review of the activities of the whole network.

What they learned and gained

Our approach allowed broad capture of impacts internationally as well as the specific impacts in the three partner countries where case studies were focussed. The team could see that there were many impacts, but also learned where they could improve their focus, and how to sustain impact planning in the longer term.

What difference did this make?

This work highlighted wide ranging impacts in the specific countries, and on the international agenda. This supported the creation of an impact case study for the UK Research Excellence Framework, and enabled UNICEF, LSE and partner countries to make a case for continued funding of the research.

The published impact report is available online:

CHILDREN’S EXPERIENCES ONLINE: Building global understanding and action


.Navigating the challenges of assessing research impact

In this webinar I explore some of the challenges of understanding and assessing research impact, especially where research is used in influencing, inspiring and educating people, and where issues of cause and effect are more complex


In traditional models of evaluation these are often referred to as ‘outputs’, but we like to think about ‘what we did’ in a broad way that might cover what was delivered, but also how it was delivered and how any challenges were overcome. This gives a rounded, more engaging and interesting story, which can help demonstrate the uniqueness of the work, and can really ground the learning, behaviour change or wider outcome into the delivery aspects of the work.

When we work with initiatives to help them understand how the activities they deliver contribute to the outcomes or impacts that are important to them, we use a process we call outcome mapping. An outcome or impact map sets out a theory of change across our simple framework of six column headings; What we do, Who with, How they feel, What they learn and gain, What they do differently, What difference they make.  

The Matter of Focus headings: 1 What we do, 2 who with, 3 how they feel, 4 what they learn and gain, 5 what they do differently, 6 what difference does this make?
The Matter of Focus headings
An outcome map
An example of an OutNav outcome map for a wellbeing service

Reporting on ‘What we do’

It is interesting working with many different clients and organisations to set up their outcome maps, and to start to tell their contribution story. It should be easy to evidence and be confident in the column that explains ‘what we do’ but it can also be a struggle to put into words how a programme or initiative has been delivered and what changed over time. There are a few tips to help with this:

Regular recorded reflections on the delivery of a programme or initiative creates rich data that can capture issues that may be lost in the longer term. Many of our clients use a reflective impact log to record their own reflections on delivery. You can download a copy of our reflective impact log from this post: 3 feedback tools to help you track your outcomes and impact.

‘Use pictures, or other images to show how it was done’

Reporting on ‘Who with’

Who is being engaged and involved is the core of work seeking people-based change. We believe that it makes for a stronger and clearer contribution story if there is a separation between what was delivered ‘what we did’ and who was engaged and involved ‘who with’.

This part of any outcome map provides some nice opportunities for quantitative data, graphs and numbers. It can be good to record who came in terms of types of people, numbers of sessions, repeat returners, social media and web stats or reach of a programme. These are the things that lend themselves to traditional quantitative data visualisation like graphs or pie charts.

While we are geeking-out on playing with these numbers it is important to also tell the story of how a programme or initiative engaged people. For many initiatives we work with, a lot of effort has gone into ensuring that the people that were involved were the right people, were the intended recipients of whatever work was being planned, and were engaged throughout the work where appropriate.

Alongside the numbers it is important to capture:

Anchoring your contribution story in a clear description of what you did and who with will help to clearly demonstrate the links between these and higher-level outcomes when it comes to reporting.


Text reads: Report in an outcome-focused way

For more on telling a compelling outcome-focused story of the contribution of your work, see our post: Report in an outcome-focused way


Assessing the impact of research on policy, practice, behaviour or society is not easy. I have written more about this in an earlier post – Understanding and assessing research impact.

I have been assessing the impact of research and evidence-to-action projects and programmes, and advising others on impact assessment for the last 13 years. One of the things I’ve learned is that if you capture evidence as you go the whole process is so much easier.

Here are four easy things you can do to start building up the evidence of the impact of your research or evidence-to-action initiative. These can be done at project, programme, research centre, or strategic level.

1. Track who is interested and engaged with your work

If you consistently keep records of who came to your events or training, webinars or partnership meetings, and anything else about people that interact with your research or knowledge mobilisation activities, you will have a good grounding for an impact story. This is the cornerstone to demonstrating the reach of your work.

You should also track what is important about these people in relation to your initiative. Is there something about who is important to the impact of this work in relation to age, geography, profession, sector, interests? If so, make sure you capture that information too, and consistently across different types of engagement. For example, if your work is important to young people then make sure you have defined that (e.g. 16-25 years old) and you consistently capture age whether people attend webinars, meetings or whatever else. 

You probably have some kind of social media data, maybe across several platforms. A regular social media report that shows reach and engagement for all the social media platforms you use can also be a good building block for telling an impact story. Make use of the platforms’ in-built analytics tools. This can highlight engagement beyond the shares and likes you see in your feed. There are also third party tools that can tell you more about your following, such as SparkToro.

Your social media report might include updated overall reach, downloads of key documents, growth of interest and how that happened, and any particular highlights. Whatever you decide, just pulling this together in the same format on a regular cycle can make this information much more useful.

Twitter analytics will show some of the less obvious engagements on a tweet, such as number of clicks on a link you’ve shared. You can also export data for a all your tweets in a designated timeframe of up to 3 months.

The Matter of Focus Research Impact School will equip you with the knowledege, skills and resources you need to unpick and start tracking the impact of your research.

The school combines four live, engaging workshops with a three month subscription to our impact tracking software, OutNav.

Find out more.


2. Record what people are saying

Capture informal evidence wherever it comes from. This might be email, blog comments, meaningful social media engagement such as a conversation, or what a stakeholder tells you in the corridor or at an event. Make a habit of doing this and you will build up a useful source of evidence.

It is good practice to ask permission from the people saying things; although not required for engagement made publicly on social media or blogs, this is important for emails or things people say to you informally. If it’s not possible to ask permission in the moment, make a mental or written note of the comment and follow up with it later by email.

We have set up an Informal Evidence Record for our own use here at Matter of Focus, in which we record:

screen capture of a conversation on Twitter
We often screencapture meaningful engagement from our Twitter notifications that doesn’t come out in the aggregate analytics data.

3. Find out what are people doing and why they are interested 

There are some easy and quick ways to gather more formal feedback.

My plea is always to do this early and often because this is information that is so much harder to gather afterwards.

These sorts of data gathering methods should focus on what people are intending to do with the information they have got and how useful they think it is. For example:

When sharing your content online

If you are hosting content online for people to download, for example a report or research briefing, ask people why they want the content and what they intend to do with it. Even if only 1 in 5 people completes a pop-up form (for example) before downloading, this can be a rich source of information and help you understand potential impact and where you might track further.

At events or meetings

Collect quick and simple feedback from any kind of face-to-face or virtual events or meetings. This doesn’t have to be long and complicated, just a quick question on MentiMeter, or a short postcard feedback form. Again with a focus on what people will do next, to draw out impact intentions. You can find an example of one of our quick feedback forms and how we adapted it for online events in our post 3 feedback tools to help you track your outcomes and impact.

If you combine these feedback options with a request for a contact email address and permission to follow up, you will create a list of contacts who have interacted with the initiative who have something to say about it and can be followed up later.

a screenshot of a virtual noticeboard with feedback on multiple post-it notes
Feedback we gathered from participants on a Miro board at one of our online Research Impact School workshops

4. Keep communication open

Keeping communication open is essential if you want to track impact over time. If you see each interaction with someone as the start of a mutually beneficial relationship you will be creating channels to achieve impact as well as to collect evidence about what that impact is.

It is really important to always thank people for any feedback they do give you – however you get it. Make it clear how you will use it and how valuable it is to you. As described above, ask for permission and contact details to follow people up. This can then be done on a cycle of 3 or 6 months, to keep tracking impact via emails, more formal surveys or other methods.

Simple steps with multiple benefits

Create the discipline to do these four things consistently, as early as possible and on an ongoing basis and you will:


Ready for a system to start tracking your impact?

The Matter of Focus Research Impact School will equip you with the knowledege, skills and resources you need to unpick and start tracking the impact of your research.

The school combines four live, engaging workshops with a three month subscription to our impact tracking software, OutNav.

Find out more.


In this post I share my approach to understanding and assessing research impact. I developed this approach through my PhD and have since used it in seven independent impact studies, in advising several research programmes, and in supporting academics to write impact case studies.

When I set out studying for a mid-life PhD in 2008, little did I know that my topic – research impact assessment – was about to take off in the UK and around the world as more interest became focused on the need to understand the wider benefits of research to society and economy. I was fortunate to gain insights into this by studying a successful research partnership, and building a framework based on contribution analysis1.

Defining research impact

Definitions are important in a PhD, so I developed one that has been useful to many colleagues grappling with how research and knowledge exchange activities contribute to high level outcomes like improved well-being.

My definition separates research impact into three distinct levels:

The challenge then is to trace impact through these levels, and find the evidence to demonstrate that change has occurred and can be reasonably linked to research.

This definition is based on the conceptualization of research use as a complex process, and follows work by Weiss2, and Gabay & LeMay3 who have unpicked the processes of research use in policy and practice to show that it is not a linear process. People who use research are not passive recipients of knowledge, instead they take up and use research according to the complex needs of the situation they are in, and combine it with other forms of knowledge to support their policy or practice. This presents many challenges for those of us who want to understand impact.

How to assess research impact

I advocate setting out a theory of change to show how research has contributed to different levels of outcomes. (We refer to this as an impact or outcome map – see our post Understand the outcomes and impacts that matter for more on this).

Theories of change are dynamic – they can be used as a planning tool for knowledge exchange, as well as the framework for assessing impact. The Research Contribution Framework published in Research Evaluation shows how to use a theory of change approach to understanding and demonstrating the impact of your research. This method of understanding impact aligns well with the case study approach used by many research excellence frameworks, such as the REF UK.

Having set out your theory of change, a variety of data and evidence can then be assembled against it.

At Matter of Focus, we use an outcome or impact map to help us clarify and clearly articulate the contribution story of our research. Our software OutNav makes it easy to add data as we go, and collaborate on evidencing the impacts of our research.


Text reads: An overview of the Matter of Focus approach

To learn more about the Matter of Focus approach we recommend starting with our overview post.


Working collaboratively is particularly helpful for identifying the assumptions we’ve made about how impact might occur. We advise spending time interrogating these assumptions and naming the any risks that your intended impact might not occur. These discussions form the basis for assembling and analysing data. Common risks and assumptions and data potential is also listed here.

Data for research impact assessment

Mapping the contribution of the Global Kids Online programme – the assessment of this can be found on the Unicef Office of Research-Innocenti website.

Top tips for assessing impact

  1. Be a detective. Assessing the impact of research can be a bit like being a detective. I often think of it as a detective job – finding clues and following them up. Sometimes one research user has a rich story to tell of how they have used research to influence policy or practice. I discuss this in another paper published here.
  2. Collect as you go. It really helps if you collect data and feedback as you go, rather than waiting until time has passed and trying to fill the gaps. Ideally if you have set out theory of change or impact map, then you will be able to see what feedback will be needed to evidence your contribution. I would advocate for using every interaction with research users as an opportunity to collect feedback, preferably formal feedback, but observations and reflections if that’s not possible.
  3. You don’t need lots of data – just the right data. You don’t need lots of data of impact, just enough convincing and well analysed data to demonstrate your contribution. Often a few key research users or collaborators can give rich insights that you can follow up and, if looking for policy change, can help shape any documentary analysis that is needed.
  4. It takes time for impact, and impact assessment. Impact takes time to emerge, so just stick with it, collecting data as you go. It can be useful to ask people if you can follow up with them after six months or a year to track emerging impacts.

Of course, I have taken this approach into Matter of Focus and some research teams are using our software OutNav to embed this way of working. If you want to see this in action, please contact us – or book a place at our research impact school.

How the Research Contribution Framework aligns with the Matter of Focus approach.

Watch my recorded live webinar: Navigating the challenges of assessing research impact


The Matter of Focus Research Impact School will equip you with the knowledege, skills and resources you need to unpick and start tracking the impact of your research.

The school combines four live, engaging workshops with a three month subscription to our impact tracking software, OutNav.

Find out more.


Some publications using the approach

Research and knowledge exchange on resilience and intimate partner violence: “Make Resilience Matter” for Children Exposed to Intimate
Partner Violence Project: Mobilizing Knowledge to Action Using
a Research Contributions Framework. [Go to report (pdf)]

Knowledge to action in healthcare: ‘Developing a framework to evaluate knowledge into action interventions.’ [Go to report]

An assessment of the Global Kids Online programme: ‘Children’s experiences online: Building global understanding and action. [Go to report]

References

1 Morton, S. (2015). Progressing research impact assessment: A ‘contributions’ approach Research Evaluation. 2015 October 1, 2015;24(4):405-19. https://doi.org/10.1093/reseval/rvv016

2 Weiss, C., H (1979). The many meanings of research utilisation. Public Administration Review 39(5): 426-431.

3 Gabbay, J. and A. le May (2004). Evidence based guidelines or collectively constructed mindlines? Ethnographic study of knowledge management in primary care. BMJ 329. https://www.bmj.com/content/329/7473/1013

Our approach has been used in independent research impact studies of initiatives and projects around the world. We have many clients using our approach and OutNav to track the impact of their research.

Here are four key ways in which our approach and OutNav will help you to track your research impact:

1. Understand, plan and set out your pathway to impact

How your research or evidence-to-action initiative makes a difference can be planned and clearly articulated as pathway to impact, using our outcome and impact mapping approach. This will help you think beyond what you are doing and who is engaged, towards what difference it makes. You can read more about this process here in our post What is outcome mapping.

2. Gather and assemble your feedback, data and evidence

Your pathway to impact becomes the foundation for gathering evidence and telling your impact story.

Through a process of data audit, we support you to think about the data, feedback and evidence you have, and where it can be enhanced and improved to tell your impact story.

If you are fortunate to be doing this at the start of a project or programme it has the additional benefit of helping you learn as you go about what is most effective.

During or after a project it can show how good your current evidence is and help you plan how to improve where needed.

3. Improve your evidence and strengthen your impact story

OutNav gives a quick visual overview of where your data is weaker and stronger, helping you to focus on collecting additional evidence where it is most needed. Once you can see clearly where data needs to be improved you can seek additional sources.

Your pathway to impact helps to frame the questions you need to ask to strengthen your impact case. It helps think about who you want to target for further feedback, and what you need to know.

4. Create and share impact reports at the touch of a button

OutNav enables you to create and share impact reports simply and effectively – pulling out your impact story for funders, stakeholders or other reporting requirements (like REF impact case studies).


If this sounds interesting, there are several ways you can find out more

* Watch my recorded live webinar: Navigating the challenges of assessing research impact.

*Register for one of our live demos to find out: Can OutNav help?

*Get in touch to request a call. We will send you my calendly link so you can find a time that suits you.

The Matter of Focus Research Impact School will equip you with the knowledege, skills and resources you need to unpick and start tracking the impact of your research.

The school combines four live, engaging workshops with a three month subscription to our impact tracking software, OutNav.

Find out more.

Because we can’t be certain of the future, we need to think deeply about the contextual factors that shape any work, and the underlying change mechanisms we are relying on. This includes thinking through what will need to be in place, why we think our initiative will work and the potential risks that we can face: our risks and assumptions.

What do we mean by risks and assumptions?

Risks are things that may happen that could get in the way of the change we hope to see. For example:

Assumptions are the change mechanisms at the core of the project and things that you are relying on being in place for an initiative to contribute to the outcomes it seeks. For example:

Making risks and assumptions real

When we support organisations to develop their outcome maps, we also go through a process of assessing the context for delivery and key risks and assumptions that need to be considered.


We have written a post about our approach to understanding and talking about context, which uses the ISM Framework, adapted from the original developed by the Scottish Government


Defining risks and assumptions helps ground the outcome map into context and to highlight where attention needs to be focused to achieve success.

Assumptions include why we think the initiative will unfold in the way we have set out – sometimes this is based on experience of delivering something similar, learning from other initiatives, or from research into similar work. Other times we might be trying something innovative where we want to test these assumptions.

Two sides of the same coin?

When looking at wider assumptions and risks we can often flip them between a risk or assumptions (e.g. risk: we can’t reach the right people, assumption: we reach the right people).

In some cases they fall more squarely into a risk or assumption category – we just need to make sure they are recorded in a way that makes sense to the work. What matters is that we think clearly about the most important of these, and how we will include them in the way we monitor and evaluate the work.

For example, nearly every initiative working with complex change faces challenges around engagement – it is usually really important to engage the right people with the work, but often a risk is that the we engage with people who are easiest to reach, or people are not ready for the initiative.

If we get engagement wrong, then the initiative probably won’t contribute to its intended outcomes. So, we need to play close attention to making sure we understand who we are engaging with and how they are feeling

Taking risks and assumptions into account when analysing your progress

When you are adding something to your lists of risks or assumptions you can also reflect on the implications it will have for analysis, and ensure you collect feedback to be able to monitor this. For example, if one of your biggest risks is that the people who want to reach are not ready for your service, in your analysis you need to reflect on the steps you put in place to overcome this risk and how successful you were.


Text reads: Managing your risks and assumptions in OutNav

In OutNav your lists of risks and assumptions are held in a tray alongside your implications for analysis, see how this looks in our OutNav feature post.


Two young girls play with a stream of water
Photo Abigail Keenan, Unsplash

Adopting outcome approaches

Many governments, funders, programmes and projects around the world are adopting outcome approaches to express their vision for social change, and commissioning services based on high-level outcomes (for example the Sustainable Development Goals).

These outcome approaches include aspirational ideas like ending poverty or ending violence against children. 

Research shows that these approaches can be a great way to get people together to work towards changes that are needed to improve lives and make the world a better place. They can help people stay focused in a complex and changing environment, bring partnerships and teams together around a shared vision, and help highlight what really matters to people and communities. 

But working towards these high-level goals or outcomes also brings with it some challenges. 

One of the biggest of these challenges is related to how we measure progress. If the focus is solely on the data about the high level outcome, then there will be some obstacles to overcome.

It gets worse before it gets better

The key problem is that data on outcomes usually gets worse before it gets better. 

For example, when evaluating a campaign to make schools safer, outcome data might include the number of school pupil reporting experiencing violence, gathered via opinion surveys, or administrative data on the number of violent incidents. However, both of these measures are likely to increase when a school adopts a safer schools approach. There are a few reasons for this: 

So, a good safer school or violence reduction programme should expect to see these rises in measures of violence in the short and medium term. However, this may alarm those implementing, authorising or funding a programme. 

The leadership challenge

The leadership challenge is to understand this data and work with it confidently while a programme is being implemented. 

For leaders to be able to do this they need a strong understanding of the context they work in, the relationship between activities and outcomes, and some really solid data telling them that implementation is going well.

The cornerstone for working well with this leadership challenge is an understanding of the difference between contribution and attribution

For complex programmes like making schools safer or reducing violence against children the relationship between programmes and their outcomes is not simple and direct. Instead there are many factors that will also influence the same outcomes:

If all of these other factors are working together (e.g. violence reduction programmes elsewhere, a policy environment that is working towards violence reduction, positive changes in community cohesion, wealth, employment, domestic abuse etc.) then changes in violence levels will be achieved more quickly and more easily than if the wider context presents difficulties and challenges. 

For this reason we focus on the contribution of any programme, initiative or policy, rather than attributing simple cause and effect. Indeed contribution analysis is the foundation of our approach at Matter of Focus. 

What other data is needed?

For leaders driving change like safer schools programmes, different kinds of data and evidence are needed to maintain confidence that the programme is making a difference in the short term, even as high-level outcomes look worse. 

At Matter of Focus we work with programmes, projects and initiatives to understand their contribution by including the following ways of thinking about and working with complex change:

  1. Understand how activities contribute to outcomes: this means having a strong theory of change for all aspects of the programme.
  2. Understand who is engaging: this means knowing that the programme is reaching the right recipients and that they are engaging positively. 
  3. Understand the learning that underpins behaviour change: that means knowing and getting feedback on the knowledge, skills and capacities that are important for the programme.
  4. Monitoring behaviour and practice changes: collecting real-time data on early changes in behaviours and practices, however small.
  5. Monitoring external factors that might affect the programme or initiative: (wider policies, services and unexpected events).

This can be done through collecting immediate feedback from people who are engaged in the programme, ensuring systematic reflections from those delivering the programme, and must include qualitative insights in order to understand the effectiveness of programme implementation. 

Working towards ending violence and creating safer schools

Over a longer time frame, if implementation goes well, changes in high level outcomes will start to emerge. 

By using a complexity-informed approach like ours, people involved in delivering these kind of programmes will be able to able to tell a really credible story of how they made a difference. That story will include how violence reduction has been influenced by their programmes and projects, as well as an assessment of the wider influences of the context for delivery. 

This evidence will provide valuable insights for other people seeking to implement similar programmes or initiatives in other places. More than that it will allow deep learning as programmes are implemented, meaning they are more likely to reach their outcomes and make a difference to people and communities.

There is an imperative to work in this way if public services are going to be successful in tackling some of the big and complex challenges that we currently face

A theory of change is not as grand as it sounds – it’s just a term that refers to making explicit the thinking behind why a programme, project or intervention will make a difference to the people or communities it seeks to serve.

Outcome mapping is the centrepiece of our approach, and the cornerstone of our software OutNav.

Why ‘outcome mapping’?

We call our approach to understanding change ‘outcome mapping’ for a few reasons:

Outcomes are a good way to visualise and work towards the change you seek in the world. Many organisations or projects have outcomes expressed in their mission or are working to outcomes determined by funders.

We work with organisations or programmes to map how the activities they deliver reach the outcomes that are important to them. Outcome mapping is a simple way of describing this process.

We use the terms ‘outcomes’ and ‘impacts’ interchangeably as we find that different sectors have their own preferences on this. Some just use outcomes, some impact and some use both.

Deeper dive

Where does outcome mapping come from?

For many years, Ailsa and I have been working with a variety of organisations with a mission for social change to help them understand and work with outcomes and we have developed a distinctive approach.

Scotland is a great place to be pioneering this approach because the Scottish Government has promoted an outcomes approach to service commissioning and delivery for over ten years, which means many people are grappling with outcome evaluation challenges.

Outcome approaches are also gaining traction globally through the Sustainable Development Goals, and governments in other parts of the world also adapting an outcomes commissioning approach.

We have built our outcome mapping approach on strong foundations, and we like to think it has great pedigree!

I first developed the approach for research impact assessment
see my 2015 article Progressing research impact assessment: A ‘contributions’ approach). Ailsa and I have since refined and reworked it through experimentation and learning with many different kinds and sizes of project, programme and organisation over the last ten years.

We built on work by Steve Montague who had taken the basic ideas of contribution analysis and turned them into practical approaches. (See Montague, S., (2012) Theory-based Approaches for Practical Evaluation – pdf.)

We also brought a strong understanding of outcomes from Ailsa’s work, (See Cook, A. (2017) Outcomes-based approaches in public service reform, and Cook, A. & Miller, E. (2012) Talking Points Practical Guide – pdf) as well as experience and commitment to participatory approaches, knowledge to action and action research.

A clear and accesible approach to understanding change

Some ways of representing theories of change can result in complicated diagrams, which can make it difficult to get a good understanding of the change processes and are challenging to evaluate.

For us, outcome mapping is an interactive approach to setting out a theory of change based on a framework we express using our headings:

matter-of-focus-ourcome-mapping-headings

We believe that the plain language approach helps to refine thinking about the programme or project in question. We separate out outcomes into different levels that help understand the change mechanism that underpins people-based work. We think about outcomes at the level of reactions, knowledge skills and capacities, changes in behaviour policy or practice, and at the level of longer-term social change.

We have another insight post that explains the rationale for our headings and how they work.

Our approach to mapping outcomes provides a clear and accessible way of breaking down and understanding the change process. It is important to note that whilst it is possible to show an overall change process in a progressive way, we recognise that change doesn’t happen in simple, linear ways. Many organisations we work with use their outcome maps to tell the stories of these complex and often circular change journeys

An outcome map
An example of an outcome map held in our software OutNav

Outcome mapping on different scales

Since we set up Matter of Focus in 2017, we have held outcome mapping workshops for more than 160 public service organisations and partnerships.

We usually work with organisations to create one or more outcome maps that set out how their activities reach outcomes.

mapping-workshop-with-thistle
An outcome mapping workshop with the Thistle Foundation

In some cases this has been for simple projects and community groups, such as Reeltime Music. However, we specialise in supporting complex organisations and partnerships where external facilitation and a very conceptually robust process is essential to make progress. That would include the Scottish Government Public Health Reform team, to map how different people contribute to improving public health in Scotland; or mapping whole policy areas, such as Self-Directed Support; and helping organisations demonstrate how their innovative approaches make a difference across multiple delivery areas, such as Penumbra.

Online mapping workshop with Obesity Action Scotland

Find out more about who is using our approach.

Putting your outcome map into action

Outcome maps and the pathways you plot through them provide a lens through which you can begin to see what data, information and feedback you need to collect and analyse to help understand the change processes and evidence your initiative’s progress towards the change it’s looking to make.

While we have been supporting organisations to map out their contribution to outcomes for many years, we set up Matter of Focus with the ambition of building a software that would hold and support the approach – which we have done.