Forward Thinking: Connecting strategy and evaluation, do’s and don’ts

20 March 2019

In our Forward Thinking blog series, we’ve featured several examples of how we learn and adapt to the ever changing health research environment. From our program learning and improvement cycle where we use data as part of a continuous feedback loop to inform future iterations of our funding programs, to our work on public and patient engagement.

In this blog, Zena Sharman, MSFHR’s director, strategy and Julia Langton, former manager, evaluation & impact analysis reflect on how best to connect strategy and evaluation to help plot a course, define success, track our progress and learn along the way.

Forward Thinking is MSFHR’s blog, focusing on what it takes to be a responsive and responsible research funder.


Connecting strategy and evaluation: Our do’s and don’ts

Today’s organizations are operating in complex, rapidly shifting environments. Some people call it a VUCA world — volatile, uncertain, complex and ambiguous! To work in this environment, organizations are having to become more adaptive and nimble. The emergence of approaches like adaptive strategy and developmental evaluation better equip organizations to meet the requirements of this ever changing environment and work towards key goals by using data to learn and course-correct along the way.

Despite these developments, evaluation and strategy are often siloed, or practiced irregularly off the sides of already-full desks. At MSFHR, we believe that the greatest benefits come from bringing evaluation and strategy together, and have been exploring how to best connect each practice at the Foundation. This is helping us develop the data-driven, context-informed strategies we need to navigate the complex world of health research funding, and ensure we have the best tools available to continue to support and grow BC’s health research system.

 

Finding our way through strategy and evaluation

For organizations, strategy is about understanding who you are, what contexts you’re working in, where you want to go (your overall goals or aims) and how you’re going to get there. But strategy is a compass, not a map. Designing and implementing strategy is a process of sense-making filled with educated guesses, experimentation, learning and adaptation.

Evaluation is also a way of making sense of things, but from a very different perspective. Evaluation is the systematic assessment of the design, implementation or results of an initiative and is a key component of learning and decision-making, and therefore strategy. But evaluating complex and evolving concepts such as an organizational strategy can be challenging, which is where methods such as developmental and participatory evaluation are important. They help us learn as we go and involve the right people in interpreting evaluation results in order to transform them into recommendations for action (it’s important that evaluators not do this in isolation!).

Strategy and evaluation both require us to consider past, present and future simultaneously, but from very different vantage points. Integrating these perspectives — the broad and the specific — can be challenging. But connecting the two can also be hugely valuable. While strategy provides our compass, evaluation helps us define what success looks like and set up a plan to track our progress and learn along the way.

 

Connecting strategy and evaluation

As we develop our new organizational strategic plan (our current one comes to an end in 2019), we’re reflecting on what we’ve learned so far from our evaluation activities, and working closely to build evaluation into both the development and implementation of the strategy right from the start.

Regardless of what stage of the strategy and evaluation journey you’re at, we’ve developed some do’s and don’ts for connecting evaluation and strategy from what we’ve learned so far.

DO:

  • Build integrated teams with a core group of in-house strategy and evaluation experts.

If possible, embed and integrate your evaluation and strategy units. Tearing down traditional siloes and hiring (or training up) in-house experts is key to ensuring an active connection between strategy and evaluation.

Our organizational structure at MSFHR has evolved in recent years, with new and specific leadership roles in both strategy and evaluation, a unit that sits within the broader strategy team. We also ensure that each person on the strategy and evaluation team does not only dedicated work, but also offers support to colleagues across the Foundation.

  • Grow knowledge and capacity across your organization, and cultivate champions.

Evaluation and strategy are most effective when staff understand how and why they are relevant to, and helpful for, their day-to-day work. Upskilling staff in both areas can help facilitate this, and build in-house skills and knowledge. This can be done through formal learning opportunities, as well as hands-on training and coaching over time.

At MSFHR, we strive to be a learning organization. We work hard to give our team access to the knowledge, tools and support they need to approach their work in a goal-driven, evidence-informed way. To support this, we’ve done a lot of internal work to build a data culture. This has ranged from intensive 1:1 sessions with high data users, to trialling an organization-wide data culture project using learning modules developed by researchers at Emerson College and the MIT Centre for Civic Media in the US.

For us, thinking of creative ways to cultivate champions for strategy and evaluation has helped get others excited about the potential benefits of thinking and acting with strategy and evaluation in mind.

  • Strike the right balance between strategy and evaluation.

Full inboxes and heavy workloads can make it difficult to give strategy and evaluation the thought and time they deserve, but focusing on one without the other can diminish the impact of both.

For example, focusing too heavily on strategy at the expense of evaluation can limit an organization’s ability to make evidence-informed decisions and may lead to reactive decisions and solutions. Conversely, focusing too heavily on evaluation and analysis (at the expense of strategy) can leave you with detailed granular information about how different parts of the system are performing, but you may struggle to see the forest for the trees.

  • Learn from others with similar experiences and challenges.

We have used a variety of resources to support our strategic learning at MSFHR. We learn from our peers through the National Alliance of Provincial Funding Organizations (NAPHRO). We have also learned a great deal from the philanthropic sector with whom we share many similar challenges. A few notable resources include a comprehensive report from the Centre for Effective Philanthropy about the evaluation practices at philanthropic foundations (the discussion questions were helpful conversation starters for us) and resources and toolkits about how to engage leadership in strategic learning from FSG, a social impact consulting group.

 

DON’T:

  • Don’t be afraid to experiment or make mistakes.

This work is best done with a spirit of courage and curiosity, and a willingness to experiment and make mistakes. You might not get it exactly right the first time, and that’s okay.

The information you learn through experimentation, whether successful or not, can help you decide what to start, or stop, doing, and how to embed new practices into existing structures. Don’t assume that the most expensive or sophisticated tools and techniques are required to transform evaluation data into strategic insights.

  • Don’t leave it to gather dust on a shelf.

Developing strategies and carrying out evaluations takes a lot of time and resources. Why then do we often find the results gathering dust?

As MSFHR, we make strategy and evaluation part of our everyday operations by actively and routinely engaging with both, together and separately. This means building in time for learning and reflection, getting together for a data party to discuss evaluation findings, or regularly gathering as a team to assess and explore our progress toward strategic goals.

  • Don’t make it overly complicated, mysterious or overly technical.

Strategy and evaluation can seem like mysterious technical, amorphous things that get done by experts. But they don’t have to be that daunting.

We try to avoid complex frameworks and models, and simplify where possible, in an effort to help staff who aren’t embedded in the practice of strategy and evaluation on an everyday basis engage with the work.

In our experience, connecting strategy and evaluation helps organizations be more purposeful, innovative and adaptive, and fosters a culture of experimentation and rigorous learning. Not only does it enable you to more effectively chart your path through the complex, rapidly changing environments many of us are working in, but it also helps you enjoy the journey.

But, we are always learning! We’re keen to hear how other organizations are connecting strategy and evaluation. What are your do’s and don’ts? What have you learned along the way? If you’ve got ideas, resources or questions to share, we’d love to hear them.