Please ensure Javascript is enabled for purposes of website accessibility

Learning from our COVID Recovery Fund

Alex van Vliet - Research and Learning Manager

Research & Learning Manager for Lloyds Bank Foundation Alex Van Vliet

One year ago, we opened for applications to a new grant programme, one specifically designed to support small and local charities to emerge from the social and economic crisis of the first national lockdown.

 

In designing this programme we wanted to leverage the distinctive characteristics of our existing grantmaking: unrestricted core costs, a relationship-based approach and a comprehensive programme of capacity building support. By offering two year grants, we wanted to provide some stability at a time when most available funding was short term. The volatility and uncertainty that characterised the first stages of the pandemic has endured - around changing individual and community needs for people facing the greatest risks, the operating, regulatory and financial landscape for charities, the welfare and morale of leaders, staff and volunteers. That meant that the pace and timescales around monitoring, evaluation and learning that we relied on for our legacy programmes needed to be significantly sped up.

 

Working with the independent consultancy BrightPurpose, we set out to design and build an evaluation learning framework for the Covid Recovery Fund. A critical early decision was that the evaluation should be delivered internally by our research team. While this meant sacrificing the independence of an external evaluator, we would benefit from having tighter feedback loops and proximity to the work which would help us to gather learning faster.

We are now six months into the evaluation of the fund. Over the coming weeks, the Foundation’s research and learning team will be sharing short outputs from what we’ve learned from the work.

 

What is different about the Covid Recovery Fund?

The Covid Recovery Fund builds on many elements of our pre-Covid grantmaking practice, but has a number of distinctive features that we want to learn from to inform our future grantmaking strategy.

 

Capacity building and Development Support

  • Each charity is paired with an experienced independent management consultant - a Development Partner – tasked with providing bespoke 1-1 support that aims to help organisations adapt to the new normal and build resilience for the future
  • The role departs from our existing capacity building support model, where uptake of support is voluntary and primarily provided by third party consultants brokered through our Regional Managers. The Recovery Fund introduces:
      • Engagement in capacity building support expected as a condition of the grant
      • More in-depth support, both in time and resources, for each charity, with each Development Partner managing a portfolio of 10-15
      • Independence from the Foundation’s staff team in the commissioning of additional support
      • Peer networks among regional grant holder cohorts

 

Application criteria, processes and decision-making

 

Length and type of funding

  • Funding in the Covid Recovery programme is entirely unrestricted, and a fixed amount of £50,000.
  • Two year grant, during which time the charity can access capacity building support through their Development Partner (first 12 months) and the broader Enhance funder plus programme for the entire duration.
  • Additional funds available through the Foundation’s partnership with the DCMS Matched Funding Covid response.

 

Accountability

  • The Development Partners will be responsible for monitoring the programme in the first year, with limited written documentation required from charities.

 

Our Evaluation Questions

The first step of our evaluation design was to work with colleagues from across the organisation to generate ideas, and prioritise the strategic evaluation questions of the fund. These questions give shape to the focus, method and tools of the evaluation, and help us think through the decisions that the work should inform in future.

 

Evaluating our grantmaking processes

  • Who did we support? How do they compare to the charities we have previously funded?
  • How effective was our recruitment and application process for charities?
  • How fairly and efficiently did we make our award decisions?

 

Learning about the operating environment for grant holders

  • How are the needs of the communities served by the charities we fund changing, and how are they adapting their services in response?
  • What did charities identify and prioritise as their development needs?
  • What can we learn from the changes we made to improve equity, diversity and inclusion in our grantmaking, and how should it shape our future approaches?

 

Feedback from grant holders

  • How did charities engage with the developmental non-financial support?
  • What was charities’ experience of
      • Working with a Development Partner for a year?
      • Peer support?
      • The combination of unrestricted funding, Development Partner, peer support and Enhance support?
      • The relationship between Regional Manager, Development Partner, charity?
  • How well did this programme align with charities’:
      • Needs and aspirations?
      • Capability and capacity to take full advantage of the support?
      • Stage on their Covid journey: crisis, recovery, building forward?

 

Medium term outcomes of the fund

  • What did charities change about their service, delivery and/or business models as a result of the combined funding and development support?

 

To properly answer these questions, we know different groups of grant holders are likely to have different experiences. We want to pay particular attention to these differences by segmenting all our analysis by the following groups:

  • Size of charity by income
  • Charities led by-and-for Black, Asian and minority ethnic communities
  • Region
  • Sector – the complex social issue and focus of their work

 

A new role for the research team

The design of the framework has been strongly influenced by Michael Quinn Patton’s developmental evaluation method, where evaluators work alongside decision-makers and programme delivery staff using a mix of process, formative and (eventually) summative qualitative and quantitative data collection.

 

Alongside our primary research, we’re convening Development Partners together with the grantmaking team every two months for a half-day learning call. We’re using these sessions to gather insight and identify lessons learned, share emerging findings from the evaluation, sense-check our analysis and build a community of practice around our development support.

 

As a small internal team, delivering an in-house evaluation on this scale for the first time brings with it risks, challenges and opportunities. We are providing the inputs for and supporting the learning of colleagues, alongside an evaluative role that requires making judgements about the value of their work.

 

As we share outputs from the work, we are keen to gather feedback from others in the sector - researchers and evaluators, funders, infrastructure organisations and frontline charities – get in touch with Alex and Tom.