Subscribe To Our Newsletter
Abonnez-vous Ć  notre bulletin
Measuring the performance of the new Global Fund Strategy
GFO Issue 415

Measuring the performance of the new Global Fund Strategy

Author:

Aidspan

Article Type:
News

Article Number: 4

The M&E and KPI Framework and draft KPI Handbook up for discussion

ABSTRACT Global Fund Board members met for their one-day Retreat on 8 July and, among other issues, discussed a draft of the high-level Monitoring and Evaluation Framework, including key performance indicators (KPIs), for the 2023-2028 Strategy ā€˜Fighting Pandemics and Building a Healthier and More Equitable Worldā€™.

At the Boardā€™s one-day Retreat on 8 July, members considered a paper from the Strategy Committee (SC) meeting held on the previous day. The draft document is the high-level Monitoring and Evaluation (M&E) Framework, including key performance indicators (KPIs), for the 2023-2028 Strategy ā€˜Fighting Pandemics and Building a Healthier and More Equitable Worldā€™.

Background

The Global Fund wants to have in place a comprehensive M&E Framework to provide an accurate and timely overview of the Global Fundā€™s performance in delivering the new Strategy. It aims to do so in the spirit of a ā€œhighly contributory environment to enable accountable performance management of both the Partnership and the Secretariatā€.

Reviews of previous Global Fund KPI Frameworks have systematically flagged challenges linked to the complexity of measuring Global Fund partnership and performance across different, yet interconnected, topics through existing quantitative measures. The Secretariat has tried to address these challenges in developing the new M&E Framework and the KPIs embedded within it.

The new M&E Framework

The M&E Framework has five components:

  1. KPIs on key areas where accountable performance metrics are available.
  2. An evaluation calendar that examines cyclical stages of the business model and areas of the Strategy that are challenging to measure.
  3. Grant Implementation and performance monitoring.
  4. Management information related to the efficiency and effectiveness of business processes.
  5. Other Secretariat and Partner reports that provide additional assurance on how the Global Fund is operating and on the overall direction of the Global partnership.

 

These components provide the highest-level information to track progress and provide oversight and strategic steer by Global Fund governance bodies. Earlier consultations had been held on the measurement processes. Based on these discussions, the Secretariat developed a list of proposed KPIs alongside preliminary evaluation topicĀ areas that will complement KPIs (recognizing that further consultation on evaluation is required under the new evaluation function), as well as an Evaluation Calendar.

The specific details on all KPIs are provided in the Draft KPI Handbook.

Board discussion

The Board discussed how lessons from previous KPI Frameworks had been applied in the development of the M&E Framework. Members asked what the M&E Framework considered and what information was available to the Board. They discussed the specific KPIs that had been proposed through the measurement consultations and issues to be flagged to the Committees for guidance. Finally, they wanted to know which early proposals for the Multi-Year Evaluation Calendar would be taken forward under the new evaluation function and what progress had been made on the transition to the new function.

The Secretariat sought the Boardā€™s inputs on: (i) an overall steer to refine the M&E Framework; (ii) how comfortable Board members were with the proposed KPIs, as well as the ā€œstrategic steerā€ (presumably, the direction which the Secretariat wanted the Board to agree to) in relation to certain KPIs; and (iii) other comments to be considered in developing the first draft of the Multi-Year Evaluation Calendar.

Both the historical and current challenges of measuring performance on diverse and often hard-to-measure topics in a highly contributory environment have informed the approach to M&E Framework development, as depicted in the figure below.

Figure 1. Measurement challenges addressed by the new M&E Framework

Stakeholder feedback

People welcomed the new Independent Evaluation Panel (IEP) and look forward to the Chief Evaluation and Learning Officerā€™s (CELO) arrival. Their leadership, independence and adequate resourcing/support will be importantĀ for ensuring the M&E Framework is fully harnessed to improve learning, accountability, and programmatic assurance to the Board.

Constituents appreciated the considerable effort that has gone into continuous improvements in how the Global Fund is able to track performance and results. They noted in particular the extensive work to embed the KPIs within a wider and more coherent M&E framework that is better rooted within the Secretariatā€™s wider work and prioritises evidence-based decision-making. They especially appreciated the greater link between grant-performance and partnership-level indicators and initial thinking on how evaluation or other qualitative analysis complements quantitative measures. For significantĀ measurement areas that are not suitable for a KPI, it will be important to understand how they are covered by other parts of the M&E Framework and complement the KPIs.

In general, people commented that the framework is robust and covers most important strategic areas. In particular, they felt that the efforts around creating new sets of KPIs for community engagement, pandemic preparedness and response (PPR) and integrated and people-centred systems are very promising. They especially commended the introduction of a Gender Marker throughout the Framework.

Many of the points on re-orienting performance incentives and reporting towards programmatic outcomes/impact highlight the importance of the new M&E Framework, including KPIs. As work on these progresses, the Secretariat needs to articulate clearly how it will use the M&E Framework to give the Board a fuller picture of what is working (or not) based on lessons across different sources of evidence, types of information (qualitative and quantitative), and levels of accountability (e.g., how grant indicators ā€œroll-upā€ to KPIs), and to trigger timely responses.

Many participants requested that the Theory of Change referenced in the documents be shared more explicitly with the SC and Board and then used to inform the evaluation calendar to fill gaps in knowledge and the understanding needed to advance objectives.

The distinction between partnership level indicators and grant performance indicators and the strengthening of RSSH indicators was viewed as positive. However, while stakeholders were happy to see more indicators on health system strengthening and equity/gender/human rights barriers, there are still relatively few compared to disease-related indicators, particularly lower down the ā€œconifer of controlā€. They said there are still opportunities to harmonise indicators with those of other global health funders.

One aspect that people felt was missing is the measurement of partnership/collaboration with other partners and in particular signatories of the Sustainable Development Goal (SDG) 3 Global Action Plan (GAP). It neither features among the KPIs nor in the multi-year evaluation calendar. Stakeholders expect the Secretariat to propose concrete measurements in order to capture progress on this important strategic enabler (e.g., on the alignment of strategies with other actors, on collaboration under the SDG 3 GAP and on joint activities).

The Framework does mention ā€œpartner versus grant performanceā€. If the partner performance includes all Global Fund grant recipient countries, regardless of whether the grant covers the areas corresponding to the partner KPIs, how does this relate to tracking Global Fund performance? The KPI is an aggregate, so this is less useful in making a statement on performance. What, they wanted to know, is the definition of partner performance?

On RSSH integration and quality metrics, there was some contention. For example, some stakeholders strongly recommended that the Fund think again about restarting health facility assessments (HFAs) saying that the proposed indicators are likely to be particularly difficult to collect data on or to determine sampling, and the quality of the data may be unusable. They wondered about the cost implications of the Global Fund funding HFAs and the risks of failing to produce useful data.

Others, while in principle supporting the RSSH health facility survey and reiterating the importance of measuring progress on this key priority, echoed the need for costs and the reporting burden to be kept at a minimum, and asked to hear implementersā€™ views on this proposal. However, others were strongly in favour of dedicating additional resources if necessary to allow surveys and studies to be undertaken to monitor health facilities and assess the Global Fundā€™s impact regarding health systems.

Questions were raised concerning the denominator of persons on antiretroviral therapy (ART) which does not account for access to viral load (VL) testing. If 30-40% of the population on treatment do not have a VL test performed during a year, the suppression rates will be artificially low. Using the number of VL tests performed in a specific time period as the denominator and the numerator of viral load suppression (VLS) provides a more accurate representation of VLS among those with access to testing. For those without access to testing, many patients may be suppressed but unable to determine this. Why, stakeholders asked, is the threshold determined with a maximum or minimum? The target should be 95% suppression as per the Global AIDS Strategy.

Regarding ā€˜know your statusā€™, it would be prudent to acknowledge this is proxy and not always accurate unless duplicate testing has been eliminated. This is further complicated by repeat testing in HIV prevention programs where staying negative is the goal and that is determined by routine testing.

Regarding the draft evaluation calendar, stakeholders noted that none of the proposed evaluations focuses explicitly on Global Fund business model practices and how they operate to deliver on priority programmatic areas. Participants recommend that this focus is explicitly built into the proposed evaluation topics.

Moreover, since these evaluations are key to inform SC and Board decisions, many stakeholders strongly disagreed with the proposal of cutting down the number of annual evaluations from eight to between four and six. Rather than a reduction in evaluation ambitions, stakeholders would like to see an assessment of the Global Fund’s M&E capacity before voting on the forthcoming decision on the 2023 OpEx. For example, was the Technical Evaluation Reference Group (TERG) budget sufficient for its ambitions? And how much of the OpEx is dedicated to evaluation (internal staff and TERG, costs of the evaluations themselves, dissemination activities, etc.)?

On financial KPIs, people had doubts about the proposal to create a ā€œshare of OpExā€ KPI. They strongly questioned the so called ā€œcost-effectivenessā€ paradigm, which is contrary to the values promoted by the Global Fund. Indeed, this paradigm leads to a rationale of concentrating technical resources on those already most benefitting from Global Fund funding, which is contrary to both the principles of efficiency and equity endorsed by the Global Fund. Some therefore felt they could not support the introduction of this indicator as a KPI based on a principle of cost-effectiveness at the expense of qualitative aspects. Nevertheless, they also considered this indicator to be useful for monitoring OpEx. It could be more detailed (the part of expenditures dedicated to monitoring and evaluation and assurance in particular) and could be included in other M&E frameworks as a mere indicator of business processes metrics instead of a KPI. Indeed, some stakeholders strongly believe that an extreme control of OpEx would be inconsistent with ā€œstrategic ambitionsā€ and would not reflect the organizationā€™s performance.

Regarding the overarching M&E framework, stakeholders expressed disappointment concerning the lack of information provided on the monitoring and evaluation of business processes. In general, they did not support the concept that KPIs and the multi-year evaluation calendar are the two only items which the Board should be allowed to see and comment on. They pointed out that the SC has repeatedly asked for disaggregated data, including on grant performance and the Strategic Initiatives result frameworks. They wanted to know what prevented the Secretariat from sharing this important information with the Board and Committees. See Article 3 on Transparency is a founding principle of the Global Fund: so where is itĀ ?Ā  in this GFO for more on the need for better communication practices.

The indicator for the introduction of new products is important for implementing and monitoring the NextGen Market Shaping approach. In addition to the quantity of innovative products, the type/need of the product should also be covered in order to ensure that new products are having a positive effect on the Fundā€™s mission.

As data quality and availability has been and continues to be a global challenge, some stakeholders wanted to know what strategies and ā€œfacilitatorsā€ the Global Fund will offer countries for improving the generation and use of high-quality data in order to enable reporting and foster a better strategy performance outcome.

Tags :

Leave a Reply

Your email address will not be published.

Aidspan

Categories*

Loading
Aidspan

Categories*

Loading