Could less reporting mean more critical self-reflection? (Credit: David Stewart/Flickr CC BY 2.0)

Public aid performance reporting: could less be better?

By David Green and Kaisha Crupi
30 June 2020

One under-the-radar contribution of Australia’s COVID-19 Development Response Strategy is a new aid acronym for our lexicon: MERLA – “monitoring, evaluation, research, learning and adaptation”. For wonks like us, the strategy makes welcome remarks about the importance of MERLA to “adaptive management” and “dialogue with … partners”, and not just reporting to Australian taxpayers.

In line with previous commitments to “streamline” DFAT’s aid performance framework, the Strategy also signals the replacement of Aid Program Performance Reports (APPRs) with “brief annual progress reports”. Whereas APPRs (about 30–40 pages) cover country context changes, progress against country program objectives and investment quality issues, their streamlined successor will merely report on “progress with [agreed] management actions”.

In his review of the Strategy, Stephen Howes calls this a downgrade. Indeed, it does seem at odds with the Strategy’s fine words on MERLA. But we are not so sure. Like the farmer in this zen parable, we’re inclined to say this development is “maybe good; maybe bad. Let’s see.”

Our view is informed by a rapid analysis of APPRs we presented at the 2020 Australasian Aid Conference, which looked at whether APPRs were achieving their stated objectives – to “strengthen program management, demonstrate accountability and improve effectiveness”.

Our premise was that, to achieve any of these objectives, APPRs had to be (1) balanced and (2) self-critical. So, we conducted a rapid review of the last four public APPRs for the five largest spending country programs (20 reports), focusing on the sections that described progress towards objectives.

On the first score – balanced assessment of both successes and shortcomings – the results were telling. Across the 20 reports we reviewed, about 145,000 words described achievements, and just 5,000 described shortcomings. So, in aggregate, just 4% of words were devoted to describing what was not going so well. This imbalance was fairly consistent across each APPR, with 16 of the 20 reports devoting between 6% or less of their words to shortcomings.

On the second score – whether reports were self-critical – the picture was more mixed. We took a closer look at the 4% of words devoted to shortcomings, and unpicked whether they were linked to issues:

  • within DFAT’s control e.g. implementing partner program management, DFAT budget cuts; or
  • outside of DFAT’s control e.g. partner government priorities, political instability.

In aggregate, 26% of the described shortcomings were linked to issues that were within DFAT’s sphere of control. The rest were either attributed to external factors or not linked to any cause, leaving it unclear what could or should be done to improve things.

We struggled to find equivalent examples across the Commonwealth of public reporting at the program (rather than departmental) level. One example we did review – the Closing the Gap Report 2019 – was even less balanced than the APPRs in our sample. Just 2% of its words were devoted to shortcomings and 17% of these were attributed to internal issues.

While we are the first to admit the limitations of our rapid review, we also think the findings raise questions about the place of APPRs in the aid performance framework.

The COVID-19 development response adds to these questions, given the “navigation by judgement” that it requires. Now more than ever, there is a need for active and adaptive management by DFAT of its country programs. From this perspective, labouring over lengthy annual public reports is going to be less useful than more regular sense-making, learning and adaptation among DFAT, its counterparts and implementing partners.

One among many tools that can support this adaptation is a learning agenda. As a complement to “performance assessment frameworks” that focus on monitoring predefined indicators of progress, learning agendas define topics or questions that would benefit from shared inquiry, to inform practical day-to-day decision-making. For example, which groups are most vulnerable to COVID-19-related impacts and how are they faring? What policy reform windows are opening due to the COVID-19 crisis? Information is gathered against questions in a variety of ways, depending on what is ‘good enough’ given intended uses and resource constraints. This could range from longitudinal research to rapid evaluations to one-off field visits. Multi-stakeholder reflection processes help to validate this evidence and ensure a range of views are considered.

Investing in learning agendas is not without risks. They can be time and resource intensive, become too process-heavy and lose sight of its original intent, or lead to ‘talkfests’ that do not translate into action. However, on balance, and particularly in the current context, we think they are worth trying as part of a broader system for regular review and adaptation.

Of course, robust annual reporting, and more regular internal learning and adaptation, are not at odds. It is possible to do both – or, indeed, neither. So, trimming back APPRs may open up spaces for critical self-reflection and adaptation in some country programs, and close down these spaces in others. Which of these paths is taken will determine whether the Strategy’s welcome commitments on MERLA are backed up by action. As suggested by Dan Honig (and others), the difference may come down to the “agent initiative” of the personnel involved: “If one oarsman changes his stroke, and then others follow, that turns the ship, too.”

Maybe good; maybe bad. Let’s see.

Listen to the podcast “Where’s the dirty laundry? DFAT APPRs and the public diplomacy imperative” from the 2020 Australasian AID Conference session with the authors, recorded for Devpolicy Talks.

 

About the author/s

David Green
David Green is a Principal Consultant at Clear Horizon.

Kaisha Crupi
Kaisha Crupi is a Consultant at Clear Horizon.

Page 1 of 1