The R4D iXc evaluation cover

The R4D iXc review

By Stephen Howes
3 October 2019

A report on the InnovationXChange (iXc) by Results for Development or R4D, a US NGO, “Experimentation, partnership and learning: insights from a review of the first three years of DFAT’s innovationXchange” is prominently linked from both the iXc website and the R4D one.

I had done my own review of iXc in mid-2018 (summarised in this blog). I had been drawn to write about iXc because it was the Coalition government’s flagship aid initiative, and Australian aid is one of my core research interests. Why had R4D, a US-based NGO, taken an interest in this Australian initiative, I wondered.

Reading the R4D report took me back all the way to 2012 (I had to check the year) when I criticised a Brookings Report that I thought was unreasonably kind to Australian aid, and which failed to acknowledge that Brookings itself received a significant amount of funding from the very program it was praising.

The R4D report isn’t particularly full of praise of iXc, an initiative I was critical of in my report, and which generally has a pretty mixed reputation at best. If I had to sum it up in one word, I would say that the R4D report is convoluted. There are no failures, only, wonderfully worded, “challenges that have historically undermined impact.” (Note how ‘historically’ points to the possibility that the criticisms are yesterday’s business, and no longer apply today.) There are also “several promising examples” but no overall assessment. Despite the report’s heading, there are in fact more recommendations than insights. Of course, this wouldn’t be the first evaluation to pull its punches. But I have rarely come away from an evaluation with so little sense of what the authors really thought. There is no summary of findings, let alone an executive summary.

The fact that the iXc report was both so convoluted and completely silent on who had funded it made me suspicious. The R4D report has a section on acknowledgements where the authors thank iXc staff for their contributions to the report, but not for any funding. I asked DFAT, and sure enough had it confirmed to me that, yes, the Department had paid R4D to carry out the study. It hardly needs to be stated that is an elementary failure of disclosure requirements for an organisation not to state that the entity it is reviewing is funding the review. And it is standard practice with DFAT evaluations.

In this case in particular the funding arrangements are highly pertinent. iXc was Foreign Minister Julie Bishop’s baby, and Bishop was Foreign Minister until August last year. Any DFAT-funded review of iXc would have to be an exercise in caution.

The other thing that bothers me about the R4D report is that it doesn’t even mention my own. R4D certainly had time to. My report came out on 2 August 2018, the month late in which R4D says it workshopped its initial findings. The R4D report was posted on the DFAT evaluation website in May this year.

Surely somewhere in my 7,200 words there was something worth drawing on, or contesting. I’m not aware of any other review. The R4D report was perhaps mainly based on interviews, but it does have references and hyperlinks, just not to my report.

Comparing the two reports, I can at least say that mine is more straightforward. I said at the outset of my review that a lack of publicly-available information made an overall assessment impossible. I therefore focused on, “a range of factors that are likely to have undermined iXc performance and reduced its potential.” I concluded that, “iXc has suffered from: insufficient focus; a lack of transparency and learning; too high a political profile; and what I call the downsides of innovation – a disregard for the fundamentals of aid effectiveness, an under-appreciation of prior experience, and an over-reliance on one-shot approaches, with insufficient follow-up.” (You can also see a response from Dr Sarah Pearson, DFAT’s Chief Scientist and Innovation Officer, here.)

As mentioned, the R4D report has no summary of findings, and I honestly could not attempt one myself. But just to give one example of how it might have, but didn’t engage with my report, the word ‘transparency’ (or any variant on it) does not appear in the R4D report. Is that because R4D thinks iXc is sufficiently transparent, or that transparency doesn’t matter?

To my mind the R4D review is discredited. A report that is so lacking in rigour that it fails to disclose that it is funded by its subject and fails to acknowledge other work on the subject lacks credibility. The damage to credibility goes beyond the report itself to iXc and to R4D, the latter an organisation that professes to promote learning and practice “analytic rigor”.

The Office of Development Effectiveness (ODE) leads DFAT’s aid evaluation effort. ODE deserves credit for getting more program and project reviews into the public domain. But the quality of DFAT (and earlier AusAID) evaluations is a long-standing problem. An ODE review of DFAT evaluations in 2017 found that only 41% were “good” quality, and only 71% at least adequate. Based on that finding, and this one example, a big effort is needed to improve the quality of DFAT aid project evaluations.

Postscript: Since I shared my draft of this article with DFAT, a disclaimer has been added to the R4D report that it was paid for by iXc.

About the author/s

Stephen Howes
Stephen Howes is Director of the Development Policy Centre and Professor of Economics at the Crawford School of Public Policy, at The Australian National University.

Page 1 of 1