Where have all the evaluations gone?

Written by Stephen Howes

In 2006, when the Howard Government released its aid White Paper to guide the expansion of  the Australian aid program from just over $2 billion to $4 billion by 2010, it announced the establishment within AusAID of the Office of Development Effectiveness (ODE) to “to improve the effectiveness of Australia’s aid program” and to  “monitor the quality and evaluate the impact of AusAID.”

In November 2009, as part of its review of the management of the expanding aid program, the Australian National Audit Office commended AusAID on the establishment of the ODE, and gave a positive verdict on its impact. The ANAO recommended that ODE publish a forward work plan of evaluations, and that management responses be included in all evaluations. AusAID agreed to both recommendations.

Since then, however, ODE has gone awfully quiet. Only one evaluation has been published by ODE since 2009: the 2009 Annual Review of Development Effectiveness, released in November 2010, but actually covering the 2008-09 fiscal year. (It didn’t contain a management response.)

Only one other 2010 report can be found on the ODE evaluation website (and none for 2011). This is an input for the OECD review of the Paris Declaration, but is not an evaluation itself.  It is more an explanation of recent trends and policies in the Australian aid program.

ODE has put up a workplan for 2011-12. This currently says we can expect soon to see a rush of evaluations published, with an evaluation on civil society engagement due for completion in October 2011, evaluations on the Philippines and rural development in November 2011, and an evaluation on the response to HIV/AIDS in December.

Further inspection of the ODE website reveals that these evaluations have all been a long time in the making. The civil society evaluation dates back to at least June 2009, while the rural development evaluation was originally scheduled for release in May 2010. The HIV/AIDS report was intended to have been published in December 2010. No detail is available on the Philippines evaluation, but the draft evaluation is quoted in the April 2011 report of the  Independent Review of Aid Effectiveness, so it has obviously been around for a while.

It is said that justice delayed is justice denied, and the same might be said for evaluations. Aid evaluations are critical tools for promoting learning and accountability in what can often be an “out of sight, out of mind” industry. The most common complaint about them, or perhaps evaluations in general, is that by the time they are produced they are out of date, and that the world has since moved on. These particular ODE evaluations were clearly not intended to be such long and drawn-out affairs (the ODE workplan now on the web has been recently revised, with planned completion dates pushed back presumably on account of the delays in getting the evaluations out). One can only speculate what the causes of delay have actually been, but their effect is to undermine both the relevance and the credibility of the evaluations concerned.

More and more official donors, bilateral and multilateral, are moving in the direction of independent evaluation. The Independent Review of Aid Effectiveness, in which I participated, recommended that ODE’s evaluations be subject to an Independent Evaluation Committee, which would ensure both quality control and independence for the evaluations. By giving this Committee rather than AusAID responsibility for the oversight of ODE’s evaluations, both independence and the prospects for timely release should be enhanced. The Government has endorsed this recommendation, though the Committee has not yet been announced.

It can only be described as worrying that the entity set up just a few years ago to monitor the quality of Australia’s aid has only published a single evaluation since 2009, and that this, its last published evaluation, refers to the year 2008-09, now almost two and a half years ago. The next round of scaling up of the aid budget will be more challenging than the first. ODE should grow stronger over time, not weaker. The sooner the Government follows through on establishing the Independent Evaluation Committee the better.

Stephen Howes is Director of the Development Policy Centre at the Crawford School of Economics and Government, ANU.

image_pdfDownload PDF

Stephen Howes

Stephen Howes is the Director of the Development Policy Centre and a Professor of Economics at the Crawford School.

4 Comments

  • I think the answer is quite simple, actually: paris and accra bye bye, seems to be the mantra many at post repeat in silence, holding their breath, while pretending to do the service delivery the government of recipient countries should, with the help of post, do.

  • Satish and Tess, Thanks for your comments on my blog.

    On Tess’s point, ODE does a good job in terms of being transparent about which questions it is asking. For some of its evaluations (I noticed rural development and civil society) you can see their terms of reference up on the ODE website.

    On Satish’s point, its not the case that ODE was modelled on the Bank’s Evaluation Department (now called the Independent Evaluation Group or IEG). IEG reports to the Bank’s Board, not to its President, and so is able to maintain its independence. There is no such protection for ODE, since it reports to the head of AusAID. I don’t see the Independent Evaluation Committee as another layer of bureaucracy, but as a mechanism for ensuring independence and timeliness.

    Which take’s me back to Tess’s question. The IEG produces a lot more output then ODE, but then it’s a much bigger outfit. In fact, it does a brief review of every completed Bank project, not to mention a number of country evaluations, thematic reports, and an annual aid effectiveness review. A more relevant comparison is on timeliness. I don’t have the details, but, as far as I can tell, IEG manages to avoid long delays around the release of its reports. It certainly manages to get its annual review out every year, something ODE has struggled with.

  • I can’t help feeling that evaluating the delivery of aid is a bit like the old story of tourists asking for directions in Ireland to be told “Well I wouldn’t be starting from here if I were you”. It is often hard to be sure that the fundamental questions have been framed properly before the “busy work” of evaluating has begun. It would be good to know how ODE compares with similar outfits such as the World Bank’s Evaluation Department referenced by Satish in terms of output, timeliness, etc.

  • Thanks to Stephen for seeking explanations for the silence of ODE. I mused at the creation of ODE, as I do now, on whether this function should be housed within AusAID rather than in the Australian National Audit Office (ANAO). The merits of placing ODE within AusAID were to allow access to in-house data. ODE’s independence was to be protected through and organisational structure modelled on the World Bank’s Evaluation Department. With Stephen’s thoughts now, is it time to fold ODE into ANAO? ANAO, after all, is independent, reports directly to parliament, and is mandated to provide: “independent assessment of selected areas of public administration, and assurance about public sector financial reporting, administration, and accountability”. This may be a superior strategy to the creation of yet another layer of bureaucracy, in the form of an Independent Evaluations Committee, to oversee the functions of ODE.

Leave a Comment