An ODE to success? 10 years of Office of Development Effectiveness evaluations

Scott Dawson speaking at Devpolicy-ODE Aid Evaluation Forum, April 2016 (image: Devpolicy)

Scott Dawson speaking at Devpolicy-ODE Aid Evaluation Forum, April 2016 (image: Devpolicy)The Office of Development Effectiveness (ODE) within DFAT is this year marking its tenth anniversary.

Back in 2006, the then Coalition government’s White Paper on overseas aid, which came up with the idea, envisaged ODE as a “small, high-profile” office, to “monitor the quality and evaluate the impact of AusAID and, as appropriate, other Australian Government agencies’ ODA programs”. ODE reported directly to the Director General of AusAID.

The 2011 Independent Review of Aid Effectiveness endorsed the approach of having an evaluation unit within the aid agency (as against the British approach of establishing an Independent Commission for Aid Impact) but recommended an Independent Evaluation Committee be established to safeguard ODE’s independence and the quality of its work. The Coalition government abolished AusAID after being elected in 2013, but retained ODE as a branch of DFAT reporting directly to the Deputy Secretary with principal responsibility for aid-related matters.

In the past we have been critical of ODE on several fronts, most notably in November 2011, when one of us (Howes) said it had “gone awfully quiet,” publishing only one report since 2009. The blog ended in typical Devpolicy rallying style:

It can only be described as worrying that the entity set up just a few years ago to monitor the quality of Australia’s aid has only published a single evaluation since 2009, and that this, its last published evaluation, refers to the year 2008-09, now almost two and a half years ago. The next round of scaling up of the aid budget will be more challenging than the first. ODE should grow stronger over time, not weaker. The sooner the Government follows through on establishing the Independent Evaluation Committee the better.

Well, it was worrying, and we do like calls to action, but we also pride ourselves on balance, so here’s some good news. We’ve been looking back at all the evaluations ODE has produced since its creation. The figure below shows the results, with both the annual numbers and a two-year average to reduce volatility.

Number of ODE evaluations published per year

Number of ODE evaluations published per year (2006-2015)

There is a clear trend upwards in the last couple of years. Very few evaluations were published in 2013, an election year and of course the year of AusAID’s abolition, but the bump following that in 2014 was sustained into 2015. It does seem that ODE has succeeded in putting out more evaluations more regularly. This year is also looking good, with four evaluations already out the door.

Being good social scientists, we’d like to know why. Was it the establishment of the Independent Evaluation Committee in 2012? Was it the integration of AusAID into DFAT, and a resultant more relaxed approach to criticism? As always, it is difficult to know, and probably reflects a mix of things, including the value of sticking with one approach and making it work better, rather than chopping and changing. A vote for incremental reform rather than innovation.

Of course, simply counting the number of evaluations says nothing about their quality. Our perception, having read most of them, and discussed many of them at our twice-yearly aid evaluation fora (related blogs are collated here), is that average quality has also improved.

There is also the point that in the past ODE published other publications (such as issues notes) around its evaluation publications, and produced a podcast. This work seems to have reduced since the integration. But if this is a case of shifting from internal think tank for the aid program to dedicated evaluation unit, as was urged by by the Independent Review of Aid Effectiveness (which recommended ODE become an office of aid effectiveness), that may not be a bad thing.

This is a good news story, but it wouldn’t be a Devpolicy post without a complaint, so here it is:
we had to use the Wayback Machine (a web archiving tool) to find pre-2012 ODE evaluations, which are not on the ODE section of the DFAT website. What a shame. Surely for its tenth anniversary, ODE should put together a complete collection of its outputs.

Ashlee Betteridge is a Research Officer at the Development Policy Centre, and Stephen Howes is the Centre’s Director. Thanks to Devpolicy intern Sienna Lake for her assistance in compiling some of the data for this post.

We will be holding our next Australian Aid Evaluation Forum with ODE on 2 September at ANU, where we will be discussing two of this year’s ODE reports: one on teacher development, and one on operational evaluations. DFAT will also be presenting its new evaluation policy. More details and registration available here.

Ashlee Betteridge

Ashlee Betteridge is the Program Manager (Research Communications and Outreach) at the Development Policy Centre. She was previously a Research Officer at the centre from 2013-2017. A former journalist, she holds a Master of Public Policy (Development Policy) from ANU and has development experience in Indonesia and Timor-Leste.

Stephen Howes

Stephen Howes is the Director of the Development Policy Centre and a Professor of Economics at the Crawford School. Stephen served in senior economic positions for a decade at the World Bank before becoming AusAID’s first Chief Economist. In 2011 he was a member of Australia’s Independent Review of Aid Effectiveness.

Leave a Comment

Tweet
Share
Share
+1