Karen Jorgensen is Head of the Review, Evaluation and Engagement Division of the OECD’s Development Co-operation Directorate. Since 2006, she has overseen and participated in some thirty OECD Development Assistance Committee (DAC) peer reviews of members’ aid programs, including reviews of Australia’s program in 2008 and 2012 (ongoing). Robin Davies interviewed Karen during a recent visit she made to Canberra.
Robin: What are the objectives of the DAC peer review process and how does it operate?
Karen: We conduct five peer reviews of the members of the DAC every year. They are always conducted by two other members of the committee as examiners, together with a team from the Development Co-operation Directorate, which I usually lead. The objective is to hold our members to account for the commitments they have undertaken, whether domestically or internationally, and particularly in the context of the policy approaches they have agreed to in the DAC. There are some specific recommendations of course that they have agreed to in the DAC. One is about untying aid and others relate to the composition of their Official Development Assistance (ODA) and the purposes for which it is used, which must be developmental.
Robin: Are the reviews standardized? Do they reflect an agreed set of norms and values?
Karen: Yes, in a way they do. When I took up the job there was no standard analytical framework for the reviews. We are now working with a third generation analytical framework known as the reference guide . In fact, this latest version was only adopted in July and we are applying it for the first time in the current review of Australia. This framework really sets out the analytical components of a review – the areas that we analyse in all reviews to make sure that all the members get a fair deal. They all know what they are going to be reviewed against. This approach will give us more comparable reviews of each member over time and also between members.
It’s important that there’s a certain core to the reviews – things that we look at all the time. Particularly issues relating to the money that an agency or a country has available to spend as ODA: how that money is allocated; how rigorously they assess how it is spent; what processes they have in place to learn from their experience and if necessary, to ensure that money is spent better and more effectively in the future.
So we look at the political interest in spending money in key partner countries, but also at performance and how the money is allocated to sectors, countries and regions. We also look at the capability of our members to manage their aid programs in accordance with international principles. These things form the core of each review.
We also try to be current, looking at what issues are really important in the aid business at any given time. For the last four years we have looked at how our members work with the international aid effectiveness principles: promoting ownership; promoting alignment; using country systems; working in partnerships; and harmonising to lower transaction costs for partner countries.
In other words, we try both to retain a core to the reviews so that they are comparable over time, and to be topical and give good guidance on issues that are currently important.
Robin: You mentioned the adoption of an explicit framework for conducting peer reviews. Since you took up your role in 2006, what other major changes have occurred in the way that reviews are conducted?
Karen: We have seen that people from bilateral donor agencies who serve as examiners of other members are more centrally placed in their agencies. They are often quite senior people, which is really important because we have mutual learning as one of the key objectives of the peer review process. Senior people will be able to both share lessons on how they have managed their programs and tackled problems, and to take lessons from the reviews that they are conducting and implement them in their own agencies. So getting more senior people to serve as aid reviewers is really important.
I think we’ve also made progress in moving towards a set of common reference points for the reviews. They are not yet standards, strictly speaking, but that progress is something that is really important for the reviews.
Robin: Do you have to differentiate your approach to some extent when you’re dealing with smaller donors? I’m thinking for example of the Republic of Korea, the newest member of the DAC.
Karen: We apply a couple of different principles to the reviews. One is to be critical in the way we conduct reviews, because if we’re not, we’re probably not credible. We want to be respectful, and that really goes to the heart of understanding the context in which the aid program sits in a country. That means understanding both the domestic political context and the country’s relationships with other countries. It’s also about understanding the maturity of the program itself.
While we measure all members against the same yardstick in principle, the review process will play out differently in different countries. There are great divergences in the capabilities of our members to undergo the reviews. Also in the size of the programs, which will give a completely different context. There is one context in Greece and another context in New Zealand, and there is a completely different context when we review the United States or the European Union.
Robin: The Multilateral Organization Performance Assessment Network (MOPAN) has recently found a permanent home within the OECD Development Co-operation Directorate. It has an established way of operating and its own staff. Do you see, over time, some convergence of the review processes for bilateral and multilateral organisations?
Karen: It’s important to remember that our job is to run peer reviews of bilateral donors, and if we want to do peer reviews, we can really only do that with bilateral donors looking at bilateral donors.
We do however look at how members assess their multilateral contributions – how they review the effectiveness of their multilateral partners and how they base their allocations on the performance of those partners. We also look at whether our members pay attention to what their multilateral partners are doing and how that is relevant for what our members want to achieve with their own aid programs.
Additionally, we ask multilateral agencies about what they think of our members as multilateral donors: How do they contribute to the multilateral agencies’ own performance and programs?; Are they behaving as good donors in that agency? Those questions are very important and I think we can get some convergence on that score. It’s looking at performance going both ways, and that’s what I’m hoping we can gain from having the MOPAN secretariat in our directorate.
Robin: I’m interested in your engagement with some of the emerging donors. Have you done anything resembling peer reviews at their request or shared your experience with them in other ways, drawing on lessons from DAC peer reviews?
Karen: Yes. We’ve done that with a number of new and upcoming donors. Particularly with the members of the OECD who are not yet members of the DAC. We’ve run special reviews with five of them. Korea, incidentally, was one country that had a special review to help them build their performance and their systems to become a member. Others who have undergone special reviews are Poland, the Czech Republic, Slovakia and Slovenia.
I think the work we’re doing with China is really exciting in the context of the China DAC Study Group, which looks at learning from each other, because China already has 60 years of experience working in partnerships for development. The study group has looked at China in Africa particularly, but also our donors in Africa over a two year period. Now we’re beginning to look more at experience managing aid and managing development cooperation relationships. We’ve also worked with China on evaluation practices, which they’re particularly interested in, and we’ve talked a lot to them about how you collect statistics on your aid spend or your ODA.
We’ve also had seminars in India on statistics and how to manage aid. There are a lot of opportunities to work with emerging donors on triangular cooperation, which is something that the DAC is becoming much more interested in.
Robin: Are there some outstanding examples in your experience where recommendations from a peer review have had a striking impact in terms of a member government’s aid or development policies?
Karen: I’ve learned over time that the peer reviews rarely work in isolation. Where they are most effective is when they can lend a hand to an ongoing process, or where a review that has been initiated domestically follows what we’ve done and can give additional impetus to some of our recommendations.
This goes back some time but in the cases of Germany, New Zealand and also Japan, we have seen reviews influencing significant organisational changes that probably would not have happened if the governments had not already been going through public sector reform processes.
It is incumbent on us to understand the context and know where we can pitch a review, so that it will have the maximum influence on all the processes that might be going on.
Robin: I know you are not in a position to say anything about this year’s peer review of Australia’s aid program, which is ongoing. However, can you say a little about the major recommendations from the last review (2008) that you’re particularly tracking?
Karen: There was already a signal in 2008 about the growth of the program, so we’ve been interested to see how Australia’s program has grown in terms of money, where the money is spent, and how well it’s managed. And it’s been clear that there has been a phenomenal rise in the aid spend, which is really good news of course, and that a lot of effort has been put into designing the processes that are needed – getting staff in place, assuring that the money is well spent – and so on.
Those are the areas that we’re looking at. There’s a lot of work in progress, which is very encouraging. We’re seeing changes on a wide range of issues that should give assurance that the program is well run. We will be looking forward to seeing how that settles down over the next couple of years.
Robin Davies is a Visiting Fellow at the Development Policy Centre.