After going through a competitive recruitment process for a dream position in a tropical locale, you finally get the call—you’ve landed an Australian Volunteers for International Development (AVID) role.
In the several weeks or months between getting the gig and going to pre-departure training, you wonder what it will be like. Perhaps you communicate with your host organisation, who tell you the ten million things they would like you to do on assignment. Perhaps you talk to returned volunteers who have romanticised their experience to the point where it could be a paperback novel. Perhaps every person you meet and talk to about your upcoming adventure gushes praise upon you like you are a modern-day Mother Teresa, piling on tired clichés that you are going off to save the world.
You take all of this with a grain of salt. But you are excited about what lies ahead.
When you finally get to pre-departure training, this is the main piece of advice on what to do over there:
“Don’t have any expectations!”
OK, no expectations. Easy to erase, right?
The next part of the briefing?
“Here are all of the things that we expect from you…”
Expectations play a huge part in the Australian volunteers program. There are not only the expectations of the enthusiastic volunteer. There are the expectations of the host organisations, the partner organisations delivering the program and the Australian government. And that’s before we even get to the expectations of recipients, or partners of the host organisation, or communities.
This is hardly unusual in aid. At the start of any project or grand endeavour there are always big expectations, couched in the foreign language of proposal-writing land. And the expectations held by different stakeholders might not always match up.
One of the things that sets the volunteer program apart from numerous others in the aid program is that it runs largely on goodwill. Instead of well-paid consultants or experienced staff implementing a program, you have everyday people signing up—fresh-faced 20-somethings, retirees and anyone in between. Sure, they get something out of it—living expenses covered, career development opportunities and the intangible, more spiritual remuneration that comes from doing good, taking on a challenge, absorbing and living within a new culture and building new relationships and friendships. But at the end of the day, volunteers are volunteering—if there wasn’t a significant element of goodwill involved then surely these volunteers would be searching out more lucrative roles.
The recent evaluation of the volunteer program by the Office of Development Effectiveness (ODE) at DFAT raised many questions—my colleague Stephen Howes has already critiqued the evaluation in a series of blog posts (and you can listen to a podcast of the discussion at our recent forum here), so I won’t go over the same ground.
But one question that sprung to mind from reading the evaluation and Stephen’s analysis was: how much should we expect from volunteers?
How far can you push the program towards aid world professionalism before you deplete the goodwill that underpins it?
And how could some of the mismatches in expectations between the different stakeholders in the program actually be addressed?
Firstly, as Stephen explored, the on-paper capacity building message often doesn’t meet the expectations of host organisations, nor match what most volunteers actually end up doing. This makes things harder for everyone involved.
For example, the evaluation notes that failing to build capacity was the major source of dissatisfaction for volunteers.
Volunteer dissatisfaction is mostly related to an inability to be effective in developing capacity of their host organisations, mainly because their organisation was poorly functioning, inadequately prepared to host a volunteer or had different expectations of the volunteer’s role in their organisation. (p.41)
Is it reasonable to expect volunteers to achieve long-term organisational capacity development, something that professional consultants and aid managers often struggle to do themselves?
The evaluation suggests a move towards role-based assignments, which makes sense. But if the institutional capacity focus remains, while the host organisation is instead expecting a freebie in-line worker or someone who is there to share skills (which as Stephen points out, is curiously not considered to be the right kind of capacity building by the evaluation), then this problem will persist.
No job description anywhere ever reflects a role fully, so there is always going to be a gap between expectation and reality. But ditching this capacity building obsession could cut the spin in assignment descriptions so that volunteers have a better idea of what they are signing up for and host organisations actually get the chance to more honestly articulate what they want and need, bringing everyone’s expectations more closely into alignment.
If we don’t change this, volunteers will remain stuck in the middle of what the Australian aid program expects from them and what their host organisation (HO) is hoping for.
It’s a tricky place to be—especially when you are expected to be a walking, talking ambassador for your country as well. The volunteer evaluation notes this tension between the capacity building push and the public diplomacy goals of the volunteer program.
…host organisations’ satisfaction was not significantly associated with any activities affecting the capacity of the host organisation as an entity… This finding highlights the inherent tension of AVID’s public diplomacy and capacity development objectives. To satisfy host organisations, volunteers must do tasks that fulfil immediate needs and transfer skills to individuals in the host organisation. Although this approach is immediately satisfying for host organisations and no doubt a positive public diplomacy outcome, it is a limiting proposition for capacity development. (p. 47)
A way to reconcile this is never addressed. One way would be to scrap the capacity building imperative—helpful Australians doing what host organisations would like them to do seems quite diplomatic.
But if we insist on institutional capacity building as a focus, volunteers also need some clarity on what their priorities should be: are they there to primarily serve the host organisation; are they there mainly to be nice ambassadors for Australia; or are they there predominantly to serve the aid program’s institutional capacity development agenda? In theory it is possible to do all three of these things simultaneously, but in practice it is much harder. So which one comes first?
Even if volunteers contribute on all of these objectives, which we know many of them do to some extent, how much of this should be a programmatic expectation and how much of this should just be considered a bonus?
If we start being too prescriptive on how volunteers should contribute then we not only risk a loss of goodwill. We also risk volunteers or HOs simply finding workarounds for unrealistic or incoherent program logic instead of actually following it—the data collected by the evaluation seems to show that this is already happening and it’s something that I’m sure many volunteers could attest to. Program planning that looks good on paper, but that rarely translates to reality, is probably not doing the volunteer program any favours, nor making a tough gig any easier.
Not that we could track this clearly from the program’s current monitoring and evaluation (M&E) processes.
The ODE evaluation finds that M&E is a major gap. It asserts that none of the three levels of M&E outlined in the program’s shared standards (individual assignments, host organisation capacity against three year plans or the contribution of volunteers to development outcomes) are being completely implemented.
End of assignment reports got an extra slap on the wrist. Each core partner uses a different template, and volunteer and host organisation contributions to the reports were “poor quality”.
One of the frustrations that I wrote about in my blog post last year on the volunteer program was writing reports that never seemed to get read. In my six months in-country, I met all the reporting expectations for my assignment, which included a one month initial report, a six-month report and then an end of assignment report. The reports collected no useful or comparative data, took up a lot of time (which wasn’t really an issue for me since once of my complaints was that I had too much free time) and in my experience weren’t used to address any issues flagged by the volunteer. The feedback loop was broken. So I can well understand how many volunteers wouldn’t be bothered to fill the reports in comprehensively by the end of a longer assignment.
Like most evaluation reports, the report calls for more rigorous evaluation, with, a favourite of most M&E advocates, Key Performance Indicators (KPIs).
Streamlining M&E across the core partners is a must, as is improving the quality and usefulness of the information received. But we should be wary about introducing an M&E regime that treats volunteer assignments like consultancies with deliverables –especially when we are asking people to ‘voluntarily’ build capacity in challenging contexts, and when it is essential to avoid placing volunteers in a position where their HOs feel they are being assessed or judged by an outsider who is supposed to be integrating into their team.
Perhaps what the volunteer program needs most is a stripping back of formal expectations and a refocusing on support rather than outcomes. The volunteer program is a good thing, as the evaluation shows, so we should be careful not to strangle it or to burn out the goodwill that it runs on in the race for results.
In my view, we need a volunteer program that creates the best enabling environment for people to go out and do good, by making a positive contribution to development as well as Australia’s image in the region, in line with the needs of host organisations and their wider communities. Getting everyone’s expectations on the same page as much as possible and having practical and realistic objectives for the program would probably be a good place to start.
And, whatever is done to the volunteer program, don’t introduce KPIs! There’s already an oversupply of lofty expectation.