Evaluating the effectiveness of non-government and government projects and programs is a minefield.
But a four-year research project is making it a bit easier.
A team of researchers from RMIT’s Centre for Applied Social Research has led the development of a new online resource which brings together information on evaluation methods that until now has been scattered across websites, unpublished reports, and personal undocumented knowledge.
The result is BetterEvaluation, which provides access to information about more than 200 different evaluation methods and processes through a website, webinars and live activities.
Structured around a rainbow of evaluation tasks, it helps users to choose the methods or processes they need and use them well.
The website is being accessed by more than 10,000 users worldwide each week and is supported by more than 2,000 individual registered members.
Project director, Professor Patricia Rogers, said the project documented the collective wisdom of people doing evaluation in different sectors, organisations, regions and languages.
“We’re very fortunate to be linked to many expert individuals and organisations across the world who are generous in sharing their knowledge,” she said.
RMIT researchers, Dr Greet Peersman, Nick Herft, Kaye Stevens, Alan Mountain, Julia Laidlaw and Farida Fleming, are part of an internationally dispersed team working on the project, including contributors from the other three founding partners: Overseas Development Institute (UK), Institutional Learning and Change (an initiative of the Consultative Group on International Agriculture, Italy) and Pact, a Washington-based NGO that supports local capacity building worldwide.
The project is also working with evaluation associations, development agencies and individual volunteers from across the world.
Recent related projects have included the development of an Evaluation Toolkit for the NSW Government, guidance on impact evaluation for UNICEF program managers and the South African Government, and advice on developing terms of reference for USAID program managers.
One of the focus areas is how to evaluate programs and policies in complex and complicated situations, using methods such as theory of change, systems approaches and complexity theory.
“Simple input-output-impact factory-line models no longer fit how programs work,” Rogers says.
“We need evaluation methods that can encompass complicated programs with multiple contributors, multiple levels and multiple causal pathways.
“And we need methods that can encompass complex programs which are inherently emergent and changing, where traditional methods of planning need to be replaced by more real-time learning and adaptation.”
The project has received financial support from the Rockefeller Foundation, Australia’s Department of Foreign Affairs and Trade, Canada’s International Development Research Centre, the Ministry of Foreign Affairs of the Netherlands, and IFAD (the International Fund for Agricultural Development, a specialised agency of the United Nations).
A spokesperson for the Department of Foreign Affairs and Trade says that its Office of Development Effectiveness supports the BetterEvaluation website because it sees great benefits in a free website that brings together comprehensive, high-quality and engaging evaluation tools and guidance from around the world.
“A number of DFAT staff helped to trial and test the website and ensure that it is user-friendly and meets their needs. We have enjoyed working with RMIT on the BetterEvaluation partnership.
“BetterEvaluation has been flexible and responsive and our cooperative work has included a small project with the Indonesian Development Evaluation Community, developing local language evaluation resources and case studies. We look forward to our ongoing partnership.”
Story: Louise Handran
Photo: Carla Gottgens
This story was first published in RMIT’s Making Connections magazine.