The value of learning: understanding and measuring the impact of KM in international development

  • Stacey Young USAID
Keywords: impact, evidence, networks, knowledge sharing, lessons learned

Abstract

We need a demonstrable evidentiary basis for understanding what works and what doesn’t in international development, and to use that to guide programming decisions; the challenge is that some things are easier to measure than others, and so we tend to focus on the results and impacts that are easy to measure. Neither ‘evidence’ nor ‘results’ are limited to phenomena that are easily measurable, but we tend to lose track of this fact. We let the proxy of our limited definition of evidence stand in for what it was originally supposed to suggest, which is to say results. There is a related error that we often make, which focusing on the proxy of a static plan in place of focusing on actual dynamic implementation contexts and processes. Static plans are easier to develop and implement than dynamic ones, but – just as easily measurable evidence isn’t necessarily the most important evidence – easily implemented static plans aren’t the most effective ones. We need to develop methods for capturing and assessing and understanding the value we create by investing in learning, and this is what the KM Impact Challenge attempted to do for the field of knowledge management and learning for international development. Relatedly, to be more effective, we need to be more dynamic and adaptable in our strategy, design and implementation – and that in turn requires that we place more emphasis on sharing knowledge and learning about new technical learning, tacit/experiential knowledge, and contextual knowledge – in order that we and our implementing partners learn and adapt for maximum aid effectiveness.

References

Clark, L., 2011. KM impact challenge case story synthesis report [online]. Available from: http://kdid.org/kmic/km-impact-challenge-case-story-synthesis-report
Cummings, S. and European Association of Development Research and Training Institutes. 2011. Evaluation of the IKM Emergent Research Programme: taking a complexity perspective
to evaluation [online]. Available from: http://kdid.org/kmic/evaluation-ikm-emergent-researchprogramme-taking-complexity-perspective-evaluation
Farrell, C. and IntraHealth International, 2011. The HRH Global Resource Center: strengthening the global health workforce through KM [online]. Available from: http://kdid.org/kmic/hrh-globalresource-center-strengthening-global-health-workforce-through-km
Horst, N. and Help Channel Burundi, 2011. EthnoCorder: an innovation in mobile data collection and use [online]. Available from: http://kdid.org/kmic/ethnocorder-innovation-mobile-datacollection-
and-use [expanded version in this issue].
Mugo, A. and Arid Lands Information Network, 2011. Connecting arid lands communities with knowledge [online]. Available from: http://kdid.org/kmic/connecting-arid-lands-communitiesknowledge
Muniz, S., InsightShare and IIED, 2011. Participatory Video for monitoring & evaluation in community based adaptation to climate change [online]. Available from: http://kdid.org/kmic/participatory-video-monitoring-evaluation-community-based-adaptation-climate-change [expanded version in this issue].
Snowden, D., 2011. Keynote address at Kumvana, the Engineers Without Borders/Canada 10th Annual National Conference, 14 January Toronto, Canada. Available from: http://www.youtube.com/watch?v=ejnlwc7W8VE
Wong, K. and The Nature Conservancy, 2011. Marine Protected Area management capacity building program for the Gulf of California [online]. Available from: http://kdid.org/kmic/marineprotected-
area-management-capacity-building-program-gulf-california [expanded version in this issue].
Published
2019-09-06