International Development Research Centre (IDRC) Canada     
idrc.ca HOME > Evaluation > Who We Are
 Topic Explorer  
Evaluation
     What We Do
    Who We Are
     Organizational Assessment
     Outcome Mapping
     Policy Study
     Projects
     Publications
     Presentations
     Contact Us
     Evaluation Research Awards

IDRC in the world
Subscribe
Development Dossiers
Free Online Books
IDRC Explore Magazine
Research Programs
 People
Martine Lefebvre
Stephanie Neilson
Charles Lusthaus

ID: 58368
Added: 2004-04-15 13:19
Modified: 2006-01-24 14:51
Refreshed: 2006-01-24 14:54

Click here to get the URL for the RSS format file RSS format file

Who We Are

IDRC's Evaluation System
Our Principles for Evaluation
Strategy 2000-2005
Our Vision
Our Mission
Our Outcome Challenges
Our Team



IDRC's Evaluation System

Evaluation at IDRC is decentralized with program initiatives responsible for their own evaluation work; the Evaluation Unit provides central coordination and support. Evaluations are carried out to fulfill specific goals or to respond to specific needs. Program Initiatives are required to have performance frameworks and evaluation plans, and to include evaluation results in their progress reporting.

The table found here outlines the project, program and corporate-level data collection components in IDRC's evaluation system.

Back to top


Our Principles for Evaluation

The evaluation principles with which the Centre operates have been captured by the Evaluation Unit as follows:
  • Evaluations should enlist the participation of relevant users. To be useful, evaluations need to produce relevant, action-oriented findings and this is fostered by sustained involvement and ownership by the client and users throughout the process.Evaluations are designed to lead to action. To be useful, evaluations need to produce relevant, action-oriented findings and this is fostered by sustained involvement and ownership by the client and stakeholders throughout the process.
  • Evaluation processes should develop capacity in evaluative thinking and evaluation use. Be it IDRC managers, program staff, or project partners, evaluation should increase participants’ capacities and comfort with evaluation. Exclusive reliance on external expertise can limit an organization’s ability to be clear and specific about its goals and to learn and apply lessons. Specific strategies can be built into evaluations that are explicitly aimed at fostering these organizational characteristics.
  • Evaluative thinking adds value from the outset of a project or program. Evaluative thinking can make a project or program more effective by helping clarify the results to be achieved, the strategies that will contribute to their achievement, and the milestones that will demonstrate progress. This is true from design through implementation.
  • Evaluation should meet standards for ethical research and evaluation quality. In order to ensure the validity of the evaluation findings, accepted social science research methods and procedures should be followed. The quality of evaluations are assessed against four internationally accepted standards: utility, feasibility, accuracy, and propriety.
  • The decision to evaluate should be strategic not routine. Evaluation is designed to lead to action and can contribute to decision-making and strategy formulation at all levels. To aim evaluations at providing useful findings, the Centre is selective in determining what issues, projects and programs are to be assessed at what time. 
  • Evaluation should be an asset for those being evaluated. Evaluation can impose a considerable time and resource burden on partner organizations and their participation should not be taken for granted. They should benefit from the process and should have control over the evaluation agenda when they are the intended users. 

Back to top


Our Vision

Useful evaluation that promotes innovation and social change.

Back to top


Our Mission

In realization of this vision, and in support of the Centre’s mandate, the Evaluation Unit promotes methodology development and processes of evaluative thinking that balance the opportunity for learning with the needs of accountability. Following its guiding principles, the EU works with IDRC partners in the field, IDRC program staff and IDRC managers, to strengthen the use, influence and quality of evaluation by engaging in four areas:

  1. Strategic evaluations;
  2. Capacity development;
  3. Tools and methods development and use;
  4. Organizational learning processes.

Back to top


Our Outcome Challenges

In order to understand the implications of our mission and establish a means of monitoring progress, the Evaluation Unit has set the following outcome challenges for each of our three primary boundary partners. The outcomes describe the changes we will try to help bring about in the behaviour, relationships, activities, and/or actions of our boundary partners. See Outcome Mapping for a more in-depth description of the methodology.

Outcome Challenge: IDRC Partners
IDRC Partners promote and include utilization-focused evaluation in their projects, programs and organizations to influence research for development activities, innovation and social change. These IDRC partners actively engage in opportunities to strengthen their ability to think evaluatively and to improve their understanding and use of monitoring and evaluation processes for their own needs. They engage in and support high quality evaluation and are recognized as regional experts and provide mentoring, training and technical advice to others.

Outcome Challenge: IDRC Program Staff
IDRC Program Staff promote and include high quality utilization-focused evaluation in support of their programs and projects while also participating in corporate-level evaluation processes. They engage in opportunities to build their own monitoring and evaluation capacities and to systematically build the capacity of those with whom they work to think and act evaluatively in their work. IDRC Program Staff work collaboratively with the EU and with their Southern partners to develop innovative evaluation processes that respond to the needs of diverse programs.

Outcome Challenge: IDRC Senior Management
IDRC Senior Managers actively demonstrate stewardship of a culture of evaluative thinking by maintaining a balance between learning and accountability within the Centre. They are active and persistent in their engagement in corporate evaluation processes by requesting evaluations, by reinforcing the development of high quality evaluation work and using the findings in their ongoing management of IDRC.

Back to top


Our Team

Fred Carden, Director
Fred joined IDRC's Evaluation Unit in 1993 and became the Director in March 2004.  He has written in the areas of evaluation, international cooperation, and environmental management.  He has taught and carried out research at York University, the Cooperative College of Tanzania, the Bandung Institute of Technology (Indonesia) and the University of Indonesia.  He holds a PhD from the Université de Montréal and a Master’s degree in environmental studies from York University.  His current work includes the development of evaluation tools and methods in the areas of organizational assessment, participatory monitoring and evaluation, program evaluation and outcome mapping.  Recent co-publications include Outcome Mapping, Organizational Assessment, and Evaluating Capacity Development.

Sarah Earl, Senior Program Officer
Sarah has worked at IDRC since 1998. She has carried out research and worked in Eastern Europe and the former Soviet Union. She holds a Master’s degree in Russian politics and development from Carleton University and an MA in Russian history from the University of Toronto. Her research focus was the role of the intelligentsia in Russian democratization efforts. Sarah led the conceptual development of outcome mapping and has authored various publications on the methodology and its use by projects, programs, and organizations. She now supports knowledge activists to use outcome mapping to research the social dimensions of development assistance and improve their effectiveness. Her current work also includes designing and implementing organizational learning processes, researching international knowledge networks, and developing the evaluation capacity of research and non-government organizations. She has extensive experience in group facilitation, training, and has worked in various parts of Asia, Africa, and Latin America. She is a founding member of the Board of Directors of the Sharp New Start Foundation.

Amy Etherington, Evaluation Officer
Amy has been working with the Evaluation Unit since 2003. Previously, she had volunteered with participatory development projects in rural agricultural communities in Northern India and plantations in Sri Lanka. Amy holds a BA in Sociology, and is currently working towards her Masters in Public Policy and Administration (development stream) at Carleton University..

Katherine Hay, Senior Regional Program Officer
Katherine joins the Evaluation Unit in August 2005. She is based in IDRC’s Regional Office in New Delhi, India. She has been working and carrying out research in South Asia for over a decade and joined IDRC in May 2000. Prior to joining IDRC, she worked as a consultant with the International Institute for Sustainable Development, and with Canadian and South Asian nongovernmental development organizations. Her past research includes analysis of the impact of modernization on gender norms and household power structures in Ladakh, India. Katherine is experienced in social and gender analysis, participatory monitoring and evaluation, and project management; she is also a skilled trainer and facilitator. While at IDRC she coordinated a five year women’s health and empowerment project (co-funded by the Canadian International Development Agency) that explored using women’s micro-credit groups as platforms for broader social change in rural India. In her regional partnership role, Katherine facilitates strategic partnering and dialogue with international agencies, foundations, government, and the private sector to promote development research and improve donor coordination. Some of her current interests include: the strategic role of organizational partnerships, organizational development and capacity building for applied research institutions, and intersecting issues around gender, empowerment, and citizenship in development research and evaluation. Katherine holds a degree in Environment and Resource Studies from the University of Waterloo, and an M.A. in International Affairs from Carleton University, Ottawa.

Kevin Kelpin, Senior Program Specialist
Kevin joined the Evaluation Unit in May 2004 as a Senior Program Specialist. He has a social science background with a PhD in Anthropology from the University of British Columbia and a MA in Visual Anthropology from the University of Southern California. He has taught development studies at Wilfrid Laurier University and has worked and undertaken research on community based natural resource management projects in Nepal and India. His current work and research interests include organizational assessment, participatory monitoring and evaluation, and the capacity-building role of NGOs in community-based natural resource management programs. He has also been involved in the production of documentary films on social and development issues in Nepal, Mexico and the United States.

Rebecca Lee, Intern
Rebecca joined the Evaluation Unit in July 2005. She has recently returned from an internship at a human rights center in South Africa. Rebecca holds a Master’s Degree in Understanding and Securing Human Rights, from the Institute of Commonwealth Studies, University of London. Her research focused on comparing and contrasting indigenous rights in Canada and Botswana. Rebecca has volunteered for numerous NGO’s in the area of minority rights, specifically pertaining to women, indigenous peoples and disability rights.

Martine Lefebvre, Evaluation Unit Coordinator
Martine joined the Evaluation Unit in March 2005 as Coordinator. Previously, she worked in Programs and Partnership Branch with Pan Asia Networking as a Program Assistant. She has a wide range of work experience in administration in the education and private sectors.

Back to top



 Document(s)

Evaluation Unit: Strategy 2000-2005 Evaluation Unit 2000-02




   guest (Read)(Ottawa)   Login Home|Jobs|Important Notice|General Infomation|Contact Us|Webmaster|Low Bandwidth
Copyright 1995 - 2005 © International Development Research Centre Canada     
Latin America Middle East And North Africa Sub-Saharan Africa Asia IDRC in the world