We detected that your browser does not have JavaScript, or it is disabled. Our product catalog requires that you have JavaScript enabled to order products. Netscape and Microsoft offer free browsers which support JavaScript. If you are using a JavaScript compliant browser and still have problems, make sure you have JavaScript enabled in your browser's preferences. Guide to Project Evaluation: A Participatory Approach, NCFV, Public Health Agency of Canada

 Flag

Government of CanadaPublic Health Agency/Agence de santé publique du Canada
   
 
Français Contact Us Help Search Canada Site
PHAC Home Centres Publications Guidelines A-Z Index
Child Health Adult Health Seniors Health Surveillance Health Canada
 
Tell A Friend  -   Need LARGER Text? - Subscribe to our E-Bulletin
 

Guide to Project Evaluation: A Participatory Approach

Population Health Directorate

Health Canada

August 1996


Our mission is to help the people of Canada
maintain and improve their health.
Health Canada

Additional copies in English or French, as well as additional resource materials on family violence, are available from:

National Clearinghouse on Family Violence
Family Violence Prevention Unit
Public Health Agency of Canada
Health Canada (Address Locator: 1909D1)
Ottawa, Ontario K1A 1B4

Telephone: (613) 957-2938
or call our toll-free number:
1-800-267-1291
For TTY users: (613) 952-6396
or call toll-free:
1-800-561-5643
Facsimile: (613) 941-8930
FaxLink: (613) 941-7285
or toll-free: 1-888-267-1233

Published by the authority of the Minister of Health Canada, May 1996

Également disponible en français sous le titre : Guide d'evaluation de projet Une démarcheparticipative

This publication can be made available in or on (computer diskette, large print, audio-cassette, braille) upon request.

Please feel free to photocopy, with acknowledgement to Health Canada, any useful information in this document.
 

Cat. No. H39-355/1996E
ISBN 0-662-24231-9

ACKNOWLEDGEMENTS



The following people contributed their time, energy and creative thinking towards the production of this Guide. Yolande Samson
Michel Boyer
Tina DeRita-Oliveira
Dina Juras
John Stinson
Penny Mosmann
Brenda Simpson
Lillian Baaske
Carole Legge
Moffatt Clarke
Barbara Brady
Denise Annett
Carol MacLeod
Pat Corbett
Des O'Flaherty  
We would also like to thank the many other Health Canada program staff involved in reviewing this Guide during its various stages of development.

Special thanks go to the writers of this Guide, Donna Denham and Joan Gillespie, of Denham Gillespie Associates, who were instrumental in the design and development of this document.
 

CONTENTS
 
Chapter 1. Introducing the Guide
1.1 Why evaluate?
1.2 Getting started
1.3 A note on terminology 
Chapter  2. Evaluation for Learning 
2.1 Participatory evaluation
2.2 Putting participatory evaluation into practic
2.3 Activity: Introducing a participatory approach to evaluation
2.4 Handout: Principles of a participatory approach to evaluation
Chapter 3. A Framework for Project Evaluation
3.1 The five key evaluation questions
3.2 The five evaluation process steps
3.3 Tools for using the evaluation framework
3.4 A framework for project evaluation 
Chapter 4. Defining Project Work 
4.1 Developing project goals and objectives
4.2 Writing Project Objectives
4.3 Activity: Writing Effective Project Objectives 
4.4 Role of the outside evaluator
4.5 Points to remember
Chapter 5. Developing Success Indicators
5.1 Purpose of success indicators and their measures 
5.2 The process of developing success indicators and their measures
5.3 Activity: Defining success indicators 
5.4 Success indicators for project activity types
Chapter 6. Collecting Evaluation Data
6.1 Determining information collection needs
6.2 Information collection tools
6.3 Sample evaluation tools
Chapter  7. Analysing and Interpreting Data
7.1 Analysing project evaluation information 
7.2 Preparing useful evaluation reports
7.3 Activity: Analysing and interpreting data
Chapter 8 Using Evaluation Results
8.1 Using evaluation results
Chapter 9 Putting It Together 
Appendices
1. Definitions of evaluation terms
2. Annotated bibliography of evaluation resources
3. Framework worksheet for the five key evaluation questions and examples of developing the questions by project activity type 
4. Framework worksheet for success indicators and examples of developing the indicators by project activity type
5. Success indicators of increased public participation and strengthened community groups
6. Reaction sheet for evaluation workshop
7. Guided telephone interview - Community Resource Handbook for Women with Breast Cancer
8. Focus group interview guide - Child, Safety Awareness Program
9. Guidelines for keeping a project diary - Child Safety Awareness Program
10. Mail-out questionnaire -Advisory Committee, Health and Disabled Women's Project
11. Mail-out questionnaire - Health Care Providers, Health and Disabled Women's Project
12. We Want Your Feedback

1    INTRODUCING THE GUIDE



Evaluation can be useful, exciting and an important knowledge development tool. This evaluation guide has been developed to help make all these things happen. The goal of this evaluation guide is to provide easy-to-use, comprehensive framework for project evaluation. This framework can be used to strengthen evaluation skills and knowledge to assist in the development and implementation of effective project evaluations.

1.1     Why evaluate?

Effective project evaluations can

  •  account for what has been accomplished through project funding
  • promote learning about which health promotion strategies work in communities and which don't
  • provide feedback to inform decision-making at all levels: community, regional and national
  • contribute to the body of knowledge about health promotion
  • assess the cost-effectiveness of different health promotion strategies
  • position high quality projects for future funding opportunities
  • increase the effectiveness of project and program management
  • contribute to policy development.
A good project evaluation provides an extremely useful tool to manage ongoing work, identify successes and plan effectively for new health promotion initiatives.

1.2     Getting started

The Guide to Project Evaluation: A Participatory Approach provides direction for your work in planning and implementing effective project evaluations.

While no single resource can answer all your questions, we hope that the Guide to Project Evaluation.- A Participatory Approach provides you with clear directions. Add to it, adapt it, and customize it to meet your own needs.

1.3     A note on terminology

For many people the language of evaluation is a barrier that prevents them from getting on with the real evaluation work. This guide attempts to avoid this problem by using plain language throughout. Appendix 1 provides a brief overview of definitions of the more common evaluation terms.

To make the guide as practical as possible it includes

  • a framework to guide the step-by-step process of developing effective evaluations.
  • activities to introduce and plan for project evaluation.
  • examples demonstrating the application of the evaluation framework to health promotion projects.
  • an annotated bibliography of a selected number of useful evaluation resources (see Appendix 2).
2    EVALUATION FOR LEARNING

This guide is based on the belief that evaluation can be a useful and positive experience that promotes learning and action. What is learned from project evaluation is as important as what the project produces or creates.

2.1     Participatory evaluation

Health promotion activities enable people to take more active roles in defining their health needs, setting priorities among health goals and influencing and assessing efforts to improve their health. Participatory evaluation work supports these activities because it is a collaborative approach that builds on strengths and that values the contribution of everyone involved. While there are other approaches to evaluation, a participatory approach seems most consistent with the goals of the Public Health Agency of Canada's strategies and programs.
 
Principles of participatory evaluation
  • Participatory evaluation focuses on learning, success and action.
  • The evaluation must be useful to the people who are doing the work that is being evaluated.
  • The evaluation process is ongoing and includes ways to let all participants use the information from the evaluation throughout the project, not just at the end.
  • Recognition of the progression of change - knowledge, attitudes, skills and behaviour - is built into the evaluation.
  • The project sponsors are responsible for defining the specific project evaluation questions, the indicators of success and realistic timeframes.
  • Participatory evaluation makes it possible to recognize shared interests among those doing the work, the people the work is designed to reach, the project funders and other stakeholders.

For a more detailed examination of these principles, refer to the handout on page 6, Principles of a participatory approach to evaluation.

2.2     Putting participatory evaluation into practice

Participatory evaluation calls for collaboration among those who share a common interest in improving health. The collaborative process starts at the beginning of a project and continues throughout the life of the project. This type of evaluation is never a one-time, end-of-project event.

Refer to Chapter 9 of this guide, Putting it Together, for a checklist of common points to consider in each stage of project evaluation.

Collaboration allows those involved in the project to

  • work in partnership with community groups to do evaluation
  • recognize the experience and expertise of community groups
  • recognize the health outcomes of the project
  • make evaluation questions and findings relevant to all stakeholders
  • increase the acceptability of and support for the evaluation process and outcomes
  • produce more meaningful results that can be used by both programs and projects to learn how to improve the work being done and to influence policy and program directions.
The activity on the next page - Introducing a participatory approach to evaluation -  outlines a process for beginning the discussion about this type of collaborative evaluation. You may want to facilitate it yourself with the groups with which you work, or you may decide to copy it and give it to the project coordinators to use on their own.

For a thorough discussion of the principles and application of participatory evaluation, we highly recommend the following two resources:

  • Keeping on Track: An Evaluation Guide for Community Groups, produced by the Women's Research Centre of Vancouver
  • The Royal Society of Canada, Study of Participatory Research in Health Promotion, prepared by the Institute of Health Promotion Research, University of British Columbia.
2.3     Activity: Introducing a participatory approach to evaluation
 
Topic: Introducing a participatory approach to evaluation
Purpose:
  • To increase the group's comfort with evaluation
  • To discuss the key principles of participatory evaluation
Suggested
uses:
This discussion is useful for a group to have at the beginning of new projects uses: so they can think about and build in evaluation measures right from the start.
Time: 30 minutes
Materials:
  • flipchart
  • handout: Principles of a participatory approach to evaluation
Activity: 
  • Ask participants to work in pairs to prepare responses to the following question: "What does evaluation mean to you?"

  •  
  • Record on the flipchart the group's responses.
Often at this point you will get both negative and positive comments about evaluation. It is important to acknowledge all the participants' previous experiences with evaluation, good and bad. You can learn from their comments how project sponsors want to make evaluation practical and useful.
 
  •  Distribute the handout: Principles of a participatory approach to evaluation (page 6).

  •  
  •  Divide the participants into small groups. Ask each group to discuss the handout and to identify the three principles of a participatory approach to evaluation that they think are most important for their type of project activity.

  •  
  • Bring all participants together again to get the feedback and to discuss their ideas on how these principles could be practiced in their project. Use this time to answer questions about the method. The Keeping on Track manual is a good backup resource to have available.

  •  

     This discussion provides an opportunity to identify the principles that are most important to the group. It sets guidelines to which evaluators will be held accountable.

     2.4     Handout: Principles of a participatory approach to evaluation

    Participatory evaluation encourages a positive experience with the evaluation of health promotion activities. The key principles of this approach are outlined below. They have been adapted from Keeping on Track, An Evaluation Guide for Community Groups, produced by the Women's Research Centre in Vancouver.

    •  Participatory evaluation focuses on learning, success and action.

    • An important question to ask in evaluation is what we learned about what worked and what did not work. Then we need to ask how can we use these learning's to move to action. The people and groups most directly involved decide what determines success.
    • The evaluation is useful to the people who are doing the work that is being evaluated.

    • The project's goals and objectives - what the project intends to accomplish - must be the standards against which the project work is measured. Evaluators must pay special attention to the project's specific needs and available resources.
       
    • The evaluation process is ongoing and includes ways to let all participants use the information from the evaluation throughout the project, not just at the end.

    • The material produced for the evaluation must be given back to the participants on an ongoing basis in a format that is useful and clearly written in plain language.
       
    • Recognition of the progression of change - in knowledge, attitudes, skills and behaviour - is built into the evaluation.

    • To measure people's success in changing knowledge, attitudes, skills and behaviour, think in advance about the kinds of changes the project strategies and activities can produce. It is important to describe how these changes can be recognized and measured in a way that is possible and practical within the timeframe and resources available to the project.
       
    • The project sponsors are responsible for defining the specific project evaluation questions, the indicators of success and realistic timeframes.

    • Community sponsors of projects must participate in decisions about what questions will be asked and what information will be collected to measure the difference, the work made in a given period.
       
    • Participatory evaluation makes it possible to recognize shared interests among those doing the work, the people the work is designed to reach, the project funders and other stakeholders.

    • The evaluation must include information and input from the people doing the work, the people who the work is designed to help or reach and the project funders.
    3    A FRAMEWORK FOR PROJECT EVALUATION


    Project evaluation is challenging work because of the great diversity in the types of projects funded. To be effective, an evaluation framework must respect and respond to this diversity. Itmust also provide a consistent and common process that applies across projects, ensures accountability and produces evidence-based results that promote learning about what contributes to better health practices for Canadians.

    The evaluation framework presented in this guide meets this challenge. It is composed of two parts:

    • five key evaluation questions
    • five evaluation process steps
    The five evaluation questions form the core of the framework and can be applied to all types of project activities. The five process evaluation steps outline a systematic approach to the tasks that projects need to complete to answer the evaluation questions. Groups work through the steps to plan and implement the evaluation.

    The following two sections discuss the evaluation questions and the process steps. An overview of the evaluation framework is on page 14.

    3.1     The five key evaluation questions

    The process of developing the answers to the evaluation questions will vary, as each project varies, but the five fundamental questions remain the same.
     
    5 key evaluation questions
    What?1 1. Did we do what we said we would do?
    Why? 2. What did we learn about what worked and what didn't work?
    So what? 3. What difference did it make that we did this work?
    Now what? 4. What could we do differently?
    Then what? 5. How do we plan to use evaluation findings for continuous learning?



    1 This approach is based on work done by Ron Labonte and Joan Feather of the Prairie Region Health Promotion Research Centre.

    1.     Did we do what we said we would do? "What?" (Description of activities)

    The responses to this question describe the work done in the project and the relevance of this work in meeting the project goals and objectives. The project success indicators provide the criteria against which success is measured. They assist the project sponsor to collect the information needed to answer this and subsequent evaluation questions. (Chapter 5 discusses how to develop project success indicators.)

    Some of the more specific questions that may need to be answered to describe the project work include the following

    •  What activities were undertaken and how did they link to meeting the project goals and objectives?

    • Examples:
                  - Describe the resources that were developed to increase awareness.
                  - Describe the training workshops that were conducted for skill development.
                  - Describe the new partnerships that were formed to work on accessibility issues.
    • What were the major achievements of the project and what resources did they require?
    • If the objectives changed during the course of the project, how and why did they change?
    2.     What did we learn about what worked and what didn't work? "Why?" (Reasons for success)

    Participatory evaluation focuses on success, learning and action. Finding out what worked well in a project and what didn't work well practices this principle. Here are some of the questions that could be included in this discussion:

    • What strategies worked well for involving the target population in the project. Why?
    • What strategies didn't work well for involving the target population in the project. Why?
    • What strategies worked best for broadening the base of community support for the project. Why?
    • What strategies didn't work well for broadening the base of community support for the project. Why?
    • Which activities and strategies did we change. Why?
    • What was learned about the relative cost-effectiveness and efficiency of various project strategies and activities?
    • How realistic and relevant were the project goals and objectives?
    • In what ways did the project planning process work most effectively?
    • What did we learn about working together as a group?
    3.     What difference did it make that we did this work? "So what?" (Impact)

    The answers to this question measure a project's success in changing knowledge, attitudes, skills and behaviour. The project success indicators represent the group's assumptions about what changes should be expected from the project work and provide the criteria against which to measure change both during and at the end of the project. (Chapter 5 discusses how to develop success indicators.)

    There are two main ways project sponsors can assess impact: by using summarized data related to the success indicators and by asking specific impact questions of people who were involved in the project and who were the target of the project's work.

    The following types of questions may be helpful in discussions about this part of the project evaluation:

    • What changed as a result of the project?

    • - knowledge
      - attitudes
      - skills
      - behaviour
    • What changed as a result of the project for

    • - members of the target population?
      - community groups?
      - service providers?
      - caregivers?
      - project sponsors and staff
    • Were there any unexpected changes resulting from the project work? Describe them.
    • In what ways did this project contribute to increased public participation?
    • In what ways did this project help to strengthen community groups?
    • To what extent did the project reduce barriers to health?
    • What evidence is there to attribute any of the above changes to the project? What other factors outside the project might have contributed to the changes?
    • Were other initiatives started, alternative services proposed or new funding resources acquired as a result of this project?
    • In what ways did this project contribute to better health practices?
    • What new partnerships developed from this project? What was the nature of the partnerships and what was their contribution?
    • Is the model or approach continuing beyond the initial funding?
    • To what extent is this model or approach transferable to other communities?
    4.     What could we do differently?    "Now what?" (Future of this and other projects)

    Evaluation is for learning and often the best learning comes from examining the challenges that projects present. Here are some of the questions that could be included in this discussion:

    • What more effective methods for achieving the objectives emerged from the work?
    • What additional knowledge development is required to do the work more effectively?
    • What additional support from the funders and community sponsoring agencies would have been useful to the project in meeting its goals and objectives?
    • Are there more cost-effective ways to achieve the project's objectives?
    • Who else could have been involved in the work?
    • What could we do to expand the network of people involved in working on this issue?
    • Were all the project's needs met?
    • Is there a better way of developing realistic project goals and objectives in the initial planning stage?
    • How did management and administrative systems change through the project to become more effective?
    5.     How do we plan to use evaluation findings for continuous learning? "Then what" (Use of evaluation results)

    Participatory evaluation includes ways to use the evaluation results throughout the project as well as at the end. Some questions to consider in developing the evaluation are as follows:

    • How were evaluation findings used on an ongoing basis to contribute to the planning and implementation of the project strategies and activities?
    • How will project findings be used for future knowledge development?
    • How will the final evaluation learnings be documented and distributed?
    • Are there alternative ways to present the evaluation results so that more people can make use of the learnings?
    • How will the evaluation results be used for new project planning?
    • How will the evaluation results be used to influence policy and research priorities?
    Seeking answers to the five key evaluation questions will guide the evaluation process throughout a project. The learnings from answering the questions can then be used to shape current and future work.

    3.2     The five evaluation process steps

    The steps to developing answers for the five key evaluation questions are briefly outlined below, and then are further developed in the next five chapters of the guide.

    1.     Define the project work.

    To evaluate a project there must be clear, measurable project goals and objectives that outline what the project plans to accomplish. While this may seem self-evident, many evaluations have gone off the track because this initial work has not been done.

    Chapter 4, Defining Project Work, provides ideas on how to strengthen the development of clear project goals and objectives.

    2.     Develop success indicators and their measures.

    The process of defining what constitutes success for a project is another important step in developing evaluations. Project sponsors need to define the success indicators for their projects. The success indicators allow project sponsors to evaluate whether they accomplished what they set out to do and what the impact of their project has been.

    Chapter 5, Developing Success Indicators, discusses this process in more detail, gives some examples of specific indicators and describes an activity that could be used to help identify success indicators for projects.

    3.     Collect the evaluation data.

    After the first two steps have been completed, it is necessary to decide

    • what information the project needs to collect
    • who has the information
    • how the information will be collected.
    Chapter 6, Collecting Evaluation Data, gives a brief overview of types of evaluation instruments and ideas on how to develop evaluation tools that are appropriate for projects. It also outlines some of the tips and cautions for using these tools.

    4.     Analyse and interpret the data.

    As the evaluation data is collected, it should be summarized and analysed and key learnings should be identified. This ongoing process will help projects prepare their final evaluation reports.

    Chapter 7, Analysing and Interpreting Data, provides some ideas to help with this process.

    5.     Use the evaluation results.

    Evaluation findings can be used throughout the project to improve the planning and implementing of project activities. By sharing project results with others, each project adds to the body of knowledge about health promotion.

    Chapter 8, Using Evaluation Results, provides ideas on how to use evaluation findings during and after the project.

    Working through these five steps will provide project sponsors with the information and tools they need to answer the five key evaluation questions. For small projects with limited resources, the process will be simple and straightforward. For large projects with greater resources, the work involved in each step will vary to reflect the complexity of project goals and objectives.

    For all projects, project sponsors should:

    • set realistic limits on the number of project-specific evaluation questions and on the amount of evaluation information to be collected, as determined by the evaluation resources available to the group
    • remember that the quality of information collected, not the quantity, is the most important factor in evaluation.
    Remember, the most successful evaluations are
    clear and easy to understand.

    3.3     Tools for using the evaluation framework

    To help in applying the evaluation framework, several different tools have been developed for this guide. Examples provided reflect the most common Health Canada project activity types, which are

    • needs assessments
    • education and awareness
    • resource development
    • skills development
    • developing innovative models
    • reducing barriers to health.
    • One-page overview of the Framework for Project Evaluation (see Section 3.4)
    • Framework Worksheet for the Five Key Evaluation Questions and Examples of Developing the Questions by Project Activity Type (see Appendix 3)
      • The blank worksheet can be used by projects to develop the five evaluation questions. The examples show how the questions can be further developed to reflect the specific evaluation needs of projects.
    • Framework Worksheet for Success Indicators and Examples of Developing Indicators by Project Activity Type (see Appendix 4)
      • The blank worksheet can be used by projects to develop their own project specific success indicators and their measures. The examples provide ideas for developing success indicators and measures that projects may find useful.
    3.4     A framework for project evaluation

    An overview of the framework for project evaluation is presented on the next page. This overview is a useful tool that can be used for

    • introducing the framework
    • reviewing the evaluation work
    • A Framework for Project Evaluation
    5 key evaluation questions
    What? 1. Did we do what we said we would do?
    Why? 2. What did we learn about what worked and what didn't work?
    So what? 3. What difference did it make that we did this work?
    Now what? 4. What could we do differently?
    Then what? 5. How do we plan to use evaluation findings for continuous learning?
                                    Steps in the project evaluation process
    1. Define the project work
    • clear, measurable project goals and objectives

    • Project activity types:
        - needs assessments
        - education and awareness
        - resource development
        - skills development
        - developing innovative models
        - reducing barriers to health
    2. Develop success indicators and their measures
    • process for identifying indicators
    • ideas for success indicators linked to process and impact
    3. Collect the evaluation data
    • written questionnaire
    • telephone survey
    • reaction sheet 
    • interview - face to-face or phone
    • focus group
    • participant - observation
    • project diary
    • program records
    • before and after questionnaires
    • non-traditional methods of documentation
    4. Analyse and interpret the data
    • data analysis
    • identification of learnings, recommendations, actions
    5. Use the results
    • sharing of the results on an ongoing basis
    • use of leanings to inform future planning

    4    DEFINING PROJECT WORK



    Evaluation isn't something that happens at the end of a project. It is a process that begins when the project begins with the development of goals and objectives, and it continues throughout the life of the project. It is through the evaluation process that we learn whether projects are meeting their goals and having an impact on the attitudes and health practices of Canadians.

    4.1     Developing project goals and objectives

    The project goals and objectives describe what the project wants to accomplish and provide the context in which the five evaluation questions are answered. If the project goals and objectives are not clear, it will be very difficult to answer the first evaluation question, "Did we do what we said we would do?"

    Goals are general statements of what a project is trying to do.

    Objectives are specific, measurable statements of the desired change(s) that a project intends to accomplish by a given time.

    4.2     Writing Project Objectives

    Clear project objectives are essential to project work and effective evaluation. Good project objectives set the groundwork for demonstrating the impact of the project. Writing project objectives, however, can be challenging for many groups.

    Many people confuse objectives with activities. For example, a project may state that their objective is to create a video explaining how HIV/AIDS is transmitted. Creating a video is an activity. The objective the activity wishes to achieve is an increase in knowledge on how HIV/AIDS is transmitted.

    There are two helpful guidelines to use in writing good project objectives: (1) identify the specific changes the project is designed to accomplish, and (2) ensure that these changes are measurable.

    To help identify the specific project objectives, it is useful to ask the question:

    What are we trying to change?

    Projects generally focus on change in the following key areas: €     knowledge (increasing knowledge on a particular issue or subject)
    €     attitudes (creating an attitude that favours, a desired behaviour)
    €     skills (developing the individual capacity to adopt a given behaviour)
    €     behaviour (maintaining or adopting a healthy behaviour) These key areas may be seen as a kind of continuum of change. A change in knowledge can lead to new attitudes. Developing skills can enable people to make positive changes in their behaviour.

    Once the areas of change have been identified, it is important to ensure that they are measurable. There are five important elements to consider when creating project objectives that are specific and measurable. These elements are listed below in random order: €    the date by which the change will occur
    €    the specific change desired (use action verb)
    €    a measure (number or percentage)
    €    the target group
    €    the location Although their use may vary from one project to another, a good rule of thumb is to write project objectives that include these five elements.

    For example, a project with the goal of increasing awareness of factors related to HIV/AIDS transmission among high school students might create a project objective that reads:

    By August 1, 1996 (date)/the knowledge of the factors involved in HIV/AIDS transmission (specific change)/will increase by 30% (measure)/among high school students(target)/in Montréal(location).

    The following page contains an activity that can help in practicing how to write good project objectives.

    4.3     Activity: Writing Effective Project Objectives
     
    Topic: Writing project objectives
    Purpose: To give project sponsors a chance to write effective project objectives for their project.
    Time: 1-2 hours
    Materials:
     
    • flipchart
    • project proposal
    • Guide to Project Evaluation, Chapter 4
    Activity: 
    • Have participants refer to the project proposal for their project.

    •  
    • Working in small groups, have participants review the project objectives. Do the project objectives provide enough information to answer the question "What are we trying to change?"

    •  
    • Have participants examine the project objectives to see if they contain the five key elements (refer to Chapter 4 of the Guide):

    •  
    • date
    • specific change desired
    • measure of change
    • target group
    • location

    •  
    • If the objectives contain the five key elements, have participants break down the objectives into the five elements. If the objectives do not contain the five key elements, have participants rewrite the objectives to include these elements.

    •  
    • Bring all participants together to share their results and to discuss their ideas on which objectives are most useful and on how to keep the number of project objectives manageable given the scope and resources of the project.

    4.4     Role of the outside evaluator

    In small projects with limited resources, the evaluation can usually be done by the project sponsors. Larger projects, having correspondingly larger evaluation requirements, often hire an outside evaluator.

    If an outside evaluator is being used, it is essential that project sponsors clarify the evaluator's roles and responsibilities.

    Questions to consider when hiring an outside evaluator:

  • What will be the relationship between the project sponsor and the outside evaluator?
  • What work will the evaluator be responsible for? A detailed workplan should be agreed upon in advance.
  • What credentials and experience will be required of the evaluator?
  • How will the evaluator be informed of and held accountable to the evaluation framework and the principles on which it is based?
  • How does the project sponsor plan to handle any disputes with outside evaluators?
  • To assist in the effective use of outside evaluators, it is helpful to have the following information available:
    • a list of possible evaluators, including their profiles: what their strengths and weaknesses are, projects they have worked on, any experience working with them previously
    • ideas on different roles for outside evaluators, e.g. working with project sponsors to develop the evaluation plan, developing some or all of the data collection tools, analysing the data, writing the summary reports
    • sample contracts with outside evaluators
    • guidelines on when to use outside evaluators for projects.
    4.5     Points to Remember
     
    Three tasks which need to be done at the start of an evaluation: Develop realistic and clear project goals. Develop specific, measurable project objectives and success indicators. Define the roles and responsibilities of the people involved in the evaluation.

    5    DEVELOPING SUCCESS INDICATORS


    Identifying what success will look like during the developmental phase of a project may seem a little like putting the cart before the horse. Many project sponsors spend a lot of time developing goals and objectives, planning activities and thinking about budgets. The real challenge is to think to the end of the project and name the identifiable changes that they expect to occur as a result of doing the work. These identifiable changes, the success indicators, should be developed as soon as clear project goals and objectives have been established. Therefore, identifying success indicators is the second step in the process of planning high quality project evaluation plans. Project sponsors should identify the success indicators that are most appropriate and best reflect the reality of their own projects.

    5.1     Purpose of success indicators and their measures

    Success indicators are a group's assumptions about what changes should be expected from doing the project work. These indicators are quantified by specific measures for example, a number, a percentage or a level of satisfaction.

    Success indicators and their measures need to link directly to project goals and objectives since they provide the objective and measurable criteria by which groups judge the degree of success they have had in reaching their goals and objectives.

    Through their project activities, project sponsors attempt to change the knowledge, attitudes, behaviour or skills of a selected group of people - sometimes referred to as the target group. To measure or evaluate the amount of change, it is useful to know the status of the target group's knowledge, attitudes, behaviour and skills at the beginning of the project. Determining this initial status or starting point is called setting a benchmark. This initial benchmark helps the project determine the amount of change it is trying for in the project. The example below may help to illustrate this process. Other examples of success indicators can be found in Appendix 4 (Examples of Developing Indicators by Project Activity Type) and in Appendix 5 (Success Indicators of Increased Public Participation and Strengthened Community Groups).

    Example: Breast Cancer Network Project

    • One of the project goals is to promote the development of survivor/directed, self-help groups for women with breast cancer
    • A project objective linked to this goal is to raise the awareness of the need for and success of such groups among health care professionals and cancer societies.
    • The target group for the project is 200 health care professionals in the project's urban area

    • The project benchmark was established using the questionnaire completed for the initial needs assessment for the project. The results of the questionnaire indicated that 15% of the 200 health care professionals in the community knew about the advantages of self-help groups and referred breast cancer patients to them.
    • The project success indicator is to have 60% of the 200 target group health care professionals know about and refer to self-help groups for breast cancer patients.
    Project sponsors can use the success indicators to identify some of the specific questions they will ask throughout the project. The information that is collected and summarized in relation to these success indicators can be used by the groups to help answer the first three questions in the evaluation framework: Did we do what we said we would do? What did we learn about what worked and didn't work? What difference did it make that we did this work?

    5.2     The process of developing success indicators and their measures

    Choosing which indicators are the "best" is not an exact science. The process that project sponsors go through to identify their success indicators is as important as the final list of indicators created. Done well, this process can contribute to the building of commitment and excitement for doing an evaluation. It also helps groups develop reasonable expectations of what can be achieved.

    Some guidelines for developing success indicators

    Success indicators should:

    1.     Be results-focused i.e. refer to results or outcomes of the funded activity and not the activity itself

    2.     Be challenging but feasible.

    3.     Involve a meaningful comparison - a comparison over time, a comparison with other similar activities or (preferably) a comparison against a reasonable
            standard.

    4.     Be measurable, using quantitative or qualitative measures. In developing indicators, consideration should be given to data availability and data collection, given
            the resources available.

    5.     Refer to a result or outcome that can be reasonably attributed to the project activity.

    6.     Be as valid (directly related to the work done and not attributable to other factors) and reliable (able to be replicated) as possible.

    7.     Meet the criteria of
                - selectivity i.e. the number of indicators are limited to and focused on the key areas of concern.
                - balance i.e. the indicators refer to a range of project activities and results that together will provide a balanced assessment of project success.
                - usefulness i.e. the potential use of the evaluation information should be taken into account when developing indicators to ensure that they capture the
                   relevant information.
     
    Benefits of developing good success indicators:
    • clarification of project goals and objectives to make them measurable
    • identification of innovative success indicators that reflect unique community characteristics and needs
    • strengthened strategies and workplans to address some of the identified barriers to success
    • increased commitment to assess impact questions

    The activity on the next page has been used with a number of community groups to help them identify success indicators for their projects. Some of the most useful indicators of success have been developed when members of the target population and project sponsors have undertaken this activity together.

    5.3    Activity: Defining success indicators
     
    Topic: Defining project success indicators
    Purpose: To give project sponsors a chance to define the indicators of success for their project.
    Time: 2-3 hours
    Materials:
    • flipchart
    • handout: an outline of the project objectives and a list of the activities or strategies that are to be undertaken to achieve the objective
    • handout: Success Indicators of Increased Public Participation and Strengthened Community Groups (Appendix 5)

    • Note: Appendix 5 is for use as an example only. The project sponsors need to develop their own success indicators that are relevant to their project.

    Activity: 
    • Divide the participants into pairs or groups and assign one of the project objectives and the activities associated with that objective to each group. Ask the groups to list five things for each activity that would indicate that the activities were successful. Examples of possible activities: media campaign, organization of a mother's group, development of brochures, needle exchange program, creation of a 1-800 line.
    • Encourage participants to use all their senses in developing indicators: e.g., What changes do they expect to see? hear? feel? Put quantities on criteria when possible. The participants must be realistic about what they hope can be accomplished. Remind them that useful indicators are measurable, specific, easy to collect information on and ultimately can provide useful information to the group.

    •  
    • Discuss the success indicators with the total group and add new ones. You may want to review the list of indicators for increased public participation and strengthened community groups to see if any of them are appropriate for the project.

    •  
    • With the total group, review the indicators and order them by priority so that only the ones that are most useful and important are selected. The task is to keep the number of success indicators manageable for the project resources.

    •  
    • Once there is agreement on the indicators, the group could begin to look at the kind of information that needs to be collected to document the degree of success.

    5.4     Success indicators for project activity types

    Although there are many different types of projects funded under Health Canada programs, certain project activity types appear more frequently than others. They have been identified as

    • needs assessments
    • educational and awareness
    • resource development
    • skills development
    • developing innovative models
    • reducing barriers to health
    Appendix 4 provides examples of individual projects by activity types and their possible indicators and measures of success. Remember, these are only guidelines.

    Appendix 5 provides examples of indicators of success for two health promotion program/project impacts - increased public participation and strengthened community groups.

    These examples have been included to stimulate thinking and to start the process of developing project-specific success indicators.

    6    COLLECTING EVALUATION DATA


    Participatory evaluation relies on a systematic and rigorous collection of information from project staff and stakeholders. It draws on both quantitative and qualitative data to measure success and to clarify and make decisions about project characteristics, activities and effects.

    6.1     Determining information collection needs
     
    Three questions to ask in
    determining evaluation information needs:
    1.  What information is needed?

    2.  Who has the information?

    3.  How will the information be collected?

    1.     What information is needed?

    Projects need to collect evaluation information that will provide answers to the five key evaluation questions. The specific type of information to be collected is determined by the work done at the beginning of each project to define the project goals, objectives and success indicators.

    2.     Who has the information?

    Depending on the nature of the project, the people with information useful to the project evaluation will vary widely. People from whom it may be important to collect information include

    - project sponsors, staff and volunteers
    - program consultants
    - target population
    - consumers of the service
    - general public
    - advisory committee members
    - other service providers
    - partners associated with the project.
    3.     How will the information be collected?

    Project sponsors decide how best to collect evaluation information based on their project's needs and resources. Designing the information collection tools should be done in collaboration with the people who will be using them. Most community projects don't have the time or the resources to put into extensive recording of data and statistics. The goal is to find ways of collecting information that do not put too much of a burden on the people doing the project work but that still provide the information required to answer the evaluation questions.
     
    Characteristics of a good information collection process:
    • useful
    • practical
    • collaborative
    • systematic
    • ongoing
    • accurate
    • ethical

    6.2     Information collection tools

    There is a wide variety of information collection tools that can be used depending on the project's evaluation needs. Examples of tools that have been used in other projects are listed below.

    Written survey questionnaire

    • structured questionnaires used to reach large numbers of people
    • provides quantitative data (numbers) that can be statistically analysed and qualitative information that can be summarized
    • used to survey target population in terms of knowledge, attitudes, beliefs and behaviour.
    Tips and cautions:

    - When developing the questions for the questionnaire, ensure that they are not worded in ways that lead to biased or misleading
       responses.
    - While mass mailing of survey questionnaires has the advantage of reaching large numbers of people, there is no guarantee that people
       will fill out and return it, so the actual response rate may be low.
    - Limiting the number of questions may increase the response rate.
    - Using smaller but targeted mailings, followed up by a phone call, may increase the response rate.
    - Paying attention to respondents' literacy level, language and visual capacity may increase the response rate.
    - All survey questionnaires need to be pilot tested to ensure that the questions succeed in getting the information that is required.

    Telephone survey

    • can ask for the same types of information as the written survey questionnaire
    • has the advantage of increasing accessibility and allowing immediate clarification of questions if the respondent is experiencing any difficulties.
    Tips and cautions:

    - Telephone interviewers may face resistance from people who are tired of answering this type of call or who are
      suspicious because of their experiences with telephone soliciting.
    - Ensuring that the respondent is provided with clear information on the credibility of the group doing the survey, the
       purpose of the survey and how the collected information will be used may increase the response rate.
    - Finding a convenient time for the respondent to answer the survey questions may increase the response rate.

    Reaction sheet

    • simple kind of questionnaire that asks questions about people's satisfaction with a particular activity
    • easy and fast to administer and summarize
    • useful tool for getting an immediate response to new resource materials, workshop models and public education events.
    Tips and cautions:

    - Avoid using leading questions that prompt positive responses. Instead of asking, "Did you enjoy the workshop?" ask,
      "Did the workshop provide you with enough information to answer your questions about health promotion?"
    - Limit the number of questions to increase the response rate.
    - Include open-ended questions to obtain qualitative data. Shape these questions carefully to control the amount of
      material received. Examples: "Give three words to describe your reaction to this workshop." "What were the two key   learnings for you from this workshop?" Face-to-face interview

    • individual interviews structured around a set of open-ended questions that are developed to guide the interview and to provide consistency in the information collected
    • useful method for getting in-depth information on project activities
    • provides an opportunity to clarify responses and probe for further information
    Tips and cautions:

    - This tool can be used with a specific group of people (e.g., project staff to gather their opinions about the strengths and weaknesses
      of the project) or with key informants who are knowledgeable about the project (e.g., frontline service providers about how best to
      conduct outreach within their community).
    - It is a good method to use with respondents who have low literacy levels and might be uncomfortable with written data collection
      tools.
    - The interviewer needs to be trained not to bias the responses through the use of leading questions.

    Telephone interview

    • similar process and function as face-to-face interview but conducted by phone
    • less expensive to administer than face-to-face interviews.
    Tips and cautions:

    - Sending the respondent a copy of the interview guide in advance may promote a more thoughtful discussion.
    - Interviews, both in person and by phone, are an alternative to focus groups when you want to avoid group influences
      on the responses people give.

    Focus group

    • group discussion in which 10 to 12 people are brought together in a single session of approximately an hour to generate ideas and suggest strategies
    • facilitated using a specific agenda of structured questions, similar to the interview guide, that focuses the discussion in the meeting
    • used to obtain in-depth understanding of attitudes, behaviour, impressions and insights (qualitative data) on a variety of issues from a group of people, e.g., project staff or a project advisory committee.
    Tips and cautions:

    - The facilitator must remain neutral and non-judgmental and have the skills to keep the discussion moving and on track.
    - This is a particularly useful method for reflecting on evaluation findings and identifying key learnings. It may also be useful for
       developing preliminary ideas for new programs or for testing messages that will be used in educational and media packages.
    - It is not a useful method for developing consensus or making final decisions.

    Participant - observation

    • involves actual observation rather than asking questions
    • used to better understand behaviours, the social context in which they arise and the meanings that individuals attach to them
    • observers compile field notes describing what they observe; the analysis focuses on what happened and why.
    Tips and cautions:

    - This may be the most feasible way to collect data from some hard-to-reach populations (e.g., individuals who frequent public sex
      environments or drug shooting galleries).
    - As with all qualitative techniques, the results may not be fully generalizable to the entire study population.

    Project diary

    • project managers, staff or participants are asked to keep a record of their experiences while working on the project
    • provides qualitative evaluation data.
    Tips and cautions:

    - It is important to provide the participants with clear guidelines on keeping a log book: the type of information you are looking for,
      how it will be used, confidentiality, etc.
    - This is a useful method for identifying unintended consequences of a project.
    - Some people are very uncomfortable with this method because of the unstructured nature of the writing required.

    Program documentation

    • analysis of written records (minutes of meetings, telephone logs, intake forms, policy directives, financial records, attendance records)
    • can provide information on people's interests, preferences and patterns of usage of services and service locations
    • can often, through systematic review, provide important evaluation information, both quantitative and qualitative
    • inexpensive source of information.
    Tips and cautions:

    - This tool is limited in that records document only existing alternatives, they don't show other needs, wants or preferences.
    - It is important to identify evaluation information needs at the beginning of a project to ensure that the necessary records are kept
      throughout the project.

    Non-traditional methods of documentation

    • non-verbal or non-written evaluation tools used to respect diversity and accessibility issues
    • examples include cartooning, drawing, poster making, photography, videotaping, audio taping, scrapbooks.
    Tips and cautions:

    - Qualitative data collected may be difficult to analyse and generalize.
    - This is a useful method for getting responses from respondents who are uncomfortable with written tools.

    No single evaluation tool can provide all the evaluation information required. A combination of different tools that suit the project needs and available resources has to be developed. Regardless of which tools are selected, they should reflect the following tips to be effective.
     
    Tips for designing effective evaluation tools
    • Keep them short and simple.
    • Use plain language with no jargon.
    • For tools requiring written responses
      • - use large print
        - avoid clutter
        - leave lots of white space
        - provide ample room for responses.
    • Ask for key works and key learnings.
    • Develop evaluation tools in collaboration with the people who will use them.
    • Ask only for information that will be used.

    6.3     Sample evaluation tools

    Sample evaluation tools are provided in the appendices at the end of this guide. They were developed for use in community-based projects and are included here to give project sponsors some ideas, which they can adapt and build on to develop their own project-specific tools. The questions in each of the sample forms have been shaped specifically to provide data to answer the five key evaluation questions.

    Sample evaluation forms provided:
     
    Appendix 6. Reaction sheet for evaluation workshop
    Appendix 7. Guided telephone interview - Community Resource Handbook for Women with Breast Cancer
    Appendix 8. Focus group interview guide - Child Safety Awareness Program
    Appendix 9. Guidelines for keeping a project diary - Child Safety Awareness Program
    Appendix 10. Mail-out questionnaire -Advisory Committee, Health and Disabled Women's Project
    Appendix 11. Mail-out questionnaire - Health Care Providers, Health and Disabled Women's Project

    7     ANALYSING AND INTERPRETING DATA



    Most evaluation projects have no problem with collecting large amounts of evaluation information. What they sometimes do have difficulty with is effectively analysing, summarizing and using the results.

    The emphasis throughout this guide is on evaluation for learning and action. This section focuses on practical ways that people at the national, regional and community levels can turn evaluation information into usable, accessible summaries and reports that add to the body of knowledge about project success and promote change in attitudes, skills and behaviour. Committing adequate resources at all levels to do the evaluation work is essential if everyone is to benefit from the valuable learnings that can be gained from evaluating health promotion projects.

    7.1     Analysing project evaluation information

    Analysing evaluation information begins with a review of all the collected data to find the emerging themes or patterns. The five key evaluation questions provide useful categories around which to group information and develop the themes. Look for and record the information that is in the data about how well the project is doing, what is working, what should be done differently and what difference it is making.

    Project sponsors may want to record notes on the data on file cards or sheets of paper - one for each question, issue or topic. This makes it possible to see the emerging patterns more easily. Include exact quotations from the interviews and questionnaires. It is essential to stay with what people have said and let the data guide the analysis. Too much detail is better at this stage than not enough. It is always easier to cut down than to add information later.

    Once the material has been grouped into themes, it can be analysed to see how the results compare to the changes that were expected as identified by the success indicators. Take the time to reflect on what the analysis reveals. What was learned to answer the "what", "why", "so what", "now what" and "then what" evaluation questions? People who have been involved in the project should be involved in the interpretation of the findings.

    Project sponsors or the project evaluator should prepare short summaries of the key learnings from the analysis on a regular basis - for example, every three months or after each project activity. The importance of preparing these brief summaries, which highlight two or three key learnings, cannot be overemphasized. The summaries provide an excellent means of letting the key players in the project know about and begin to use the evaluation findings throughout the project - one of the basic principles of participatory evaluation. By completing summaries of key learnings at regular intervals, the work at the end of the project will be greatly reduced.
     
    Summary - Analysing evaluation information
    • Review the collected evaluation material for emerging themes and patterns.
    • Use the key evaluation questions to group the material into themes.
    • Analyse the material by themes, comparing the results to the changes that were expected as identified by the success indicators.
    • Reflect on what the analysis means. Ask other key project players for their interpretations.
    • Prepare short summaries of key learnings under each theme.
    • Prepare summaries of key learnings on an ongoing basis.
    • Submit the summaries to the participants for their feedback and verification of the findings.
    • Develop the final analysis.

    Analysis of quantitative data

    Quantitative data looks at the incidence and quantity of events. Data gathered through quantitative methods (surveys, questionnaires, administrative records) is numerical and may be analysed by calculating averages, ranges, percentages and proportions. Descriptive statistics simply account for what is happening in numerical terms. For example, when evaluating the use of a needle exchange system, an estimate may be made of the average number of people using the facility each week or the percentage of users returning needles. Bar charts, pie charts, graphs and tables can be effective ways to present the statistical analysis in a clear and concise manner.

    Analysis of qualitative data

    Qualitative data is information that is primarily expressed in terms of themes, ideas, events, personalities, histories, etc. Data is gathered through methods of observation, interviewing and document analysis. These results cannot be measured exactly, but must be interpreted and organized into themes or categories. The primary purpose of qualitative data is to provide information to the people involved in the project. This standard of usefulness is an important one to keep in mind when analysing qualitative data.

    Note: Neither the quantitative nor the qualitative approach to the collection and analysis of data is inherently superior. Each has advantages and disadvantages. For both, it is important to know the context within which they have been used in order to understand the analysis. Whenever possible, project evaluations should include several types of information collection tools. The analysis and summaries of key learnings should draw on information collected from all of them.

    7.2     Preparing useful evaluation reports

    Once the evaluation information has been analysed, the next challenge is to present the learnings in ways that are both informative and interesting.

    The brief summaries of key leamings, described in the preceding section, are often all that is needed to provide information on an interim basis. However, the final project report requires more data. The next section provides some ideas that might be useful for clarifying the expectations about the final report with project sponsors.

    Evaluation report outline

    Having an outline at the beginning of a project about how the final report will be developed is extremely useful. It helps shape the thinking about what information is needed and how it will be collected, analysed and used.

    There are two questions to consider when planning evaluation reports.

    • Who is writing the report?
      Small projects with very limited resources should have different expectations placed on them than larger projects or projects with funding for an external evaluator.
       
    • Who is the report for?
    While every evaluation report should be written in an interesting and clear style, the structure and emphasis of the report may vary depending on who it is for. For example, is it intended primarily for the funder or for the project participants? The former might focus more heavily on learnings about cost-effectiveness strategies; the latter might be more interested in leamings about how to implement a specific health promotion activity.


    The following sections form the basic structure - the bare bones - of an evaluation report. Personal stories and quotations from the project participants put a human face on the evaluation results and can make the report much more interesting and user-friendly. Groups can adapt and build on the following guidelines to develop evaluation reports that reflect the unique nature of specific projects.

    Example of an outline for a project evaluation report
     
    Section 1:  Executive Summary

    This section is for people who are too busy to read the whole report. - One page is best - never more than three.
    - It comes first but is the last piece written.
    - It usually looks at what was evaluated and why and lists the major conclusions and recommendations.

    Section 2: Background Information - Getting started

    This section provides background leading up to the evaluation: - how the project was conceived
    - why it was needed
    - the project goals and objectives
    - who was involved in the work
    - the project organizational structures.

    Section 3:  Description of the Evaluation - How we learned

    This section describes - the evaluation approach and how it was chosen
    - evaluation goals and objectives
    - how the evaluator was selected and managed
    - how the data collection tools were designed and used
    - how well the data collection tools worked
    - any limitations of the methodology
    - how people were selected to be interviewed, or to receive questionnaires, etc.
    - who did the interviewing, the number of people interviewed and their situation
    - how questionnaires were distributed and returned.

    Section 4: Evaluation Results - What we learned

    One way to organize this section is around the first four evaluation questions:

    • Did we do what we said we would do?

    • - Outline goals and objectives of the project.
      - Record what happened as a result of the project - e.g., resources developed, training sessions completed, etc.
      - Describe the changes that occurred in relation to the success indicators.
       
    • What did we learn about what worked and what didn't work?

    • - Outline key learnings from the project about making things work. Examples: producing effective resource    materials, structuring productive advisory committees, conducting needs assessments in rural and isolated    communities, building community ownership of health promotion projects, etc.
      - Identify learnings about what strategies didn't work and why.
       
    • What difference did it make that we did this work? (outcomes)

    • - Outline results from the evaluation that show how the project made a difference to consumers, project sponsors    and the wider community.
      - Identify any changes - of attitudes, knowledge, skills or behaviour - that occurred from the project work, e.g.,    how health practices have improved.
      - If appropriate, show how the project contributed to increased public participation and strengthened community    groups.
      - Include personal statements and anecdotal material from project evaluations which illustrate the impact an activity
         has had on project participants. Example: "One thing I plan to use right away in my work which I got from the    training is..."
       
    • What could we do differently?

    • - List leanings from the projects about different ways to do the work. Examples: improving the cost-effectiveness    of projects, adapting the project model to make it more responsive to volunteers, changing the reporting role for
         outside evaluators to improve accountability, etc.
      - Reflect on cautions and challenges about doing the project work.
    Section 5: Conclusions and Recommendations Final thoughts on what we would like others to know - Conclude with a summary of the work done and how well the goals and objectives were reached.
    - Include recommendations for further work.
    - Include recommendations on how the evaluation results can be used.
    Section 6:  Appendices - These may include copies of questionnaires or interview schedules, statistical information, program   documents or other reference material important to the evaluation but not important enough to go into the   text.
    - It is useful to include a bibliography - list of the sources used to compile the evaluation results, other   research studies and articles. A list of who was interviewed or organizations contacted may also be   included.
    7.3     Activity: Analysing and Interpreting Data
     
    Topic: Analysing and interpreting data from project evaluations.
    Purpose: To give project sponsors practice in completing the analysis and interpretation of project evaluation results for inclusion in the project evaluation report.
    Time: 1-2 hours
    Materials:
    • flipchart raw
    • project evaluation data
    • Guide to Project Evaluation, Chapter 7
    Activity:
    • Have participants review the raw project data
    •  Working in small groups, divide the raw data among groups and have participants analyse the data to find themes (refer to Chapter 7 of the Guide) that relate to the third evaluation question:

    • "What difference did it make that we did this work?"
       

    •  Have participants prepare short summaries of learnings for each theme

    •  
    • Have participants develop one example of quantitative analysis and one example of qualitative analysis

    •  
    • Bring all participants together to share their results and to discuss their ideas on which information is most useful

    •  
    • Have the total group develop a final analysis plan based on the information presented by the small groups

    •  
    • Have the total group brainstorm on informative and interesting ways to present the evaluation results, including how to organize the evaluation report for maximum effect

    8    USING EVALUATION RESULTS


    The fifth and final key evaluation question in the framework is, "How do we plan to use evaluation findings for continuous learning?" This is a question that needs to be considered at the very beginning of a project and not only at the end, as is often the case. Having ideas at the start of a project about uses for the evaluation findings helps ensure that the evaluation is conducted and the results reported in a way that meets people's needs. If key stakeholders are involved from the beginning, it increases their support for the process and their likelihood of using the results as they become available.

    There are several major ways in which project evaluations can be used to maximize their benefit, A few ideas are listed below.

    8.1     Using evaluation results

    • Bring together all project staff on a quarterly basis to discuss the evaluation results and look at ways the results can be used to increase performance, improve project administration, enhance planning activities, etc.
    • Present the report orally to staff, funders and community members.
    • Develop a news release outlining the main leanings from the evaluation and some of its more important conclusions. Send the news releases to key community contacts and evaluation participants.

    •  
    • Involve project participants in developing ways to present the project findings. Build on their stories and personal experiences to give a human face and to create interest in the evaluation results.

    •  
    • Make a presentation on the evaluation results to the local health council or social planning group, highlighting the accomplishments and describing how the results can be used to promote better planning.

    •  
    • Use the evaluation results to shape requests for new or continued funding or for suggesting alternative health practice models.

    •  
    • At the start of each new project proposal process, review evaluations from past projects to discover what learnings are transferable.

    •  
    • Send a letter thanking all project participants for their work on the project and include a summary copy of the key evaluation results.

    •  
    • Develop a short video of project participants discussing what they learned from the project. Use it to promote the project with community groups and with funders.

    •  
    • Build the evaluation results into presentations to local service clubs to show how their funding support could be effectively used.

    •  
    • Commit 15 minutes of time at meetings to information sharing about key leanings from project evaluations.

    •  
    • Extract highlights of project evaluation reports and distribute them regularly.

    •  
    • Develop a workshop to present the project evaluation results at a regional or national conference of health promotion professionals.

    •  
    • Identify other projects that are doing related work and share evaluation reports with them.

    •  
    • Organize a brainstorming session involving staff to come up with creative ideas to document and promote project successes.

    •  
    • Develop a user-friendly yearly summary of key evaluation results from across projects. Include ideas for using the results to strengthen planning and distribute the summary to key stakeholders.

    •  
    • Make presentations to other health care practitioners, using project evaluation results to show how they can benefit from involvement in health promotion work.

    •  
    • Systematically review and summarize all project evaluation results on a twice-yearly basis. Use the evidence-based outcomes to develop and improve health practice models.
    9    PUTTING IT TOGETHER

    This section provides a checklist to use throughout the project evaluation process.

    Developing the project evaluation

    Have you used the principles of participatory evaluation?

    Have you reviewed the evaluation framework and example worksheet?

    Have you identified the evaluation resources required to plan and carry out the evaluation?

    Have you discussed the roles and responsibilities of those involved in the evaluation?

    Reviewing the project evaluation plan

    Are the project goals clear and realistic? Are the project objectives specific and measurable?

    Are the project goals consistent with the overall goals of the funding program?

    Is an evaluation framework prepared and included in the plan?

    Does the evaluation proposal demonstrate a process that will provide information to answer the five key evaluation questions?

    Does the evaluation proposal demonstrate a participatory process that includes others, eg., target group members?

    Are the success indicators for the project identified in clear, measurable terms?

    Is there a practical outline of how evaluation information will be collected and from whom?

    Does the proposal give ideas on how the evaluation results will be used both throughout the project and at the end?

    Assisting and monitoring project evaluation work

    Is the project sponsor regularly informed of evaluation findings?

    Have the roles and responsibilities for reporting purposes been negotiated for the project sponsor and the outside evaluator?

    Does the final evaluation report address all five evaluation questions?

    Using project evaluation results

    Is there a plan in place for identifying different ways to share evaluation information?

    Are project evaluation results being used to contribute to future project planning?

    Common Evaluation Terms and What They Mean

    •  evaluation - a way of measuring if a project is doing what it says it will do.
    • goals - general statements of what an organization is trying to do.
    • objectives - specific, measurable statements of what an organization wants to accomplish by a given point in time.
    • objective approach - an approach which values the perspective, views and opinions of those outside of or distanced from the situation, event, organization, project, etc., as the primary basis for making an assessment or judgment.
    • informant - in research and evaluation terminology, the person you inter-view or question is called the "informant".
    • impact or outcome evaluation -gathers information related to the anticipated results, or changes in participants, to determine if these did indeed occur. It may also be used to test the effectiveness of a new program relative to the results of an existing form of service. An impact evaluation will tell you about the effects of a project.
    •  process or formative evaluation -an ongoing dynamic process where information is added continuously (typically using a qualitative approach), organized systematically and analysed periodically during the evaluation period. A process evaluation will tell you how the project is operating.

    •  
    • quantitative approach - an approach that tries to determine cause and effect relationships in a program. A quantitative approach will use measurements, numbers and statistics to compare program results. The information that is found is considered "hard" data.
  • qualitative approach - an approach that examines the qualities of a program using a number of methods. This approach uses non-numerical information - words, thoughts and phrases from program participants, staff and people in the community - to try and understand the meaning of a program and its outcome. The information that is found is considered "soft" data.

  • Annotated Bibliography

    This section contains a list of selected resources which you may find useful in your work with participatory evaluations.

    • Cultural Competence for Evaluators: A Guide for Alcohol and Other Drug Abuse Prevention Practitioners Working with Ethnic/Racial Communities 

      Mario A. Orlandi (Editor) (1992)

      This document contains 10 insightful articles that help evaluators understand the cultural ,factors involved when working in diverse ethnic communities.
       
        Available in English from:

        Division of Community Prevention and Training, Office for Substance Abuse Prevention,
        Alcohol, Drug Abuse, and Mental Health Administration, Public Health Service,
        U.S. Department of Health and Human Services, 5600 Fishers Lane, Rockwall 11,
        Rockville, Maryland 20857 U.S.A.
         

    •  Doing it Right: A Needs Assessment Workbook
    • Edmonton Social Planning Council (1988)
       

        Comprised mainly of activity checklists and work sheets, this workbook should be useful to agencies conducting needs assessments to develop new programs or to evaluate current ones.
          Available in English from:

        Edmonton Social Planning Council,
        #41, 9912 106 St.,
        Edmonton, Alta. T5K 1C5
         

    • Making a Difference: Program Evaluation for Health Promotion

    •  

        Tammy Home (1995)

         
      The purpose of this guide is to present basic principles and processes of program evaluation in the context of health promotion. Cost is $21.95 plus shipping and handling.
       
        Available in English from:

        WellQuest Consulting Ltd.
        11521-125 St.
        Edmonton, Alta. T5M 0N3
        Tel.: (403) 451-6145 Fax: (403) 451-5280

  • How to Choose a Research Consultant

  • BC Health Research Foundation (BCHRF)

      This handbook is a step-by-step guide to employing a research consultant to assist with community-based research. It is free of charge within B. C., and $10. 00 + GST elsewhere.Payment should be made to Waterford Press and sent to the BCHRF.
        Available in English from:

      BC Health Research Foundation
      Suite 710, Metrotower 11
      4270 Kingsway
      Burnaby B.C. V5H 4N2
      Tel.: (604) 436-3573; Fax: (604) 436-2573
       

  • Evaluating HIV/AIDS Prevention Programs in Community-Based Organizations

  •  

      National Community AIDS Partnership (1993)

    This guide serves community-based service providers, government funders and managers of prevention programs. Its focus is on the selection of the appropriate Methodology for the typeof program being evaluated and on the needs and resources of the organization conductingthe evaluation.

      Available in English from:

      National Community AIDS Partnership,
      1140 Connecticut Ave., NW, Suite 901,
      Washington, D.C. 20036-4001
      Tel.: (202) 429-2820
       

  • Evaluation in the Voluntary Sector
    •  
    Mog Ball (1988)
       
    A useful guide based on work in the British voluntary sector illustrating a variety of evaluation methods. It gives practical examples of good practice techniques and of lessons learned.
     
      Available in English from:

      The Forbes Trust, Forbes House,
      9 Artillery Lane,
      London E1 7LP England
      Tel.: 01-377 8484

  • A Guide to Program Evaluation for Accountability in Non-Profit Organizations

  •  

      David Lewis and Toni MacEachern (1994)

     A practical guide to the uses and benefits of program evaluation. The guide takes the reader through the steps of deciding to perform a program evaluation, preparing for it, creating the evaluation tools and analyzing and writing the results.
     
      Available in English from:

      Lamp Consultants to Non-Profits,
      10 Water St. N.,
      Kitchener, Ont. N2H 5A5
       

  • A Hands-On Guide to Planning and Evaluation: How to Plan and Evaluate Programs in Community-Based Organizations

  •  

    Durhane Wong-Rieger and Lindee David (undated)

    A comprehensive guide to planning and evaluating education and prevention programs forcommunity-based AIDS groups. The purpose of the book is to demystify structured planningand evaluation processes. It provides a step-by-step guide to planning and evaluation with sample work sheets and models.
     

      Available in English and French from:

      National AIDS Clearinghouse,
      1565 Carling Ave., Suite 400,
      Ottawa, Ont. K1Z 8R1
      Tel.: (613) 725-3769, Fax (613) 725-9826
       

  • How About ... Evaluation: A Handbook about Project Evaluation for First Nations and Inuit Communities

  •  

    Jacqueline D. Holt (1993)

     A clearly written, concise handbook, complete with straightforward work sheets for use by First Nations and Inuit communities who will be operating child development and mentalhealth programs and projects in their communities. Available in English and French from:

    Mental Health Advisory Services,
    Medical Services Branch,
    Health Canada,
    Jeanne Mance Building, 11 th Floor,
    Tunney's Pasture,
    Ottawa, Ont. K1A 0L3

  • Keeping on Track: An Evaluation Guide for Community Groups.

  •  Diana Ellis, Gayla Reid and Jan Barnsley (I 990)  Developed especially for community groups, this guide describes how to prepare for participant-focused evaluation, develop the evaluation design, and analyze and use the results. It is well written in plain language and grounded in the realities of community-basedgroups.
      Available in English and French from:

      Women's Research Centre,
      101 - 2245 West Broadway,
      Vancouver, B.C. V6K 2E4
       

  • Research for Change: Participatory Action Research for Community Groups.
    •  
    Jan Barnsley and Diana Ellis (1992)
       
    A step-by-step outline of a research method developed especially, for community groups.Written in clear language, it has lots of examples and how-to information.
     
      Available in English and French from:

      Women's Research Centre,
      101 - 2245 West Broadway,
      Vancouver, B.C. V6K 2E4
       

  • Study of Participatory Research in Health Promotion: Review and Recommendations for the Development of Participatory Research

  • in Health Promotion in Canada

    Institute of Health Promotion Research, University of British Columbia, and the B.C. Consortium for Health Promotion Research (1994)

    The report contains useful sections on applications of participatory research, and guidance tofunding agencies reviewing health promotion research grant applications.
     

    Available in English and French from:

    The Royal Society of Canada
    225 Metcalfe Street
    Suite 308
    Ottawa, Ontario K2P 1P9
    Tel. (613) 991-6990
    Fax. (613) 991-6996
     


     
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type:

    Objective:
    Did we do what we said we would do? What did we learn about what  worked and what didn't work? What difference did it make that we did this work? What could we do differently?
    How do we plan to use evaluation findings for continuous learning?
    "What"
    "Why"
    "So what?"
    "Now what?"
    "Then what?"

     
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type: Needs Assessments

    Objective: to identify barriers to health and potential strategies to reduce the barriers
    Did we do what we said we would do?

    "What"

    What did we learn about what  worked and what didn't work?

    "Why"

    What difference did it make that we did this work?

    "So what?"

    What could we do differently?
     

    "Now what?"

    How do we plan to use evaluation findings for continuous learning?

    "Then what?"

    • Describe:

    • - how the needs assessment
         was done
      - where it happened
      - who was involved - how they were involved.
    • Describe the relevance of the needs assessment to reducing barriers to health

     

    • What strategies worked well to involve members of the target population in the needs assessment? Why did they work? What strategies didn't work and why?
    • What were the key learnings from the needs assessment?
    • What did we learn about building partnerships with specific populations through the development and implementation of the needs assessment?
    • How appropriate were the project goals and objectives for the task of carrying out the needs assessment?
    • How were the findings of  the needs assessment used? By whom?
    • What were some of the unexpected impacts of the project work?

    • - what factors in the project might account for the above changes?
      - what other factors in the community might account for the changes?
    • What more effective methods for carrying out needs assessments were identified through the project work?
    • What barriers to health that emerged through the needs assessment require future attention?
    • How were the evaluation results of the needs  assessment used? By whom?
    • How will the results of the needs assessment continue to be used to influence program and policy development?

     
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type: Education and Awareness

    Objective: to raise awareness, increase knowledge and promote attitudes and practices that contribute to impoved health
    Did we do what we said we would do?

    "What"

    What did we learn about what  worked and what didn't work?

    "Why"

    What difference did it make that we did this work?

    "So what?"

    What could we do differently?
     

    "Now what?"

    How do we plan to use evaluation findings for continuous learning?

    "Then what?"

    • Describe the activities carried out to meet the project goals and objectives.
    • Describe who was involved in project work and their roles:

    • # of target population involved and their roles? # of community members involved and their roles?
      # of community agencies involved and their roles? 
    • Describe how the project activities link to the goals and objectives of education and awareness.
    • Describe any changes in project goals and objectives during the project and explain why the changes were made.

     

    • What strategies worked well to involve members of the target population in the project? Why did they work? What strategies didn't work and why?
    • What did we learn about conducting participatory education and awareness programs?
    • What project structures, relationships and skills worked well to enhance the project work of educating and raising people's awareness about the conditions that affect their health? Why did they work well? Which ones did not work well and why? 
    • What did we learn about building partnerships with specific populations to do education and awareness work?
    • What did we learn about costs and savings associated with different strategies for increasing education and awareness?
    • How appropriate were the project goals and objectives for the task of increasing knowledge and raising awareness of the conditions that affect people's health, both individually and collectively?
    • What changes of attitudes/ skills/behaviour were identified by:

    • - target population?
      - project staff and volunteers?
      - community groups?
      - healthcare providers?
      Examples of changes:
      - increased interest in promoting health issues
      - increased involvement of target group in accessing services
      - increased requests for additional information on health issues
      - increased number of participants involved in prevention work
    • What were some of the unexpected impacts of the project work? 

    • - What factors in the project might account for the above changes? - What other factors in the community might account for the changes?
    • What strategies for conducting effective education and awareness activities do we recommend for future groups?
    • What new skills and resources are required to more effectively conduct education and awareness activities? What barriers to conducting education and awareness activities require future attention?
    • What did we learn that can be applied to future work to conduct education and awareness activities
    • Describe the process followed throughout the project to use the evaluation findings to make adjustments and strengthen project work.
    • How can learnings from the project be used to strengthen future education and awareness activities? 
    • What did this project learn about doing evaluation that contributes to continuous learning about raising awareness and increasing knowledge?

     
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type: Resource Development

    Objective: to create resources and tools to use for increasing knowledge and developing skills
    Did we do what we said we would do?

    "What"

    What did we learn about what  worked and what didn't work?

    "Why"

    What difference did it make that we did this work?

    "So what?"

    What could we do differently?
     

    "Now what?"

    How do we plan to use evaluation findings for continuous learning?

    "Then what?"

    • Describe the resources and tools developed in the project: 

    • - type of resources
      - how resources were used
      - who used them.
    •  Describe the process followed for developing the resources.
    • Describe the distribution plan for the resources.
    • Describe who was involved and the process used to develop, review and use the resources.
    • Describe how the resources developed can contribute to better health practices.
    • Describe any changes in project goals and objectives during the project and explain why the changes were made.
    • What strategies worked well to involve members of the target population in the development of resources? Why did they work? What strategies didn't work and why?
    • What did we learn about making resources more accessible to specific populations?
    • What did we learn about the costs and savings associated with the development of resources?
    • What skills and activities were required to ensure optimum use of the new resources? e.g.

    • - training on how to use resource
      - development of speaker's guide to introduce new resources.
    • What did we learn about building partnerships with specific populations to develop resources?


    • How do the resources contribute to better health practices? e.g., by increasing knowledge, skills and /or by changing behaviour, policy.
    • Which groups in your community benefited from the development of the resources and how?
    • What other communities and groups benefited from the development of the resources and how?
    • What were some of the unexpected impacts of the project work?

    • - What factors in the project might account for the above changes?
      - What other factors in the community might account for the changes?
    • As a result of creating these resources, what gaps were identified that need to be a addressed in the future?
    • What would we do differently if we were developing new resources in the future?
    • What new skills and resources are required to more effectively develop resources?
    • How can learnings from the project be used to strengthen the development of effective resources?
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type: Skills Development

    Objective: to build the capacity of individuals, groups and communities to reduce the barriers to health
    Did we do what we said we would do?

    "What"

    What did we learn about what  worked and what didn't work?

    "Why"

    What difference did it make that we did this work?

    "So what?"

    What could we do differently?
     

    "Now what?"

    How do we plan to use evaluation findings for continuous learning?

    "Then what?"

    • Describe the activity out to meet the project goals and objectives. 
    • Describe who was involved in project work and their roles:

    • # of target population involved and their roles,
      # of community members involved and their roles,
      # of community agencies involved and their roles?
    • Describe what skills were learned, by whom and how they were used to improve health practices.
    • Describe the factors that contributed to the development of new skills.
    • Describe how the new skills link to improved health practices.
    • Describe any changes in project goals and objectives during the project and explain why the changes were made.
    • What strategies worked well to involve members of the target population in the skills development process? Why did they work? What strategies didn't work and why?
    • What did we learn about developing effective skills for improving health practices?
    • What project structures/ relationships worked well to enhance the project work and why? Which ones did not work well and why?
    • What did we learn about building partnerships with specific populations to develop skills?
    • What did we learn about costs and savings of enhancing individual, group and community skills capacity to reduce barriers to health?
    • What skills were learned, by whom, and how were they used to reduce health barriers?
    • What were some of the unexpected impacts of the project work?

    • - What factors in the project might account for the above changes?
      - What other factors in the community might account for the changes?
    • What other skills and resources are required to carry out skills development among the target group?
    • What barriers to health emerged in this project that require future skills development?
    • What did we learn that can be used to increase the effectiveness of future skills development strategies?
    • Describe how ongoing evaluation data was used to improve the effectiveness of the skills development process.
    • How can learnings from the project be used to strengthen future skills development work?

     
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type: Developing Innovative Models

    Objective: to contribute to the development of knowledge about alternative health action initiatives that lead to a
                      more flexible, responsive and cost-effective health system
    Did we do what we said we would do?

    "What"

    What did we learn about what  worked and what didn't work?

    "Why"

    What difference did it make that we did this work?

    "So what?"

    What could we do differently?
     

    "Now what?"

    How do we plan to use evaluation findings for continuous learning?

    "Then what?"

    • Describe the activities carried out to meet the project goals and objectives
    • Describe who was involved in project work and their roles:

    • # of target population involved and their roles? # of community members involved and their roles?
      # of community agencies involved and their roles?
    • Describe how the model contributed to reducing barriers to health for specific populations.
    • Describe any changes in project goals and objectives during the project and explain why the changes were made.

     

    • What were the key learnings from this project model about improving health practices with specific populations?
    • What did we learn about costs and savings associated with this model?
    • What did we learn from this model about building partnerships with specific populations?
    • How appropriate were the project goals and objectives for developing and testing the model?
    • What changes of attitudes, skills, behaviour and policy resulted from the new model? Who identified them:

    • - target population?
      - project staff and volunteers? - community groups?
      - health care providers?
    • What evidence is there to attribute these changes to this model?
    • What were some of the unexpected impacts of the project work?
    - What factors in the project might account for the above changes? - What other factors in the community might  account for the changes?
    • What recommendations for reducing health barriers can we make from the results of this project?
    • What barriers to health were identified from this project model that require future attention?
    • Who received the project findings and how did they use them?
    Framework Worksheet for the Five Key Evaluation Questions
    Developing the Questions by Project Activity Type

    Activity Type: Reducing Barriers to Health

    Objective: To empower individuals, groups and communities to reduce or overcome barriers to health through: - broadening access to health information, practices and care for specific populations.
    - developing and enhancing coalitions and partnership.
    - promoting healthy public policy.
    Did we do what we said we would do?

    "What"

    What did we learn about what  worked and what didn't work?

    "Why"

    What difference did it make that we did this work?

    "So what?"

    What could we do differently?
     

    "Now what?"

    How do we plan to use evaluation findings for continuous learning?

    "Then what?"

    • Describe the activities carried out to meet the project goals and objectives
    • Describe who was involved in project work and their roles:

    • # of target population involved and their roles? # of community members involved and their roles?
      # of community agencies involved and their roles? 
    • Describe how the project activities link to reducing barriers to health for specific population groups.
    • Describe any changes in project goals and objectives during the project and explain why the changes were made.

     

    • What strategies worked well to involve members of the target population in the project? Why did they work? What strategies didn't work and why?
    • What did we learn about making health services more accessible to specific populations?
    • What project structures/ relationships/skills worked well to enhance the project work of reducing barriers to health and why? Which ones did not work well and why?
    • What did we learn about building partnerships with specific populations to reduce barriers to health?
    • What did we learn about costs and savings associated with different strategies for reducing health barriers?
    • How appropriate were the project goals and objectives for the task of reducing health barriers for specific population groups?
    • What changes of attitudes, skills, behaviour and policy related to reducing health barriers were identified by:

    • - target population?
      - project staff and volunteers?
      - community groups?
      - healthcare practitioners?
    • Examples of changes:

    • - new groups formed
      - increased interest in promoting health
      - community services more accessible
      - issue on the agenda of local health council
    • What were some of the unexpected impacts of the project work?

    • - What factors in the project might account for the above changes?
      - What other factors in the community might account for the changes?
    • What strategies to reduce health barriers do we recommend for future groups?
    • What new skills and resources are required to more effectively address the barriers to health?
    • What barriers to health emerged in this project that require future attention?
    • What did we learn that can be applied to future work to address health barriers?
    • Describe the process followed throughout the project to use the evaluation findings to make adjustments and strengthen project work.
    • How can learnings from the project be used to strengthen future work in reducing health barriers?
    • What did this project learn about doing evaluation that contributes to continuous learning about reducing barriers to health?
    • How will the firidings be used for future knowledge development?
    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type: Objective : 
    Project Activity
    Indicators of Sucess and their Measures Evaluation Tools Who Has the Information
       
     
       

     


    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type:Needs Assessment

    Objective : to identify barriers to health and potential strategies to reduce the barriers
    Project Activity
    Indicators of Success and their Measures Evaluation Tools Who Has the Information
    For example, a project consists of conducting a needs assessment of 100 needle users to determine the factors which contribute to their commitment to reduce risk factors and to increase the awareness of the needle exchange program by needle users.
    1. Knowledgeable and other researchers agree that the factors identified are credible 1.1 At least two knowledgeable observers or researchers agree that the identified factors are reasonable and credible
    2. The needs assessment findings are used as a basis for developing intervention programming 2.1 At least one major project or program adopts the needs assessment findings and implements appropriate programming
    3. Awareness of the needle exchange program is increased among needle users interviewed for the needs assessment investigation. 3.1 At least 30% of new users of the needle exchange program report that they became aware of the exchange program because of the needs assessment process
    face-to-face interviews

    project documentation

    project records (intake forms)
     

    expert collaborators
     

    staff
     

    staff

    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type:Eduction and Awareness Objective : to raise awareness, increase knowledge and promote attitudes and practices that contribute to improved health
    Project Activity 
    Indicators of Success and their Measures Evaluation       Tools Who Has the Information
    For example, local youth develop a series of 3 "skits" on HIV/AIDS prevention and present them to 5 local intermediate schools
    1. Student audience has gained awareness and knowledge of HIV/AIDS issues 1.1 Average student audience score on a basic facts, multiple choice HIV/ AIDS knowledge test increases by at least 15 % after viewing the skits
    1.2 At least 30% of the student audience report an increased awareness of HIV/AIDS issues on a self assessment evaluation form
    2. Increased visibility of HIV/AIDS issues in the community through press coverage of the project activity 2.1 At least one article in the local press about the presentation
    3. On-going program activities are established as a result of project activities 3.1 At least 2 of the schools develop and implement a curriculum module to address HIV/AIDS issues and those responsible agree that these modules have been developed as a direct result of the project activities
    3.2 Community groups and other organizations provide financial and other support to ensure that the skits will continue on an annual basis after project funding ends

    pre/post questionnaire
     

    presentation questionnaire
     

    project diary
     

    survey questionnaire
     

    project records (financial records)


    target group
     

    target group
     

    staff
     

    school contact people
     

    project sponsors/staff
     

    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type: Resource Development Objective : to create resources and tools to use for increasing knowledge and developing skills
    Project Activity 
    Indicators of Success and their Measures Evaluation Tools  Who Has the Information
    For example, a resource package is developed to help health agencies and institutions become more accessible to persons with disabilities (but the actual distribution of the package is not included in this resource development project)
    1. The need for this package has been wellestablished 1.1 A survey of targeted institutions and agencies shows that over 30% of agencies and institutions agree that such a package was needed
    OR
    1.2 A preliminary Needs Assessment Study for the development of this package was carried out and a credible and independent reviewer of this study agrees that the study was well conducted and that the study has established the need for such a package
    2. The package consists of relevant, accurate,useful and credible information and recommendations 2.1 At least one independent "expert", such as a senior member of an organization representing persons with disabilities, who is knowledgeable about the issues associated with each area of disability, has reviewed the package and agrees that the information and recommendations are relevant, accurate, useful, and credible
    2.2 Over 75% of surveyed agencies and institutions agree that the information and recommendations in the package are relevant and useful to these agencies and institutions
    3. A distributor or distribution mechanism has been identified to distribute the package to targeted agencies and institutions 3.1 At least one credible distributor organization or distribution mechanism has been identified that will distribute the package to at least 80% of targeted agencies and institutions
    survey questionnaire
     

    project documentation (external review report)

    project documentation (external review report)

    survey questionnaire

    project records

    community agencies
     

    staff
     

    staff
     

    community agencies
     

    staff

    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type: Skills Development

    Objective : to build the capacity of individuals, groups and communities to reduce the barriers to health
      Project Activity 
    Indicators of Success and their Measures Evaluation Tools  Who Has the Information
    For example, a project will train volunteers to answer a 1-800 Breast Cancer Line
    1. Participating volunteers report that the training has been effective 1.1 At the end of the training session at least 80% of participants give an overall assessment of the training as "effective" or "very effective"
    2. The telephone answering skills of volunteers have improved 2.1 At least 75% of participants show an improvement in relevant answering skills after completion of the training
    3. Recently trained volunteers perform well 3.1 Supervisors responsible for recently trained participants rate 85% of trained volunteers as performing at a "good"or "excellent" level in actual answering situations during the 2 month period immediately following the training
    post-training questionnaire

    pre/post test using test situation scenarios
    project records (answering skills performance assessment sheet)

    target group participants
     

    target group participants
     

    staff (supervisors)
     

    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type: Developing Innovative Models

    Objective : to contribute to the development of knowledge about alternative health action 
                        initiatives that lead to a more flexible, responsive and cost-effective health system
     Project Activity 
    Indicators of Success and their Measures Evaluation Tools  Who Has the Information
    For example, a project consists of an outreach program linking through computer communications youth aged 1 I- 14 in 5 remote fishing villages in Newfoundland
    1. Targeted youth regularly communicate using this technology 1.1 At least 3 contacts of at least 15 minutes a week are made for each participating youth during the project period.
    2. Targeted youth increase their use of other communications technologies to communicate with other participants 2.1 Participants report an increased use of other communications technologies
    3 . Participants feel less isolated 3.1 Participants report reduced feelings of isolation and alienation
    4. High youth interest and participation in the use of this technology 4.1 At least 5 0% of the youth aged 11- 14 in these communities make use of this technology to communicate with youth outside their own communities
    project records (weekly time sheets)
    guided telephone interviews
    guided telephone interviews
    project records (participant log books)
    target group
     

    target group
     

    target group
    target group

    Framework Worksheet for Success Indicators
    Developing Indicators by Project Activity Type

    Activity Type: Reducing Barriers to Health Objective : to empower individuals, groups and communities to reduce or overcome barriers to health through:                          - broadening access to health information, practices and care for specific populations
                             - developing and enhancing coalitions and partnerships
                             - promoting healthy public policy.
       Project Activity 
    Indicators of Success and their Measures Evaluation Tools  Who Has the Information
    For example, a project consists of conducting homophobia training for health and social service professionals who work with HIV + gay men, their friends and families
    1 . Workshop participants have increased their knowledge and awareness of issues facing HIV+ gay men, their friends and families and have developed more positive attitudes towards the HIV+ gay population 1.1 Average participants' score on a true/ false knowledge and gay stereotypes test increases by at least 15% after workshop participation
    1.2 At least 50% of participants report an increased awareness and sensitivity towards the gay population and HIV/AIDS issues
    2. Less homophobic behaviour and attitudes are experienced by gay men at institutions where workers received training 2.1 A majority of HIV + gay clients report a reduced level of homophobic behaviour and attitudes among workers who received training
    3. More HIV+ gay men use institutional facilities because of reduced homophobic attitudes and and behaviour 3.1 An increase of at least 10% in usage by HIV+ gay men at institutions where workers received training
    3.2 A majority of new HIV+ gay users state that they have increased their use of facilities because of the reduced level of homophobic behaviour and attitudes

    pre/post questionnaire
     

    self assessment reaction sheet

    face to face interviews

    program records

    survey questionnaire
     


    target group (health and social service professionals)
     

    target group
     

    HIV and gay clients
     

    staff

    new HIV + gay

    Success Indicators of Increased Public Participation and Strengthened Community Groups

    Listed below are additional examples of indicators of success for two health promotion program/project impacts: increased public participation and strengthened community groups.For each impact, sample indicators of success are given. Below the indicators are the types of questions project staff can ask themselves in order to determine these indicators of success.

    Remember, setting indicators of success presumes that first you have determined your benchmarks. As described in Chapter 5, section 5. 1, benchmarks are the status of your targetgroup's knowledge, attitudes, behaviour or skills before project work begins.

    Indicators of increased public participation:

    • The people who experience the health or social issue the project addresses are involved in making decisions about the project.
    • The project reaches the consumers it wants to reach.
    • Skills and knowledge are transferred from individuals to the community. (e.g., increased sensitivity, interpersonal skills)
    • Social support networks are expanded.
    • Consumers are involved in every aspect of the project, from planning to evaluation.
  • Those involved with the project gain knowledge and skills through their involvement (e.g., increased self-confidence, organizational skills)
  •  
  •  Increased collective action.
  •  
  • Those involved with the project form a foundation for ongoing social change.
  •     Examples of evaluation questions for project staff:
  •  Who was reached by the project?

  •  How were they reached?
     How many were reached?
       
  •  How did we involve consumers in the planning, implementation and evaluation of the project? List the roles assumed by consumers. How did the involvement change throughout the project?
  •  
  • What barriers existed to involving consumers in the project?
  •  
  • What steps did we take to encourage consumers to get involved?
  •  
  • What changes in skills and knowledge did consumers experience as a result of their involvement in the project? What opportunities will consumers have to use them after the project?
  •  
  • What did we learn from consumers about: specific health needs? Factors that affect their health? How to improve conditions that affect their health? How has this learning been used after the project.
  • Indicators of Strengthened Community Groups:
  • Positive visibility, recognition and acceptance in their communities.
  •  
  • Increased knowledge, skills and understanding by the group's members.
  •  
  • Building of coalitions and the formation of partnerships.
  •  
  • Creation of a foundation to support future activities.
  •  
  • A greater sense of group identity and purpose.
  •  
  • Ability to influence and/or participate in decision making that affects the community.
  •  
  • The exchange of knowledge, skills and understanding (e.g., sharing of knowledge of health needs of the community with other groups).
  •  
  • Cooperation with other groups and networks.
  •  
  • Ability to sustain the participation of the community.
  •  
  • A sense of the group's collective power.
  •     Examples of evaluation questions for project staff:
  • In what ways is our group stronger than at the beginning of the project? (e.g., stronger sense of identity, incorporated, money raised from diverse sources, media coverage, larger number of volunteers, new skills)
    • What different groups in our community did we work with? What new groups did we work with? What did we do together?
    • In what ways will the work started during the project be continued after the project ends?
    • To what extent and in what ways did the image of our group change in the community?

    •  
    •  What experiences did our group have in affecting community decisions to improve the conditions that affect the health of our community? How will our ability to influence decision making be continued?


    Sample Reaction Sheet for an Evaluation Workshop

    1.     How useful was this training for you?
     
    not useful so-so useful  very use

    Comments:
     

    2.     What 3 words would you use to describe the training?

    •  
    •  
    3.     How did this training contribute to your understanding of evaluation?
     

    4.     What suggestions do you have to make the training more useful?
     

    5.     What comments would you like to make about the trainers?
     

    6. What is one thing that you got from the training that you could use right away in your work?

    Sample Questions for Guided Telephone Interview

    Community Handbook on Resources for Women with Breast Cancer

    • How did you hear about the Community Handbook on Resources for Women withBreast Cancer ? (health professional, friend, advertising poster, other.)
    • Which sections of the handbook were most helpful to you?

    •  
    •  How were they helpful?
  •  What is missing from the book that you would like to see included?

  •  
  • What suggestions do you have for making the handbook more useful?

  •  
  • Do you plan to start using any of the resources in the community because of ideas you received from the handbook? If so, which resources?

  •  
    Sample Focus Group Interview Guide

    Children's Safety Awareness Project

    •  What contributed to the success of the child safety awareness activities in your community?
    • What are the barriers you experienced when you implemented child safety awareness programs in your community?
      • - timing
        - money
        - human resources
        - attitudes
        - culture
        - politics
  • What happened in your community as a result of being chosen as a pilot site for the child safety awareness project?
    • - activities directly related to project
      - ripple effect of being involved
      - unexpected consequences
  • What ideas do you have about different ways to set up national child safety awareness programs?

  •  
  •  What advice do you have for other communities who are planning to set up child safety awareness programs?

  • Sample Guidelines For Keeping a Project Diary

    Children's Safety Awareness Project

    We are asking a few people in each of the five pilot communities to keep a project diary during this next year. The diary is your own record of your experience of using the resource materials to do child safety awareness work in your community. There is no right or wrong kind of information to collect. You may decide to note thoughts that you have, ideas that come to you about what you think would work better, anecdotes that actually describe what you did and how you felt about what happened. All of this information will help us put the human side and your unique community experience into the evaluation. Everything we learn from the collective !earnings will be identified and shared with you. Anonymity of individuals will be preserved.

    To help us put some structure to the analysis of the material and provide some consistency across all of the communities, we will be using the information in your project diary to answer the following questions:

    1.     What difference did each of the following elements of the project make to your community's experience of carrying out child safety awareness activities?
            (a) money (b) project sponsor's presence in the community (c) resource materials (d) program consultant.

    2.     What are the major challenges to getting awareness activities going and to keeping them going in your community?

    3.     Were there any ways in which the information in the resource material provided help to individuals, coordinating and planning groups and agencies in the
            communities to address the challenges? Describe.

    4.     What are you learning about planning and implementing child safety awareness aimed at changing beliefs, attitudes and behaviours?

    5.     What suggestions do you have about making the resource materials more accessible and useful?

    6.     How do community projects keep volunteers interested, committed and actively involved in child safety awareness work? Who gets involved?

    Remember - you just jot down experiences and thoughts you have through the year that relate to your involvement with the child safety awareness project. You do not have to directly answer the above questions. Building collective answers to these questions is our job. Keeping the diary should be enjoyable, not onerous, so do what works for you. Thanks.

    Have Fun!
     

    Sample Mail Out Questionnaire

    Advisory Committee, Health and Disabled Women's Project

    The terms of reference for the advisory committee are to:

    • hold two meetings a year in Toronto

    •  
    • participate in conference calls

    •  
    •  provide input on project issues as they develop

    •  
    • assist in the development of health brochures and other literature

    •  
    • assist in creating a workplan and strategy to establish working groups in communities outside Toronto who will organize local workshops and will form the framework for DAWN Ontario.
    1.     How successful was the advisory committee in carrying out its terms ofreference as listed above?

    2.     Which task(s) did the advisory committee do well? Why?

    3.    What helped you personally to contribute to the work of the committee?

    4.     What got in the way of your contributing to the work of the committee?

    5.     What impact did being on the committee have on

    a) your personal life (time, money, stress)

    b) your attitudes and approaches to health issues of women with disabilities

    c) how you address issues of accessibility in your own work

    d) your own confidence and commitment to work on health issues of women with disabilities?

    6.     What was the key learning for you as a member of the advisory committee?

    7.     What suggestions can you give DAWN Ontario for structuring future advisory committees?

    8.     What recommendations do you have for making advisory committees in the health care system more accessible to women with disabilities?

    Thank you
     

    Sample Mail Out Questionnaire

    Health Care Providers, Health and Disabled Women's Project



    1.     How familiar are you with the Health and Disabled Women's Project?
     
    Not familiar


    So-so


    Very familiar


    2.     Please identify the type(s) of contact you had with the Health Project.
     
    a. attended a workshop in which Health Project staff participated
    b.  reviewed education material developed during the project
    c. attended the symposium at Geneva Park in 1992
    d. requested and received information
    e. responded to the initial questionnaire on accessibility and services in 1991
    f. other
     

    3.     As a result of your contact with the DAWN Health Project did you:
     
    No Yes
    a. increase your own awareness of health issues as experienced by women with disabilities (WWD)?

    b. change any of your own assumptions or practices with WWD?

    c.  increase your own interest in providing appropriate care to WWD?

    d. become involved in increasing the accessibility of your services or facilities to better accommodate WWD

    Comments :
     

    4. How would you rate the following education tools developed through the project?
     
    Not useful So-So Very Useful
    a.     The Education Brochures
    • WWD talk about sexuality

    •  
    • You and your doctor

    •  
    • Mothering and WWD

    •  
    • Guide for Health Care Professionals





     







     







     


    b.   The Access checklist


    c.   Les actes du Colloque


    Comments :
     

    5. What kinds of activities would you consider doing in the future to address health care issues of WWD? For example, would you:
     
    No Yes Maybe
    a. facilitate a workshop with a disabled woman?


    b. talk to other professionals about the health care needs of WWD?


    c. distribute the education materials mentioned above?


    d. advocate for an accessibility audit to be done on your facility by a barrier-free consultant?


    e. refer a WWD to DAWN Ontario for support and information?


    f. other
     

    6. Based on your experience with the Health Care Project, what advice would you give other health care providers about addressing the needs of women with disabilities?






    7. DAWN Ontario wants to be a more active partner in the process of defining andaddressing health concerns of disabled women. What suggestions do you have for makingthis happen in your agency/program?






    8. What is your title? Briefly describe the kind of work you do.






    THANK YOU
     

    We Value Your Feedback

    Thank you for using this evaluation guide! Your comments and suggestions can help us make sure the guide is kept current and useful. If you have a moment, please answer and return the following questionnaire. Respondents are eligible to receive future updates to the guide - please complete Question 5. If you are interested in receiving future updates.

    (1 = not very useful)
    (5 = extremely useful)

    1  2  3  4  5

    1 .     Is the guide useful to you? a) Which part of the guide did you findmost helpful?




    b) How do you use the guide?



    (1 = not very useful)
    (5 = extremely useful)

    1  2  3  4  5

    2.     Is the guide easy to use you? a) How could it be made easier to use
     
    Comments:





    3.     Are there additional resources that should be added to the Annotated Bibliography? If so, please provide:
     
    Name of resource:
    Date: 
    Language:
    Source:
    Tel.: 
    Fax :
    E-Mail:
    Short description:



    4.     Do you have any other comments or suggestions to Improve the guide?







    5.     Are you interested in receiving future updates of the guide?

            Yes  ___ No     ___

            If so, please answer Question 6.

    6.     The following information is optional (unless you answered "yes" to Question 5.). These details will help us identify the types of organizations that most
            commonly use the guide and what their specific needs are.

    This information will be used for our statistics only.
     
    Your Name:
    Name of your organization:
    Complete mailing address:


     
    Telephone number:
    Please return this questionnaire by mail or fax to:

    National Program Consultant
    Evaluation and Coordination
    Health Canada
    Population Health Directorate
    st Floor, Finance Building
    Address Locator 0201C
    Ottawa, Ontario K1A 1B4
    Fax: (613) 957-1565
     

     
     
    top

    Last Updated: 2005-06-10