Department of Justice Canada / Ministère de la Justice CanadaGovernment of Canada
Skip first menu Skip all menus
   
Français Contact us Help Search Canada Site
Justice Home Site Map Programs and Initiatives Proactive Disclosure Laws
Programs and Initiatives
 
The many faces of family violence

Family Violence Initiative

Project Managers' Guide to Performance Measurement and Evaluation

[ Previous | Table of Contents | Next ]


APPENDIX 3:
Tools for Project Managers


TABLE OF CONTENTS

Case Scenarios & Logic Models

Case Scenario 1: Special Edition of a Provincial Newsletter on “Abuse of People with Disabilities”

Case Scenario 2: Child Victims and the Criminal Justice System – Legislative Review

Case Scenario 3: Expert Consultation on Key Research Issues

Case Scenario 4: Child Witnesses of Family Violence: Training/Research Project

Road Map—Template

Project Level Evaluation Plan - Checklist

An Overview of Information/Data Collection Methodologies

Guidelines for Tool Development & Examples

Workshop or Conference Evaluation Feedback Forms

Interviews

Surveys

Focus Groups

Cluster Evaluations


Case Scenarios & Logic Models

Case Scenarios & Logic Models – Road Maps to Results

Here are four case scenarios and related results-oriented logic models. Think of the logic model as a road map that leads to results. Use these as examples in developing other models.

Steps in Developing an Outcome Logic Model:

1.  Define your goals and objectives

2.  Add your activities (overall activity and specific activities)

3.  Identify your “target” or “client” groups

4.  Identify the outputs of your activities

5.  Add the immediate outcomes (within 1 year) that will result from your objectives, activities and outputs

6.  Add your intermediate outcomes that your objectives will lead to within a year

7.  Add the long-term outcomes that your objectives will lead to down the road 2+ years

8.  Check the overall logic of your model

9.  Refine your logic model as your project develops.

See the W.K. Kellogg Foundation Evaluation Handbook for further information on developing different types of logic models.

Case Scenario 1: Special Edition of a Provincial Newsletter on “Abuse of People with Disabilities”

Purpose

  • Newsletter will contain six legal information articles on issues related to the abuse of people with disabilities.
  • Six articles (total of 8 pages) strike the right balance of a manageable amount of information for the reader.
  • Involves provincial consultation to determine most important issues.

Goals

  • Produce a newsletter to educate provincial citizens on the abuse of people with disabilities
  • Partner and consult with BC service to identify the most important issues and the best way to address them in print
  • Distribute the Newsletter throughout BC with emphasis on distribution to service providers
  • Evaluate the Newsletter and report on findings.

Workplan

A. Consultation Process

Purpose:

  • To generate guidance and feedback on content and contacts [not a full-scale research project and should not be held to such a standard].

Extent:

  • A minimum of 25 organizations provincially/regionally distributed.

Process:

  • Two-week provincial consultation tour through key regions.
  • Consultation with service providers and advocates (community mental health workers, mental health associations, women’s, seniors, youth centres, multi-cultural and aboriginal organizations, individuals with disabilities) and with partners
  • Consultation with legal community (Crown Prosecutors, lawyers, advocates).

Consultation Questions:

  • How do you identify disabled people who may be abused? What warning signs do you look for?
  • Is there any one group that is at higher risk than the others? (Women? Youth? People with physical disabilities? Mental disabilities? Multi-cultural communities? What services exist to help people who are disabled and are being abused?
  • What are the biggest hurdles these people have to face when it comes to receiving help?
  • Do you think the public is aware of this issue? What more do you think the public needs to know?
  • If you were writing a booklet to educate around this issue, what would you include? (This is the key question and the main purpose of the consultation).

Consultation Report:

  • Consultation report to inform newsletter.

B. Newsletter Production

  • draft the Newsletter
  • departmental review
  • prepare it for printing
  • print

C. Newsletter Distribution

  • Update mailing list and distribute to: non-profit sector, and associations and organizations associated with physical and mental disabilities, as well as various multi-cultural and aboriginal groups that are not included under another heading.
  • 10,000 Newsletters
  • Evaluation Forms
  • Self Addressed Stamped Envelopes

Evaluation

Plan:

  • Of the 10,000 newsletters, 1000 packages will include an Evaluation and Self Addressed Stamped Envelope (10% of total mail-out). This distribution process will ensure appropriate and relevant contact lists, and will facilitate the prompt return of evaluations.
  • These evaluations will be tabulated and will be incorporated into the reporting process.

Data to be tracked:

  • The number and percentage of returned evaluations
  • What our audience liked and disliked about the publication
  • Suggestions for improving our publications
  • Suggestions for future topics and services
  • Suggestions for other distribution targets
  • Previous knowledge of services mentioned
  • People’s Law School
  • Other organizations listed

Evaluation Resources:

  • Evaluation is a deliverable of this Newsletter, but is not funded by DOJ.
  • Organization says that DOJ funds the preliminary work that goes into creating an evaluation framework.
  • Due to timing, the actual compilation of the data received takes place in the fiscal year post funding. This, therefore, represents the project sponsors contribution to the project.

Reporting

  • Report #1 (March 31st): Distribution of the Newsletter and 1st draft of the final report. This will include a first look at evaluation but, as with past practice, evaluation cannot be accurately reported on until six months past the distribution.
  • Report # 2 (September 30) A report on the Evaluation will be provided for the department 6 months after the completion date of March 31st.

National Sharing

  • Consultation report, newsletter and evaluation to be shared with community partners and national colleagues, including PLEAC and consultation participants.

Expected Outcomes

  • Increased awareness of the issues surrounding abuse of people with disabilities.
  • Increased awareness of provincial resources and services available to address this issue.

Budget

$24,500.00 for salaries and benefits, contract consulting and writing fees, printing, distribution, consultation travel and hosting.

Road Map for Newsletter Project (Click here to view graph)

Case Scenario 2: Child Victims and the Criminal Justice System – Legislative Review

Purpose

  • Consultation on, and review of criminal legislation to examine the need for criminal law reform related to specific offences against children, facilitating child victim/witness testimony, sentencing, age of consent to sexual activity.

Workplan

  • Establish & carry out FPT consultation process.
  • Establish & carry out public consultation process.
  • Determine/coordinate related research activities.
  • Report on consultation and research results.
  • Coordinate/develop follow-up options at FPT level.
  • Legislative amendments and implementation as appropriate.

Expected Outcomes

  • Criminal law reform.
  • Strengthened, coordinated criminal justice response to child victim issues.

Road Map for Legislative Review (Click here to view graph)

Case Scenario 3: Expert Consultation on Key Research Issues

Purpose

  • Consultation with key experts in the field of family violence, to identify priorities in family violence research from a justice perspective.

Goals

  • Seek information & advice from key experts.
  • Exchange ideas & information on research needs.
  • Consider results in research planning.

Workplan

A. Identify & Invite Experts

  • Include A Range Of Experts In Fields Of Violence Against Women And Children.
  • Include Those With Expertise In Specific Population Groups Or Living Contexts (e.g. Rural, Remote, Aboriginal, Cultural Diversity).

B. Plan Meeting

  • Prepare Agenda, Consultation Questions, Consultation Package.
  • Arrange For Presentations & Logistics.
  • Evaluation Form!

C. Hold Meeting

  • Conduct meeting.
  • Record proceedings.

D. Meeting Report

  • Draft meeting report & evaluation results.
  • Disseminate report.

E. Planning Implications

  • Consider results in research planning including opportunities for collaboration.

Evaluation

  • Evaluation questionnaire will be administered at the meeting.
  • Information will be considered in research evaluation planning

Budget

$40,000 for meeting, travel for participants, facilitation, reporting, evaluation.

Road Map for Expert Consultation Project (Click here to view graph)

Case Scenario 4: Child Witnesses of Family Violence: Training/Research Project

Purpose

  • Develop resource for Casework Staff and Program volunteers.
  • To utilize existing resources to develop a training program for volunteers that can be incorporated into current orientation and training material delivered within a national child and youth-serving organization (volunteer based).
  • Resource will be flexible and will provide an overview and assist volunteers who are dealing with child witnesses to gain more detailed information to help them support the child.

Goals

  • Develop a user friendly, comprehensive training package to assist volunteers and caseworkers in supporting children and youth who witness family violence in the home.
  • Increase awareness around the need to support child witnesses of family violence.
  • Provide BBBSC member agencies, along with all other child and youth serving organizations with a resource that will increase their capacity to serve children living in violent homes (transferable and sustainable product).

Workplan

A. Establish Advisory Committee

  • Role to identify materials, provide insights.
  • Includes experts in fields of Violence Against Women and Children.

B. Research & Design

  • Research existing material & assess for appropriateness for project.
  • Design resource to increase caseworkers understanding of child witness issues, allow for incorporation into existing training, explain caseworker role as mentors

C. Pilot

  • 5 agencies to pilot and evaluate materials.
  • Further evaluation by Program Volunteers matched with child witnesses.

D. Revise & Finalize

  • Revise based on evaluation feedback from pilot.

E. Share

  • Share training with other organizations.
  • Post on web site.
  • Sustained dissemination.

Evaluation

Plan:

  • Questionnaire will be administered at the pilot.
  • Information will be used to revise the program.
  • Results of evaluation will be made available to pilot agencies.

Budget

Research $4000
Writing Training Program $4000
Pilot Agencies $2500 (5 agencies @ $500)
Printing $2000
Translation $3500
Promotion and Distribution of Material $500
Administration $1500
Project Management $2000
Total $20,000

Sustainability

  • The training program will be posted on BBBSC’s on-line library in downloadable form, and will be accessible by all organizations who wish to use it.
  • The program may be transformed into an on-line interactive training module that is available through BBBSC’s on-line library.
  • Promotion of the material through regular announcements to other organizations (by e-mail, quarterly newsletters, other publications etc).
  • Promoting the program to other organizations through BBBSC’s 181 local member agencies.
  • Partnering with other organizations that reach a wider group of children and youth, to provide this training on a more extensive basis.

Road Map for Training Research Project (Click here to view graph)

Road Map—Template (Click here to view graph)

Project Level Evaluation Plan - Checklist

Element

Look for….

Tools

Project Description

· Project objectives

· Target group or beneficiaries

· Activities

· Outputs

· Expected results (outcomes)

· Consider providing a Logic model or Project “road map”

Indicators of success/impact

· What are the indicators of success/impact?

· Are they measurable?

· Specific indicators

Data collection

· Methods (Qualitative and Quantitative)

· Data Sources

· Feasibility

· Logistics

· Timing/frequency of data collection

· Roles and responsibilities

· Protocols for collecting and monitoring

· Appropriate methods that are sensitive to the situation and population (gender, culture, language, literacy, age, community, disability)

· Data collection plan and protocols

· Ethical standards and confidentiality provisions

Who is responsible for conducting the evaluation?

· Is the evaluator internal or third party?

· Does the evaluator have the appropriate knowledge and skills, including cultural/diversity competence?

· Are there any conflict of interest issues to consider?

· How will privacy and confidentiality be addressed?

· Is there good communication between the evaluator and the project sponsor?

· Agreements, contracts and protocols

Partner and Stakeholder involvement

· How will partners be involved in the evaluation?

· How will stakeholders (e.g. funders) be involved?

· Agreements, terms of reference for committees

Evaluation resources

· Are sufficient resources allocated to carry out the plan?

· Is the evaluation cost-effective?

· Evaluation budget as a % of project budget

· Actual and in-kind resources

Utilization of results

· How will the project use the results?

· How will DOJ use the results?

· Project statement of how the results will be used to improve their project

· DOJ statement of how the results will be used to inform decision-making

Does the evaluation make sense?

· Is the type of evaluation planned appropriate? Realistic?

· Is the evaluation plan practical and achievable?

· Will results be meaningful & credible?

· Will results be timely?

· Your overall assessment

· Advice of others

Considerations

· Are there more suitable methods that would be better matched to the project?

· Are there more cost effective strategies?

 

An Overview of Information/Data Collection Methodologies

There are various types of information or data, and various collection methods. Here’s an overview of some of the most commonly used methods.

Type of information/data

Examples of methods to collect information/data

Some advantages

Quantitative data

Closed-question Surveys (mail-out, e-mail, web site, telephone)

 

 

Project records/statistical reviews (client processing information; project dissemination log analysis)

You can gather information from many people and you can count and measure to produce statistics.

You can provide a quick overview of your project’s activities (e.g. how many clients you served, how many pamphlets you disseminated, costs per activity)

Qualitative data

Project file or document reviews

 

You can build understanding of the context and experiential process from the project record.

 

Literature reviews

You can assess the relevance or your work within broad stage of knowledge development in the field.

 

Policy reviews

You can situate your work with broad stage of policy development in the field.

 

Key informant interviews

You can discover the context and meaning of peoples’ experience with the project.

 

Case studies

You can get in-depth information or a story of what happened and what the results were.

 

Expert panels

You can acquire further knowledge and insights.

 

Focus groups

Like a group interview. You can get collective insight on a specific topic or questions.

 

Dialogue or learning circles

You can gather stakeholders together to share experiences and identify key learnings in a culturally appropriate way.

Guidelines for Tool Development & Examples

There are many different ways to collect project evaluation information, including the compilation of basic statistical information. This appendix briefly describes several of the tools that can be used to evaluate projects – and to determine, in particular, project impacts:

  • Workshop or Conference Evaluation
  • Interviews
  • Surveys
  • Focus Groups
  • Cluster Evaluations

Workshop or Conference Evaluation Feedback Forms

Workshops and conferences bring individuals together to share their experiences, exchange ideas, develop knowledge and acquire new skills. Participant feedback from such events can provide valuable information to determine the very immediate impact of the event. You can also evaluation feedback forms to get a sense of how people will use the knowledge or skills they acquired at the event. You would need to do further follow-up at a later point in time– such as participant interviews or a surveys – to find out whether and how people have applied the knowledge and skills that they acquired and how it has impacted their work.

What’s Involved?

Before the event: Once you have set your agenda, design a brief feedback form and include it in the participant package. Participants should fill out this form anonymously.

At the event: Have participants fill out the form and hand them in at the end of the event.

After the event: Compile the answers to assess what worked well, what did not work so well, and participants’ suggestions for improvements and/or next steps. Use this information in future work (e.g. future workshops or conferences, follow up steps).

Overall Design

A participant feedback form should:

  • Be one page or less
  • Be printed on coloured paper to stand out
  • Be easy to read and complete
  • Provide space for additional comments
  • Indicate to whom to submit the form
  • Explain how you will use the feedback, and
  • Thank participants for completing the form.

Designing the Questions

  • Ask only a few questions that participants can read and answer quickly
  • Make sure the questions are clearly worded
  • Use either close-ended or open-ended questions depending on the topic (see definitions below).

Close-ended (or closed) questions provide individuals with a set of answers to choose from, such as a multiple choice list of answers, “yes” or “no” boxes to check, or a rating scale to complete.

Open-ended (or open) questions do not provide individuals with a set of answers to choose from – the individual is expected to formulate their own answer, in their own way.

Here are some examples of topics suited to closed questions:

  • Objectives Achievement: To what extent do you think the event’s objectives were met? (Not met, partially met, fully met)
  • Satisfaction: How satisfied were you with the presentations? (not at all satisfied, satisfied, very satisfied)
  • Usefulness: How useful did you find this event to the work you do? (not at all useful, somewhat useful, quite useful)
  • Creature Comfort: How satisfied were you with the facility? (not satisfied, satisfied, very satisfied). How satisfied were you with the food? (not satisfied, satisfied, very satisfied).

Here are some examples of topics that may require “open-ended” questions:

  • Intentions: How will you apply the [knowledge, skills] you acquired at this event?
  • Lessons learned: What was the most important thing… least important thing you learned?
  • Opinions: What do you think about issue/idea/suggestion X?
  • Comments: Do you have any additional comments about this event?

Interviews

Interviewing individuals who have been involved in – or impacted by – a project can provide in-depth and detailed information about their perspectives and experiences.

One-on-one interviewspermit individuals to make anonymous comments and express their opinions freely.

Interview data can supplement – and permit a crosscheck of – information obtained from various sources.

Interviews can be conducted in-person or on the telephone.

What’s Involved?

Before conducting the interviews

  • Develop a list of those individuals who will be most knowledgeable. Think about who can best provide the information you need. It may be helpful to develop selection criteria to choose your key informants.
  • Decide what type of interviews you will conduct. The options include, for example, informal conversational interviews, interviews that focus on a list of key topics, or interviews that include a standardized set of (open and/or closed) questions.
  • Prepare an interviewer protocol to familiarize interviewers with the process to be used to contact, book, conduct and report on interviews. Confidentiality is a key issue to be addressed in an interviewer protocol.
  • For standardized interviews, develop an interview guide that contains all of the questions to be asked (include prompts where needed in the interviewer’s version).
  • Prepare an information package to send out to interviewees. The package should include information about the purpose of the interview, background information about the project, and a list of the topics (or the specific questions) that will be asked in the interview.
  • Contact potential interviewees to request their participation. Be clear about issues such as: recording of the interview, confidentiality, how the information will be used, how long the interview is likely to take, and the format (in-person or by telephone).

When conducting interviews

  • Follow the interviewer protocol closely.
  • Be prepared to handle situations, such as cancellations and “no shows” and requests for additional information, copies of the interview notes, etc.
  • Evaluation project managers may want to monitor the first few interviews and review the resulting interview notes to ensure quality control.

After conducting the interviews…

  • Finalize the interview notes according to the protocol.
  • Review each set of notes systematically and synthesize the answers to each of the questions.
  • Analyze the overall results of the interviews.

Overall Design

  • Interviews should:
  • Be carefully planned
  • Focus on key issues
  • Be as time-efficient as possible
  • Follow a logical sequence, and
  • Provide interviewees with opportunities to ask questions and to provide additional comments.

Developing Interview Questions

Interview questions should be:

  • Clearly stated
  • Brief and to the point
  • Relevant, and
  • Objective.

Here are a couple of examples of interview questions that could be asked of those involved in a newsletter project:

  • How did you (or others in your organization) use the newsletter in your work? [Open-ended]
  • To what extent was the newsletter useful in your work? (Not at all useful, somewhat useful, very useful). [Close-ended]

Surveys

A survey (or questionnaire) is a set of questions that is given to a group of individuals to complete. A survey can be used in a variety of different settings to collect information about the same set of questions from many different people. Surveys may consist of a few brief questions – or they may be more detailed and lengthy.

Although surveys may include either close-ended or open-ended questions (see definitions above), they usually consist primarily of close-ended questions, because these take less time to complete, and the results are easier to analyze statistically.

A survey can be administered in a number of different ways: the questions can be printed and sent (or given) out; an electronic survey form can be emailed out or posted on a web site; or individuals can be asked to respond to a telephone survey.

What’s Involved?

Before conducting the survey….

  • Decide how you will collect the completed surveys and record and analyze the answers.
  • Design the survey (see below)
  • Pilot test the survey with a small group and obtain feedback about the clarity of the questions, the time needed to complete the survey, etc.
  • Refine the survey based on the feedback from the pilot test.

While conducting the survey…

  • Collect and record/keep track of all completed surveys.

After conducting the survey…

  • Organize the answers to completed surveys and input the data
  • Conduct a statistical analysis (this will require software and some technical expertise)
  • Report on the findings.

Overall Design

  • Use a clear and easy-to-read format (large enough font, enough space for answers, etc.)
  • Provide clear instructions about how to complete the questions
  • Use as few questions as possible
  • Ensure there is a logical flow to the questions
  • If appropriate, develop a coding system to make it easier to input and analyze the data (this will require some technical expertise).

Developing the Questions

  • Keep each question as brief as possible.
  • Ensure each question focuses on only one topic or issue.
  • Use plain language.
  • Avoid biased questions.
  • Provide an “other” category for answers that do not fit elsewhere.

Here are a couple of examples of survey questions (open and closed) that could be asked of those who participated in an expert consultation to develop a research plan:

  • How well does the research plan reflect the priority issues in the field? (Does not reflect the priority issues; reflects some of the priority issues; reflects most of the priority issues; reflects all of the priority issues) (Close-ended).
  • Are there other priority issues that should be reflected in the research plan? (Open-ended).

Focus Groups

A focus group is a type of “group interview” in which a small number of people are asked to provide their perspectives on a specific topic. The group’s facilitator encourages all participants to express their views, but the group is not expected to reach consensus. For evaluators, focus groups can provide diverse perspectives and insights on an issue. The opportunity for group interaction and discussion may stimulate participants to make observations and comments that they otherwise may not have offered.

What’s Involved?

  • Determine who should participate in the focus group – usually the participants will be a group whose shared characteristics or experiences allow them to provide relevant insights and feedback on a specific issue or topic.
  • Invite participants to attend and provide them with sufficient information, e.g. an information package that describes the purpose of the group, the process that will be used, and your expectations. It is important to decide whether or not the focus group participants will be given incentives or honoraria for attending.
  • Focus on logistics, including arranging for a comfortable space, refreshments if needed, etc.
  • Find a facilitator with the right blend of expertise, experience, and skills.
  • Determine whether or not the discussion will be recorded via audio/videotape or note-taking (or both) and advise participants about confidentiality.

Overall Design

  • Determine what the group will focus on and develop the specific questions to be asked.
  • Timing – when will it be easiest for participants to participate (during the day? evening?
  • Find the appropriate (accessible, comfortable) setting.
  • Restrict the number of participants (focus groups usually include 6-8 individuals).
  • Limit the duration of the discussion to 1-2 hours.

Developing Questions

  • The questions that will be asked of the group should be pre-determined beforehand.
  • Ask only a limited number of questions (to avoid rushing participants).
  • Avoid controversial or very personal issues, as participants may not be comfortable discussing these in a group.

Here are some questions that might be asked of a small group of practitioners who have been involved in implementing an amended (or new) piece of legislation on specific offences against children:

  • How has the amended/new legislation affected your capacity to address offences against children [the specific ones addressed by the legislation]?
  • How has this amended/new legislation strengthened or weakened the criminal justice system’s response to the victimization of children?
  • How has this amended/new legislation contributed to/hampered the coordination of the criminal justice system’s response to the victimization of children?

Cluster Evaluations

Cluster evaluations look at how well a collection of similar projects meet a particular objective of change. Cluster evaluations are a potential way for the Family Violence Initiative to look across projects to identify common threads, themes and impacts and to identify overall lessons learned.

Some potential goals of a cluster evaluation include to:

  • Identify innovative, good or promising practices.
  • Assess the cluster’s progress towards the stated FVI goals and objectives.
  • Enable implementation adjustments throughout the course of the FVI.
  • Provide evaluation information to inform policy development.

Cluster evaluations are not a substitute for project-level evaluations. A third-party cluster evaluator typically conducts them. They may in part rely on some data collection by project-level evaluators. Logic Model Development Guide and Evaluation Handbook, p. 17 (W.K. Kellogg Foundation): www.wkkf.org/Programming/Overview.aspx?CIA=281.

What’s Involved?

  • Determine which projects have commonalities in project design.
  • Identify what you expect to learn from a cluster evaluation.
  • Invite project participation.
  • Develop evaluation questions based on the expected impacts and outcomes of the FVI as a whole.
  • Establish – and reach agreement with stakeholders – on the terms of reference for the cluster evaluation.
  • Select a cluster evaluator to carry out the evaluation.

Overall Design

  • Determine who will conduct the cluster evaluation, how information will be collected and by whom.
  • Consider the confidentiality provisions (e.g. will projects be identified in the cluster evaluation.
  • Consider individual project time frames and coordinate with the cluster evaluation time frame.
  • Consider bringing project recipients and evaluators together periodically, to share insights and learn from each other.

Cluster evaluation is a good method for obtaining information on projects that cumulatively are designed to bring about policy or systematic change. Such evaluations can lead to important “lessons learned”. This makes cluster evaluation particularly attractive to family violence issue-oriented projects.


[ Previous | Table of Contents | Next ]

 

Back to Top Important Notices