Français | Contact Us | Help | Search | Canada Site | |
What's New | About Us | Policies | Site Map | Home |
Alternate Format(s)
|
In many cases evaluation programs are offered as a specific focus within an existing graduate degree, either at the master’s or doctoral level. Several of these involved the study of quantitative methods, measurement or statistics. In some cases (e.g., Claremont Graduate University, Western Michigan University, University of Minnesota) degrees specifically in evaluation are offered. Ten out of the seventeen programs that we profiled are offered through Faculties or Colleges of Education. Unlike Canada, education in the US is a federal matter, and standard evaluation texts observe that much of the early work done in evaluation of focused on the evaluation federal nation-wide educational programs. Other locations for programs include several schools or departments of psychology and in the case of the American University, a School of Public Affairs. This was the only program located in such a school. The Ph. D. program at Western Michigan University is in fact, an interdisciplinary program involving four colleges at the University (A. Gullickson, personal communication, April 11, 2006; Stufflebeam, 2001). According to Stufflebeam (2001) this program took lead from Stanford Evaluation Consortium headed up by L. Cronbach: the ideal evaluation degree program would include (1) disciplinary preparation in social sciences, (2) participation in interdisciplinary seminars that examine evaluations, (3) apprenticeship to practicing evaluators (preparation of critiques, assist in drafting proposals, interpretation of data, communication of findings), and (4) an internship at an agency were policy is formulated. In programs that were clearly identifiable as evaluation training programs, we did not find explicit evidence that the curriculum is based on an identified set of evaluator competencies, a route toward quality assurance proposed by Stevahn et al. (2005b). In addition to degree programs, several universities (7 or 41%) offer graduate certificate programs in evaluation, usually involving about 15 credits of study (5 courses). In the case of University of Minnesota, the certificate is only available to those holding a graduate degree, but such is not the case anywhere else. Usually the certificate is offered at the master’s level and in most cases, credits from the certificate could subsequently be applied toward a graduate degree. Melbourne offers the certificate program through distance mode. This program was recently offered under contract to the Government of the Northwest Territories in partial distance, partial onsite delivery mode (J. Owen, personal communication, April 7, 2006.) The number of full time regular professors teaching in the evaluation program was often difficult to determine. In many cases, the only information available was the number of faculty as full time members of the department or academic unit. Nevertheless, programs that did specify faculty teaching in the evaluation program, typically identified at least 5 members. Finally, there was a mix of programs offering practica and internships and those offering course and thesis work only. In the case of the former, it was usually explicitly stated that the development of effective practitioners of evaluation or policy analysis was part of the mission of the academic unit. It was sometimes the case that professional development activities were offered as well as formal certificate or degree programs (e.g., Claremont Graduate University, University of Minnesota). In some cases the mission is explicitly to develop potential researchers and scholars in evaluation. Usually such programs offered Ph.D. degrees. 3.1.2 Other training options In addition to these training and professional development opportunities there are two training options that potentially contribute to quality assurance in evaluation. The first is government-directed training and the second is enrolment in the single university-based course on evaluation. Both options might involve certification of achievement rather than participation. In other words, unlike many professional development institutes and workshops, successful achievement of knowledge and skills would be required to pass the course(s). We now turn to a discussion of such training options for evaluators. In 1998 General Accounting Office (GAO) of the US federal government established its own Training Institute to separate training and education from career counseling and personnel and organizational development support. According to Kinsgbury and Hedrick (1994) the Training Institute was deemed critical to meeting the information needs of the US Congress. It was comprised of 17 classrooms in a Washington DC location plus training facilities in all regional offices. In order to continue to be deemed qualified to do the GAO’s audit and evaluation work, every evaluator must obtain a minimum 80 hours of training every two years. The Institute offered six focal areas of study for evaluator training: mission, policies and individual responsibilities; assignment planning and evaluation; communication; computer use; workplace relations and management; and issue area training. Also self-paced training was available in the form of internet courses. In their critique of the program Kingsbury and Hedricks (1994) concluded that matching training to job relevance is critical, involving both line managers and staff in training increases its credibility and impact, training needs to deliver consistent messages at all levels and there is an ongoing need to evaluate the training programs in order to assure quality. As mentioned above, the Training Institute has now evolved to the Center for Learning, but it continues to be governed by Government audit standards. Even with its emphasis on internal training, the US GAO has recently consciously altered its hiring practices to focus on highly trained personnel, typically at the Ph.D. level. Some such individuals may have been trained in specialized evaluation programs but this is not necessarily the case (N. Kingsbury, personal communication, April 7, 2006). Even in the case where government-level evaluation training is available, the single university-level evaluation course represents another option adopted by many practicing evaluators. Several of the university evaluation programs identified in the Altschuld et al. (1994) survey were not included in our own review in Table 2 because our survey of their websites revealed that only one or two evaluation courses may have been offered, rather than a set of courses contributing to a program. Yet, such courses provide aspiring and practicing evaluators with worthwhile options for intensive study in evaluation, normally about 40 hours or so (3 credits) at the graduate level. Candidates might choose to do the single evaluation course as part of a related degree program such as education and psychology. Morris (1994) observed that this scenario is probable since only 11 percent of AEA members indicated training in formal evaluation degree and certificate programs in the 1992 membership directory. Another option would be to enroll in such a course as on a special student basis. Candidates must be eligible for study at the graduate level in order to enroll for single courses of this sort (e.g., upper second class standing in relevant bachelor’s degree). Morris (1994) provided an in-depth examination of the single evaluation course phenomena, giving explicit consideration to the role of such courses and what they might include. He sees the single course as a valued part of professional training, not as a cut-and-dried competency development exercise. Yet, its major contribution would be to the production of educated consumers as opposed to competent practitioners. But that, says Morris, can be a good thing, particularly in the light of the need to educate non-evaluators about the power and virtue of the field. “Although a little knowledge can be a dangerous thing, program evaluation is a field in which total ignorance is much worse. Evaluation is most likely to achieve its dual goals of demonstrating scientific credibility and bettering the human condition in an environment where it is not just the professional evaluation community that has access to relevant knowledge and skills” (1994, p. 57). Clearly, however, something deeper and more intensive is required in order to develop evaluator competencies and to inform practice. 3.1.3 Summary
|
Name | Location |
Mission |
Credentials Offered | Human Resources | Practicum / Internships |
Carleton University School of Public Management |
Ottawa, ON | Training of practitioners, researchers / scholars | MA Ph.D. Diploma in public administration Evaluation courses also available in International development, social work, psychology |
21 Faculty teaching in the
school 4 professors emeriti 11 adjunct professor |
No practical component in evaluation mentioned. Practical component in public service management available in masters program. |
Dalhousie University, School of Public Administration | Halifax, NS | Training of practitioners | MPA MPA with various specializations (mgt, law, library sciences) |
10 Faculty members associated with School of Public
Admin 8 Adjunct Faculty |
No practicum mentioned. |
École Nationale d’administration publique | Montréal, Québec, Gatineau, PQ | Training of practitioners, researchers / scholars | Ph.D; M.P.A. with courses in
evaluation; Graduate Certificate in program evaluation (15 cr.) |
4 Faculty members in program
evaluation; 1 faculty member in performance measurement |
No mention of practicum
requirement. Situated within public administration |
Georgian College, Research Analyst Program | Barrie, ON | Training of practitioners | Diploma (3 semesters) | 5 Faculty 3 associated faculty 23 advisory committee members |
Yes, internship placement for third semester. |
University of Guelph Department of Psychology | Guelph, ON | Training of practitioners, researchers / scholars | Ph.D Applied Social Psychology program evaluation-related courses offered |
8 Faculty members | Placement opportunities and consulting work with all levels of government |
Université de Laval École de psychologie |
Québec, PQ | Training of practitioners, researchers / scholars | Ph.D. Recherche et intervention (orientation communautaire). Cours en évaluation de programme, évaluation psychosociale des milieux, consultation et gestion dans la pratique professionnel | Nombres de professeurs associés avec le programme non spécifié | Stages qui se centrent en évaluation, consultation, gestion, ou supervision. Lieux des stages ne sont pas spécifiés |
University of Ottawa, Faculties of Education and Social Sciences |
Ottawa, ON |
Training of practitioners, researchers / scholars | Graduate Certificate in Evaluation
(pending) M.A., Ph.D. Educational Measurement and Evaluation; Clinical Psychology |
6 Faculty in education and psychology teach evaluation | Practicum training and internship placement in evaluation offered through School of Psychology at Centre for Research on Community Services. Placements generally in community/social services and governmental contexts. |
Université de Québec a
Montréal, Faculté d’éducation et Département de psychologie |
Montréal, PQ |
Training of practitioners | Programme court de deuxième cycle en évaluation de programmes d'éducation et de formation (15 cr.). Psy.D. psychologies (profil professionnel) ; Ph.D. psychologie (profil scientifique-profesionnel). Cours en méthode de recherche en intervention (6 crédits) |
Not indicated | Intégration workshop in évaluation (1 cr.) Stages d’évaluation (15 cr.), processus psychologique et d’évaluation – approche systémique / social (3 cr.) |
University of
Saskatchewan Applied Social Psychology Program |
Saskatoon, SK |
Training of practitioners, researchers / scholars | Ph.D.; M.A. in Applied Social Psychology, includes courses in
evaluation Evaluation courses also offered in Faculty of Ed. |
5 Faculty members associated with Applied Social Psychology Program | Practicum in
evaluation Internship in evaluation |
University of
Victoria School of Public Administation |
Victoria, BC |
Training of practitioners, researchers / scholars | M.PA; Ph. D. or diploma
programs. Graduate certificate program in performance management incl. eval. and performance measurement as a core course. (12 credits) |
5 Faculty members teach in the School | Evaluation practicum and field experience courses can be part of the Ph.D. degree. |
U of Waterloo Health Studies and Gerontology |
Waterloo, ON |
Training of practitioners, researchers / scholars | Ph.D, M.Sc. Evaluation courses can be done as part of the degree programs |
33 Regular Faculty, 17 adjunct members of the Health Studies and Gerontology Program | No practicum mentioned |
U of Windsor Department of Psychology |
Windsor, ON |
Training of practitioners, researchers / scholars | M.A. and Ph.D. programs in Applied Social Psychology: Program evaluation, research methods and measurement courses offered | 9 Faculty members | Practica and internships in government agencies, community organizations, schools /
colleges Provincial research funding evident |
Wilfrid Laurier University, Department of Psychology |
Waterloo, ON |
Training of practitioners, researchers/ scholars | M.A. and Ph.D. programs in Community Psychology Courses on research in community settings at M.A. level and on program evaluation and community research and action at Ph.D. level |
Number of Faculty members dedicated to community psychology area not specified | Practica in community settings with students receiving training in consultation, program development, program planning, and program evaluation. |
While there are no degree programs offered in evaluation in Canada, we did locate three graduate certificate programs in evaluation. Two of these are located at French-language universities, one in public administration (École nationale d’administration publique), the other specifically devoted to education (Université de Québec à Montreàl). A third program is structured as a bilingual, joint Faculty of Social Science and the Faculty of Education program and, pending final approval, will be offered starting fall 2006 at the University of Ottawa. In each case, the graduate certificate requirement is 15 credits (5 courses) completed on a full- or part-time basis, which is very much consistent with what we observed in programs beyond Canadian borders.
Evaluation courses and the potential for specialization in evaluation under an associated discipline are offered at several universities under various disciplines including Applied Social Psychology or Community Psychology (e.g., Université de Laval, University of Guelph, University of Saskatchewan, University of Windsor, Wilfrid Laurier University), Education (University of Ottawa, Université de Québec a Montréal), Health Studies and Genentology (University of Waterloo) and Public Administration (Carleton University, Dalhousie University, École national d’administration publique, University of Victoria).
As was the case with our international survey of programs, it is difficult to determine how many faculty members are teaching evaluation courses in each program, but certificate programs appear to be staffed by a minimum of five faculty members.
Practical experiences in evaluation are mostly associated with evaluation certificate and diploma programs, but also with some universities currently not offering a certificate (e.g., Université Laval, University of Saskatchewan, University of Victoria, University of Windsor, Wilfred Laurier University). Practical experiences range from internship opportunities, such as those offered by the University of Saskatchewan, to practicum courses where candidates participate in practical evaluation work under the supervision of university instructors (e.g., University of Ottawa, University of Victoria).
3.2.2 Other training options
As would be the case in the US and elsewhere, candidates may actually enrol in graduate-level university courses in evaluation and take them on a special student basis. Such persons are required to be eligible for graduate study at the various universities (e.g., bachelors degree with second class standing) and typically they would be permitted to complete only two such courses on a special student basis. The advantage to following such courses is that they are recorded as achievements on the respective university transcript, they confirm candidate eligibility for graduate study, and in many instances, they may subsequently be applied toward university degree or certificate programs (under such conditions as program requirements and recentness of completion). The list available at the CES website (www.evaluationcanada.ca ) shows the breadth of universities offering such courses across the country.
Continuing with post-secondary options in Canada, the Research Analyst Program at Georgian College (see Table 3) is not a university-based program but candidates are expected to have at least three years of post-secondary education prior to admission. The program consists of 12 primarily skill-building courses in various research-oriented topics and an internship is required for the third and final semester. This program represents an alternative to university-level study that results in a diploma. A variety of other training options are available in Canada for those interested in developing knowledge and skill in evaluation, but the majority of these might best be thought of as professional development activities that result in a certificate of participation rather than credentials of achievement such as credit courses or certificates that are recorded on university transcripts.
In Canada, the Canadian School of Public Service provides internal training nation-wide for Canadian federal employees, and periodically offers courses on evaluation. Such courses are considered to result in credentials of achievement. Also offered on a regular basis by the CEE are a series of topical workshops available to public servants with an interest in evaluation. These opportunities are similar to the Essential Skills Series and intermediate level training opportunities (i.e., programs in survey design and logic models) offered by CES, in as much as they result in a certificate of participation, rather than a credential of achievement. The ESS certificate is awarded after participation in four day-long modules (CES, n.d.-a; see also, Nagao, Kuji-Shikatani & Love, 2005). The CES also sponsors workshops and professional development activities at the chapter level as well as annually at the CES national conference. Finally, universities occasionally hold such opportunities as summer institutes in evaluation and applied research methods although these do not appear to be offered on a regular basis such as we have observed elsewhere (Claremont Graduate School; University of Minnesota).
Despite the existence of these opportunities, in contrast to the US, for example, the Canadian federal government does not currently require that public servants employed in evaluation and policy analysis related jobs hold attestation that they have undertaken such training (see discussion of the Training Institute above).
3.2.3 Summary
Evaluation training opportunities in Canada are widely available but opportunities for advanced level university training appear to be quite limited. There currently exist no degree programs in program evaluation in the country and we located only three graduate certificate programs (one is pending approval) and one diploma program at a community college. While a wide array of universities offer graduate study in evaluation, this is most often limited to course-level experiences. Such courses may be integrated into degree programs (concurrently or subsequently) and it is likely that candidates could specialize in evaluation in degree programs in related disciplines such as education or applied social psychology. It is encouraging to note, however, that university courses in evaluation and related topics exist on such a broad basis and that several universities offer more than one evaluation course within single faculties or departments. The potential for the development, of certificate programs for example, would be increased in circumstances where faculties or departments could build on existing courses rather than developing programs from scratch. Finally, a wide variety of other training and professional development opportunities in evaluation exist both inside government and out, but at present there are no regulations requiring candidates to have undergone such training in order to hold evaluation-related posts within the federal government. We now examine an M.Sc. program in the UK that represents a partnership between the federal government and a university. This program is quite unique and bears quite directly on considerations of government’s role in fostering evaluation quality assurance.
The M.Sc in Policy Analysis and Evaluation developed in the UK and co-sponsored by the Government Social Research Unit, Cabinet Office and Institute for Education, University of London (2005) provides a model of institutional collaboration in the interests of evaluation quality assurance in government. The program is in essence a modular degree to be completed part-time over two years and began in October 2005, and it offers a unique opportunity to government social researchers looking to enhance their professional skills and career prospects. (http://www.gsr.gov.uk/professional_development/msc/index.asp ) The aims of the program are:
The program is designed to provide students with an understanding of the major quantitative research skills relevant to designing, analyzing and evaluating government policy. Participants are expected to gain a high level of critical insight into a range of research methods and to apply their understanding to policy and research questions and communicate their understanding clearly to both academic specialists in research and non-specialists. In so doing students would develop their existing skills in critiquing and applying research methods. The program was actually the brainchild of the Cabinet Office, Government Social Research Unit (GSRU) and built on a series of courses already developed for government researchers. The GSRU was interested in raising the quality of government social research and had some evidence that university programs did not provide people with the research skills that were needed in government (R. Taylor, personal communication, April 7, 2006).
The two year modular program structure, laid out in Table 4, consists of 5 compulsory and 2 optional modular credits. The program would be taken over a total of 24-26 days within a two year period and during regular working hours within the public service. The program is collaboratively delivered between Cabinet Office and Institute of Education but the Institute assumes responsibility for the assessment of the program, drawing up and marking assignments, establishing a board of examiners and appointing an external examiner.
Table 4: Modular Program Structure of the MSc in Policy Analysis and Program Evaluation (adapted from GSRU, 2005)
Modular elements | Module providera | Credit equivalents |
Compulsory modules | ||
Research & research management | CO | 1.5 |
Statistical analysis | CO | 3.0 |
Experimental & quasi experimental design | CO | 1.5 |
Research synthesis for policy and practice | IoE/CO | 3.0 |
Report (10,000 words) | IoE | 3.0 |
Optional modules (2 required) | ||
Sampling design & data collection | CO | 3.0 |
Qualitative research & analysis | IoE | 3.0 |
Economic & econometric analysis | IoE | 3.0 |
Longitudinal research & analysis | IoE | 3.0 |
a CO = Cabinet Office; IoE = Institute of Education, University of London |
Compulsory modules focus quite heavily on quantitative methodology for social research including methods, statistics and design. There is also a module on evidence-based policy, or synthesis of research for policy and practice. Optional modules can be taken to extend methods, analysis and design capabilities including courses on qualitative methods and econometrics.
The program teaching team comprises eight Institute of Education faculty members, four Cabinet Office staff members and one consultant. Students of the program become members of the Institute of Education’s Doctoral School and the Bedford Group of ‘Lifecourse and Statistical Studies’, another one of the Institute’s Schools. The Bedford Group provides the teaching for the Institute’s Doctoral School courses in statistics, multivariate analyses and survey methods. It is also responsible for course leadership for a MA and MSc in the Economics of Education and a new module on Quantitative Evaluation Methodology. In 2004/05 there were approximately 30 doctoral students registered in the Bedford Group.
The collaborative MSc in policy analysis and evaluation represents the first of its kind to our knowledge; a degree program jointly offered by a university and governmental organization, exclusively for members of the governmental organization. The program meets academic standards for the degree by virtue of affording responsibility for marking and candidate examinations to the Institute for Education, which is part of a chartered degree granting institution. While the academic standards of the degree speak to program quality, the heavy involvement of government in the design and delivery of compulsory and optional modules ensure relevance of the program to the policy analysis and evaluation exigencies of government. The program represents a clearly hands-on role for government in enhancing quality assurance in evaluation.
Table 5: Centres of Excellence in Evaluation and Related Fields
Name | Location | Missiona | Practice /Consultancy | Human Resources | Sustainability / Link to Government |
Claremont Graduate
University, Inst of Org & Program Evaluation Research, School of Behavioral and Org Studies |
USA Claremont, CA |
Research Practice Training Teaching |
Program design, devel and eval; consultation in design, proposal prep; evaluation in HR, organizations; needs assessments, organizational and management consulting | 6 Faculty 1 staff students on project basis |
Fees charged to students and clients for workshops and degree/certificate
programs Service provider relationship Sponsorship unclear: mention of ‘generous donations’ for fellowships. |
George Washington University Center for Equity and Excellence in Education Grad School of Ed and Human Development |
USA Arlington, VA |
Training Research Practice |
Conducts national and local ed. policy and applied research Designs and conducts program evaluation for states, districts, and schools, and analyzes policy. | 2 admin: director, assistant
director, 14 research scientists and associates, 3 staff |
Funding sponsorship
unclear Service provider relationship with state education agencies, local education agencies, and various offices of the U.S. Office of Education |
Harvard University, Harvard Family Research Project (HFRP), Graduate School of Education | USA Cambridge, MA |
Research Teaching Practice |
Evaluation of varied initiatives for foundations, non-profit organizations and public agencies, family involvement in education; dissemination of research and theory on eval. | 16 Faculty and staff
members, graduate and undergraduate student assistants. |
Funding sources include: private foundations and corporations, public agencies |
Indiana University, Centre for Evaluation and Education Policy, School of Education | USA Bloomington, IN |
Research Practice Training |
Evaluation literacy, education policy research and technical assistance; health, human services, and community development; math, science, and technology | 5 management and academic
staff 5 Faculty associates 6 research staff |
Service provider relationship with state, regional and national governmental agencies and institutions, educational institutions and community organizations |
Massey University, Center for Public Policy Evaluation College of Business |
NEW
ZEALAND Palmerston North |
Research Practice |
Evaluation in law and economics, health, economics and education and family | Unclear | Unclear: Likely sponsorship from government. |
RMIT University; Collaborative Institute for Research, Consulting and Learning in Evaluation |
AUSTRALIA, Melbourne |
(Teaching) Training Practice |
Program evaluation in education, community development, labour policy, health; implementation and outcome evaluation, performance measurement | 3 Faculty members, 4 research staff, staff postgraduate students |
Fees to students for short course and
certificates Service provider relationship with Australian state and federal public agencies and New Zealand federal government |
University Of
Aberdeen, Health Economics Research Unit, College of Life Sciences and Medicine |
UK |
Research Practice Teaching Training |
Economic
evaluation; behaviour, performance and organisation of care; evaluation of health improvement; valuation & implementation programme |
22 researchers including
Faculty members and research
fellows, support staff, Ph.D. students |
Sponsorship: Chief Scientist Office of the Scottish Executive Health Department, competitive research
grants Service provider relationships: public agencies. |
University of California Los Angeles Center for Research on Evaluation, Standards, and Student Testing, Grad school of Education and Info St. |
USA |
Research Practice (Training) |
Conducts major program evaluations, research-based assessments, technology as assessment tool, aid to schools and districts respond to the many accountability demands | 4 Faculty members | Partnership/consortium with 4 American universities, 1 UK university and Educational Testing Service. |
University Of East
Anglia, Centre for Applied Research in Education School of Education and Professional Devel. |
UK |
Teaching Research Practice |
Applied research including action research; programme and policy evaluation; consultancy; methodological development; research training; research degrees | 16 Faculty members and researchers, 8 visiting fellows, support staff, students |
Sponsorship from European Commission, local and central governments, foreign national and state/provincial
governments Service provider relationships unclear. |
University Of Illinois At Urbana-Champaign Center for Instructional Research and Curriculum
Evaluation College of Education; Dept of Psychology |
USA |
Research Teaching Practice |
Evaluation of programs in schools, education-related social services; technical and philosophical review of eval. projects, examination of questions of validity, utility of findings and ethical issues. | Unclear: Faculty, associated Faculty and students | Service provider relationship with schools and communities, state and federal programs, professional associations, and others. |
University of Iowa, Iowa Center for Evaluation Research College of Public Health |
USA |
Practice Training Teaching Research |
Services in design and conduct of evaluation procedures in ongoing University and state public health projects and programs | 4 Faculty
members 2 graduate students |
Funding support from federal and state agencies, and private
foundations Research grants |
University of
Melbourne, Centre for Programme Evaluation; Faculty of Education |
AUSTRALIA |
Practice Research Teaching Training |
Commissioned evaluations, survey research in education, health, welfare,
training. Developmental activities and workshops |
6 Faculty/research
fellows 5 research associates admin staff |
Revenue generated through contract work, fees for training activities and services Service provider relationship with all levels of government: scope ranging from local to national |
University of
Minnesota Center for Applied Research & Educational Improvement ; College of Education & Human Devel |
USA Minneapolis, MN |
Research Practice Teaching Training |
Independent evaluation and policy research of school district or community-based programs, state agency-funded programs, and other projects; Collaborative research projects in schools | 10 principal investigators and research fellows; research assistants; staff |
Service provider relationship: state and federal agencies and foundations Fees for training services |
University of
Nebraska Center for At-Risk Children’s Services; Spec Ed/Community Disorder Department |
USA |
Research Practice Teaching Training |
Program evaluation, including survey research, needs assessment, data management, proposal writing services | 5 Faculty members, administrative personnel, students, staff including data coordinators | Service provider relationship: federal and state levels of government; community and school-based agencies. |
University Of New
Mexico Health Evaluation and Research Office; Dept Family and Community Medicine |
USA Albuquerque, NM |
Research Practice Training |
Program evaluation, continuous improvement, research design, strategic planning, research methods analysis | Unclear: Director, associate director, staff, students |
Service provider relationship to: clinicians, coalitions, community health educators, federal and state agencies, foundations, not-for-profit agencies, policymakers, public health program developers, and researchers. |
University of North Carolina, Chapel
Hill: Evaluation, Assessment & Policy Connections School of Education |
USA Chapel Hill, NC |
Practice Training Teaching |
Program evaluation and technical assistance and development services in childcare, higher education; school-university partnerships; substance abuse prevention; community planning | 6 Faculty members; 5 graduate students |
Sponsorship: Federal, state and local agencies, and private
foundations Service provider relationship: some state and local government |
University of Technology Sydney, Centre for Health Econ. Research &
Evaluation Faculties of Business and Nursing |
AUSTRALIA Sydney |
Research Practice Teaching Training |
Economic evaluation, technology assessment, program evaluation in health, complex interventions; quality of life assessment; policy analysis economic forecasting | 5 academic staff, 10 research associates, 3 post-doc fellows, 6 research officers, 6 administrative staff |
Sponsorship: State government, health public
agencies; Unclear if service provider relationship; Fees charged to students for courses and workshops; |
University of Wisconsin Extension: Program Development and
Evaluation, Cooperative Extension |
USA Madison, WI |
Practice Training |
Training and technical assistance to plan, implement and evaluate high quality extension educational programs. | 5 Faculty members, associated staff | Sponsorship: State government
funding Service provider relationship with state agencies. |
University of York, Centre for Health
Economics, Dept of Economics and Related Studies and the Dept of Health Sciences |
UK Heslington, England |
Research Practice Teaching Training |
Health economic policy analysis, evaluation and health technology assessment: primary car; addiction research resource allocation, outcomes research, econometrics |
40 research staff including
Faculty, support staff students |
Service provider/client relationship with central and local public agencies and European Union, among
others Sponsorship: unclear Fees for training |
Vanderbilt University, Center for Evaluation Research and Methodology, Vanderbilt Institute for Public Policy Studies |
USA Memphis, MS |
Research Practice |
meta-analytic techniques for policy research; evaluation of programs in juvenile justice; school readiness; dissemination to policy makers and practitioners | 2 Admin: director, research
coordinator; 7 research associates and analysts graduate student assistants |
Sponsorship: Federal and state research grants; private foundation grant No mention of fee for service provision relationships. |
Vanderbilt
University, Center for Evaluation and Program Improvement, Peabody College (ed., human development) |
USA Memphis, MS |
Research Practice |
Program evaluation and program improvement in health and education, child, adolescent and family mental health services; contextualized feedback intervention theory | 3 Faculty members, 10 researchers, research assistants, postdoctoral fellows, graduate student assistants |
Service provide/client relationship with federal and state
agencies Sponsorship: federal and state agencies, private foundations, and private corporations |
Western Michigan
University, The Evaluation Center: Vice President of Res. |
USA Kalamazoo, MI |
Research Practice Training Teaching |
Program evaluation and community development in higher ed; schools;
personnel; science education; social/youth standard setting; state/regional educational services; testing |
4 Faculty members, 6 researchers; consultants, graduate students, and other Faculty members as associates |
Sponsorship from a wide variety of national and regional associations and
organizations Service provider relationship: some government, most often community sector |
a Teaching implies link to university degree programs; Training implies professional development services to clients; Research implies creation of academic knowledge which may or may not be directly related to evaluation; Practice implies delivery of evaluation services including consultation and project management. |
The implementation of the model did not come about without its challenges, however, according to its Director, R. Taylor (personal communication, April 7, 2006). The GSRU ran a competition among a number of leading UK universities but few universities were interested in becoming an academic partner under the terms proposed. They wanted to run their own degree programs and could not understand the government’s interest in partnership. Once the partner was identified, the major challenge was the degree accreditation system which was highly bureaucratic and took time to be approved by the University. The most significant obstacle was the establishment of a viable business model.
The GSRU paid a start up fee to the University partner and now charges a fee per student which covers off all costs. Even non-MSc students participate in the courses at a commercial rate. The first cohort in 2005 consisted of 18 students. To date feedback has been very positive. Students are quite satisfied with the program and Government departments are of the view that the degree offers good value for money and his helping raise skill levels. The program will undergo formal evaluation in the near term (R. Taylor, personal communication, April 7, 2006). We now continue our examination of government-university relationships by examining the concept of ‘centre of excellence in evaluation’ and how such centres have been and may be involved in ensuring quality in government level evaluation.
Unlike training and educational programs in evaluation, there is very little appearing in the literature about the nature, roles and consequences of centres of excellence in evaluation. We sought to survey extant university centres both outside of Canada (in English speaking countries) and within her borders. It should be noted at the outset, that such units are most often called research centres or sometimes research groups or institutes. We sampled primarily through internet search engines, but also through bibliographic follow-up and telephone and email consultations and endeavoured to describe centres in terms of structural arrangements, functions and mission, sustainability, links to training and degree programs, and relationships with sponsors and clients (e.g., foundations, government). As with training programs our intention was to be comprehensive but we acknowledge that many centres that do extensive evaluative work are difficult to locate by virtue of the term ‘evaluation’ not appearing in their official name or mission statement. Nevertheless, in the sections to follow we attempt not only to describe extant centres but to develop a sense of the potential relationships with government either as service providers (contractors) or as recipients of government support or sponsorship. We begin with an examination centres located outside of Canada and conclude with a look at domestic centres of excellence with significant interest in evaluation.
4.1.1 Sample characteristics
In the end we located 21 centres of excellence with what we judged to be significant interests in evaluation. These are summarized in Table 5 and further details on each are located in Appendix B[3]
. The majority (14 or 66%) are located in the US with 3 each in the UK and Australia and 1 in New Zealand. Twelve of the centres (57%) are located in faculties of education and human development (e.g., Harvard University, University of East Anglia, University of Melbourne), while 3 are located in Faculties of Medicine or Schools of Public Health (University of Aberdeen, University of Iowa, University of New Mexico), and 1 each in organizational behaviour/psychology (Claremont Graduate University), business studies (Massey University), and public policy (Vanderbilt University). A second centre located at Vanderbit University was in Education and Human Development. One centre (Royal Melbourne Institute of Technology) is not located within a Faculty or department but rather, reports directly to senior administration of the Institute. Three centres were interdisciplinary: University of Illinois Urbana-Champaign (psychology and education), University of Technology Sydney (business and nursing) and University of York (Economics and Health Sciences). It is interesting to note that 17 of the 21 centres (81%) had the term ‘evaluation’ explicitly represented in the centre name (‘applied research’ in one case).
4.1.2 Centre activities and supports
From mission statements and lists of activities we coded the principal activities of the centres into research (usually discipline specific but sometimes research on evaluation), practice (most often evaluation and related practices), teaching (formal links to degree programs offered at the university), and training (professional development in the form of non-credit workshops, seminars, institutes). Nine centres (e.g., Claremont Graduate University, University of Iowa) are engaged in all four types of activity, whereas most others are engaged in three of the four. A small number of universities are engaged in only two of these activities, usually being research and practice with no professional development or educational services offered.
As was the case with evaluation training programs, it was sometimes difficult to identify precisely how many faculty members and associated staffs were affiliated with the respective centres. Faculty complements of those with direct accountability for centre activities ranged quite enormously from about 2 to well over 10. Usually, centres had (sometimes extensive) lists of affiliated or associated members from various departments and faculties around campus. While most had identifiable support staff in the role of research coordination, project leadership, research assistance, and financial and administrative assistance, many also involved directly students in the some of these roles. In one of the columns in Table 5, we elaborated practice and consultancy activities in an effort to capture the nature of the evaluation-related business of the centres. Such activities could usually be categorized as one of three main types: consultation and advisory activities; evaluation and applied research delivery services (conducting the inquiry); and dissemination and follow up. In some instances, given the centre’s mandate or mission, evaluation was integrated with other activities and responsibilities. For example, at the University of Illinois, Urbana-Champaign, evaluation was integrated with program development activities. In other locations, such as the University of East Anglia and the University of Minnesota, evaluation represented one choice on a menu of inquiry activities that included policy research, needs assessment or even action research. Consultation services are provided by many centres on such issues as funding proposal development, evaluation planning and design, and instrument design and validation. It might also include providing specific technical services such as computerized data scanning, or statistical analysis of data sets.
In terms of dissemination activities, we observe that in some cases the centre acted as a knowledge brokerage with explicit goals of diffusing research and best practices not actually produced by centre personnel. But in many cases, however, such dissemination and follow up did actually relate to project work undertaken by centre staff.
In the final column in Table 5, we attempted to capture a sense of centre sustainability and where possible, to identify links to government. It would be safe to assume that virtually all centres would receive some sort of internal support from the university but that there would be expectations that the centres would be largely self-sustaining over time. We found it virtually impossible to identify from websites to what extent centres relied on internal support. We determined that significant internal support is provided in two cases: University of Melbourne (J. Owen, personal communication, April 7, 2006) and Western Michigan University (A. Gullickson, personal communication, April 11, 2006). In both cases, there is significant expectation the university support is augmented through the generation of external contract work and other revenue streams.
In many cases, we were able to ascertain that significant support was derived from external sponsorship, service provision (contract work), or fee for services such as workshop, conferences, short courses, and consultation. Sponsorships took one of two forms. First, external agencies (government, private foundations, private corporations) were sometimes identified as official sponsors of the centre, which we took to imply that they were supported by grants and contributions. Some centres actually got their start this way. For example, at UCLA the original Center for the Study of Evaluation (now the Center for Research on Evaluation, Standards, and Student Testing) was funded as a national research centre by the Department of Education. Other such centres were located at University of Wisconsin and the University of Pittsburgh and they all receive ongoing renewable funding (M. Alkin, personal communication, April 11, 2006). In other cases, sponsorship takes the form of grants obtained through competitive processes, usually from government sources or foundations. Federal, state and municipal governments were implicated as sponsoring agencies with municipal government being mentioned quite infrequently. At the University of Melbourne, state government contracts (and some federal and municipal) represent 80% of Centre revenues (J. Owen, personal communication, April 7, 2006). In other instances community agencies (often para governmental) served as clients for service provision. A similarly high proportion of government sourcing was reported with respect to Western Michigan University (A. Gullickson, personal communication, April 11, 2006).
4.1.3 Summary
In summary, we located a wide range of centres in five English- speaking countries around the globe. Many of the centres were located in faculties of education or human development but, health services and interdisciplinary centers were also noted to have a presence. Centres varied quite substantially in size and in the scope of their work. Most were involved in some combination of research; evaluation related practices including consultation, service delivery, and dissemination; and training or education. Most centres were dependent in some way shape or form on government (usually federal or state) for sponsorship, source of competitive grant funds, or contracted project work. We observed that private foundations often provided support as well. Centre business often included disciplinary research (e.g., child welfare, public health) in addition to evaluation-related services. In some instances we observed formal links to degree programs but sometimes center activities did not involve education or training. We now turn to an examination of domestic Canadian centres of excellence by way of examining similarities and differences with those located in the international context.
4.2.1 Sample characteristics
Whereas over 80% of the centres comprising the international sample included ‘evaluation’ in the centre name, such was the case in only 5 (62%) of our final sample of 8 Canadian centres. At least partially due to this reason, centres with a significant interest in evaluation were comparatively more difficult to locate. Ultimately, as shown in Table 6, we located 1 centre in the Maritime Provinces (University of New Brunswick), 5 in Ontario (Carleton University, Queen’s University, University of Ottawa, University of Toronto, University of Waterloo) and 2 in the west (University of Calgary, University of Saskatchewan/University of Regina). The center in Saskatchewan is actually a jointly managed organization comprising two universities and a health foundation. Three of the centres are either not affiliated with specific faculties or departments or that information is not clear on the website (University of New Brunswick, University of Calgary, University of Saskatchewan/University of Regina). The others are disciplinarily located in the public management (Carleton University), education (Ontario Institute for Studies in Education/University of
Toronto), social sciences (University of Ottawa), or health sciences (University of Waterloo). As are the unaffiliated centres, the centre at Queen’s University is interdisciplinary (Faculties of Education and Health Sciences).
We located four other centres but decided that their interest in evaluation (as understood for the purposes of this paper) was marginal although they are very much involved in important work with great potential to influence government policy. These were:
In contrast to the international sample, none of the Canadian centers demonstrated a scope of activity all of the identified areas – research, practice, teaching, training – although most appeared to focus on three of the four. In all cases, research was a main activity of the centre, and this we took to imply disciplinary research (policy, public health) as opposed to research on evaluation. Practice activities related to evaluation were similar in Canadian centres to those from the international sample. Specifically, several centres provided program evaluation services and consultation and/or dissemination and follow up services designed to foster evidence-based practice. In some cases, such as University of Calgary and Carleton University, evaluation services were somewhat incidental to the more mainstream policy research activities.
Formal links to university degree programs were observable in only one of the centres (University of Saskatchewan/University of Regina,) whereas several centres (e.g., University of Ottawa, University of Waterloo) identified evaluation training and capacity building as central activities. It is interesting to note that only two of the centres (University of New Brunswick, University of Ottawa) make explicit reference to the involvement of students in centre business. Although the number of faculty dedicated to the centre was difficult to determine in some instances, most identified 4 or 5 faculty in addition to research fellows, associates or faculty affiliates. It was not possible to determine staffing parameters for Carleton University or OISE/UT from their respective websites. Perhaps as no surprise given the Canadian population, we did not observe excessively large centers as was the case in the US and the UK.
Table 6: Centres of Excellence in Evaluation and Related Fields in Canada
Name | Location | Missiona | Practice /Consultancy | Human Resources | Sustainability / Link to Government |
Carleton University, Centre for Policy and Program Assessment, School of Public Management | Ottawa, ON | Practice Research |
applied research in, numerous public policy fields and program areas at the federal, provincial municip. and international levels of government. | Unclear: several Faculty from different departments and disciplines | Service provider relationship with federal
government Sponsorship: unclear (federal government research grants) |
Ontario Inst for Studies in Education/UT Centre for the Advancement of Measurement, Evaluation, Research & Assessment | Toronto, ON | Practice Research Training |
Collaboration on R &
D; program evaluation design; instrument development and validation; data collection and analysis; report writing Workshops, symposia and seminars on methods issues |
Unclear; 1 Faculty person |
Revenue generated through contract
work Service provider relationship with provincial government |
Queens University, Social Program Evaluation
Group Faculty of Education, Faculty of Health Sciences |
Kingston, ON | Practice Research |
Basic, applied and policy research; program evaluation and monitoring; dissemination activities with partner agencies | 1 Director, 2 Faculty, 4 project managers, 1 research associate, 2 support staff. | Service provider relationship: contracts with federal and provincial
government Sponsorship: federal and provincial research grants. |
University of Calgary, Institute for Advanced Policy Research, Unaffiliated |
Calgary, AB | Research Training Practice |
No mention of evaluation practice. Focus on policy research. Policy briefs and technical report dissemination. Cities, disabled, well being, climate change. | 1 Director 4 Faculty 28 affiliated Faculty and staff |
Funding and sustainability are
unclear Sponsorship through competitive research funding. |
University of New
Brunswick Canadian Research Institute for Social Policy, Unaffiliated |
Fredrickton, NB | Research Practice |
Conducting detailed evaluations of local, national, and international policy initiatives, and by analyzing large complex data bases | 2 administrators (director,
associate) 4 research fellows, 2 research associates, 2 staff and 6 students |
Funding and sustainability are
unclear Sponsorship through competitive research funding. |
University of Ottawa Centre for Research on Community Services: Faculty of Social Sciences |
Ottawa, ON | Research Practice Training |
Social research studies; community program devel through research and training; needs assessment; program evaluation; survey design and analysis | 2 principal Faculty, Faculty, 9 senior researchers, Research coordinator, student assistants, admin staff |
Revenue generated through contract work, fees for training activities and services; research
grants Service provider relationship: some government, most often community sector |
University of Sask &
Regina Saskatchewan Population Health And Evaluation Research Unit; University affiliation unclear |
Saskatoon, Regina, Prince Albert, SK |
Research Practice Teaching |
Research focus on Aboriginal northern and rural health, children’s health, policy and governance and health. No evaluation services identified | 1 Administrator:
director 6 Faculty 6 support staff |
Partnership among two universities and Saskatchewan Health Research
Foundation Funded research. |
University of
Waterloo, Centre for Behavioral Research and Program Evaluation Faculty of Applied Health Services |
Waterloo, ON |
Research Practice Training |
Evaluation planning; data collection tools and protocols; data analysis and interpretation; knowledge synthesis and translation; capacity building and training | 3 admin (director, 2 assistant directors) 3 scientists, 7 admin support staff, 12 evaluation and research staff. |
Sponsored by Canadian Cancer Society. Other funding unclear: competitive research funding likely. |
a Teaching implies link to university degree programs; Training implies professional development services to clients; Research implies creation of academic knowledge which may or may not be directly related to evaluation; Practice implies delivery of evaluation services including consultation and project management. |
Finally, funding and modes of sustainability were generally difficult to determine but it would be safe to say that at least a portion of centre budgets would come from the university while other means of revenue generation would come through sponsorship and securing competitive grants as well as through contract work. There seems to be ample evidence to show that the centre sustainability depends in part on government sponsorship or contract services to government. In the case of University of Waterloo and University of Saskatchewan/University of Regina, formal partnerships were established with supporting non-governmental agencies and foundations.
4.2.2 Summary
To summarize, compared to the international sample, Canadian centres of excellence with significant interest in evaluation-related activities, appear to be somewhat more homogeneous in size and less prevalent in faculties of education. We observed a tendency for interdisciplinary centers to exist and the centres do not appear to be affiliated with a particular faculty or department. Centres that participate in evaluation-related activities were difficult to locate by virtue of evaluation not being represented in the centre name. Nevertheless, there is substantial involvement of university-based centres in Canada in evaluation activities, either in consultation, service delivery, or training. There is also a good deal of interest in fostering evidence-based practice in the respective field of practice, in some cases through disseminating policy research or brokering research done elsewhere. Finally, it seems clear that centres are dependent to a significant degree on funds generated through their relationship with government, either as a recipient of sponsorship, grant recipient or as a contractor to government at provincial and/or federal levels.
The foregoing literature review and survey of university-based evaluation training options and the existence, function and sustainability of university-based centres of excellence with an interest in evaluation provides a comprehensive platform from which to consider potential roles for government in fostering evaluation quality assurance. Given the deliberations and the evidence concerning university-based interests in evaluation in Canada that we uncovered, we are persuaded that it would be premature at this time to move to an individual-level certification model. Although in Canada and elsewhere significant recent progress has been made in developing core competencies for evaluators, diversity in the field of practice is substantial and it represents a serious mitigating factor against the implementation of a licensure approach that would restrict entry into the field by virtue of tests of minimum levels of knowledge and skill. Further, we have seen that graduate-level degree programs in evaluation are just not available in Canada unlike the situation in other jurisdictions. There is, however, substantial graduate level instruction concerning the evaluation function occurring in Canadian universities and we found some indication that graduate certificate programs are becoming a realistic advanced-level option to existing professional development activities that result in a certificate of participation, rather than achievement.
Our primary conclusion from this analysis is that a system of credentialing which would acknowledge a set of courses or other experiences a person must go through to be recognized would be the most prudent and realistic route to meeting current demands in the Canadian context for quality assurance in evaluation. Such a system could form the basis of a more elaborate and stringent certification system in time, should consensus of the definition and bounded competencies that evaluators should possess. It is on this central plank that we now turn to considerations for the role of government in fostering evaluation quality assurance in Canada. We address such issues under the banners of training, centres of excellence, other implications for universities and links with the professional society, the CES.
5.1.1 Graduate certificate programs in evaluation
The development of pilot projects of graduate-level university certificate programs in program evaluation represents a reasonable and potential powerful step for government to take in fostering its quality assurance agenda. These would be master’s level programs that would include 5 or 6, 3-credit courses to be done on a part- time or full-time basis. The Ontario Council for Graduate Studies has become quite open to the concept in recent years, as Ontario universities have benefited from the development and implementation of graduate certificates in a variety of applied fields and domains of inquiry. We note that the concept of graduate certificate program is becoming commonplace in other jurisdictions in Canada, as it is globally. The focus would be on preparing qualified and competent persons to assume evaluation roles. Programs should include solid grounding in methods and practice of evaluation, evaluation theory and models and experiential learning opportunities through practica or internship placements.
A small number of these programs currently exist in Canada. It might be possible to partner with these programs to offer specialized versions of the certificate program that would be tailored to the needs for evaluation in the federal government. As well, it may prove beneficial to assist universities to develop distance education approaches to delivering these programs so that they are made available to federal sector employees across the country.
Other possibilities would be to negotiate pilot opportunities with other promising sites in Canada, such as university faculties or departments that currently offer multiple evaluation courses on a regular basis. Support in the form of guaranteeing a certain number of federal government placements (i.e., government personnel to be retrained for evaluation) over coming years would be useful to help establish and develop the programs within the university structure.
The advantage of federal support for the development of graduate certificate programs in evaluation might be realized in the form of federal-level credentialing of evaluators. That is, the federal government could move ahead and require its program evaluators or contractor to have completed a graduate-level certificate program in evaluation (presumably one that would have participated in or have been modelled on those involved in the pilot initiative). The credential then becomes the certificate which graduates of the program receive. This route assumes reasonable similarity between the different programs across the country, which there should be if coordinated via a pilot project. (Another possibility would be to work with CES to develop a registry of credentialed evaluators – see discussion below).
5.1.2 Develop a graduate degree program in program evaluation in partnership with a university
A somewhat more ambitious option would be to collaborate with one or more universities to develop a graduate degree that is specifically tailored to meeting government evaluation training and certification needs. Such a program would provide a significant challenge to develop and install and would implicate ongoing commitment to running the program by government. A core curriculum for such a program could be developed on the basis of what is currently known about evaluator competencies juxtaposed to identified, and perhaps somewhat unique, government needs with regard to the evaluation function (e.g., fit with expenditure management and accountability framework). The program could be offered on a part-time basis to select public servants on a pre-service basis, or perhaps on an in-service basis in the short run. The degree would be recognized on a university transcript and therefore transferable anywhere, which would be likely to be highly attractive to a good many public servants. Another advantage of such a program is that curriculum would be tailored to government needs and therefore highly relevant, while at the same time quality would be assured by mandatory compliance with provincial regulations for graduate instruction. In order to accomplish the latter, their may be implications for involvement of university faculty in aspects of courses that are given by members of the public service. (Recall, for example, that provisions for assignment development and marking to be the purview of the Institute of Education are associated with the London program – see section 3.3 above.) Another consideration would be geography. Would the program be made available to public servants across the country and if so, on what basis? Finally, the program would necessarily be given in both official languages which would carry resource implications.
5.1.3 Workshops, short courses, summer institutes and other learning experiences
Several possibilities exist here to continue to provide more basic level training and exposure to evaluation principles and practices. First, CEE should continue to develop and offer workshops and learning events associated with a CEE evaluation learning strategy. These events are highly relevant to evaluation in government and may serve to augment more advanced training such as degree programs or graduate certificate programs. Similarly, it would be beneficial to encourage basic-level training participation in CES Essential Skills Series and intermediate short courses. It would be prudent to evaluate the curricula of these courses against contemporary government exigencies in order to ensure that there would at least partly meeting quality assurance needs. Another option might include developing partnerships with universities to offer summer institutes that could be theme-based or more general to evaluation capacity building. Universities have considerably more flexibility to offer such courses because they are not governed by central accrediting agencies and ultimately are not included on the University transcript. Such institutes are quite popular in other jurisdictions and offer the opportunity to bring in high profile guest speakers from within the evaluation community.
5.2.1 Support for development of university-based centres of excellence in evaluation
Our analysis shows the multiple dimensions of value that university-based centres of excellence can add to the evaluation quality assurance agenda. Yet in Canada, the existence of centres with wide scope with regard to the evaluation function is somewhat muted as compared to other jurisdictions. Only some existing centres have formal ties to graduate degree or certificate programs, and since there are really no evaluation degree programs in Canada, they represent one avenue to training highly qualified evaluation personnel that is largely underdeveloped. Some existing centres of excellence do include training and evaluation capacity building as part of their core activities. Such functions could be invaluable to public servants needing to develop specific knowledge and skill sets on an in-service basis. Centres of excellence also serve a consultative function. It would be beneficial for government departments to cultivate such relationships and to take advantages of the function for advisory and peer review services. Centres of excellence also carry out contract work and provide bona fide evaluation services. Government investment in stimulating centre development would be well spent to the extent that such centres could provide alternatives to the usual firms located on standing offers for service provision. In other jurisdictions, particularly in the US, evaluation centres have the infrastructure to handle very large scale evaluations of national programs. There is no reason why this could not be the case in Canada. It is true that individual professors are unable to drop their myriad of commitments to pick up demanding contracts that are on tight timelines, yet appropriately resourced centres would comprise research associates, coordinators, post doctoral fellows, and students, all of whom would be in position to provide valued expertise to evaluations of major program or policy initiatives.
How could government help bring this about? For one, through internal restructuring within government, contract work could be made more readily available or accessible to universities and university centres in particular. In the interest of developing a core of highly qualified personnel in evaluation, another consideration would be to establish funding post-doctoral experiences working with such organizations. Fostering research on evaluation, another potentially strong interest of centres of excellence, would provide another option for consideration. Despite their different purposes, research interests can often be piggybacked on evaluation activities. A caveat would be that universities would require at least shared ownership of intellectual property, but stimulating research on evaluation could be highly advantageous in the long run. Good research attracts interest and in and of itself may serve a capacity building function with regard to the appreciation of the power and potential of evaluation as a management function. In addition to post-doctoral opportunities, the development of a small number of research chairs dedicated to program evaluation and associated with certificate programs could be considered as a means of advancing the ability of universities to contribute to the development of program evaluation capacity across the country.
5.2.2 Continue/expand academic liaison
Quality assurance bodies such as advisory committees for ongoing evaluation planning and integration with strategic plans, peer-review of evaluation frameworks and evaluation reports, and meta-evaluations of clusters of evaluation reports are all valued contributions that can be made by academics regardless of whether they are affiliated with centers of excellence. This is current practice in many federal government departments and agencies but it is far from widespread. It would be prudent to encourage government departmental and agency evaluators and members of the decision and policy community to cultivate relationships with academics in such advisory capacities. We observe that many evaluation courses are offered at universities across the country and that it is likely that a good deal of evaluation expertise exists in universities despite the paucity of official evaluation degree and certificate programs.
5.3.1 Exchanges between government sector and academe
Longstanding support in the organizational change literature exists for the prospect of developing organizational knowledge and learning through attracting external persons such as academics to work in the government sector for short term assignments. In the present case these might take the form of one year secondments, sabbatical placements, or short term replacements for persons on temporary leave. Datta (2003) points out that some of the more influential figures in developing evaluation as a domain of inquiry have been academics who have worked in the government sector with significant evaluation responsibility and influence.
5.3.2 Support for student development
Students represent the next generation of evaluators and efforts to stimulate the development of their knowledge and skill in evaluation would represent a class of strategies that would be likely to pay off. Many evaluation centres of excellence routinely engage students in evaluation contract work, which is typically extraordinarily beneficial to all concerned. In addition to helping students to secure financial sustainability, knowledge and skill developed in the practical milieu are likely to be robust. Such opportunities need not be limited to contract work in the centre of excellence. Summer placement programs and other modes of involving students in work placements, such as government internship sites, could be highly beneficial. Currently cooperative education programs exist at the graduate level at only a small number of universities and we can say with some certainty, none with evaluation as a focus. Yet this mode of learning carries with it a long tradition of support in terms of attendant benefits to both students and the organizations which take them on. In addition to such direct support, continued sponsorship of worthwhile competitions such as CES paper contests and case competitions would be likely to pay off for government as well.
5.4.1 Support CES to develop credentialing system
In section 5.1.1 above we argued that an easy and logical feature of a pilot program to set up graduate certificate programs at universities across the country would be for the government to institute a
defacto credentialing system. Public servants and external persons completing the graduate certificate would be considered competitively qualified for upcoming positions. Yet credentialing systems can and should be more sophisticated than that in order to minimize the production of false negatives (persons not credentialed who should be) and false positives (persons credentialed who should not be). In addition to the certificate program, there may be other experiences such as disciplinary graduate degrees with specialization in evaluation, evaluation practical experiences, contributions to the advancement of evaluation theory and practice, that might ultimately help to address the problem. Yet such systems would imply the establishment of procedures for adjudicating applications and for maintaining a registry of qualified persons. Such persons could be working directly within government or external to it as private consultants.
Developing such a credentialing system, as we suggested above, might be viewed as an incremental strategy toward eventual certification. And given the professional interests at stake it would be logical for the professional association to take the lead in installing such a system. Of course, the development of a credentialing system would incur substantial start up costs and maintenance costs for which subsidization might be required. For example, government might consider support for a conference or some process to agree to a common set of standards for credentialing, accreditation of programs, and the like. Or a formalized partnership in a credentialing system might be an option worth considering. Once installed full-time staff would be required to maintain a registry of credentialed evaluators, and that might represent substantial member dues increments. With a credentialing system in place, and a mechanism to equate other experiences with the demands of a graduate certificate, TBS might consider implications for hiring practices within the federal sector and for contracting out to private service providers.
This brings to a close our thoughts about possible roles for government in assuring quality in evaluation. The prospects for the development and cultivation of government-university relationships are many and varied. Both sectors, it seems, stand to benefit quite enormously from heightened assurances of evaluation quality and ultimately the full integration of evaluation into the management function. We would hope that this discussion paper in some significant way lays the groundwork for further concrete dialogue, deliberation and, ultimately, action toward these ends.
American Evaluation Association (2004) American Evaluation Association Response to U.S. Department of Education notice of proposed priority, Federal Register RIN 1890-ZA00, November 2003 “Scientifically based evaluation methods” Retrieved April 19, 2004 from http://eval.org/doestatement.htm
Amara, N., Ouimet, M., & Landry, R. (2004). New evidence on instrumental, conceptual and symbolic utilization of university research in government agencies. Science Communication, 26, 75-106
Alkin, M. C., Kosecoff, J., Fitz-Gibbon, C., & Seligman, R. (1974). Evaluation and decision making: The Title VII experience. Los Angeles, CA: Centre for the Study of Evaluation.
Altschuld, J. W. (2005). Certification, credentialing, Licensure, competencies, and the like: Issues confronting the field of evaluation. Canadian Journal of Program Evaluation, 20(2), 157-168.
Altschuld, J. W. (1999). Certification of evaluators: Highlights from a report submitted to the Board of Directors of the American Evaluation Association. American Journal of Evaluation, 20, 481-493.
Altschuld, J. W. (1999). A case for a voluntary system for credentialing evaluators. American Journal of Evaluation, 20(3), 507-517.
Altschuld, J. W., Engle, M., Cullen, C., Kim, I., & Macce, B. R. (1994). The 1994 directory of evaluation training programs. In J. W. Altschuld & M. Engle (Eds.), New Directions in Program Evaluation: The preparation of professional evaluators; issues perspectives, and programs, No 62 (pp. 71-94). San Francisco: Jossey-Bass.
Aucoin, P. (2005, April). Decision making in government: The role of program evaluation. Ottawa: Centre of Excellence in Evaluation, Treasury Board of Canada, Secretariat.
Auditor General of Canada. (1997, Oct.). Moving toward managing for results. Report to Parliament, Chapter 11. Ottawa:: Author.
Bickman, L. (1999). AEA, bold or timid? American Journal of Evaluation, 20, 519-520.
Breen, G., & Associates. (2005, May). Interviews with deputy ministers regarding the evaluation function. Ottawa: Centre of Excellence in Evaluation, Treasury Board of Canada, Secretariat.
Borys, S., Gauthier, B., Kishchuk, N., & Roy, S. (2005, November). Survey of evaluation practice and issues in Canada. Paper presented at the Joint Canadian Evaluation Society/American Evaluation Association Conference, Toronto.
Canadian Evaluation Society (n.d.-a). Essential skills series. http://evaluationcanada.ca retrieved January 20, 2006.
Canadian Evaluation Society (n.d.-b). CES guidelines for ethical conduct. http://evaluationcanada.ca retrieved January 20, 2006.
Centre of Excellence for Evaluation. (2004). Review of the quality of evaluations across departments and agencies. Ottawa: Author, Treasury Board of Canada, Secretariat.
Cousins, J. B. (2005). Interview with Joe Hudson founding editor of the Canadian Journal of Program Evaluation. Canadian Journal of Program Evaluation, 20(3), 199-221.
Cousins, J. B., Goh, S., Clark, S., & Lee, L. (2004). Integrating evaluative inquiry into the organizational culture: A review and synthesis of the knowledge base. Canadian Journal of Program Evaluation, 19(2), 99-141.
Cousins, J. B., Goh, S., Aubry, T., Elliott, C., Lahey, R., & Mongtague, S. (2006, January). What makes evaluation useful? A concept mapping study. Presentation made at a session of the Performance and Planning Exchange, Ottawa.
Datta, L.-E. (2003). The evaluation profession and the government. In T. Kellaghan & D. L. Stufflebeam (Eds.), International Handbook of Educational Evaluation (pp. 345-360). Boston: Kluwer.
Engle, M., & Altschuld, J. W. (2003). An update on university-based evaluation training. Evaluation Exchange, 9(4), 13.
Engle, M., Altschuld, J. W., & Kim, Y. C. (in press). 2002 survey of evaluation preparation programs in universities: An update of the 1992 American Evaluation Association-sponsored study. American Journal of Evaluation.
Government of Canada. (2000). Results for Canadians. Ottawa: Author.
Government of Canada (2006). Canada’s new government. Federal accountability action plan. Turning a new leaf. Ottawa: author. http://www.faa-lfi.gc.ca/docs/ap-pa/ap-pa00_e.asp retrieved April 15, 2006.
Grasso, P. (2003). What makes an evaluation useful? Reflections from experience in large organizations. American Journal of Evaluation, 24, 507-514.
Gussman, T. K. (2005, May). Improving the professionalism of evaluation. Ottawa: Centre for Excellence in Evaluation, Treasury Board Secretariat of Canada.
Joint Committee on Standards for Educational Evaluation. (1994). The program evaluation standards. Thousand Oaks: Sage.
Jones, S. C., & Worthen, B. R. (1999). AEA members' opinions concerning evaluator certification. American Journal of Evaluation, 20, 495-506.
King, J. A., Stevahn, L., Ghere, G., & Minnema, J. (2001). Toward a taxonomy of essential evaluation competencies. American Journal of Evaluation, 22, 229-247.
Kingsbury, N., & Hedrick, T. E. (1994). Evaluator training in a government setting. In J. W. Altschuld & M. Engle (Eds.), New Directions in Program Evaluation: The preparation of professional evaluators; issues perspectives, and programs, No 62 (pp. 61-70). San Francisco: Jossey-Bass.
Landry, R., Amara, N., & Lamari, M. (2001). Utilization of social sciences research knowledge in Canada. Research Policy, 30, 333-349.
Leviton, L. C. (2003). Evaluation use: Advances, challenges and applications. American Journal of Evaluation, 24, 525-535.
Long, B., & Kishchuk, N. (1997). Professional certification: A report for the National Council of the Canadian Evaluation Society on the experience of other organizations. Ottawa: Canadian Evaluation Society.
Love, A. (1994). Should evaluators be certified? In J. W. Altschuld & M. Engle (Eds.), New Directions in Program Evaluation: The preparation of professional evaluators; issues perspectives, and programs, No 62 (pp. 29-40). San Francisco: Jossey Bass.
Mathison, S., & (Ed.). (2005). Encyclopedia of evaluation. Thousand Oaks: Sage.
May, R. M., Fleisher, M., Schreier, C. J., & Cox, G. B. (1986). Directory of evaluation training programs. In B. G. Davis (Ed.), Teaching evaluation across the disciplines. New Directions in Evaluation, No. 29 (pp. 71-98). San Francisco: Jossey Bass.
McGuire, M., & Zorzi, R. (2005). Evaluator competencies and professional development. Canadian Journal of Program Evaluation, 20(2), 73-99.
Morris, M. M. (1994). The role of single evaluation courses in evaluation training. In J. W. Altschuld & M. Engle (Eds.), New Directions in Program Evaluation: The preparation of professional evaluators; issues perspectives, and programs, No 62 (pp. 51-59). San Francisco: Jossey Bass.
Morris, M. M. (2003). Ethical considerations in evaluation. In T. Kellaghan & D. L. Stufflebeam (Eds.), International Handbook of Educational Evaluation (pp. 303-328). Boston: Kluwer.
Muller-Clemm, W. J., & Barnes, M. P. (1997). A historical perspective on federal program evaluation in Canada. Canadian Journal of Program Evaluation, 12(1), 47-70.
Nagao, M., Kuji-Shikatani, K., & Love, A. (2005). Preparing school evaluators: Hiroshima pilot test of the Japan Evaluation Society's accreditation project. Canadian Journal of Program Evaluation, 20(2), 125-155.
Office of the Comptroller General. (1981a). Guide on the program evaluation function. Ottawa: Minister of Supply and Services.
Office of the Comptroller General. (1981b). Principles for the evaluation of programs by federal departments and agencies. Ottawa: Minister of Supply and Services.
Patton, M. Q., Grimes, P. S., Guthrie, K. M., Brennan, N. J., French, B. D., & Blyth, D. A. (1977). In search of impact: An analysis of the utilization of federal health evaluation research. In C. H. Weiss (Ed.), Using Social Research in Public Policy Making (pp. 141-163): D. C. Heath and Company.
Perrin, B. (2005). How can the information about the competencies required for evaluation be useful. Canadian Journal of Program Evaluation, 20(2), 169-188.
Russon, C., Russon, G., & (Eds.). (2005). International perspectives on evaluation standards. New Directions in Evaluation, No. 104. San Francisco: Jossey-Bass.
Segsworth, R. V. (2005). Program evaluation in the Government of Canada; Plus ça change... Canadian Journal of Program Evaluation, 20(3), 173-195.
Shadish, W., Newman, D., Scheirer, M. A., & Wye, C. (Eds.). (1995). Guiding principles for evaluators. New Directions in Evaluation. No.66. San Francisco: Jossey-Bass.
Smith, M. F. (2003). The future of the evaluation profession. In T. Kellaghan & D. L. Stufflebeam (Eds.), International Handbook of Educational Evaluation (pp. 373-386). Boston: Kluwer.
Smith, M. F. (1999). Should AEA begin a process of restricting membership in the professional of evaluation? American Journal of Evaluation, 20, 521-531.
Stevahn, L., King, J. A., Ghere, G., & Minnema, J. (2005a). Establishing essential competencies for program evaluators. American Journal of Evaluation, 26, 43-59.
Stevahn, L., King, J. A., Ghere, G., & Minnema, J. (2005b). Evaluator competencies in university-based training programs. Canadian Journal of Program Evaluation, 20(2), 101-123.
Stufflebeam, D. L. (2001). Interdisciplinary Ph.D. programming in evaluation. American Journal of Evaluation, 22, 445-455.
Scriven, M. (2003, November 12). Comments to feds re: proof of causation. Message posted to EVALTALK [Msg. 41]. http://bama.ua.edu/archives/evaltalk.html
Treasury Board of Canada, Secretariat. (1977). Evaluation of programs by departments and agencies. (Treasury Board Circular 1977-47) Ottawa: Author.
Treasury Board of Canada, Secretariat. (2001). Evaluation policy. Ottawa: Author. http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/tbm_161 retrieved January 20, 2006.
Treasury Board of Canada, Secretariat, (2006a). Policy on internal audit. Ottawa: Author. http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/ia-vi/ia-vi_e.asp retrieved April 15, 2006.
Treasury Board of Canada, Secretariat, (2006b). Guidelines on the Responsibility of Chief Audit Executive. Ottawa: Author. http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/ia-vi/rcae-rdv_e.asp retrieved April 15, 2006.
United States Department of Education (2003). Scientifically based evaluation methods. (Federal Register RIN 1890-ZA00, November 4, 2003, vol. 68, no. 213, pp. 62445-62447). Washington, DC: U.S. Government Printing Office.
Worthen, B. R. (1999). Critical challenges confronting certification of evaluators. American Journal of Evaluation, 20, 533-555.
Worthen, B. R. (2003). How can we call evaluation a profession if there are no qualifications for practice? In T. Kellaghan & D. L. Stufflebeam (Eds.), International Handbook of Educational Evaluation (pp. 329-344). Boston: Kluwer.
Weiss, C. H., & Bucuvalas, M. J. (1980). Social science research and decision making. New York: Columbia University Press.
Zorzi, R., Perrin, B., McGuire, M., Long, B., & Lee, L. (2002). Defining the benefits, outputs, and knowledge elements of program evaluation. Canadian Journal of Program Evaluation, 17(3), 143-150.
Zorzi, R., McGuire, M. & Perrin, B. (2002). Canadian Evaluation Society project in support of advocacy and professional development: Evaluation benefits, outputs, and knowledge elements. Retrieved Jan. 20, 2006 from http://consultation.evaluationcanada.ca/pdf/ZorziCESReport.pdf