Canadian Institutes of Health Research
Français Contact UsHelpSearchCanada Site
CIHR HomeAbout CIHRWhat's NewFunding OpportunitiesFunding Decisions
CIHR | IRSC
About CIHR
CIHR Institutes
Funding Health Research
Knowledge Translation and Commercialization
Partnerships
Major Strategic Initiatives
International Cooperation
Ethics
News and Media
Publications
Health Research Results and Related Reports
Strategic Plan
Funding Related Documents
Ethics
Reports to Parliament
Reference Documents
Institute Publications
 

Evaluation Study of the Operating Grants Program

Report

Evaluation and Analysis Unit
September 2004

Also available in French

Table of Contents

Executive Summary
1. Introduction
1 a. Context
1b. Purpose of Report and Audience
1c. Scope
1d. Approach to Development of Evaluation
2. Background
2a. Nature of the Operating Grants Program
3. Methodology
3a. General Overview
3b. Survey of OGP Applicants
3c. Administrative Data Analysis
3d. Interviews
4. Findings
4.1 Administrative data
4.1a. Scope of the OGP
4.1b. Characteristics associated with Successful OGP Applications
4.1c. Association between OGP funding and top health research articles
4.2 Surveys of OGP applicants
4.2a. Web Survey of Successful OGP Applicants
4.2b. Survey of Unsuccessful OGP applicants
4.3 Interviews
5. Key conclusions and recommendations
Management Response and Action Plan

Executive Summary

CIHR has promised transparency and accountability to Canadians for its investments of public dollars in all its health research programs and activities. As part of its commitment to transparency and accountability, CIHR undertook its first evaluation of its largest single program of health research funding, the Operating Grants Program (OGP), to assess the OGP's operation, impact and achievement of objectives.

The Operating Grants Program represented roughly 46% of CIHR's total grants and awards expenditures in 2003-2004 (approximately $265,000,000 out of $576,000,000). The program offers research grants in peer-reviewed open competitions to all eligible health researchers. While the program has been in existence for over twenty years, the evaluation is primarily focused on the program outputs and outcomes that have occurred since transition from the Medical Research Council of Canada (MRC) to CIHR in 2000.

The evaluation was conducted to provide CIHR's Governing Council, senior management and the health research community with evidence on whether the program was meeting its expected results of:

  1. improved capacity for generating and developing new knowledge and
  2. improved production of highly qualified personnel.

The evaluation also addresses a recommendation from the Auditor General to monitor and evaluate the OGP.

Given the scope of the program, the evaluation took place in several phases and relied on multiple lines of evidence. The first phase of the evaluation was conducted by an external contractor (BearingPoint) and was overseen by a steering committee of independent evaluation and content-area experts. The first phase included a web survey of over 600 OGP recipients, one-on-one interviews with a sub-sample of 24 OGP recipients, interviews with key informants such as Directors of other health research organizations and CIHR Institute Scientific Directors. The second phase of the evaluation was implemented by staff within CIHR's Evaluation and Analysis Unit and included an analysis of administrative data, a survey of approximately 115 unsuccessful OGP applicants and a small bibliometric study. Given the importance of this program and evaluation, all phases of the report have been presented to the CIHR Governing Council's Standing Committee on Performance Measurement, Evaluation and Audit.

The evaluation design, using multiple methodologies and incorporating input from multiple stakeholders, helps to ensure that the findings and conclusions are based on reasonable and verifiable evidence. Each of these methodologies had its own strengths and weaknesses which are outlined in the report and which were taken into consideration during the analyses and development of conclusions.

While the evaluation has provided a good starting point regarding the perceptions and performance of the program, it does not provide a substantive handle on what OGP funded research accomplishes related to CIHR's mission. CIHR recognized in its corporate strategic plan that "it must demonstrate to Canadians not only the value of individual programs within its suite of programs but also the overall return on investment (ROI) to Canadians.What good comes from health research? For some it is an intrinsic good - the search for new knowledge and discovery is inherently worthwhile, the more of it, the better. The process of discovery is incremental, and as such tangible benefits do not materialize immediately. . Others take a more agnostic view - that health research is instrumental and subordinate to other goals (e.g. better health, economic productivity) and, moreover, competes with other claims for public dollars."

CIHR has committed to developing an ROI framework that will enable measurements of the impact of research funded through OGP and other programs, in future evaluations.

Conclusions and Recommendations

The findings of the evaluation concerning the success of OGP in achieving results are presented below.

5.1. How successful is the OGP in improving capacity for generating and developing new knowledge?

Conclusion 1: Additional data collection strategies are required to further review the OGP and its role in Canadian health research.

Additional data collection strategies should be put in place to collect appropriate data at the end of funded research projects to be able to better measure the outputs and several years after completion of research projects to assess overall impacts of funded research. CIHR should also collect appropriate information on the type and level of dissemination and translation strategies used by OGP researchers. Without pre-existing benchmarks or a valid impact factor, it is difficult to judge the true meaning of the data collected during this evaluation regarding knowledge dissemination and translation. While there is clearly dissemination and translation activity resulting from OGP research, the question of how much more (or less) can and should be achieved remains unanswered. Similarly while there is clearly more health research taking place as a result of budget increases, the question of "value for money", and therefore how much more (or less) can and should be funded also remains unanswered.

In addition, we did not address several issues that may have furthered our understanding of the OGP's role in producing highly qualified personnel. We did not, for instance, probe how the OGP is more or less effective than other funding tools in supporting and training new researchers. In addition, there were no reliable data on the number of research trainees supported by the OGP. Here, as with dissemination and translation of results, on-going data collection from CIHR would have been very useful to the evaluation.

Finally, there are still a number of areas within the issue of research excellence that could be explored in greater depth. For some areas of health research, CIHR could consider using its administrative data to generate test groups by funding type and funding level and produce reliable and sizeable comparison groups for analysis of researchers in different categories and then compare impact using bibliometric analysis. Further evaluative work could also examine how the OGP impacts the formulation and design of research.

Recommendation #1: CIHR should develop better on-going performance measurement for the research it funds. The following are examples, though CIHR should consider a variety of additional studies in greater detail.

Conclusion 2: The OGP is making a growing contribution to generating new knowledge through the amount and breadth of the health research funded.

The OGP contributes over $250,000,000 a year to Canada's health research funding, the single largest investment in health research in Canada. The average team size is also increasing, suggesting that the OGP is funding both more projects and more scientists per project. In addition, since the transition from MRC to CIHR, the number of peer review committees in the OGP has almost doubled, suggesting a greater scope in the type and diversity of health research projects funded under the OGP.

Conclusion 3: Researchers see the OGP as a very important component of Canada's ability to generate new health research.

The vast majority of successful OGP applicants indicated that the OGP was very important to their research programs. The two highest ranked indicators (in terms of percent indicating "high" or "very high") were becoming established in a chosen field (91%) and allowing freedom to explore ideas (87%), suggesting that the OGP plays a key role in generating and facilitating innovative ideas in health research. The qualitative data from both funded and unfunded researchers also point to the importance of the OGP as a tool for creating new heath research by allowing the free exploration of ideas, providing a high level of support relative to other health research funding sources and having a reputation as a funding program of credible health research.

Conclusion 4: There are a growing number of high quality applications that cannot be funded.

Administrative data reveal that CIHR is typically unable to fund a significant number of the high quality research proposals it reviews during its OGP competitions, and the gap between the number of quality proposals and the number actually funded is continuing to grow.

As well, the survey of unfunded researchers suggests that many of these highly regarded but unfunded proposals were funded via another funding stream. This suggests that when the OGP selection process gave high ratings to proposals, it was indeed identifying research proposals that were of high scientific quality, and that these were subsequently not funded only due to CIHR budget restrictions. We cannot conclude whether or not the selection process might have erroneously excluded proposals that were of high scientific quality by giving them low ratings. Given the importance of the OGP to many Canadian health researchers, the growing gap between fundable proposals and those that are actually funded is cause for some concern.

Conclusion 5: There is early evidence of mechanisms of knowledge translation and dissemination).

The results of the data collected regarding knowledge dissemination and translation do point to the fact that OGP-funded research can and is applied in a variety of commercial, policy, management, and clinical arenas. However, given the evaluation approach of purposive, non-random sampling, the results cannot be generalized to the entire population of OGP funded research. Industry applications for research were the most frequently cited practical application from the full sample of web survey respondents; although the numbers of clinical, health services and public/population health researchers reporting applications of their research suggest that there were also a large number of applications within these research fields.

Recommendation #2: CIHR should maintain the Operating Grants Program.

Perhaps not unexpectedly, respondents to the evaluation were almost unanimously supportive of the OGP. The evidence presented here suggests that the OGP remains a highly valued funding tool. Through budget increases the program has also enabled the conduct of a growing number of research projects and funded more researchers. Some would view this as an intrinsic good. Increasingly however, in an environment of scarce public dollars, more evidence is required about the results of research funded through programs such as OGP.

2. How successful is the OGP in improving the production of highly qualified personnel?

Conclusion 6: Researchers see the OGP as a very important component of Canada's ability to generate highly qualified personnel.

The indicators associated with the production of highly qualified personnel were highly ranked by respondents, 82% indicating that the OGP was important or very important for building research teams and 80% indicating that the OGP was important or very important for enhancing their ability to train researchers. While these indicators should be considered indirect measures of the production of highly qualified personnel, the OGP does appear to be one important source of capacity building by supporting training and development and building research teams. Qualitative data also support the important role of the OGP in generating highly qualified personnel. The data suggest that the OGP is very important to maintaining research infrastructure, such as labs, and to supporting highly qualified personnel. Further, respondents indicated that losing an operating grant can have significant career ramifications for health researchers.

Conclusion 7: The OGP is one factor in retaining and attracting researchers.

Though the OGP does not appear to be a major factor in the retention of researchers, evidence suggests that the program (and CIHR in general) exerts a slight draw on researchers who are interested in moving to or remaining in Canada. In both cases, the role of CIHR funding is one of many factors.

Conclusion 8: Researchers with an established CIHR track record are considerably more successful in obtaining OGP funding.

The analysis of administrative data revealed that researchers with little previous CIHR granting experience were significantly less likely to be successful in an OGP competition.Fifty percent of applicants with over five previous CIHR grants were successful, for example, compared to 13% with no previous CIHR grant. The qualitative data suggested similar conclusions, though respondents tended to view the lack of access to the OGP as being based on the age of the applicant instead of experience.

The administrative data do clearly indicate that experience, as defined by a pre-existing track record of successful CIHR funding, is highly associated with obtaining a grant through the program. The OGP selection process is designed to favour applicants with more experience and a proven track record. The low success rates for inexperienced researchers, however, may be cause for concern if there is evidence that excellent proposals are not funded simply as a result of the researcher's previous CIHR experience. This evaluation was not able to probe this issue in greater detail.

Second, there does appear to be some disconnection between the program's goals and the expectations of the research community, particularly in regards to what they consider to be lower success rates for young researchers. Although one of its objectives is to improve the production of highly qualified personnel, the OGP is primarily a mechanism to fund research; it has no explicit mandate to support either young or inexperienced researchers. The expectations for the level and type of support available through the OGP need to be clear to the health research community, reinforcing recommendation number 3 below.

Recommendation #3: CIHR should review and then clearly communicate the goals of the OGP in the context of other CIHR funding opportunities.

During this review, it became clear that CIHR has not adequately defined and positioned the OGP in relation to the other funding opportunities available at CIHR. There appeared to be considerable confusion as to the role of the OGP and its mandate to support various types of researchers. CIHR needs to review and then clarify the goals and structure of the program to the research community in the context of other CIHR funding opportunities. This would include clarifying the OGP's role in explicitly supporting young or inexperienced investigators, and which funding mechanisms exist to support the next generation of researchers.

Recommendation #4: CIHR should ensure that its peer review practices do not unnecessarily disadvantage proposals from applicants without an established CIHR track record.

While this evaluation was not focused on peer review, many respondents referred to it as an area inviting improvement. CIHR has already engaged in several studies that, while they generally revealed a judicious and carefully monitored peer review system, also revealed differences across research communities1 Peer review is clearly central to CIHR's operations and is therefore an area inviting further analysis to ensure that the goals of CIHR are reflected in the peer review process. CIHR should consider reviewing peer review criteria regarding excellence and established track record in order to ensure that researchers new to CIHR continue to feel that they can apply and have reasonable chance of success. Specifically, we suggest that CIHR review the relative weight assigned to an applicant's track record in the peer review process to ensure that the OGP continues to fund high quality research regardless of the applicant's previous experience with CIHR.

1 See, for example, the "Thorngate Report": Mining the Archives: Analyses of CIHR research grant adjudications.
Warren Thorngate, Neda Faregh and Matthew Young. Carleton University. November 1, 2002.

To the top of the page

1. Introduction

CIHR has promised transparency and accountability to Canadians for its investments of public dollars in all its health research programs and activities. As part of its commitment to transparency and accountability, CIHR undertook an evaluation of its largest single program of health research funding, the Operating Grants Program (OGP), to assess the OGP's operation, impact and achievement of objectives. This evaluation also addresses the recommendations set out by the Auditor General of Canada asking CIHR to improve its monitoring, tracking of research results and the assessment and reporting of program performance.

1a. Context

The Canadian Institutes of Health Research (CIHR) was established in June 2000 to transform Canada's federal approach to health research. Its vision is to position Canada as a world leader in the creation and use of knowledge through health research that benefits Canadians and the global community.

The objective of the Canadian Institutes of Health Research (CIHR) is:

To excel, according to internationally accepted standards of scientific excellence, in the creation of new knowledge and its translation into improved health for Canadians, more effective health services and products and a strengthened health care system.

CIHR's mandate and structure are unique in the world. The organization is structured around 13 virtual geographically distributed Institutes that each supports health research in biomedical, clinical, health systems and services and population health. The Institutes are based in universities or teaching hospitals across the country, but may also have staff located in a variety of other venues. The Institutes are part of a larger national health research network that links researchers and other stakeholders across the country.

1b. Purpose of Report and Audience

The report represents the first evaluation of the OGP program. Given the size and complexity of the OGP, and its importance to CIHR, the program will be undergoing a series of reviews and analyses. The report will be used to communicate preliminary results to Canadians and to provide information to CIHR management and the Governing Council on how well CIHR's largest funding program is enabling the transformation envisioned with the creation of CIHR2. This report is intended for two main audiences. First, it is intended to inform CIHR's Governing Council (through the Standing Committee on Performance Measurement, Evaluation and Audit) in order to ensure that Governing Council is provided with information for evidence-based decision-making. Second, the report is intended for the Canadian health research community in order to demonstrate CIHR's transparency in reporting results and to provide data on the operation and impact of CIHR's largest funding program.

2 Please note, this report is a condensed version of a longer technical report Readers seeking more detailed information on the program, the methodology or results should consult that report (available in English only), available upon request from CIHR.

1c. Scope

The evaluation was designed to examine the preliminary outcomes of the Operating Grants Program. Given the size and breadth of the program, the evaluation team decided to focus on two, specific areas that could lead to a greater understanding of the program's functioning and impact within CIHR and the Canadian health research universe: the impact on the improved capacity for generating and developing new knowledge and the impact on the improved production of highly qualified personnel.

1d. Approach to Development of Evaluation

Given the scope of the program, the evaluation took place in several phases and relied on multiple lines of evidence. An Evaluation Steering Committee was engaged throughout the design phase and the first part of the evaluation, which was conducted by an external consultant, BearingPoint. The committee consisted of 12 representatives of CIHR, the Social Sciences and Humanities Research Council, six Canadian universities, the Alberta Heritage Foundation for Medical Research, and Rand Corp. (in the UK). The first phase included a web survey of over 600 OGP recipients, one-on-one interviews with a sub-sample of 24 OGP recipients, interviews with key informants such as Directors of other health research organizations and CIHR Institute Scientific Directors. The second phase of the evaluation was implemented by staff within CIHR's Evaluation and Analysis Unit and included an analysis of administrative data, a survey of approximately 115 unsuccessful OGP applicants and a small bibliometric study. CIHR's Governing Council's Standing Committee on Performance Measurement, Evaluation and Audit reviewed the report during various stages of its development.

The list of evaluation issues and questions was developed after reviewing the OGP Resultsbased Management and Accountability Framework (RMAF) to help structure the data collection activities. These questions were:

  1. What is the role and importance of the program to the research programs of Canadian health researchers?
  2. What impact do operating grants have on the development and maintenance of Canadian health research capacity?
  3. Are there any program design or operational problems that impact the operation, outputs and outcomes of the Operating Grants Program?
  4. How and to what extent are program research results disseminated and translated to potential user groups?
  5. To what extent does the program fund excellent research?
  6. Are there improvements and alternatives to the program?

To the top of the page

2. Background

2a. Nature of the Operating Grants Program

The Operating Grants Program offers research operating grants in peer-reviewed open competitions to all eligible researchers. By open competitions, CIHR means open to the most meritorious research proposals versus targeted on specific, pre-determined research subjects. Prior to the development of the Operating Grants Program Evaluation Framework, CIHR did not have a stated program objective or goal that could be used to provide guidance for the evaluation. Implicitly, the goal was understood to be the conduct of original, high-quality research. There were no specific goals related to the dissemination and use of OGP-funded research results or to the development of Canadian research capability. In February 2002, the Evaluation Steering Committee, with an eye towards an effective and useful evaluation, developed the following program objective statement:

The objective of the CIHR Operating Grants program is to contribute to the creation, dissemination and use of health-related knowledge, and to help develop and maintain Canadian health research capacity, by supporting original, high quality projects proposed and conducted by individual researchers or groups of researchers, in all areas of health research.

This objective statement was validated by key internal and external stakeholders as part of the process of formalizing the framework for the evaluation. The logic model for the Operating Grants Program was developed at the same time by an external consultant and included input from members of the OGP Evaluation Steering Committee. The model is depicted in schematic format on the next page. It identifies the key activities, outputs, reach and immediate, intermediate and final outcomes for the Program.

Other Beneficiaries / Stakeholders
Other countries / citizens Ultimate / Long-term Outcomes Positive impacts on health care and improved health and economic impacts in Canada and around the world
Canadians (in general and specific target groups) Indirect / Intermediate Outcomes Knowledge / Knowledge translation
Other levels of government (e.g., provincial) Knowledge gained and disseminated Improved production of Highly Qualified Personnel (HQP) Improved reputation of researchers and affiliated institutions Improved broad
capacity for and expertise in, health research
Health Care System Direct / Immediate Outcomes Improved capacity for generating and developing new knowledge
(includes knowledge and expertise)
Industry External Processes Excellent investigator-initiated research projects undertaken
Universities Outputs Funding and Payments
Natural Science and Engineering Research Council (NSERC), Social Sciences and Humanities Research Council (SSHRC)
Other federal funding agencies Communications Distribution and posting of applications Reviewing applications and making recommendations to GC Governing Council approval
Non-governmental organizations (NGOs)
Other researchers Activities Promotion, awareness
building, notifications process
Application process Peer review process Approval process
Lab Technicians
Target Users Students
Health researchers in Canadian institutions
Co-deliverers Governing Council
SCOGAC
Peer Review Committees
Reach

Logic models provide a useful graphic overview of a program's structure and provide focus for an evaluation, but evaluations generally examine specific areas and links in a model. Key stakeholder consultations during the early phases of the evaluation revealed that the most pressing concerns were related to the direct and indirect outcomes of the OGP program. As a result, the evaluation's design and implementation was based upon the need to collect data on whether the program was meeting its objectives through the improved capacity for generating and developing new knowledge, and improved production of highly qualified personnel.

As mentioned above, the Logic Model and specific goal statements were the first ever developed for the program, and were articulated prior to any evaluation data collection activities. These goal statements, as a result, were used as guides for the evaluation team, but should be reviewed in the context of evaluative data that has now been collected. As recommended later in this report, the evaluation team suggests that CIHR use the findings and results contained in this report to revisit the program goals, expected results and related performance measurement and evaluation documents prior to conducting further studies.

A brief description of the program is found below, and more information can be found on the CIHR website: http://www.cihr-irsc.gc.ca/e/22377.html. All OGP research proposals are submitted for review to committees of scientific experts in specific health research fields. The CIHR OGP selection process is, not surprisingly, a complex undertaking. A more comprehensive discussion of the CIHR peer-review process is included in the following web address link: http://www.cihr-irsc.gc.ca/e/4656.html.

Applications to OGP are evaluated by peer review carried out by committees of experts (grants committees) that span the spectrum of biomedical and health research disciplines. Within each committee, experts generally have similar expertise. New, renewal and resubmitted applications are normally reviewed at the same committee meeting, and the same criteria and cut-offs are used for all types of applications. In advance of the grant committee meetings, each application is read by one to four external reviewers who provide written reviews, as well as by two of the committee members and one reader. (Additional reviews may be conducted where more expertise is required for specific competitions or applications.) Recommendations from these committees are first considered by Council's Standing Committee on Oversight of Grants and Awards Competitions (SCOGAC), which in turn makes recommendations for funding to the Governing Council, which makes the final funding decisions.

Specific review criteria for all CIHR programs (i.e.., OGP and others) are mainly based on the applicants' track records and quality of the proposed research:

OGP grant terms are usually for two, three or five years. As a general guide, five-year grants are for highly rated research programs that are judged to be stable and consistently productive. Two or three-year grants are for projects which are less well rated or for which the committee has some concerns and believes that a fairly rapid re-evaluation is justified.

OGP applications are divided mainly between new grants and renewals. The concept of "new" and "renewal" applies to the research, not to the applicant. A renewal application continues the same line of investigation as when the grant expired. A new application is for a new line of enquiry. New and renewal applications are evaluated together, on a "level playing field", using the same rating scale and CIHR applies the same cut-off for funding to both types of applications. A renewal application that is resubmitted after an initial, unsuccessful, grant application would be coded as a new grant in CIHR's database. CIHR has generally not funded the full amount recommended by the review committee. This practice, applied by Governing Council, enables the funding of more applications. A formula is generally used to cut funding, with cuts being pro-rated, the highest ranked proposals receiving the smallest cut.

Success rates are discussed in greater detail in the findings sections of the report. However, there have been fairly consistent success rates for applications to the program. The average success rate from 1995 to 1998 competitions was 28.9%; by comparison, the average success rate after the transition to CIHR (2000 to 2003) rose very slightly to 31.0%.

The OGP had already been a long-standing program when the Medical Research Council (MRC) separated from the National Research Council decades ago. However, the mandate of MRC had focussed mainly on biomedical and clinical research. When the MRC was transformed into the CIHR in 2000, the mandate was expanded to encompass all major forms of health research, regardless of the discipline. The CIHR concept involves a multidisciplinary approach, organized through a framework of 13 virtual institutes; each one dedicated to a specific area of focus, linking and supporting researchers pursuing common goals. The Institutes embrace four themes of research:

  1. Biomedical
  2. Clinical
  3. Health Services
  4. Social, Cultural, Environmental and Population Health

Researchers should identify their projects with one of the Institutes and one of the themes when they apply for CIHR support, a process put in place during the transition from MRC to CIHR. These data are available therefore beginning with the 2000 awards. The four themes are meant simply to operationalize and categorize the broad and expanded array of research supported by CIHR. Definitions of the four themes can be found at: http://www.cihr-irsc.gc.ca/e/3738.html

To the top of the page

3. Methodology

3a. General Overview

The methods used have been consistent with standard practices regarding program evaluations in general, as well as those used for research-based programs in particular. All methods were developed in collaboration between evaluators and subject matter experts to ensure valid and reliable information3. It should be noted that the sheer size and scope of the OGP creates a methodological challenge. The findings and conclusions need to be understood within the context of the fact that the OGP is the largest single source offunding for Canadian health researchers. Without any other comparable programs in Canada, there is likely some bias towards a positive view of the OGP, particularly from OGP recipients. In addition, some of the methodologies used in this study relied on self-reported data and may therefore be a source of self-report bias. Other lines of evidence are used to compensate for these limitations, including data from unsuccessful OGP applicants and administrative data.

3 Copies of all data collection instruments are available with the technical report, upon request from CIHR

A description of each method is below.

3b. Survey of OGP Applicants

Two surveys were conducted during the course of the evaluation, a web-based survey to successful OGP applicants, and an E-mail survey to unsuccessful OGP applicants. Successful OGP applicants were e-mailed to notify them of the survey's presence and were provided with a link to the survey site. The web-based survey was hosted by BearingPoint Consulting to assure respondents of independence from CIHR, using CIHR's assistance only to distribute the e-mail notification.

A total of 2,010 OGP grant holders were notified by e-mail, 184 of these notices were undeliverable (wrong e-mail address), for a population of 1,826 grant holders who had valid E-mail addresses. Responses totalled 629, for a response rate of about 34%.

The survey sample of unsuccessful OGP applicants was drawn at random from applications reviewed by five committees between the March 2000 and March 2003 Operating Grant competitions. These five committees were: Health Policy and Systems Management, Cell Physiology, Immunology and Transplantation, Experimental Medicine and Behavioural Sciences A. A total of 392 researchers were e-mailed the survey questions between November 2003 and June 2004. One hundred and fifteen researchers responded for a response rate of 29%. The questions in the survey were open ended, researchers were instructed to respond in the body of the E-mail text and return their responses to CIHR. A consent form was included in the E-mail ensuring respondent anonymity.

The E-mail survey of unfunded researchers was designed to complement the survey of funded researchers by gathering data on the importance and potential impacts of the OGP from a group of researchers that are not currently reliant on the OGP for funding. Given the size and scope of the OGP, however, there were a number of researchers in the sample that were funded by the OGP at one point in their careers (although care was taken to exclude researchers who had had a successful and long-term history of funding with CIHR).

3c. Administrative Data Analysis

CIHR houses its administrative data in an Electronic Information System (EIS). The data used in the analysis consist of information entered by the research applicant onto application forms and a CV form. The evaluation team used the administrative data to examine the scope of the OGP program, the characteristics associated with successful OGP applications and an internal analysis linking 150 top health research articles in 1996, 1999 and 2002 with OGP funding data. While administrative data will not provide direct information on program outcomes, the information is essential for understanding both the scope of the program and the extent to which certain variables (e.g., characteristics of successful applications) may influence program outcomes.

There are a number of limitations to CIHR's administrative database, particularly as it relates to the analysis of the characteristics associated with success in obtaining OGP funding. It should first be noted that the success rates reported here are based on applications, not on individuals. CIHR does allow researchers to re-submit applications to the OGP competition if they are originally unsuccessful. Success rates by application are therefore only one estimate of success rates, and one that tends to underestimate the level of support by the OGP. An analysis of two OGP peer review committees, Health Services and Policy Management and Cell Physiology, between 2000 and 2003 revealed that, while the success rates for applications was roughly 30%, the success rate for all researchers in this sample was over 40% as these individuals were eventually funded through either original or re-submitted applications.

Re-submission data also pose an additional methodological challenge. It can be difficult to determine whether certain applications are merely re-submissions of earlier proposals or are applications that have undergone radical revision from the original proposals. CIHR reports success rate data based on each unique application as a result of this constraint. The impact of analyzing data by applications, while allowing for rapid and reliable analysis, is that individual researchers are counted on a per application basis, and may therefore be double or triple counted as they may participate in more than one competition.

The main analysis uses data from the eight competitions held between March 2000 and September 2003. The data include the demographic (e.g., region of country) and other application characteristics (e.g., size of research team) for the lead applicant from 11,192 OGP applications. We did not perform significance testing when presenting full population statistics as even moderate differences can appear significant with sample sizes of that magnitude. We do use significance testing in all other sections, however. Spearman's Rho (for correlations) and Chi square (for crosstabular data) statistics are used in those sections.

The other use of CIHR's administrative data was based on linking a list of the top 150 health research articles, as defined by the number of citations, with OGP funding data to determine what, if any, association existed between the bibliometric data and OGP funding. The data were derived from the database: Web of ScienceMD produced by Thomson-ISI. This database contains data from the roughly 8,500 most frequently cited scientific journals.

For each of the three periods of the study (1996, 1999 and 2002), the 50 articles (first author Canadian) receiving the highest number of citations up to 2003 were included. Readers should note, therefore, that the articles in 1996 would have had 6 years of citations whereas the 2002 articles would have had only one. This limitation does bias the study somewhat in that certain fields of research will not have the same levels of citations, particularly in as short a period as one year.

The bibliometric search provided information on article title, address, first author and field of study. Using this data, CIHR was then able to examine its internal database to determine: a) whether the research was funded by CIHR; b) if so, was it funded via the OGP or another source; and, c) if the researcher had a history with CIHR, were there any demographic characteristics of note (e.g., was he or she typically involved in large, multi-researcher grants?)

There are several limitations to this methodology. First, the above grouping is a conservative list of health research topics. CIHR might fund research that would have been categorized via Veterinary Sciences (such as animal models of disease), for example. Given the small number of articles collected, it was felt that a more conservative approach would be the most appropriate in order to limit the likelihood of non-health research articles appearing. Second, there is not always a clear link between an article and the source of funding, a factor which might limit the number of clear links between funding and publication. An additional limitation was linking the CIHR database with only the first author. In certain health research fields, the researcher that received the funding is listed last, not first.

3d. Interviews

The evaluation team interviewed both OGP recipients and senior research administrators such as Vice Presidents of Research in universities and CIHR Institute Scientific Directors. Data were obtained from 24 web survey respondents that were OGP recipients. The interview sample was created based on results of the web survey in which respondents were asked to describe interesting research results and "practical" impacts stemming from their operating grants research. It should be noted, therefore, that the interview population was purposive, not random, in that the evaluation team sought out researchers having done some practical application of their research. Respondents were sent the interview guide ahead of time.

Data were also obtained from 22 senior research administrators and CIHR Scientific Directors who were responsible for overseeing large health research programs and could be expected to have a broad overview of health research trends and factors influencing health research, as well as an overview of the role and importance of the OGP. As with the OGP recipients, this was not a random sample; CIHR identified individuals who were thought to be very knowledgeable, specifically with regard to the main evaluation questions being addressed.

To the top of the page

4. Findings

4.1 Administrative data

4.1a. Scope of the OGP

The Operating Grants Program is CIHR's largest single research funding program, representing roughly 46% of the total grants and awards expenditures in 2003-2004 (approximately $265,000,000 out of $576,000,000).



As Table 4.1 demonstrates, even as more programs have come into place since the transition from MRC, the OGP budget has continued to climb. Even when taking all other programs into account, the OGP is the main source of funds for health researchers applying to CIHR.

Operating grants are traditionally viewed as "single-investigator" driven models of research. However, this view is no longer an accurate representation of a typical operating grant. Up until the CIHR/MRC transition of 1999, the average number of researchers on a grant remained constant at around 1.5 applicants per grant. During the late 1990's, there was a slight increase in the size of OGP research teams to near two members per team, an incremental change in number of team members but still a small number overall. Since the transition period, the average team size has doubled to four researchers per team. This growth is indicative of the general trend in all scientific fields towards a more team-based approach, but it also supports the notion that the OGP is transforming itself to adapt to these changes.

Success rates are discussed in greater detail in the findings sections of the report. However, as a brief overview, the evaluation team found a fairly consistent success rate for applications to the program. The average success rate from 1995 to 1998 competitions was 28.9%; by comparison, the average success rate after the transition to CIHR (2000 to 2003) rose very slightly to 31.0%.

Success rates, though critical, are only part of the story. As Table 4.2 demonstrates, CIHR is often unable to fund a large number of grants deemed fundable by peer-review committees due to budget restrictions. Not only is there typically a large gap between well-ranked proposals in the OGP (3.5 out of 5 are considered fundable research) but it is also a gap that appears to be growing. In 1999, 18% (177 out of 986) of all proposals fell into the highlyrated but unfunded category. By 2003, 33% (501 out of 1517) of all proposals were in this category. CIHR clearly receives more high-quality applications than it can afford to fund.



While peer review is not an explicit focus of this evaluation, it is, nevertheless, an important component of the program's delivery. All research proposals are submitted for review to committees of scientific experts in specific health research fields. CIHR has undergone a rise in the number of committees reflecting the increased mandate of the organization to fund a broad spectrum of health research. In fact, the number of committees has nearly doubled in the last seven years, from 26 committees to 45.

Many of the new peer review committees in the OGP reflect either the change in the CIHR mandate to increase the scope of research activities (e.g., Humanities, Perspectives on Health) or they reflect a recognition from the scientific community that certain key research areas merit their own peer review bodies (Virology and Viral Pathogenesis).

4.1b. Characteristics associated with Successful OGP Applications

The OGP does not have a mandate to fund any type of grant above and beyond high quality health research. However, there may be characteristics of a grant application that are associated with success in an OGP competition. These variations in success rates may impact the nature, scope and type of research funded by the OGP. Given the large number of variables analyzed, including interactions between each variable, the full table of results can be found in the technical report. Table 4.3, presented on the following page, provides overall descriptive data on all variables used in the analysis. Subsequently, information is provided with special focus on key variables that were found to have significant impact on success rates. Statistical techniques, such as partial correlations, were used to identify key factors related to success.

Table 4.3: Descriptive Data on OGP Applications: 2000-2003 (n=11,192)

Variable Pop. Distribution Success Rates
Total CIHR grants held prior to application
0 grant (n=2742)
1 to 2 grants (n=3042)
3 to 4 grants (n=2327)
5 or more grants (n=3097)
25%
28%
21%
28%
13%
27%
35%
50%
Renewal status
New Application (n=8937)
Renewal Application (n=2253)
80%
20%
25%
55%
Age
35 and below (n=1255)
Between 36 and 45 (n=4692)
Between 46 and 55 (n=3704)
56 and above (n=3097)
11%
42%
33%
13%
30%
32%
32%
29%
Fiscal size of grant application
Smallest quartile (n=2799)
Second quartile (n=2797)
Third quartile (n=2798)
Largest quartile (n=2796)
24%
25%
25%
26%
20%
24%
32%
49%
Size of institution
Small Institution (n=5704)
Large Institution (n=5486)
51%
49%
30%
33%
Total CIHR grants held concurrently while applying to OGP
0 grant (n=4355)
1 grant (n=3152)
2 grants (n=1865)
3 grants (n=1818)
39%
28%
17%
16%
20%
32%
40%
47%
Years since last degree
Less than five years since degree (n=3166)
Five to fifteen years since degree (n=4047)
Over fifteen years since degree (n=3101)
31%
39%
30%
27%
33%
33%
Region of research
Maritimes (n=440)
Quebec (n=3204)
Ontario (n=3680)
Prairies (n=645)
Alberta (n=1238)
British Columbia (n=1121)
4%
31%
36%
6%
12%
11%
25%
33%
32%
21%
37%
28%
Gender
Male (n=8148)
Female (n=3040)
73%
27%
32%
28%
Size of research team
One member (n=2278)
2 or 3 members (n=3788)
4 or 5 members (n=2650)
Over 6 members (n=2103)
21%
35%
25%
19%
33%
30%
31%
32%
Language
English (n=10736)
French (n=454)
96%
4%
31%
28%
Research theme
Biomedical (n=8030)
Clinical (n=1467)
Health Services (n=738)
Population Health (n=873)
72%
13%
7%
8%
34%
23%
24%
24%

The variables where we observed the largest differences in success rates are:

These differences were explored further to determine whether there were any mitigating factors influencing success rates.

Table 4.4 provides partial correlation statistics by each variable of interest and success rates. A partial correlation statistic is a measure of association between the two variables when all other variables are taken into account. The partial correlation for renewal status, for example, is the association between renewal status and success rates when the impact of all the other variables is accounted for.

Table 4.4: Measures of association between predictor variables and success rates: 2000 to 20034

High Association 1. Total grants held prior to application**
2. Renewal Status**
3. Age**
4. Fiscal Size of Grant Application**
0.1611
0.1595
-0.0956
0.0899
Moderate Association 5. Institution Size**
6. Number of Grants held concurrently*
7. Years since last degree*
8. Region
9. Gender
0.0311
-0.0309
-0.0258
0.0084
-0.0060
No Association 10. Size of team
11. Language of Application
12. Research theme
0.0030
-0.0010
0.0001

Source: CIHR administrative data
**statistically significant at the 0.01 level
* statistically significant at the 0.05 level

4 We cross-validated these results by running a logistic regression with the same set of variables, using success rate as the dependent variable. We also ran a similar analysis (with the exception of research theme) with OGP data from 1995 to 1998 to examine whether these relationships were stable over time. The results in both cases were extremely similar to the data in Table 4.4.

The purpose of the above analysis, however, is not to draw specific inferences from the coefficients themselves but to extract the variables that have a potential association with success rates. A significant variable therefore has a significant association with success rates, even when all other variables are taken into account.

There appears to be three main categories of association with success rates. The first category, including the total number of grants held at time of application and renewal status are those variables that appear to have a high degree of association with success rates based on their significance and the size of the association. The second category, including age, fiscal size of grant application, size of institution, number of grants held concurrently and years of experience, are those variables that have a modest association with success rates. The final category includes those variables with no apparent association with success rates, including region, gender, size of team, language of application and research theme.

The short analysis in the following section is centered on the impact of those variables most associated with access to the OGP: the total number of grants held at time of application and renewal status. These two variables would appear to have a clear association to the applicant's experience with CIHR, and our analysis is therefore most concerned with the concept of researcher experience as defined by these two variables. Tables 4.5 and 4.6 graphically illustrate the impact of applicant experience on success rates. There is a clear variation in OGP success rates associated with the applicant's previous CIHR experience (Table 4.5). Applicants with no previous CIHR grant experience had a 13% success rate, compared to 50% for those with 5 or more grants, suggesting that applicants with less CIHR granting experience are significantly less likely to access the OGP than experienced CIHR researchers.



Table 4.6 presents data on renewal status. While most applications to the OGP are new grant proposals, renewal applications are significantly more likely to be successful (approximately 25% for new applications, compared to 55% for renewal applications). While renewal applications are therefore a small portion of the total number of applications, they are more than twice as likely to be successful in an OGP competition.

The data generally appear to indicate that experience (in particular, whether the applicant had higher numbers of CIHR grants held prior to application and whether the grant is a renewal of an existing grant) leads to a significant variation in access to OGP funding.

There are also moderate trends in association between success rates and other variables such as age of lead applicant, fiscal size of grant application, size of institution, number of grants held concurrently and years of experience. These variables also have a significant association with success rates, though the strength of the association does not appear to be as large as with total number of CIHR grants prior to application or renewal status. Even many of these variables are related to the researcher's experience with CIHR. For example, the success rate for those applicants with no concurrent OGP grant is 20%, compared to 47% for those with 3 or more concurrent grants.

Our final grouping includes those variables with no significant association with success rates, such as: region, gender, size of team, language of application and research theme. Although there were large intra-group differences with some of these variables, the differences are most likely due to the relationship between these variables and the variables related to experience with CIHR funding. There are wide variations in the distribution of research themes across renewal status. 91% of the renewing applications are biomedical, compared to 68% of the new applications. Biomedical researchers, in other words, are much more likely to have renewal grants, a factor associated with higher success rates, than researchers from any other theme. By contrast, renewal applications from researchers in health services and population and public health are almost non-existent. These results do not indicate that research theme is irrelevant to accessibility. Rather, they indicate that the nature of biomedical research (more likely to renew an existing project, for example) may be more aligned with factors that appear to increase accessibility to the OGP. If we take only new applications as a unique population, there is very little difference in success rates between themes. Additional quantitative analysis will continue to probe differences and similarities between the various research communities served by CIHR.

4.1c. Association between OGP funding and top health research articles

As a potential measure of the degree to which "excellent" Canadian health research has had OGP support, the evaluation team used bibliometric techniques to generate a list of the top 150 health research articles in order to determine if any of these articles has been funded by CIHR, and in particular the OGP. Our preliminary findings were inconclusive. The evaluation team was able to link approximately 20 highly-cited articles that had received OGP funding. A sample of 20 researchers linked to these articles was considered too small to conduct any further substantive analysis.

The low number of articles with a link to the OGP was somewhat surprising, given that respondents frequently cited the OGP's critical role in Canada's health research infrastructure. A number of methodological problems may have contributed to an underestimation of the OGP's role in funding excellent Canadian health research, however. First, the methodology was chosen to be exclusive, only the very top articles (by citation) were chosen. A large sample size would have been inconsistent with an approach designed to identify "the very best". The original sample size of 50 articles per year, however, may have been too exclusive. Every year, Canadian health researchers publish approximately 12,000 articles on health research. We therefore operationalized "excellence" as less than 0.5% of the total population of published articles. Examining the top 5% would have yielded approximately 600 articles and might have provided a more substantive basis upon which to judge the OGP's role in funding excellent research.

Another possible explanation may lie with the methodology itself. In many biomedical fields, the last author is the researcher who received the project funding, not the first. Using last Canadian author as well as first Canadian author may also have revealed additional OGP-funded grants. In many cases, the first author may have been a post-doctoral fellow or a very junior faculty, types of researchers that are generally less likely or ineligible to obtain an OGP grant.

There are a variety of bibliometric approaches that could yield insightful data on the outputs and impacts of OGP-funded research. No specific technique or approach has yet been established in research and development evaluation, however. Though the approach used in the current evaluation clearly did not yield useful data, further bibliometric approaches are still considered valuable and are discussed in the recommendations section.

To the top of the page

4.2 Surveys of OGP applicants

4.2a. Web Survey of Successful OGP Applicants

The evaluation team also probed the impact of the OGP through a web survey to successful OGP applicants. Respondents to the web survey were asked to rate importance of the OGP in five key areas of their research program. Table 4.7 provides data on these ratings. The vast majority of respondents indicated that the OGP was very important to their research programs, be it for their own establishment as researchers or the development of young investigators. The two highest ranked indicators (in terms of percent indicating "high" or "very high") were becoming established in a chosen field (91%) and allowing freedom to explore ideas (87%), suggesting that the OGP is used as a tool for developing research programs. Even the responses to the lowest-ranked indicator, contributing to ability to access other funds (65% indicating "high" or "very high"), suggest that the OGP is important for sustaining the research programs of a significant portion of Canadian health researchers by facilitating additional research grants.

Source: Web Survey: Throughout your career, how important has the research funding from OGP
been in terms of .(e.g., enhancing your ability to train new researchers)

The evaluation team also explored whether the high ratings found in Table 4.7 reflect the fact that most survey respondents obtained the bulk of their total funding from OGP. What about those for whom OGP was a relatively small source of support? Table 4.7(a) shows the distribution of ratings according to the proportion of research funding from OGP (we divided the response population into two groups; those that indicated under 50% of their funding was derived from the OGP, and those that indicated 50% and above).

Table 4.7(a) -Importance of OGP to investigators' research projects and programs by proportion of total funding from OGP

  Very Low
1
Low
2
Moderate
3
High
4
Very High
5
Becoming established in chosen field (n=627)**
Funding below 50%
Funding above 50%
1%
1%
4%
1%
16%
3%
22%
16%
56%
78%
Allowing freedom to explore ideas (n=628)**
Funding below 50%
Funding above 50%
2%
1%
5%
1%
21%
3%
22%
16%
50%
78%
Building research team (n=627)**
Funding below 50%
Funding above 50%
2%
1%
8%
3%
26%
9%
22%
19%
41%
67%
Enhancing ability to train researchers (n=626)**
Funding below 50%
Funding above 50%
3%
0%
9%
5%
23%
12%
24%
18%
40%
64%
Contributing to ability to access other funds (n=626)**
Funding below 50%
Funding above 50%
4%
1%
15%
9%
24%
17%
20%
22%
36%
44%

** significant at the 0.01 level.

Every inter-group difference in Table 4.7(a) was significant, suggesting that the more support researchers have received from OGP, the more important the program has been to their careers. This is perhaps best regarded as a confirmatory finding given that one would expect the high levels of financial support to be associated with the program's importance to researchers. An analysis of the respondents who received less than half their total funding from the OGP does suggest, however, that the OGP's importance is not strictly limited to large funding support.

We also examined the role of the OGP in attracting and/or retaining researchers in Canada. We do not hypothesize that the OGP, on its own, would prevent a top researcher from leaving Canada. However, given the important role of the OGP to health research, the evaluation team felt it appropriate to probe if and to what extent the OGP may play a role in increasing the number of health researchers in Canada.

Data from the web survey suggested that about 41% of respondents were offered a job outside Canada within the past five years, suggesting substantial opportunities for OGP recipients to move abroad if they so desire. Length of support made very little difference in this figure, suggesting that it is not only Canada's most senior researchers who are recruited by foreign institutions. Thirteen percent of respondents indicated that they had recently moved to Canada.

The majority of respondents stated that they had chosen to remain in Canada-despite offers from other countries-mainly because of quality of life issues, family responsibilities, or appreciation for the Canadian research climate, not because of specific granting opportunities. A number of respondents also indicated that they had passed up foreign opportunities in the past but if the funding situation in Canada does not improve they may be forced to move.

The web survey also examined a related issue: the extent to which the OGP influenced researchers to move to Canada from abroad5. The responses suggest that the OGP was quite important to attraction-almost 60% of researchers in this area rated its influence on moving to Canada as high or very high. Researchers who had made a conscious decision to either come or to return to Canada offered some insight on the drawing power of the OGP. Family or personal reasons were still the most highly-cited reasons for attraction, but the promise of increased funding opportunities was also a contributing factor to some researchers' decision-making process.

5 We made no distinction in the survey between immigrants and Canadian citizens that had moved back to Canada.

Web survey respondents were also asked about awareness and the perceived availability of OGP grants for researchers in their fields (i.e., in their own research pillar). Table 4.8 provides the awareness data. We provide data by research theme as researchers were asked to comment on their perceptions of the level of awareness within their own theme, not on the research community as a whole.

Table 4.8: Perceived awareness of the OGP by research theme

  Very Low
1
Low
2
Moderate
3
High
4
Very High
5
Overall (n=621) 0% 1% 8% 28% 63%
Biomedical 0% 1% 5% 27% 68%
Clinical 0% 0% 19% 39% 42%
Health services and policy 0% 3% 23% 43% 30%
Pop/Public Health 0% 5% 24% 29% 42%

Source: Web Survey: From your perspective, how much awareness do researchers in your research stream have of opportunities to apply to the OGP?

Perceived awareness is high or very high in clinical and biomedical areas (81% and 95% respectively). The data suggest that awareness is lower in the health services and policy areas and public health, although even here very few respondents thought it was low. While the MRC provided operating grants to health services and population health researchers prior to 2000, the results appear to indicate that the two groups historically linked with OGP funding (biomedical and clinical), are also the two groups that are most aware of the program.

On average, investigators in all themes believed they were the same or disadvantaged compared to those in the other fields. Those in biomedical disciplines appeared to be the least likely to hold this opinion, but even 28% believed they had lower or much lower chance of success than did scientists in other research areas. For researchers in clinical, health services and policy, or public health disciplines, the equivalent figures were 50%, 53% and 70%, respectively. This finding does contrast with the administrative data which suggested that biomedical researchers are generally more likely to be successful (34% success rate for 2000 to 2003) than researchers from any of the other themes (roughly 24% success rate from 2000 to 2003).

OGP recipients were also asked about the proportion of OGP-funded research that had resulted in practical applications. Table 4.9, based on data from the full web survey, shows that many researchers reported that their OGP-supported research had either already been applied in some "practical" way, or was in the process of being applied. Many others reported that they were actively considering such applications.

Table 4.9: Practical applications of OGP research (n=628)

  Applications already exist Applications in active development Possibilities being explored No applications N/A
Industry applications 19% 13% 22% 33% 14%
Clinical practice 14% 19% 33% 24% 10%
NGOs 16% 11% 16% 35% 23%
Health services and policy 7% 9% 14% 42% 28%
Government 4% 7% 13% 48% 28%

Source: Web Survey: Has any of your research attributable to OGP support led to practical applications in the following area (e.g., health services and policy)? By "attributable" we mean that OGP funds contributed to the research or skills that led to that application

There were logical differences by research theme as to the link between the research and application areas. Investigators in health services and policy disciplines were the most likely to have existing applications in the health services and policy field, for example, while those in the clinical disciplines were most likely to have been involved in clinical applications. Investigators in biomedical disciplines (which tend to be thought of as more "fundamental") were most likely to have been involved in industrial applications, although they were also active in developing uses in clinical practice. The relatively large number of industry applications should be seen within the context of the large number of OGP researchers in the biomedical sciences. The large number of NGO applications and clinical applications can therefore also be viewed as the strong areas for research application. Without any data on dissemination activities prior to this data collection, however, Table 4.9 might best be viewed as a set of benchmark indicators that CIHR can use as a reference point for future data collection.

4.2b. Survey of Unsuccessful OGP applicants

We asked respondents to this E-mail-based survey to identify other funding sources they used to conduct research that had originally not been rated high enough by the OGP peer review committees to receive OGP funding. The OGP peer review system is designed to judge the level of scientific excellence for each application. If peer review is working effectively, the committee ratings of excellence should be a predictor of eventual funding success, even if that funding comes from an alternate funding source. To judge whether the committee ratings of excellence were indeed valid, respondents were divided into two categories: the "high" (i.e. "excellent") group representing those researchers who had fundable research proposals but were not funded due to budget restrictions, and the "low" (i..e, not excellent) group had submitted research proposals that had not been considered of fundable quality by CIHR. It should be noted that CIHR considers 3.5 out of a scale of 5 to be fundable by CIHR and 3.0 to be considered generally fundable research. As our focus is on CIHR's selection system, a rating of 3.5 was chosen as the cut-off for fundable research.

Table 4.10: Success Rates of individuals not funded by CIHR in obtaining funding through other competitions/organizations

  Successful Unsuccessful Not yet known Did not Reapply Total
High 38 (69%) 5 (9%) 7 (13%) 5 (9%) 55 (100%)
Low 24 (40%) 15 (25%) 5 (8%) 16 (27%) 60 (100%)

Source: survey of unfunded researchers: Did you seek funding from another funding agency (public or private) for this research? If so, when? Where? Were you successful?

As can be seen in Table 4.10, researchers in the "high" group generally appeared to be able to find alternate sources of funding. A significant majority of "excellent" respondents in this sample (69%) were successful in finding additional sources to conduct their research and only 9% were unable to find additional sources of funding. The majority of researchers sought funding at either the major provincial health research funding agencies such as the Quebec provincial funding agency FRSQ, other federal funding agencies such as the Natural Sciences and Engineering Research Council (NSERC) or the Social Sciences and Humanities Research Council (SSHRC), the National Health Institutes (NIH) in the United States, or they reapplied to CIHR (in strategic competitions).

Those in the "low" group, by contrast, were considerably less likely to obtain funding elsewhere or they simply did not reapply. 40% of the "low" group were able to obtain additional funds, while 25% were unsuccessful elsewhere. In addition, the alternate funding programs for the 24 successful researchers in this group were considerably smaller. One researcher, for example, was funded through a small Health Canada program that is no longer operating. Another was funded directly through a hospital. The following quotes from researchers ranked below the OGP fundable cut-off demonstrate the trend towards smaller and less productive grants.

I received a pilot grant from the.Society of Canada. It was small compared to an OGP so obviously I did not continue to produce in the same way.

I did the administrative work that would have been done by someone else if I had received proper research funding.

The foundation .. is a grant that I had used periodically, but the maximum.is $15,000 per year. How can you run a laboratory?

The evaluation team also compared the open-ended comments from the web-survey of successful OGP applicants to those that were unsuccessful in their OPG application. An analysis of the responses from funded and unfunded researchers suggested three general roles for the OGP in the current funding climate: supporting individual initiatives, providing levels of support not available from other funding sources and maintaining its role as a "prestige" grant. The categorization of the three roles was based on an analysis of responses from the survey of unfunded researchers. Approximately 40% of unfunded researchers indicated that the OGP supported individual initiative, approximately 30% of the unfunded researchers indicated that the OGP provided support not available elsewhere and approximately 10% of unfunded researchers indicated that the OGP was a prestigious grant. The quotes presented were judged by evaluation team members to be most representative of the "typical" response.

Supporting individual initiative

The OGP's role in supporting individual initiative (i.e., allowing researchers to explore research ideas independently) was frequently mentioned. The finding that the OGP has a role in supporting individual initiative is perhaps not surprising, given the fact that the OGP is Canada's largest open competition in health research. The comments from respondents, nevertheless, suggest that the OGP's role as a source of independent inquiry is valued by health researchers. Even in the current funding climate of strategic opportunities and disease-specific organizations, many respondents claimed that their research could only be covered by the OGP. The following comments were made by both funded and unfunded researchers.

[the OGP] is not subject to fashion, fads, bureaucrats' ideas of what should be done, or political ideas.

This fundamental research did not suit the various foundations that I knew about. Nor was it focused on a particular disease and therefore on a particular institute.

Better support than other funding sources

Respondents also frequently cited the role of the OGP as the largest single source of health research funding. While researchers may obtain additional funding from other sources, the OGP appeared to be the main source of funds for sustaining and maintaining research projects. Again, this conclusion may seem somewhat commonsensical given that the OGP is such a large funding program and the average grant size is over $100,000 per year. Even with other sources of support available for health research6, respondents nevertheless singled out the level of support offered by the OGP as key to their health research. The following comments are from both funded and unfunded researchers.

6 CIHR itself has a number of other programs, and there are some significant provincial programs, (e.g., the Alberta Heritage Foundation for Medical Research, the Michael Smith Foundation for Health Research, Le fonds de la recherche en santé du Québec), national targeted programs (e.g., Genome Canada), and disease-specific foundations and programs (e.g., Heart and Stroke Foundation of Canada, Canadian Breast Cancer Research Initiative).

The level of CIHR funding allows a project of this scope, which is difficult with other organizations.

I applied to CIHR for an operating grant because, as a new investigator, my start-up funds were inadequate to provide student salaries or to finance necessary reagents.Further, research foundations such as NSERC do not offer grants sufficiently large enough to cover the research operating costs .There are other foundations out there, but these are subject to many conditions (often non-renewable). Therefore CIHR is the only granting agency that I can rely on for operation.

Many respondents indicated that their research program would suffer a significant loss if they were unsuccessful with their OGP application. The extent of the expected loss varied from the loss of a few students, to the termination of a particular aspect of a project, to total "collapse" of the research program. Not all respondents, however, indicated that the OGP was crucial to their research programs. Self-identified population and public health researchers, in particular, indicated that operating grants were helpful but not crucial. One of these respondents explained that he does not have to maintain a lab as his work is more project-specific. CIHR is just one of many organizations that he applies to for projectspecific grants. In his view, those that are doing more basic research and those who are trying to maintain a lab are much more reliant on the OGP.

Highly credible

The credibility of OGP funded research is as much a quality of the OGP as it is a role. The role of the OGP as a source of highly credible research should not be discounted, however. Comments regarding the prestige of the OGP were made by unfunded researchers, a population that is perhaps most sensitive to the prestige of an OGP grant as they also seek funding from other, less well-regarded or well-known, sources.

A CIHR grant is giving not only the money needed for research but also the prestige of a well designed, scientific based study.

It also has more prestige than any other grant.available to me, with the possible exception of NIH.

Respondents indicated that an OGP grant conveyed a level of legitimacy on their research projects due to the OGP's very high standards of acceptance. Several respondents noted that the credibility of OGP grants is important as it facilitates securing additional funds from other granting bodies. Conversely, the prestige of the OGP does hold a potential cost for health researchers. A number of respondents indicated that not having or losing an OGP grant could have significant professional repercussions, including job loss.

To the top of the page

4.3 Interviews

The interviews with both OGP applicants and senior research administrators probed challenges and possible OGP program improvements. Data from the interviews with successful OGP applicants included further probing of knowledge translation and dissemination issues.

OGP applicants during the one-on-one interviews were asked how often they interacted with the potential users of their research results or expertise. Researchers reported that they engaged in such translation activities quite regularly. Knowledge transfer and collaborative activities were quite popular: 33% of respondents reported collaborating with non-academic users very often, 29% reported helping create new linkages and collaborations very often, etc. Translation activities related to business activities were considerably smaller: only 4% indicated very often. It may be that translation activities related to business development are more time consuming than are collaborative activities, although even collaborations with non-academic users are likely to involve a considerable amount of time and resources.

The evaluation did not address the types of supports that might be required to facilitate knowledge translation within universities and research institutions. Data from respondents suggest, however, that collaborative activities involving non-academic users are somewhat easier to implement than commercial activities. Even taking into account that the data presented here are small and non-random, the findings do perhaps point to areas that CIHR can target if it wishes to pursue additional translation activities.

Biomedical investigators responded that they tended to use what might be termed "traditional" technology transfer mechanisms, such as seeking venture capital, many ofwhich naturally take a considerable time to come to fruition:

I am the founder and president of an international society of artificial cell and blood substitutes.there are lots of companies using it (OGP research findings) but not so many patents.

Based on results from the lab we applied to get some venture capital money.
Created a company and got some spin off money.

Researchers from the other research themes, while not engaged in business applications, also noted the length of time required to translate research results. The clinical respondents believed that there is a shorter time-frame for the application of their results than in biomedical fields. One respondent, for example, is currently conducting workshops with practitioners in order to expedite the awareness process surrounding her findings. Another estimated that realization in the clinical field takes about three years. Health services and policy researchers noted that their experiments are often being conducted with the beneficiaries of the findings, and that dialogue surrounding improvements and application of findings is therefore an oNGOing process. One population and public health respondent noted that many public health studies are applied primarily through activities to increase awareness and it is estimated that it takes three to five years to create awareness of a new issue. The general finding, across all research themes, is that the application and dissemination of research requires a sufficient expenditure of time and energy.

The interviews with both OGP applicants and senior research administrators probed challenges and possible OGP program improvements. The evaluation team broke the suggestions for improvements and alternatives into three main categories: increased funding, peer review, and support for young and inexperienced researchers.

It is a normal occurrence during evaluations of research granting programs to hear requests for more funding, both for the program overall and per grant. In the case of the OGP, however, there is reason to believe that these requests may be more substantive than usual. As was demonstrated in the program overview, the OGP does not have the funds to support all fundable research projects (and, indeed, many of the fundable applications need to find funding outside the OGP). The data on retention suggest that there is indeed substantial pressure for scientists to move abroad, especially to the US. The following quotes are indicative of these suggestions for improvements:

In one 13 month period a few years ago I wrote 16 grant applications to various agencies. While funding may have increased somewhat (although with questionable distribution), because grant sizes have not appreciably increased, it requires even more effort to access additional funds. Because of dispersion of research focus needed to obtain additional funds (i.e., to minimize perceived overlap), my own average grant quality has substantially decreased, as has that of many of my colleagues. CIHR grants should be consolidated with an average size of $250-300K per year with automatic 5 year terms so that two grants would be sufficient to run a moderate to large group. Grant sizes have not appreciably increased, it requires even more effort to access additional funds.

We definitely need more resources going to OGP. It is an extremely important program . . . The current success rates are too low for sustaining an active research community.

Aside from increasing the OGP budget, a number of alternative approaches for improving funding were mentioned by researchers, including asking CIHR to take steps to ensure sustainable funding and to limit the number of OGP grants held by any one individual researcher.7

7 CIHR recently instituted a policy restricting lead applicants to one new application per competition

This is not new to you, but given the current success rates of applications, it is very difficult to maintain good continuity in one's research program.

I would suggest that CIHR look at the number of grants an applicant holds. If the applicant has only one grant proposal and if the grant is not funded, the impact of closing one's lab and having to start over again is significant. Having said this, it stands to reason that the grant must be good but possibly just below the cutoff for funding (e.g., 3.9-4.0). Despite what some investigators may feel, my thoughts are that applicants holding 2-5 grants from different agencies are not impacted to any great extent.

Peer review was also mentioned, by both funded and unfunded researchers as an area of improvement. The nature and quality of peer review in CIHR was generally perceived to be very good and to constitute a strength of the agency. Many respondents, however, indicated that it is difficult to find a representative group of independent experts who can cover all types of applications within Canada's small pool of investigators, particularly within new research areas. Unfunded researchers were particularly sensitive to this issue. The alternatives suggested by respondents were generally focused on improving the number and diversity of OGP peer reviewers. The following quotes are from both funded and unfunded researchers.

The comments from reviewers.most of whom work on a research subject far removed from the applicant's subject, are totally useless and generally unkind and destructive for no valid reason. This is a very serious problem that has persisted for many years.

The comments from the committee were superficial, lacking in expertise and not focussed on the.relevance of the proposed research. The committee did not take into consideration the external reviews to the extent that they should, nor did they attempt to reconcile the opinion of the committee reviewers with those of the externals in formulating the final score.

Support for young and inexperienced researchers was a frequent theme among respondents, who perceived success rates for young and less experienced researchers as too low, and the amount of funding as not high enough to set up a laboratory or a sustained research program. Support for young and inexperienced investigators has been examined in previous sections and, while the data do support the fact that less experienced researchers are less likely to obtain OGP support, the concerns raised by respondents may represent a misunderstanding of the OGP's role as a mechanism for supporting young researchers. We nevertheless present the comments from respondents here, since many respondents commented that better mechanisms are needed to ensure support for rising "stars." The following quotes represent comments regarding improving support for young and less experienced investigators.

I believe that the CIHR, like other organizations, should give promising young researchers a chance and, primarily for them, base their judgement on the applicant's file, his/her skills and the quality of the project.... The upshot of all this has been that the effective launch of this project has been delayed for at least two years: impossible to hire staff, impossible to invest sufficient amounts of money to move things ahead, etc.

The recurrent uncertainty about the levels of funding in the near future are not conducive to the recruitment of new Assistant Professors (who currently tend to prefer positions in the USA), or encouraging to those that are already in the system.

Several possible solutions were mentioned, including having a separate application stream for younger researchers (for which it was noted that this ideally requires additional funding), forewarning review panels when an application from a new investigator is being reviewed, and asking more senior researchers to mentor younger ones during the application process8, though several universities do provide mentoring assistance for more junior applicants.

8 The NIH uses the latter two methods.

To the top of the page

5. Key Conclusions and Recommendations

The specific design and implementation of the evaluation was centered on two key outcome areas identified in the program logic model: the impact on the improved capacity for generating and developing new knowledge and the impact on improved production of highly qualified personnel. We summarize our conclusions using the two main outcome areas examined in the evaluation. We also include management recommendations related to each section where applicable.

5.1. How successful is the OGP in improving capacity for generating and developing new knowledge?

Conclusion 1: Additional data collection strategies are required to further review the OGP and its role in Canadian health research.

Additional data collection strategies should be put in place to collect appropriate data at the end of funded research projects to be able to better measure the outputs and several years after completion of research to assess overall impacts of funded research. CIHR should also collect appropriate information on the type and level of dissemination and translation strategies used by OGP researchers. Without pre-existing benchmarks or a valid impact factor, it is difficult to judge the true meaning of the data collected during this evaluation regarding knowledge dissemination and translation. While there is clearly dissemination and translation activity resulting from OGP research, the question of how much more (or less) can and should be achieved remains unanswered. Similarly while there is clearly more health research taking place as a result of budget increases, the question of how much more (or less) can and should be funded also remains unanswered.

In addition, we did not address several issues that may have furthered our understanding of the OGP's role in producing highly qualified personnel. We did not, for instance, probe how the OGP is more or less effective than other funding tools in supporting and training new researchers. In addition, there were no reliable data on the number of research trainees supported by the OGP. Here, as with dissemination and translation of results, on-going data collection from CIHR would have been very useful to the evaluation.

Finally, there are still a number of areas within the issue of research excellence that could be explored in greater depth. For some areas of health research, CIHR could consider using its administrative data to generate test groups by funding type and funding level and produce reliable and sizeable comparison groups for analysis of researchers in different categories and then compare impact using bibliometric analysis. Further evaluative work could also examine how the OGP impacts the formulation and design of research.

Recommendation #1: CIHR should develop better on-going performance measurement for the research it funds. The following are examples, though CIHR should consider a variety of additional studies in greater detail:

Conclusion 2: The OGP is making a growing contribution to generating new knowledge through the amount and breadth of the health research funded.

The OGP contributes over $250,000,000 a year to Canada's health research funding, the single largest investment in health research in Canada. The average team size is also increasing, suggesting that the OGP is funding both more projects and more scientists per project. In addition, since the transition from MRC to CIHR, the number of peer review committees in the OGP has almost doubled, suggesting a greater scope in the type and diversity of health research projects funded under the OGP.

Conclusion 3: Researchers see the OGP as a very important component of Canada's ability to generate new health research.

The vast majority of successful OGP applicants indicated that the OGP was very important to their research programs. The two highest ranked indicators (in terms of percent indicating "high" or "very high") were becoming established in a chosen field (91%) and allowing freedom to explore ideas (87%), suggesting that the OGP plays a key role in generating and facilitating innovative ideas in health research. The qualitative data from both funded and unfunded researchers also point to the importance of the OGP as a tool for creating new heath research by allowing the free exploration of ideas, providing a high level of support relative to other health research funding sources and having a reputation as a funding program of credible health research.

Conclusion 4: There are a growing number of high quality applications that cannot be funded
.

Administrative data reveal that CIHR is typically unable to fund a significant number of the high quality research proposals it reviews during its OGP competitions, and the gap between the number of quality proposals and the number actually funded is continuing to grow.

As well, the survey of unfunded researchers suggests that many of these highly regarded but unfunded proposals were funded via another funding stream. This suggests that when the OGP selection process gave high ratings to proposals, it was indeed identifying research proposals that were of high scientific quality, and that these were subsequently not funded only due to CIHR budget restrictions. We cannot conclude whether or not the selection process might have erroneously excluded proposals that were of high scientific quality by giving them low ratings. Given the importance of the OGP to many Canadian health researchers, the growing gap between fundable proposals and those that are actually funded is cause for some concern.

Conclusion 5: There is early evidence of mechanisms of knowledge translation and dissemination.

The results of the data collected regarding knowledge dissemination and translation do point to the fact that OGP-funded research can and is applied in a variety of commercial, policy, management and clinical arenas. However, given the evaluation approach of purposive, nonrandom sampling, the results cannot be generalized to the entire population of OGP funded research. Industry applications for research were the most frequently cited practical application from the full sample of web survey respondents; although the numbers of clinical, health services and public/population health researchers reporting applications of their research suggest that there were also a large number of applications within these research fields.

Recommendation #2: CIHR should maintain the Operating Grants Program.

Perhaps not unexpectedly, respondents to the evaluation were almost unanimously supportive of the OGP. The evidence presented here suggests that the OGP remains a highly valued funding tool. Through budget increases the program has also enabled the conduct of a growing number of research projects and funded more researchers. Some would view this as an intrinsic good. Increasingly however, in an environment of scarce public dollars, more evidence is required about the results of research funded through programs such as OGP.

To the top of the page

2. How successful is the OGP in improving the production of highly qualified personnel?

Conclusion 6: Researchers see the OGP as a very important component of Canada's ability to generate highly qualified personnel.

The indicators associated with the production of highly qualified personnel were highly ranked by respondents, 82% indicating that the OGP was important or very important for building research teams and 80% indicating that the OGP was important or very important for enhancing their ability to train researchers. While these indicators should be considered indirect measures of the production of highly qualified personnel, the OGP does appear to be one important source of capacity building by supporting training and development and building research teams. Qualitative data also support the important role of the OGP in generating highly qualified personnel. The data suggest that the OGP is very important to maintaining research infrastructure, such as labs, and to supporting highly qualified personnel. Further, respondents indicated that losing an operating grant can have significant career ramifications for health researchers.

Conclusion 7: The OGP is one factor in retaining and attracting researchers.

Though the OGP does not appear to be a major factor in the retention of researchers, evidence suggests that the program (and CIHR in general) exerts a slight draw on researchers who are interested in moving to or remaining in Canada. In both cases, the role of CIHR funding is one of many factors.

Conclusion 8: Researchers with an established CIHR track record are considerably more successful in obtaining OGP funding.

The analysis of administrative data revealed that researchers with little previous CIHR granting experience were significantly less likely to be successful in an OGP competition. Fifty percent of applicants with over five previous CIHR grants were successful, for example, compared to 13% with no previous CIHR grant. The qualitative data suggested similar conclusions, though respondents tended to view the lack of access to the OGP as being based on the age of the applicant instead of experience.

The administrative data do clearly indicate that experience, as defined by a pre-existing track record of successful CIHR funding, is highly associated with obtaining a grant through the program. The OGP selection process is designed to favour applicants with more experience and a proven track record. The low success rates for inexperienced researchers, however, may be cause for concern if there is evidence that excellent proposals are not funded simply as a result of the researcher's lack of previous CIHR experience. This evaluation was not able to probe this issue in greater detail.

Second, there does appear to be some disconnection between the program's goals and the expectations of the research community, particularly in regards to what they consider to be lower success rates for young researchers. Although one of its objectives is to improve the production of highly qualified personnel, the OGP is primarily a mechanism to fund research and it has no explicit mandate to support either young or inexperienced researchers. The expectations for the level and type of support available through the OGP need to be clear to the health research community, reinforcing recommendation number 3 below.

Recommendation #3: CIHR should review and then clearly communicate the goals of the OGP in the context of other CIHR funding opportunities.

During this review, it became clear that CIHR has not adequately defined and positioned the OGP in relation to the other funding opportunities available at CIHR. There appeared to be considerable confusion as to the role of the OGP and its mandate to support various types of researchers. CIHR needs to review and then clarify the goals and structure of the program to the research community in the context of other CIHR funding opportunities. This would include clarifying the OGP's role in explicitly supporting young or inexperienced investigators, and which funding mechanisms exist to support the next generation of researchers.

Recommendation #4: CIHR should ensure that its peer review practices do not unnecessarily disadvantage proposals from applicants without an established CIHR track record.

While this evaluation was not focused on peer review, many respondents referred to it as an area inviting improvement. CIHR has already engaged in several studies that, while they generally revealed a judicious and carefully monitored peer review system, also revealed differences across research communities9 Peer review is clearly central to CIHR's operations and is therefore an area inviting further analysis to ensure that the goals of CIHR are reflected in the peer review process. CIHR should consider reviewing peer review criteria regarding excellence and established track record in order to ensure that researchers new to CIHR continue to feel that they can apply and have reasonable chance of success. Specifically, we suggest that CIHR review the relative weight assigned to an applicant's track record in the peer review process to ensure that the OGP continues to fund high quality research regardless of the applicant's previous experience with CIHR.

9 See, for example, the "Thorngate Report": Mining the Archives: Analyses of CIHR research grant adjudications. Warren Thorngate, Neda Faregh and Matthew Young. Carleton University. November 1, 2002.

To the top of the page

Management Response and Action Plan

Recommendation 1. CIHR should develop better on-going performance measurement

CIHR management agrees with this recommendation. The Operating Grants Program is CIHR's largest and most important funding tool - it will therefore be the subject of oNGOing quantitative and qualitative performance measurement and evaluation. CIHR will also continue to develop better measurement tools to assess the impact of programs such as OGP.

In the Federal Government's budget of March 2004, CIHR and other granting agencies were asked to develop a more comprehensive system to track, evaluate and report on the outputs of the research they fund. This will improve accountability for federal support for university research and contribute to the high standards of excellence researchers strive for. The Evaluation and Analysis Unit of Corporate Affairs Portfolio will work with the Research Portfolio to develop a strategy for meeting this accountability in an efficient and effective way that adds minimal additional administrative burden to distract researchers from the conduct of their research. This strategy will be developed by early 2005. In developing this strategy, the resource and other implementation requirements will be developed.

The Evaluation and Analysis Unit will also work with the Research Portfolio to update the logic model and performance measurement strategy for the OGP in the context of overall CIHR performance measurement that is being updated through the Federal Government's new Management, Resources and Results Structure (MRRS). The performance measurement strategy (MRRS) from which performance measurement for programs such as the OGP will flow will be completed by March 2005.

Finally the question of how to measure research excellence as well as the return on investment for health research will continue to be important research areas for CIHR. The Evaluation and Analysis Unit will continue to liaise with other science-based agencies to monitor best practices and subsequently implement these when possible into our performance measurement activities.

Recommendation #2: CIHR should maintain the Operating Grants Program.

CIHR Management agrees with this recommendation. The program will be continued and decisions regarding the level of support are the responsibility of senior decision making bodies at CIHR including the Research Planning and Priorities Committee (RPPC) and ultimately Governing Council (GC).

When faced with a potential budget exigency in planning for fiscal year 2004-2005, ensuring an adequate number, value and success rate for the in-year Operating Grant competitions was paramount in the budget allocations determined. However, the long-term projection for number and value of operating grants in light of the development of alternative forms of support for the direct costs of research projects (such as the new Team Grants program) has not been adequately discussed either at the Governance or Executive levels within CIHR. A framework for this discussion and resolution of the issue was provided by the agreement of the Research Planning and Priorities Committee at its September 2004 meeting, at which time it was agreed to adopt a 'simpler, better' approach to CIHR's programs of support for research, which will focus attention on a smaller number of key programs. In addition, GC has requested a three-year budgetary planning cycle, which will provide better information about medium term support for the program.

Recommendation #3: CIHR should review and then clearly communicate the goals of the OGP in the context of other CIHR funding opportunities.

CIHR management agrees with this recommendation. With support from the Evaluation and Analysis Unit, CIHR needs to review and discuss the goals of the OGP and how well these align with CIHR objectives. In reporting to the research community, CIHR needs to emphasize that there are now many other types of funding opportunities available, particularly in communicating with those who have not previously applied to CIHR, and for whom success in the OGP is a lower probability. The Research Portfolio will take the lead in improving communications with the research community. The Portfolio will continue to work with the CIHR Communications Branch, and recently hired its own officer responsible for the creation of communications with the research community about CIHR's funding programs. This is an oNGOing requirement, but improved communications vehicles, such as an E-mail newsletter to researchers, are planned for April 2005. Increased communications efforts of this nature may require some additional resources and will be submitted to the CIHR budgeting and prioritization exercises for 2005-2006.

Management wishes to add that it is still considered important that the OGP is accessible to all health researchers with excellent proposals, irrespective of their past success in the program's competitions. The information from the OGP evaluation study will be included as part of a review of evaluation criteria and rating scales now in progress by the Sub-committee on Monitoring and Innovation in Peer Review (SMIPR), to address the issue of the appropriate importance given to "track record" in the overall evaluation of applications. This work should be complete by the end of FY 2004-05 and can be accomplished within existing resources.

Recommendation #4: CIHR should ensure that its peer review practices do not unnecessarily disadvantage proposals from applicants without an established CIHR track record.

CIHR management agrees with this recommendation, and has already created SMIPR (see above), a joint management/peer reviewer/researcher group to review this and other potential improvements to the peer review system. Major policy changes dealing with funding allocation methodology, evaluation criteria and rating scales are under consideration, with recommendations to go to GC before the end of Fiscal-Year 2004-2005. The work of the Committee can be completed with existing resources - depending on the nature of their recommendations, additional resources may be required.

To the top of the page


Created: 2005-06-16
Modified: 2005-06-16
Reviewed: 2005-06-16
Print