Canadian Institutes of Health Research
Français Contact UsHelpSearchCanada Site
CIHR HomeAbout CIHRWhat's NewFunding OpportunitiesFunding Decisions
CIHR | IRSC
About CIHR
CIHR Institutes
Funding Health Research
Funding News and Developments
Funding Opportunities
How to Apply for Funding
Funding Policies
Peer Review
Funding Decisions
Funding Related Databases
Training Opportunities
ResearchNet
Knowledge Translation and Commercialization
Partnerships
Major Strategic Initiatives
International Cooperation
Ethics
News and Media
Publications
 

First Report on Peer Review Innovations

June 2005


Table of contents

Section 1: Introduction
1.1 Reporting on Peer Review Innovations
1.2 Setting the Stage for Systematic Reporting

Section 2: The Scope and Magnitude of the CIHR peer review system
2.1 Foundation of CIHR 's reputation of excellence
2.2 Peer Review Facts and Figures

Section 3: Taking Stock of CIHR Peer Review Innovations
3.1 Enhancing the Effectiveness of CIHR 's Peer Review System
3.1.1 Adapting the portfolio of peer review committees
3.1.2 Involving the Canadian public in CIHR peer review committees
3.1.3 Evaluative studies of CIHR 's peer review system
3.1.4 Evaluation Study of the Operating Grants Program
3.1.5 Improving Awards Selection Processes
3.1.6 Development of CIHR 's Relevance Review Process
3.1.7 Continued implementation of the ResearchNet initiative

Section 4: Strengthening CIHR peer review system


Acknowledgments

The Canadian Institutes of Health Research (CIHR) wish to acknowledge and thank the individuals who contributed to the development of this report. In particular CIHR would like to express appreciation to members of the Subcommittee on Monitoring and Innovation in Peer Review who graciously provided their expertise and support throughout the development process. We would also like to recognize the contributions of the various CIHR units which provided materials presented in this reports: Evaluation and Analysis; Information Technology Management Services; Knowledge Creation Programs, Research Capacity Development; Research Planning and Resourcing; and, Special Studies.

How to read this report

This report is intended as a reference tool and is not designed to be read from cover to cover. Some sections will be more useful for some purposes than for others.

We do not consider this to be an exhaustive document. We have tried to incorporate the relevant information that will help all audiences understand the ways in which CIHR is seeking to improve its most important business process, the peer review system. We would appreciate receiving any feedback regarding this report, as we strive to make a document that is useful to you.

Return

Section 1: Introduction


1.1 Reporting on Peer Review Innovations

At its March 16th & 17th 2005 meeting, the Standing Committee for Oversight of Grants and Awards Competitions (SCOGAC) supported the work plan of its standing committee, the Subcommittee on Monitoring and Innovation in Peer Review (SMIPR).

One element of the work plan consisted of designing an annual report on peer review innovations. The annual report on peer review innovations is intended to take stock of improvements to CIHR's peer review system and to inform plans for improvement activities in future years. The report will cover all small and large innovations inspired by: Governing Council, peer reviewers, CIHR staff, implementation of process improvements such as e-business, etc. The report on innovations will inform the Governing Council, management and CIHR staff discussions on peer review innovations. In addition, systematic reporting will contribute to the development of CIHR Reports such as the Annual Report, the Departmental Performance Report and the Report on Plans & Priorities.

SMIPR

The mandate of SMIPR is to advise SCOGAC and inform the Research Planning and Priorities Committee (RPPC) with respect to continuous quality improvement in peer review at CIHR, with the goal of ensuring that excellence, equity of opportunity, and due diligence in the use of taxpayer funds are the overarching principles guiding CIHR's allocation of competitive grants and awards funding.

The roles and responsibilities of SMIPR are as follows:

Return

1.2 Setting the Stage for Systematic Reporting

In CIHR's Blueprint 2007, there is an explicit commitment to account to Canadians on performance. CIHR has developed a framework to facilitate the ongoing evaluation and reporting on the outcomes of the organization and its programs according to key strategic outcomes.

Since the peer review process is the backbone upon which CIHR's reputation for excellence rests, SMIPR has taken up the mandate to guide the development of a systematic and explicit approach to monitoring the performance of the peer review system. A commitment was made to develop and implement a systematic reporting regime with three types of reports:

During the course of fiscal year 2005/2006, CIHR will proceed with the design and development phases for its peer review reporting regime. The work will consist of:

The challenge in developing a performance measurement regime is developing a simple, manageable one. To do that, one must resist the urge to make performance measurement over-complicated. Having a lot of data does not mean having a lot of meaningful performance measurement information. Success in this domain does not mean being swamped with information that has little or no value in relation to assessing the wisdom of changes in the process and making appropriate decisions.

The foundation has been established to start reporting on CIHR's most critical business process, the peer review system. This report on peer review innovations is the first step in meeting our commitment.

Before looking back at the peer review process improvement activities which took place in 2004/2005, let us first appreciate the importance of CIHR's peer review system.

Return

Section 2: The Scope and Magnitude of the CIHR Peer Review System


2.1 Foundation of CIHR's reputation of excellence

The process of peer review consists of a system of expert review of research/scientific work by peers. It is often defined as the process of reviewing and deciding the merit of research proposals submitted for funding or the merit of scientific manuscripts submitted for publication. The peer review process is more than a mechanism for allocating funds and judgment of merit. It is a fundamental component of the norms of higher education. It is also a process by which faculty are evaluated and promoted. It is an essential element in the scientific process by which knowledge is developed and judged to be accurate. It is the mechanism of scientific self-regulation, a method used to ensure appropriateness of research procedures, and to evaluate the scientific merit and plausibility of research results. Proponents of peer review assert that the process of peer review enhances the progress of science and is a key mechanism through which the best science is attained.

Each year, CIHR receives thousands of applications. Each application is evaluated by a peer-review committee, composed of a number of volunteer reviewers, who write detailed reports on its strengths and weaknesses. Through a process of consensus, the committee arrives at a numerical rating after averaging individual ratings for each proposal. As a result, only those that meet internationally accepted standards of excellence are funded.

Peer reviewers are accountable to the wider research community and ultimately, to the Canadian public. They help to ensure that CIHR is accountable to the Government of Canada for the money it invests in health research. CIHR recognizes the dedication of over 2,575 experts who volunteered their time on one of CIHR's peer-review committees over the past year and thanks them for their continued contribution to improving the health of Canadians. CIHR also appreciates the efforts of thousands of external referees who submitted written reports for consideration by the peer-review committees.

The origin of peer review

The process of peer review dates back to 1665 with the founding of the Royal Society of Philosophical Transactions where it was mandated that the "philosophical transactions" be licensed "under the charter of the Council of the Society, being first reviewed by some members of the same" (Chubin & Hackett, 1990). Since then peer review has been accepted as an indispensable element of the process of science and academia. It has been both esteemed and criticized. Despite the criticisms, it remains one of the most widely accepted processes for examining grant proposals and for determining their merits.

Return

2.2 Peer Review Facts and Figures

Grants and Awards - 1999-2000 / 2004-2005

 

1999-2000

2004-2005

Increases

 

$  (Millions)

#

$  (Millions)

#

$  (Millions)

%

#

%

Grants - Open Operating
(includes Randomized Controlled Trials)

170

2,341

324

3,393

154

91

1,052

45

All Other Grants
(open & strategic)

57

612

213

2,047

156

274

1,435

234

Salary Programs
(open & strategic)

23

497

40

736

17

74

239

48

Training
(open & strategic)

26

1,372

42

1,759

17

64

387

28

Partner Contributions

52

578

88

1,063

36

69

485

84

Peer Review Committees

 

Fiscal Year 2004/2005

Program

Number of Peer Review Committees

Number of Reviewers

Operating Grant Programs
(includes Randomized Controlled Trials)

51

838

Research Personnel Awards Programs

17

397

Partnered Programs

2

49

Strategic Initiatives

49

406

Total

119

1,690

* Number of Invited Members: 682
* Total Number of Reviewers: 2,372

Return

Section 3: Taking Stock of CIHR Peer Review Innovations


3.1 Enhancing the Effectiveness of CIHR's Peer Review System

Many activities were undertaken to improve the peer review system during fiscal year 2004/2005. In the past, several individuals and groups have studied the CIHR peer review processes and systems. An early CIHR peer review initiative consisted of the Working Group on Programs and Peer Review which, early in CIHR's tenure, had the mandate to identify key issues related to the programs and peer review activities of CIHR. Following establishment of the Standing Committee on the Oversight of Grants and Awards Competitions (SCOGAC) and the Research Priorities and Planning Committee (RPPC), it was determined that the Working Group on Programs and Peer Review was redundant and it was dissolved in November 2001.

In May 2002, the Peer Review and Grants and Awards Administration Process Design working group tabled their reports including many recommendations representing a cross-functional perspective for improvement. Many recommendations contained in this early report have since been implemented, albeit in an unsystematic way. Other recommendations remain to be validated. Still later, in response to issues raised in the Thorngate report, SCOGAC created an ad hoc committee in January 2003. This committee was called the FAIRR committee (FA.irness I.n R.atings and R.ankings).

In the meantime, CIHR created the position of Peer Review Innovation Coordinator, which was filled in January 2004. In February 2004, the ad hoc FAIRR committee reviewed its mandate in light of the March 2003 SCOGAC recommendation that FAIRR remain active and work with staff to develop processes and action plans to implement changes over time, and a recommendation emerging from a January 2004 staff retreat on peer review to "establish a committee that would guide the peer review process".

Following a discussion of current and planned activities, the FAIRR committee members noted the potential for conflicting opinions and advice from the various initiatives and concurred with the suggestion that a single portal would be preferable to bring together and consolidate the recommendations from multiple sources. To this end, on March 5th, 2004 SCOGAC determined that the ad hoc FAIRR committee would become a Standing Subcommittee of SCOGAC, co-chaired by members of SCOGAC and RPPC to ensure a healthy interface between the governance and executive functions. This committee is known as SMIPR (Subcommittee on monitoring and innovation in peer review).

All of these efforts are testimony to CIHR desire to continuously improve the peer review system. With the establishment of SMIPR and staffing of the Peer Review Innovation Coordinator position, CIHR now has in place the structures and mechanisms to support the publication of its first report on peer review innovations. You will find in the following sections a brief overview of many improvement activities undertaken within CIHR in support of excellence in our peer review system.

Strategic Outcome 1 Outstanding Research
Priority Advance health knowledge, through excellent and ethical research, across disciplines, sectors, and geography
Key plans & activities to achieve the priority Enhancing the effectiveness of CIHR's peer review system

Return

3.1.1 Adapting the portfolio of peer review committees

What we aim to achieve

CIHR is aiming to have a portfolio of peer review committees that can appropriately respond to demand generated by the research community itself (application pressures) and to needs in relation to specific research priorities. At its Peer Review Innovations - Directions Workshop, the Subcommittee on Monitoring and Innovation in Peer Review agreed that the peer review process and the committees that support the process are in constant evolution. CIHR needs to better understand the mix of applications received by a committee whether for an open or strategic competition, or both and clearly define realistic measures of workload that ensure quality review can occur in all cases.

  Leads: Research Portfolio & SMIPR
Actions Taken Results Achieved

On February 23rd, SMIPR identified as a priority for 2005 the development of a transparent, well-supported, and strategic decision-making process to guide evolution of CIHR's portfolio of open operating grants peer review committees.

  • SMIPR recommendations endorsed by SCOGAC on March 8th and work plan established for FY 2005/2006
  • Key considerations:
    • CIHR must implement a proactive, annual review and planning process that links into the appropriate CIHR decision making and governance structures. The process will be initially applied to the open operating grants peer review committees.
    • It is critical that there is capacity within CIHR to build / package requests for new or modified committees to ensure that decisions are appropriately supported by evidence. This capacity is required to identify and gather supporting information to prepare a business and science case for a given transformation recommendation.
    • The proposed timeline for implementation in fiscal year 2005/06 would result in decisions that impact the portfolio of peer review committees in the subsequent year.
    • There is a need to appropriately manage the transition of existing committees to transformed peer review committees. This was acknowledged as a key success factor in managing and change process.
  • Developed draft process for committee evolution - currently in consultation phase

Creation of two new open operating grants peer review committees

  • Implemented additional peer review committees in the areas of Aboriginal Peoples' Health and Palliative Care (IAPH & IC)

Since the creation of CIHR in April of 2000, the number of open operating grants peer review committees has grown from 25 to 45 for the 2004/2005 Open Operating Grants Competition. There is a need for a clear, transparent process for addressing committee transformation/evolution. The current ad hoc, reactive process is not a sustainable solution. An annual review and planning process that links into the appropriate CIHR decision making and governance structures is called for.

Return

3.1.2 Involving the Canadian public in CIHR peer review committees

What we aim to achieve

As a federal agency, CIHR has an ongoing responsibility to demonstrate to Canadians that tax dollars are spent wisely. In recent years, public-sector activities have become subject to increased scrutiny, raised expectations, and value-for-money audits and evaluations. Consumers are the ultimate recipients and beneficiaries of the knowledge derived from research. It is therefore not only desirable, but essential that they be involved in developing and implementing strategies for research at CIHR. Indeed, in CIHR's Blueprint 2007, a key component of its success is an integrative approach that brings together all members of the health research enterprise, including those who fund research, those who carry it out, and those who use its results.

CIHR has committed to enhancing public and stakeholder engagement in health research in Canada. To this end, CIHR enhanced its transparency by introducing reviewers representing the Canadian public into its peer review processes used to assess applications for funding. The introduction of community reviewers has three objectives:

CIHR community reviewers may play all three roles, or may play the first and third roles only, depending on the reviewers' backgrounds and experience, and the nature of the review committee on which they serve.

  Leads: Peer Review Innovation Coordinator & Knowledge Creation Programs
Actions Taken Results Achieved

Development of the Community Reviewers' Project

  • As a response to the study entitled: Towards Integration of Community Review in CIHR Peer Review Processes (Schneider, 2004), the Community Reviewers' Project was launched.
  • Project objectives, process details and orientation materials were developed.
  • A Pre-trial was conducted at the September 2004 Open Operating Grants Competition with four (4) peer review committees.
  • Evaluation data was collected to inform the decision-making process.
  • Process details were reviewed in light of comments received from project participants.

Implement medium scale Community Reviewer's pilot project

  • For the Spring 2005 Open Operating Grants Competition a web call was posted inviting members of the Canadian public to participate as community reviewers - 70 nomination packages were received over a 2 week recruitment period.
  • 14 peer review committees participated in the in the pilot project.
  • Evaluation data is currently being collected and analyzed to assess the project's success in meeting its stated objectives.

Return

3.1.3 Evaluative studies of CIHR's peer review system

The Subcommittee on Monitoring and Innovations in Peer Review addressed various issues related to the peer review system. In response to the Fairness in Rating and Ranking (FAIRR) Committee Synthesis Report two studies were commissioned. The first study addressed the issue of CIHR funding allocation methodology, and the second study addressed issues related to the rating scale currently in use to assess the scientific merit of applications submitted to the open operating grants program.

Funding allocation methodology

As part of CIHR's commitment to ensuring a fair funding allocation process, and as a necessary precursor to modifying or replacing the current 80:20 allocation system, the FAIRR committee, in its Synthesis Report of February 2003, recommended that a comparative evaluation be conducted into the impact and implementation requirements of the 80:20 method, the 100:0 method and the Historical Percentile Ranking (HPR) method, each of which would include an agreed-upon funding floor, based on data from previous competitions.

To this end, a report was commissioned. The report (Thorngate and Wang, From merit to money: Five methods of funding allocation, 2004) includes a clear description of each funding allocation methodology assessed and data on the outcomes of applying each method.

The issue is ensuring that CIHR's funding allocation methodology for the open operating grant competition allows fair competition for funds, is transparent and is applied equitably to all types of health research.

  Lead: Peer Review Innovation Coordinator
Actions Taken Results Achieved

Perform a study on funding allocation methodologies

  • Spreadsheet comparisons were made of four alternatives to CIHR's current 80:20 funding allocation method to see how use of the alternatives would affect funding rates for CIHR peer review committees in the past six grant competitions.
  • Production of the study report - From merit to money: Five methods of funding allocation (Thorngate & Wang, May 2004)

SMIPR's recommendation to SCOGAC

  • SCOGAC supported SMIPR's recommendation to adopt the 100:0 method at its March 2005 meeting
  • The 100:0 method was applied by SCOGAC for the Awards Competition (March 2005)

SCOGAC's recommendation to the Governing Council (GC)

  • GC approves the adoption of the 100:0 method as CIHR's funding allocation methodology. The new methodology will be applied to all open operating grant competitions from the September 2005 competition forward. Additionally, to ensure stability, the 100:0 method will be adopted for a period of five years at the end of which a review of the impact will be reported by SCOGAC.

CIHR recognizes without hesitation the ability of an individual peer committee to rate fairly the grants placed before it with respect to one another (intra-committee validity). On the other hand, cross-sectional and longitudinal analyses of rating outcomes have revealed significant heterogeneity in ratings distributions and success rates across committees and themes. Given the considerable heterogeneity that has been found with respect to the rating process across committees, the pooling of results across committees is not justifiable. In particular, the 'Thorngate Report' showed that reviewers of themes 3 and 4 applications valued different aspects of applications compared to reviewers of themes 1 and 2. Based on historical data and additional studies the recommendation to adopt the 100:0 method was approved by the Governing Council at its March 2005 meeting.

Rating scale studies

In April 2004, CIHR sought proposals for the development and presentation of data and options that would enable members of the Subcommittee on Monitoring and Innovation in Peer Review (SMIPR) to critically review the rating scale currently used by CIHR. This review was initiated in response to evidence that the current scale is not applied in the same fashion across peer review committees and potentially even within peer review committees. The key concern is the impact of inconsistent use and application of the rating scale across various fields of research as represented by different peer review committees in the Open Operating Grants Program.

The first commissioned study examined the rating scales and procedures used by funding agencies around the world to evaluate their grant proposals. The second commissioned study assessed psychological distances between, and inter-reviewer reliability of, the verbal descriptors that anchor ranges along the current CIHR rating scale. The supplemental study estimated how reviewers weigh five features of research grant proposals in making their evaluations These studies were undertaken to suggest alternatives to the scales and procedures CIHR currently employs that could be used in future field tests.

SMIPR has recommended that any planned reviews of the rating scale should be deferred until the approved changes in funding allocation methodology are implemented. Furthermore, it was recommended that further analysis is required to better understand the impacts of proposed rating scale changes, should this be contemplated in the future.

  Lead: Peer Review Innovation Coordinator
Actions Taken Results Achieved

Performa survey of the rating scales and procedures of 21 research funding agencies around the world

  • Survey Report: A review of rating scales and procedures for assessing research grant proposals (Thorngate, Wang & Tavakoli, July 2004)
  • Highlights: Great variety
    • Numerical scales ranged from 4 to 100 points
    • Scales often reversed: lower numbers meant lower quality in some scales, higher quality in others
    • Some had no numerical ratings

2nd Study - Experiment 1: Scaling the rating scale descriptors

  • 168 former CIHR internal reviewers contacted to rate 47 anchor words/phrases on a 100-point scale; 21 individuals responded (limited sample size)
  • Great variety in anchor words
  • Popular components include: scientific merit, practical importance, track record, methods & design, training, ethics
  • Subjective distance between the two CIHR descriptors outstanding and excellent is quite small, and the distance between excellent and very good is quite large, relative to their prescribed use. This implies that there would be far less subjective difference between an outstanding and an excellent proposal than between an excellent and a very good proposal

2nd Study - Experiment 2: Rating proposal features

  • Reviewers use non-compensatory rules of assessment
  • The interaction effects argue strongly against using a simple, weighted-additive formula to combine ratings of five features into a single score; interactions/context must be part of formula
  • On average, Methods/Design was the most important feature in determining a high rating, Scientific Importance was the 2nd most important.

Return

3.1.4 Evaluation Study of the Operating Grants Program

CIHR has promised transparency and accountability to Canadians for its investment of public dollars in all its health research programs and activities. As part of its commitment to transparency and accountability, CIHR undertook its first evaluation of its single largest program of health research funding, the Operating Grants Program (OGP), to assess the OGP's operation, impact and achievement of objectives.

  Lead: Evaluation & Analysis Unit
Actions Taken Results Achieved

Evaluation of the Open Operating Grants Program (OGP)

  • Evaluation, analysis and communication of results
  • Recommendation #1. CIHR should develop better on-going performance measurement
  • Recommendation #2: CIHR should maintain the OGP.
  • Recommendation #3: CIHR should review and then clearly communicate the goals of the OGP in the context of other CIHR funding opportunities.
  • Recommendation #4: CIHR should ensure that its peer review practices do not unnecessarily disadvantage proposals from applicants without an established CIHR track record.

Management's response

  • Agree to recommendation #1. The OGP is CIHR's largest and most important funding tool - it will therefore be the subject of ongoing quantitative and qualitative performance measurement and evaluation.
  • Agree to recommendation #2. The OGP program will be continued and decisions regarding the level of support are the responsibility of senior decision making bodies at CIHR including the Research Planning and Priorities Committee (RPPC) and ultimately Governing Council (GC).
  • Agree to recommendation #3. The Research Portfolio will take the lead in improving communications with the research community and will continue to work with the CIHR Communications Branch.
  • Agree to recommendation #4. CIHR management agrees with this recommendation, and has already created SMIPR, a joint management/peer reviewer/researcher group to review this and other potential improvements to the peer review system.

The Operating Grants Program represented roughly 46% of CIHR's total grants and awards expenditures in 2003-2004 (approximately $265,000,000 out of $576,000,000). The program offers research grants in peer-reviewed open competitions to all eligible health researchers. While the program has been in existence for over twenty years, the evaluation was primarily focused on the program outputs and outcomes that have occurred since transition from the Medical Research Council of Canada (MRC) to CIHR in 2000.

The evaluation was conducted to provide CIHR's Governing Council, senior management and the health research community with evidence about whether the program was meeting its expected results of:

  1. improved capacity for generating and developing new knowledge and
  2. improved production of highly qualified personnel.

The evaluation also addressed a recommendation from the Auditor General to monitor and evaluate the OGP.

Return

3.1.5 Improving Awards Selection Processes

Streamlining the Review of Fellowship Applications

Since 2002, CIHR has been considering ways and means of improving the fairness, effectiveness and efficiency of its review process for applications to the Fellowship program. In 2003, CIHR introduced more systematic pre-meeting or "at-home" scoring of applications. Reviewers were provided with clear guidelines and forms with which to rate applications in advance of the Fellowship selection committee meeting.1 2 These preliminary scores enable early triaging of the applications, thus reducing the amount of time that a committee might spend identifying and discussing non-competitive applications. The at-home work on each application also helps ensure that the reviewers have fully prepared an analysis for presentation to the committee and, in developing their assessments, have used the same criteria and have weighted them the same way.

A key issue in the Fellowships selection process is the value added by committee discussion. Convening an in-person meeting of reviewers from across Canada is one of the most resource-intensive activities in the current process in terms of both reviewer time and direct costs. In January 2005, staff reported that a project team had assessed the experience with at-home reviewing in two Fellowship competitions and had further fine-tuned the off-site review process. In March, staff followed up with evidence indicating that the at-home reviewing system was advanced enough that CIHR could consider the possibility of relying on it entirely for selection of Fellowship recipients in future competitions.

  Leads: Research Capacity Development and Special Studies
Actions Taken Results Achieved

Development of an at-home (off-site) review process for the Fellowships Program.

  • Introduction of off-site review as the initial stage for the Fellowship selection process in the October 2003 and February 2004 competitions.
  • The pre-meeting scores enabled earlier triaging of applications and thus more systematic treatment of triaged applications at the committee meetings.

An analysis of reviewers' pre-meeting and post-discussion scores from the October 2003 and February 2004 Fellowship competitions

  • The analyses examined:
    1. The impact of committee discussion on the outcome of the review process;
    2. The influence of committee discussion on scores;
    3. Impact of committee discussion on inter-committee variation in scores;
    4. Committee experience and reviewers' scores; and,
    5. The influence of disciplinary focus on scores.
  • The results suggested that shifting to an entirely off-site, structured review would not reduce, and might even improve, the fairness and effectiveness of the selection process.

Questionnaire sent to all Fellowships committee members

Focus group meeting with representatives from all five Fellowships panels

  • These two activities yielded information about: committee members' views on criteria and weights, effectiveness of the application form and the guidelines for review; the amount of timerequired for at-home review and for attendance at committee meetings; the fit between applications and reviewers; perceptions of the effectiveness of committee discussions; and, ideas for improving the review process.
  • The new information led to: reduction in the number of review criteria (from eight to six); minor adjustment to the relative weighting of criteria; new benchmarks for rating each criterion; improvements to application form modules; revision of the guide for reviewers; reassignment of committee responsibilities for various types of application; a new form for providing feedback to applicants; and, an improved formula for triaging applications.

Presentation to SCOGAC, in March 2005, of evidence from seven lines of investigation indicating that off-site review would be fair to applicants, effective in terms of achieving program objectives and would use reviewers' time efficiently.

  • Options development: SCOGAC agreed that staff should present options for the Fellowship review process at the June 2005 meeting. One of the options would be structured off-site review.

Other Improvements to CIHR Award Selection Processes

What we aim to achieve: CIHR wants to link effectively to other review processes, such as those that already exist within the universities, aiming for overall efficiency of national review resources. We also want to contribute to the training of reviewers to shorten the learning curve as they take on responsibilities with CIHR committees.

  Lead: Research Capacity Development
Actions Taken Results Achieved
Linking to other review processes

Introduced a pre-selection process for the Canada Graduate Scholarships Master's Awards program.
  • Made effective use of university capacity for pre-selecting applicants to the program, thus reducing the pool of applicants to be reviewed by CIHR.
Training reviewers

Introduced pre-meeting teleconferences to discuss procedures for reviewing applications for CIHR awards.
  • Provided new committee members with an opportunity to learn about the review process and to ask questions before they begin assessing applications.
  • Provided a verbal update to established committee members on improvements to review processes.

Return

3.1.6 Development of CIHR's Relevance Review Process

What we aim to achieve

CIHR is aiming to develop and implement a transparent, simple and consistent relevance review process. As the number of CIHR's strategic funding opportunities increases, the process for reviewing applications to decide whether or not they are in alignment with (or relevant to) strategic research priorities is becoming a key component of CIHR's funding decision-making. While an application submitted to an open competition is likely to be funded if it receives a high score during peer review, an application submitted to a strategic competition will not receive funding, or may not even be accepted for peer review, if it is found to be not relevant to strategic interests, regardless of the application's research potential.

  Lead: Research Planning and Resourcing Branch
Actions Taken Results Achieved

Cross-functional working group researched current practices in conducting relevance review, consulted with colleagues and drafted recommendations.

  • December 2004: Draft recommendations presented for consultation to major CIHR internal stakeholder groups (Institute Assistant Directors, Research Portfolio Deputy Directors, Knowledge Translation, Partnerships).
    • key issues emerged

Workshop held March 10, 2005 to resolve key issues. 39 participants, majority from Institutes (AD s) and Research Portfolio (DD s), with additional representation from KT and Partnerships.

  • 7 key points of consensus reached:
    1. conduct relevance review before (and blind to) peer review;
    2. final result of relevance review is a yes/no answer;
    3. the RFA team as a collective should decide who conducts relevance review;
    4. registration or letter of intent to be required in the majority of cases;
    5. criteria for deciding to conduct relevance review at the LOI stage identified;
    6. criteria for requiring peer review committees to conduct relevance review identified;
    7. minimum documentation requirements identified.
  • Workshop summary and action plan developed

Workshop summary and action plan to RPPC on April 15, 2005.

  • Approved.

Target completion date: September 2005 (guidelines/procedures document, support tools, implement learning strategy)

Report to RPPC in October 2005 and March 2006.

  • On-going implementation.

CIHR has an established process for research funding decision-making for the open competitions, founded upon its peer review process. Standardized procedures and documentation requirements, clearly defined roles and responsibilities, a formal committee member selection process, and an established timeline within the competition cycle are in place. In contrast, the foundation for making research funding decisions for strategic competitions is less well established. CIHR's relevance review process is inconsistent, has undefined documentation requirements, unclear roles and responsibilities, no formal framework for appointing decision-makers, and is conducted at different points within the competition cycle. A transparent, simple and consistent relevance review process is required to support CIHR's strategic funding decision-making. CIHR has taken steps to develop such a process.

Strategic Outcome 2 Outstanding Researchers in Innovative Environment
Priority Develop and sustain Canada's health researchers in vibrant, innovative and stable research environments
Key plans & activities to achieve the priority Utilizing technology to enhance service delivery

Return

3.1.7 Continued implementation of the ResearchNet initiative

What we aim to achieve

A robust research environment requires an infrastructure that makes it easy for Canada's researchers to do their work. CIHR created its e-Services strategy to lessen the application workload on researchers, as well as to assist research organizations manage a growing number of complex funding programs. It features the creation of ResearchNet, a Canadian research portal that offers electronic services and tools to support collaboration and information sharing amongst researchers, research organizations, government, industry and the public. Beginning April 2004, the pilot version of ResearchNet was launched, and links to the previously created Canadian Research Information System (CRIS), and the Common CV. FY 2004/5 efforts featured a Peer Review Pilot, which allowed for the electronic submission and distribution of internal/external reviews to committee members and applicants.

  Leads: ITMS and Knowledge Creation Programs
Actions Taken Results Achieved

Increased the number of Peer Review Committees using the ResearchNet online Peer Review system.

  • 19 Committees (15 Operating Grant and 4 Award Committees) using ResearchNet Peer Review System.

Improved ResearchNet Peer Review functionality.

  • 92% overall satisfaction rating, 96% found it made the peer review process more efficient.
  • 80% of external reviews, 92% of committee reviews received online.

Integrated eRegistration/ Submission pilots with the ePeer Review Pilots to improve and streamline interactions with Committee members. Committee members from Behavioral and Neurosciences Committees viewed registrations and declared their conflicts, suggest potential reviewers online (paperless). Data is automatically loaded into the EIS (corporate database). All reviewers for these committees have online access to application.

  • 85% take up by internal (committee) reviewers who provided conflict responses and suggested reviewers online.
  • Provided online copies of applications to Peer Reviewers.
  • Provided electronic version of Summary of Research Proposal.

Launched online paperless Registration process for March 2005 Open Operating Grant Competition for Behavioral and Neurosciences Committees.

  • Complete online process. No paper copies required for registration
  • 96% compliance rate

Launched online Application process for March 2005 Open Operating Grant Competition for Behavioral and Neurosciences Committees. Less Paper - only the original signed application submitted (8 copies eliminated)

  • 98% compliance rate
  • Only one copy of the application was required.

Kicked-off a project to work with NSERC to deliver a peer review pilot project for 4 of their committees in 2005/2006

  • Partnership development activities led to this pilot agreement with NSERC. Leverages work done for CIHR ePeer Review. Excellent testing ground for collaborative efforts on ResearchNet

Analyzed and documented how funding opportunities are communicated by Canadian universities and funding agencies, and by selected non-Canadian institutions

  • Developed a draft approach and design based on best of current approaches. Will use to bring potential partners to the table and to inform forward work-plan.

Return

Section 4: Strengthening CIHR peer review system

Enhancing our understanding

One of the largest hurdles CIHR faces in ensuring that operational performance and direction meet with and support its strategic vision is the need to efficiently and accurately measure and analyze corporate performance at both the strategic and tactical levels.

The tendency to collect large amounts of data during the course of usual operations is prevalent in most organizations. However, the frequent lack of definition of proper performance metrics and deficient reporting and analysis capabilities result in difficulty translating this data into usable and actionable information. SMIPR believes that obtaining relevant performance information is necessary to enhance our understanding of the peer review system and to properly manage its evolution.

For fiscal year 2005/2006, efforts will be undertaken to improve CIHR's capacity to appropriately monitor the peer review activities related to the open operating grant program.
Leveraging data into aggregated, focused information will allow CIHR to capitalize on the depth and breadth of the available information.

Increasing the efficiency of peer review

Significant advancements are expected with the ResearchNet initiative. CIHR will continue the careful implementation of its e-strategy. Great process efficiencies and user satisfaction should be obtained by furthering the use of information technologies. Once fully implemented, this new technological platform will permit many peer review system innovations for which data will be easily obtainable. This emerging opportunity is yet one more reason to establish systematic planning and reporting with respect to peer review innovation.

Other improvement activities will include:

Concluding Remarks

It is our hope that this and future annual reports on innovation in the CIHR Peer Review System will serve to support thoughtful, evidence-based evolution of CIHR's key business system. Opportunities for innovation abound. Our challenges will continue to be ones of prioritization, change management and resource allocation in support of our ultimate objective - to improve the health of Canadians.


1 More information on activities related to the Fellowships selection process redesign is available. See: A CIHR Process for Awarding Fellowships, October-November 2003. Documents to Guide the Development of a Valid, Reliable and Cost-Effective Process. See also: Improving Awards Review Processes - Phase 2 Plans. Special Studies Report 07-04.
2 See: Fellowship Applications, 2003-2004. Guide for Reviewers. February 2004.
Modified: 2007-08-02
Print