Office of the Auditor General of Canada - Bureau du vérificateur général du Canada
Skip all menusSkip first menu Français Contact Us Help Search Canada Site
About Us Publications Media Room Site Map OAG Home
Office of the Auditor General of Canada
O A G
What's New
Mandate
Reports to Northern Legislative
Assemblies
Work Opportunities
Careers
Consultant
Registration
Feedback on the Site

Opening Statement to the Standing Committee on Industry, Science and Technology

Roundtable on Science and Technology Policies

8 May 2001

Richard Flageole, FCA
Assistant Auditor General

Madam Chair, thank you for this opportunity to participate in the Committee's review of Canada's science and technology policies and programs. This is an important subject, and one that the Office of the Auditor General has reported on several times over the past nine years.

As you are well aware, one of the principal functions of the Office is to carry out value-for-money audits. This morning, I would like to briefly review the audit reports that have dealt with the government's management of science and technology activities. Our reports have looked at science and technology issues government-wide, at the portfolio level, and at the level of departmental programs. Our most recent audit looked at what are known as "big science" projects.

Also, I want to present the Committee with the criteria we used to determine if value for money was achieved at each level. We hope that this discussion will be relevant to the goals of your study.

Madam Chair, let me begin with a brief overview. The federal government spends several billions a year on science and technology in addition to tax incentives to encourage research and development in the private sector. The government accounts for the largest share of Canada's total investment in research and development. It is difficult to find S&T issues that the government is not somehow involved in.

Because of the importance of federal S&T activities, our Office has produced a series of reports that promote a mission-driven, results-based approach to federal spending on S&T. These reports stress that the government's investment in science and technology can and should be managed; and in particular, that the performance of the investment can and should be measured.

This series of reports goes back to 1993 when we first talked about mission-driven results-based research in what was then the Department of Forestry. This idea was that a department did research in support of its mission and mandate, and that these responsibilities should provide direction to their research efforts. With this general direction set out, good management required that the specific research results needed to carry out the mission be determined as a focus for specific research programs and activities.

In 1994 we took this idea government-wide. We looked at whether the federal government was doing all it could to achieve value for money in its science and technology activities.

At the government level we asked the following:

  • Is there strategic direction and priorities to guide federal science and technology activities?
  • Are these activities coordinated?
  • Are results being assessed and reported to Parliament?

At the department level, we asked the following:

  • Has management clearly set out what it wants to achieve?
  • Are these goals being met?

We reported significant shortfalls in each area.

In 1996 the government announced its Science and Technology Strategy, in part as a response to our 1994 report on governance and management.

In 1998 we reviewed the government's progress in implementing the S&T Strategy. We asked whether the government was meeting its 1996 commitments. Was it acting on the principles it had set for the management of departmental science and technology activities? We found that progress was slow. To achieve value for money in its science and technology activities, the government needed to act aggressively on its 1996 promises and increase attention in the following three areas:

  • develop mission-driven, results-based management frameworks for science and technology activities;
  • use external peer reviews to ensure scientific excellence;
  • develop partnerships inside and outside government to better leverage federal expenditures.

In 1999, we looked at federal investment in innovation in Canadian industry. We examined four federal contribution programs in the Industry Portfolio that had combined expenditures of $1.3 billion over three years. In particular, we asked whether:

  • these programs were based on a sound understanding of innovation performance problems in the economy - in other words, was there a strategy in place to target spending?
  • there was a business case justifying why specific projects were funded;
  • there was a strong rationale for government support;
  • management knew if value for money was achieved.

We found that no strategy existed to explain how the Portfolio addressed performance problems in innovation or what results it was trying to achieve. We also found problems with due diligence in two of the contribution programs. We will follow-up this year on the report and we will be pleased to discuss the results with the Committee after tabling.

In 1999 we also reported the results of our study on good management in science-based organizations, which I think is particularly relevant to the Committee's current interests. The purpose of the study was to help federal science managers manage better by describing what good management should look like in research organizations. Our study found that well-managed science organizations share a number of attributes (or characteristics) that we grouped under four themes:

  • These organizations focus on people. They know who they need, and they develop and retain the right mix of talent in a positive and supportive environment.
  • They show leadership. They align themselves with the needs of those who depend on them for results. They achieve buy-in for their vision, values, and goals; and they undertake the right research, at the right time, with the right investment.
  • They manage research to ensure excellence and needed results. They leverage resources, and capture organizational learning.
  • They strive for exceptional performance, are widely known and respected, and meet the needs of those who depend on the results of their work.

The extent to which the attributes are demonstrated by an organization is a measure of how well it is managed. Said another way, measuring performance against these attributes would show the extent to which value for money is achieved. Since we reported our work, several federal science organizations, as well as others, have used these attributes to assess their own performance.

In 2000, we reported the results of our audit of the government's administration of the tax incentive program for scientific research and experimental development. We asked whether management:

  • had set out clear objectives for the program;
  • had set clear rules and guidelines to help claimants and staff;
  • had procedures in place to manage the risk of ineligible claims; and,
  • was resolving claims efficiently and effectively and treating taxpayers consistently.

We found that there was significant room for improvement in each of these areas, and we will be doing a follow-up next year on management's actions.

Finally, in December 2000, we reported on how the federal government decides to invest in big science projects and used the Sudbury Neutrino Observatory as an example. We identified the following lessons learned:

  • The government needs complete and accurate information to properly assess the costs and benefits of the big science projects when deciding whether to invest. For example, the government needs good information on the nature of the science, project risks, life-cycle costs, expected scientific benefits, and on any economic benefits.
  • Inter-departmental systems for reviewing and handling science activities should be used to manage decision-making for big science projects.
  • Accountability within the government and to Parliament needs to be improved. There needs to be a single federal authority for each project that would report on performance.

Madam Chair, with these reports over the past nine years, the Office has developed a range of value-for-money criteria that I hope the Committee will consider and find useful in the course of its study.

This concludes my opening statement. We would be pleased to answer any questions the Committee might have.