Informing Research Choices: Indicators and Judgment
In 2010, the Minister of Industry, on behalf of the Natural Sciences and Engineering Research Council of Canada (NSERC), asked the Council of Canadian Academies to examine the international practices and supporting evidence used to assess performance of research in the natural sciences and engineering disciplines.
Lack of a “one-size-fits all” solution to monitoring and measuring research performance is driving a growing international need to develop and implement new methods and, share best practices and lessons learned. The Council’s assessment drew from both international and national approaches for evaluating research performance. Overall, the assessment presents an important opportunity to address science performance from a uniquely Canadian perspective. It is also the first in a new series of assessments by the Council that address various parts of the Canadian science and technology environment.
After examining the available evidence, the Council’s Expert Panel came to a small number of key findings, listed below, and concluded overall that quantitative indicators should inform, rather than replace, expert judgment in science assessment for research funding allocation. The Expert Panel also developed four guiding principles to support research funding agencies undertaking science assessments in support of budget allocation, they are: context matters; do no harm; transparency is critical; and expert judgment remains invaluable. The principles are expanded upon with the Panel’s report.
Many science indicators and assessment approaches are sufficiently robust to be used to assess science performance at the field level in natural sciences and engineering (NSE).
Quantitative indicators should inform, rather than replace, expert judgment in science assessment for research funding allocation.
International “best practices” offer limited insight into science indicator use and assessment strategies.
Mapping research funding allocation directly to quantitative indicators is far too simplistic, and is not a realistic strategy.
There is no compelling reason for certainty that past successes will lead to future successes or past failures to future failures. As a result, science indicators may not always provide a reliable guide to future prospects.
These are just a few of the many observations that are considered in the Expert Panel’s report.
The report was publicly launched on July 5, 2012. To request a hard copy of the report, contact Andrea Hopkins at email@example.com.
What do the scientific evidence and the approaches used by other funding agencies globally have to offer, in terms of performance indicators and related best practices in the context of research in the natural sciences and engineering, carried out at universities, colleges, and polytechnics?
Reports and related publications
- Informing Research Choices: Indicators and Judgment (full report)
- Report in Focus (abridged version)
- Executive Summary
- News Release
- Media Backgrounder
The Expert Panel on Science Performance and Research Funding was chaired by Dr. Rita Colwell, Distinguished University Professor both at the University of Maryland, College Park, and at Johns Hopkins University Bloomberg School of Public Health, and former Director of the National Science Foundation (1998-2004). For a complete list of panel members, please visit the Expert Panel on Science Performance and Research Funding Membership page.
For further information, please contact:
Andrea Hopkins, Program Coordinator, at 613-567-5000 ext. 268 or firstname.lastname@example.org.