Responsible use of research and innovation metrics
Understand the responsible use of research and innovation metrics, the University’s commitment to DORA, and the actions we are taking to support this.
Defining the responsible use of metrics
The term “research and innovation metrics” refers to the quantitative indicators which can be used to evidence research activity, quality and characteristics.
There is a wealth of data on research and innovation outputs and impact which can evidence the quality of research and innovation activity. Such metrics encompass publication data (bibliometrics such as citation data and journal impact factors), indicators of social and economic impact (such as those incorporated into the knowledge exchange framework), indicators supporting contributions to the research culture and environment (such as supervision, collaboration, and open research practices) and related measures such as funding income, and measures of esteem (such as honours, awards, and invitations to present).
The ‘responsible use of metrics’ is defined as “framing appropriate uses of quantitative indicators in the governance, management and assessment of research" (The Metric Tide, 2005), with the Universities UK Forum for the Responsible Use of Metrics defining responsible use by five key principles:
- Robustness: Base metrics on the best possible data in terms of accuracy and scope
- Humility: Recognise that quantitative evaluation should support, but not supplant, qualitative, expert assessment
- Transparency: That those being evaluated can test and verify the results
- Diversity: Account for variation by research field, and use a range of indicators to reflect and support a plurality of research and researcher career paths across the system
- Reflexivity: Recognise and anticipate the systemic and potential effects of indicators, and updating them in response.
Research funders, government, institutions, and researchers all place importance on the responsible use of metrics.
Our commitment to responsible metrics
The University published its first institutional statement on the responsible use of metrics in 2017 and, following a commitment in our 2020 Open Research Position Statement to further develop and promote the responsible and fair use of metrics, an institution-wide Responsible Use of Metrics Implementation Plan (2021) set out how the University would achieve this. In 2022 the University signed the San Francisco Declaration on Research Assessment (commonly known as DORA).
Our approach to research metrics forms part of broader efforts to enhance research culture and practice and reflects:
- Our ambition, as set out in the University’s strategy to be a global leader in research and innovation and a destination of choice for researchers in the UK and internationally for the practise of excellence in research and innovation
- Our values of inclusion, inspiration, innovation, integrity
- Our culture of open and collaborative research
- Our aspirations, through interdisciplinary collaboration and industry engagement, to bring together diverse groups of researchers to address global challenges.
DORA: The San Francisco Declaration on Research Assessment
In 2022 the University of Surrey signed DORA, and as a signatory the University commits to supporting the adoption of the following practices in research assessment:
An overarching general recommendation of DORA states “Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.”
Two further recommendations of DORA form the institutional commitment:
The first of these advises that institutions “be explicit about the criteria used to reach hiring, tenure, and promotion decisions, clearly highlighting, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published”
The second recommendation considers that “for the purposes of research assessment, consider the value and impact of all research outputs (including datasets and software) in addition to research publications, and consider a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice.”
Responsible metrics implementation plan
In 2021 the University of Surrey Open Research Working Group developed a responsible use of metrics implementation plan to support the practical application of responsible metrics across the University, and to promote the benefits of this. The implementation plan comprised ten key steps including the creation of resources, the development of evaluation, and recognition of the value of all relevant research and innovation outputs.
In addition to drawing upon DORA and the UK Forum for Responsible Use, the implementation plan also draws upon the Leiden Manifesto, INFORM’s Scope Model, and further sector statements, as well as our local context, culture and experience at Surrey.
The implementation plan (PDF) was approved by the University Research and Innovation Committee in March 2021.
Best practice guidance
In 2017, the Open Research team developed the following suite of best-practice tips to consider when approaching publication data (bibliometrics) to help elicit the richness, diversity, and complexity of research and researchers.
- Use metrics to support—not supplant—peer review and expert knowledge
- Recognise that research and its outputs are rich, complex, and varied. In contrast, metrics are usually one dimensional. Be realistic about the limits of what they can tell you
- Foster humility in your approach to metrics and take care not to imbue them with undue power
- Understand that most metrics are designed to work at larger scales. At the level of the individual, prefer qualitative information.
- Use data that are of sufficient accuracy and scope
- Where such data are unavailable, the usefulness of metrics diminishes. At best, results will be incomplete. At worst, they may be inaccurate and misleading
- Weigh up the pros, cons, and risks of proceeding under such circumstances.
- Recognise that almost all uses of metrics involve comparison of some sort, no matter how small or inconsequential that comparison may seem
- Ensure that only appropriately normalised indicators are used when comparisons are made, even in cases where comparisons are deemed to be ‘within the same field’ or for the same person
- Recognise that some questions can only be addressed using metrics that cannot be systematically normalised
- Only use these non-normalised metrics in non-comparative, and ideally, narrative contexts.
- Recognise that most metrics are indicators of potential effects or states
- Indications are informed inferences—not indisputable facts
- Avoid misplaced precision in interpreting indicators
- Accompany point estimates with stability (confidence) intervals, where possible.
- Align metrics work transparently with mission or strategy
- Undertake bibliometric and altmetric studies with a stated purpose in mind
- Define questions and aims in sufficient detail, and only then, choose methods and indicators
- Ensure that the metrics selected capture the concept probed by the question asked.
- Most bibliometric and altmetric studies explore more than one aspect of research
- Check that the metrics you select capture the concepts you think they do
- To look at multiple aspects of research, you need to use multiple metrics
- Even if looking at a single aspect, capture this with several metrics if you can.
- Recognise that all metrics have limitations. Use these limitations to frame your results
- Accept that in some contexts, metrics aren’t useful at all.
- Never make vital decisions on metrics alone
- Consider not only the positive impacts of using metrics, but also the negative ones
- Don’t overuse metrics
- Appreciate that where metrics are applied to individuals, careers can be at stake.
- Insist on reproducibility and transparency in data collection and analysis
- Be on the lookout for bias. Seek to reduce it and acknowledge its effects
- Cherish objectivity and neutrality
- Be fair, mindful, and considerate of all parties involved.
- Choose the most straight-forward approach sufficient for addressing the question asked
- Provide plain English, non-technical explanations of bibliometric or altmetric methods used
- Raise awareness and increase understanding of metrics
- Listen to concerns that are raised
- Encourage questions and thoughtful debate and discussion.
- Keep pace, expect change, and adjust metrics and processes accordingly. Areas of work related to both research and metrics are rapidly developing and evolving
- Recognise that metrics establish incentives. These can change human behaviours and the research ecosystem
- Look out for unintended consequences. When you spot them, determine their effect
- Manage the goal displacement and gaming that can arise in response to the use of metrics.
Resources: Sector statements
- San Francisco Declaration on Research Assessment (DORA)
- DORA resources
- Overview of the UK Forum for Responsible Research Metrics
- The Metric Tide (2015 Independent Report)
- Wellcome’s Guidance for research organisations on how to implement responsible and fair approaches for research assessment
- Leiden Manifesto for Research Metrics
- UK Reproducibility Network (UKRN)’s Statement on Responsible Research Evaluation
- Plan S Principles (Principle 10)
- International Network of Research Management Societies (INFORMS) scope model
- UKRI’s Research England terms and conditions of grant (Item 45) (PDF)
- European University Association (EUA), Science Europe, and the European Commission agreement on reforming research assessment
- Cesaer next generation metrics (white paper)
- World Conferences on Research Integrity Hong Kong principles.