SlideShare a Scribd company logo
Measure for Measure
The role of metrics in assessing research performance
Society for Scholarly Publishing - June 2013
MichaelHabib, MSLS
Product Manager, Scopus
habib@elsevier.com
Twitter: @habib
http://orcid.org/0000-0002-8860-7565
1. What level am I assessing?
– Article, Journal, Researcher, Institution, etc.
Which metric should I use?
1. What level am I assessing?
– Article, Journal, Researcher, Institution, etc.
2. What type of impact am I assessing?
– Research, Clinical, Social, Educational, etc.
Which metric should I use?
1. What level am I assessing?
– Article, Journal, Researcher, Institution, etc.
2. What type of impact am I assessing?
– Research, Clinical, Social, Educational, etc.
3. What methods are available based on above?
– Metrics: Citation, Usage, Media, h-index, SNIP, SJR, etc.
– Qualitative: Peer-Review, etc.
Which metric should I use?
Article-Level-Metrics Altmetrics
Article-Level-Metrics Altmetrics
Image from:
http://www.plosone.org/article/metrics/info%3Adoi%
2F10.1371%2Fjournal.pone.0006022
APIs are available to easily + freely embed Scopus
Cited-by countson your Article pages
http://am.ascb.org/dora/
Background & approach
Who & when: 54,442 individuals were randomly selected from Scopus.
They were approached to complete the study in October2012.To ensure
an unbiasedresponseElsevier’sname was only revealed atthe end of
the survey.
Responses:The online survey took around 15-20 minutes to complete.
3,090 respondentscompletedit, representing a response rate of 5.7%.
Data has not beenweighted. There was a representative response by
country and discipline.
Statisticaltesting: Error margin ± 1.5%, at 90% confidencelevels.When
comparing the score formain group and sub-groups we have used a Z test
of proportion to identify differencesbetweenthe overallaverage and the
sub-group (90% confidence levels),when there are 30 or more responses.
8
Adrian Mulligan, Gemma Deakin and Rebekah Dutton
Elsevier Research & Academic Relations
9
Most widely known by researchers
Impact Factor (n=2,520)* 82%
H-Index (n=1,335) 43%
Journal Usage Factor (n=309) 10%
Altmetrics (n=41)* 1%
Impact Factor is published by Thomson Reuters, Altmetrics were least well known
Awarenes
s
10
Q2 Which of these do you think are most useful at measuring research quality? (Select up to 3)
64%
29%
29%
28%
58%
37%
34%
42%
0% 20% 40% 60% 80% 100%
Impact factor (n=2,530)
SNIP (n=51)
SJR (n=126)
Eigenfactor (n=285)
h-index (n=1,335)
Journal Usage Factor (n=309)
F1000 (n=155)
Alt-metrics (n=41)
* Only people who said they were aware of a particular metric in Q1 were given the opportunity to
select that metric in Q2, *See appendix for background and approach.Research by Elsevier Research
& Academic Relations. ImpactFactor is published by Thomson Reuters,
TOTAL
(n=3,090)
82%
2%
4%
9%
43%
10%
5%
1%
Researcher perception of most useful
% useful
Generally, metrics with the highest awareness are
also considered to be the most useful
11
Impact factor
SNIP
SJR
Eigenfactor
h-index
Journal Usage Factor
F1000
Alt-metrics
R² = 0.6972
0%
10%
20%
30%
40%
50%
60%
70%
0% 10% 20% 30% 40% 50% 60% 70% 80% 90%
Percentageofawarerespondentsthatchosethe
metricasoneofthemostuseful
Percentage of respondents that are aware of the metric
The trendline showsthe linear
trendfor the relationship
betweenawarenessandusage
of metrics
Metricsabove the line have
lowerlevelsof awareness,but
are more likelytobe ratedas
useful thanthe typical
awareness-usage relationship
Metricsbelowthe line have
higherlevelsof awareness,
butare lesslikelytobe rated
as useful thanthe typical
awareness-usage relationship
*See appendix for background and approach.Research by Elsevier Research & Academic Relations.
ImpactFactor is published by Thomson Reuters,
13
Assessing the usefulness of potential quality
metrics: by age

Significantdifferencebetween
subsetandtotal (subset higher)

Significantdifferencebetween
subsetandtotal (subset lower)
Under 36 (n=540) 36-45 (n=920) 46-55 (n=819) 56-65 (n=507) Over 65 (n=242)
TOTAL
(n=3,090)
Article
views/downloads (for
articles)
 43%
Citations from
materials thatarein
repositories
  43%
Share in social
network mentions (for
articles)
   16%
Number of readers
(for articles)  40%
Number of followers
(for researchers)   31%
Votes or ratings (for
articles)   24%
A metric that measures
the contribution an
individual makes to peer
review (for researchers)
 28%
A score basedon
reviewer assessment (for
articles)
 28%
Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality of a
researcher or a researcharticle?(By age) % Think it would be extremely/veryuseful
43%
49%
21%
42%
38%
35%
34%
33%
44%
45%
18%
41%
33%
24%
29%
29%
45%
41%
15%
39%
28%
22%
27%
27%
44%
41%
12%
41%
30%
22%
26%
27%
36%
37%
13%
35%
30%
19%
24%
27%
14
Assessing the usefulness of potential
quality metrics: by region (1 of 2)

Significantdifferencebetween
subsetandtotal (subset higher)

Significantdifferencebetween
subsetandtotal (subset lower)
Africa
(n=72)
APAC
(n=803)
Eastern Europe
(n=183)
Latin America
(n=182)
TOTAL
(n=3,090)
Articleviews/downloads (for
articles)     43%
Citations frommaterials thatare
in repositories    43%
Share in social network mentions
(for articles)    16%
Number of readers (for articles)  40%
Number of followers (for
researchers)   31%
Votes or ratings (for articles)    24%
A metric that measures the
contribution an individual makes
to peer review (for researchers)
  28%
A scorebased on reviewer
assessment(for articles)    28%
Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality
of a researcher or a research article? (By region, slide 1 of 2) % Think it would be extremely/very
useful
56%
51%
26%
49%
36%
33%
40%
44%
50%
55%
27%
46%
46%
29%
35%
36%
50%
49%
19%
45%
41%
30%
28%
26%
50%
49%
21%
45%
34%
24%
32%
35%
15
Assessing the usefulness of potential quality
metrics: by region (2 of 2)

Significantdifferencebetween
subsetandtotal (subset higher)

Significantdifferencebetween
subsetandtotal (subset lower)
Middle East (n=47) North America (n=770) Western Europe (n=1,033)
TOTAL
(n=3,090)
Articleviews/downloads (for
articles)  43%
Citations frommaterials that
are in repositories  43%
Share in social network
mentions (for articles)   16%
Number of readers (for articles)   40%
Number of followers (for
researchers)   31%
Votes or ratings (for articles)   24%
A metric that measures the
contribution an individual
makes to peer review (for
researchers)
 28%
A scorebased on reviewer
assessment(for articles)  28%
Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality of
a researcher or a research article? (By region, slide 2 of 2) % Think it would be extremely/very
useful
40%
40%
19%
43%
32%
28%
32%
34%
41%
42%
10%
36%
23%
19%
26%
26%
36%
32%
11%
36%
23%
22%
23%
22%
“For publishers
Greatly reduce emphasis on the journal
impact factor as a promotional tool, ideally by ceasing to
promote the impact factor or by presenting the
metric in the context of a variety of journal-
based metrics … that provide a richer view of journal
performance.”
– from The San Francisco Declaration on Research
Assessment(DORA)(http://am.ascb.org/dora/ )
 Transparency: Calculated by independent third-parties; freely and publicly
accessible www.journalmetrics.com
 Subject Field Normalization: allows for comparison independent of the
journals’ subject classification. Reflects most current journal scopes, thereby taking
ongoing changes into account
 3-year citation window: demonstrably the fairest compromise
 Manipulation-resistant: Article type consistency. Only citations to and from
articles, reviews, and conference papers are considered
 Breadth of coverage: Scopus has over 20,500 sources: 19,500 journals as well
as trade publications, proceedings and book series.
 Metrics based on Scopus.com: underlying database available for
transparency; Titles indexed based on transparent criteria by independent advisory
board
CONS:
 More complex methodology
 Do not take amount of review content into account
 Low awareness
Advantages of SNIP & SJR
Modified SNIP
• Refined metric calculation,bettercorrects
for field differences
• Outlier scores are closer to average
• Readilyunderstandablescoringscale with
an average of 1 for easy comparison
Modified SJR
• More prestigious nature of citationsthat
come from within the same, or a closely
related field
• Overcome the tendency for prestige
scores the quantityof journalsincreases
• Readily understandablescoring scale
with an average of 1 for easy
comparison
http://www.journalmetrics.com/
+ + +
A journal’s raw
impact per paper
Citation potential in
its subject field
Peer reviewed
papers only
A field’s frequency
and immediacy
of citation
Database
coverage
Journal’s scope
and focus
Measured relative to
database median
SNIP: Source-normalized impact per paper
SNIP: Molecular Biology VS Mathematics
Journal RIP Cit. Pot. SNIP (RIP/Cit. Pot.)
Inventionesmathematicae 1.5 0.4 3.8
MolecularCell 13.0 3.2 4.0
APIs are available to easily + freely embed these
metrics on your journal homepages
Measure for Measure: The role of metrics in assessing research performance - Society for Scholarly Publishing. June 2013
Snowball Metrics …
• Support universities’ strategic decision
making processes
• Aim to encompass the entire scope of key
research and enterprise activities
• What is the driver? – universities’ metrics tend to
suit their data and priorities. With Snowball, they
agree to a single method so that they can
benchmark themselves against their peers
• What is special about them?
– owned by distinguished universities, including Oxford
and Cambridge, and not imposed by e.g. funders.
Universitiestaking control of their own destiny!
– Triedand tested methodologiesthat are available
free-of-charge to the higher education sector
– Academia– industry collaboration
23
Vision for Snowball Metrics
Snowball Metrics drive quality and efficiency across higher
education’s research and enterprise activities, regardless of
system and supplier, since they are the preferred standards used
by research-intensive universities to view their own performance
within a global context
24
Snowball Metrics Project Partners
Snowball Metrics Recipe Book
25
Agreed and tested methodologies for new Snowball
Metrics, and versions of existing Snowball Metrics, are
and will continue to be shared free-of-charge.
None of the project partners will at any stage
apply any charges for the methodologies.
Any organisation can use these methodologies for
their own purposes, public service or commercial.
(Extracts from Statement of intent,October2012)
www.snowballmetrics.com/metrics
1. Choose methods + metrics appropriate to level and impact type
being assessed (DORA)
2. Don’t confuse level with type (alms ≠ altmetrics)
Free + easy to embed Scopus Cited-by counts on article pages
http://www.developers.elsevier.com/
3. Awareness of metrics correlates to acceptance, raising awareness
matters
4. APAC + younger researchers open to new metrics
5. Don’t use just one metric, promote a variety of metrics
Free + easy to embed SNIP/SJR on journal homepages
http://www.journalmetrics.com/
6. Choose transparent and standard methods + metrics
Learn more about Snowball Metrics
http://www.snowballmetrics.com/
In summary
Michael Habib, MSLS
Product Manager, Scopus
habib@elsevier.com
Twitter: @habib
http://orcid.org/0000-0002-8860-7565
http://www.mendeley.com/profiles/michael-habib/
Thank you!

More Related Content

PPTX
Embedding ORCID across researcher career paths
PPTX
Author identifiers & research impact: A role for libraries
PPTX
June 18 NISO Virtual Conference: Keynote Speaker: Altmetrics at the Portfolio...
PPTX
Cassidy "Case Study: Supporting Researcher Impact and Efficiency"
PPTX
NISO Altmetrics Initiative: A Project Update - Martin Fenner, Technical Lead ...
PPTX
Snowball Metrics: University-owned Benchmarking to Reveal Strengths within Al...
PPTX
Case studies for open science
PPTX
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
Embedding ORCID across researcher career paths
Author identifiers & research impact: A role for libraries
June 18 NISO Virtual Conference: Keynote Speaker: Altmetrics at the Portfolio...
Cassidy "Case Study: Supporting Researcher Impact and Efficiency"
NISO Altmetrics Initiative: A Project Update - Martin Fenner, Technical Lead ...
Snowball Metrics: University-owned Benchmarking to Reveal Strengths within Al...
Case studies for open science
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....

What's hot (20)

PDF
Roth "Tools to support systematic review research"
PDF
Warren-Jones "Using text-mining and summarisation technology to manage the gr...
PPTX
Carelli "Promoting Content Discovery Within the Reader/Researcher Workflow"
PPTX
Llebot "Research Data Support for Researchers: Metadata, Challenges, and Oppo...
PPTX
Gathering Evidence to Demonstrate Impact
PPTX
In metrics we trust?
PPTX
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
PDF
SHARE Update for CNI, Spring 2014
PDF
Understanding impact through alternative metrics: developing library-based as...
PPTX
Methods for measuring citizen-science impact
PDF
Responsible metrics in research assessment
PPTX
Your Systematic Review: Getting Started
PPTX
Cooper "Simplicity is the Ultimate Sophistication: Accessible, Ubiquitous Tec...
PDF
Researcher perspectives on publication and peer review of data.
PPTX
DORA and the reinvention of research assessment
PPTX
Altmetrics for Team Science
PDF
Citation & altmetrics - a comparison
PPTX
Practical applications for altmetrics in a changing metrics landscape
PPTX
Conducting Integrated Mixed Methods Research and Analysis Using NVivo
PPTX
Gather evidence to demonstrate the impact of your research
Roth "Tools to support systematic review research"
Warren-Jones "Using text-mining and summarisation technology to manage the gr...
Carelli "Promoting Content Discovery Within the Reader/Researcher Workflow"
Llebot "Research Data Support for Researchers: Metadata, Challenges, and Oppo...
Gathering Evidence to Demonstrate Impact
In metrics we trust?
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
SHARE Update for CNI, Spring 2014
Understanding impact through alternative metrics: developing library-based as...
Methods for measuring citizen-science impact
Responsible metrics in research assessment
Your Systematic Review: Getting Started
Cooper "Simplicity is the Ultimate Sophistication: Accessible, Ubiquitous Tec...
Researcher perspectives on publication and peer review of data.
DORA and the reinvention of research assessment
Altmetrics for Team Science
Citation & altmetrics - a comparison
Practical applications for altmetrics in a changing metrics landscape
Conducting Integrated Mixed Methods Research and Analysis Using NVivo
Gather evidence to demonstrate the impact of your research
Ad

Similar to Measure for Measure: The role of metrics in assessing research performance - Society for Scholarly Publishing. June 2013 (20)

PPTX
Michael Habib – Lightning talk at NISO Altmetrics Initiative
PDF
Habib, Researcher Awareness + Perception: A Year in Review
PPTX
PPTX
Using-publication-metrics-responsibly-17-Oct-2019.pptx
PPTX
Els lc metrics_reference_cards_v1.0_slides_2016
PDF
Scopus: Research Metrics and Indicators
PPTX
Impact Factor Journals as per JCR, SNIP, SJR, IPP, CiteScore
PPTX
Els lc metrics_reference_cards_v2.0_slides_dec2016
PPTX
Quick reference cards for research impact metrics
PDF
Can Bibliometric and Scientometric Measures be Used to Assess Research Quali...
PPTX
Research Metrics
PDF
Researcher profiles and metrics that matter
PDF
Research metrics Apr2013
PPTX
Measuring Scientific Productivity
PPTX
Impact Factor Journals-SNIP SJR IPP Cite Score.pptx
PDF
How to measure the impact of Research ?
PDF
A tool for librarians to select metrics across the research lifecycle
PDF
Scopus Research Metrics NUI Galway Sept 2018
PPTX
Bibliometrics: journals, articles, authors (v2)
PPT
Moed henk
Michael Habib – Lightning talk at NISO Altmetrics Initiative
Habib, Researcher Awareness + Perception: A Year in Review
Using-publication-metrics-responsibly-17-Oct-2019.pptx
Els lc metrics_reference_cards_v1.0_slides_2016
Scopus: Research Metrics and Indicators
Impact Factor Journals as per JCR, SNIP, SJR, IPP, CiteScore
Els lc metrics_reference_cards_v2.0_slides_dec2016
Quick reference cards for research impact metrics
Can Bibliometric and Scientometric Measures be Used to Assess Research Quali...
Research Metrics
Researcher profiles and metrics that matter
Research metrics Apr2013
Measuring Scientific Productivity
Impact Factor Journals-SNIP SJR IPP Cite Score.pptx
How to measure the impact of Research ?
A tool for librarians to select metrics across the research lifecycle
Scopus Research Metrics NUI Galway Sept 2018
Bibliometrics: journals, articles, authors (v2)
Moed henk
Ad

More from Michael Habib (12)

PPTX
Complexities in Open Access Discovery Interfaces
PDF
Ubiquitous Open Access: Changing culture by integrating OA into user workflows
PPT
Application Platforms and Developer Communities - New software tools and app...
PPT
"New Technologies: Empowering the Research community for Better Outcomes", L...
PPT
Scopus March 2012 release overview: New Document Details Pages, Interoperabil...
PPT
SNEAK PREVIEW Scopus Analyze Results: Overview and use case
PDF
Connecting Publications & Data: Raising visibility of local data collections...
PDF
Connecting Publications and Data
PPT
Scholarly Identity 2.0: What does the Web say about your research?
PPT
From Academic Library 2.0 to (Literature) Research 2.0
PPT
Scholarly Reputation Management Online : The Challenges and Opportunities of ...
PPT
Engaging a New Generation of Authors, Reviewers & Readers through Web 2.0
Complexities in Open Access Discovery Interfaces
Ubiquitous Open Access: Changing culture by integrating OA into user workflows
Application Platforms and Developer Communities - New software tools and app...
"New Technologies: Empowering the Research community for Better Outcomes", L...
Scopus March 2012 release overview: New Document Details Pages, Interoperabil...
SNEAK PREVIEW Scopus Analyze Results: Overview and use case
Connecting Publications & Data: Raising visibility of local data collections...
Connecting Publications and Data
Scholarly Identity 2.0: What does the Web say about your research?
From Academic Library 2.0 to (Literature) Research 2.0
Scholarly Reputation Management Online : The Challenges and Opportunities of ...
Engaging a New Generation of Authors, Reviewers & Readers through Web 2.0

Recently uploaded (20)

PPTX
Lesson notes of climatology university.
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
Classroom Observation Tools for Teachers
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
Insiders guide to clinical Medicine.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PDF
RMMM.pdf make it easy to upload and study
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PPTX
Institutional Correction lecture only . . .
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
Basic Mud Logging Guide for educational purpose
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
GDM (1) (1).pptx small presentation for students
Lesson notes of climatology university.
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Microbial diseases, their pathogenesis and prophylaxis
Classroom Observation Tools for Teachers
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Anesthesia in Laparoscopic Surgery in India
Insiders guide to clinical Medicine.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Final Presentation General Medicine 03-08-2024.pptx
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
RMMM.pdf make it easy to upload and study
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Institutional Correction lecture only . . .
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Basic Mud Logging Guide for educational purpose
FourierSeries-QuestionsWithAnswers(Part-A).pdf
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
VCE English Exam - Section C Student Revision Booklet
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
GDM (1) (1).pptx small presentation for students

Measure for Measure: The role of metrics in assessing research performance - Society for Scholarly Publishing. June 2013

  • 1. Measure for Measure The role of metrics in assessing research performance Society for Scholarly Publishing - June 2013 MichaelHabib, MSLS Product Manager, Scopus habib@elsevier.com Twitter: @habib http://orcid.org/0000-0002-8860-7565
  • 2. 1. What level am I assessing? – Article, Journal, Researcher, Institution, etc. Which metric should I use?
  • 3. 1. What level am I assessing? – Article, Journal, Researcher, Institution, etc. 2. What type of impact am I assessing? – Research, Clinical, Social, Educational, etc. Which metric should I use?
  • 4. 1. What level am I assessing? – Article, Journal, Researcher, Institution, etc. 2. What type of impact am I assessing? – Research, Clinical, Social, Educational, etc. 3. What methods are available based on above? – Metrics: Citation, Usage, Media, h-index, SNIP, SJR, etc. – Qualitative: Peer-Review, etc. Which metric should I use?
  • 8. Background & approach Who & when: 54,442 individuals were randomly selected from Scopus. They were approached to complete the study in October2012.To ensure an unbiasedresponseElsevier’sname was only revealed atthe end of the survey. Responses:The online survey took around 15-20 minutes to complete. 3,090 respondentscompletedit, representing a response rate of 5.7%. Data has not beenweighted. There was a representative response by country and discipline. Statisticaltesting: Error margin ± 1.5%, at 90% confidencelevels.When comparing the score formain group and sub-groups we have used a Z test of proportion to identify differencesbetweenthe overallaverage and the sub-group (90% confidence levels),when there are 30 or more responses. 8 Adrian Mulligan, Gemma Deakin and Rebekah Dutton Elsevier Research & Academic Relations
  • 9. 9 Most widely known by researchers Impact Factor (n=2,520)* 82% H-Index (n=1,335) 43% Journal Usage Factor (n=309) 10% Altmetrics (n=41)* 1% Impact Factor is published by Thomson Reuters, Altmetrics were least well known
  • 10. Awarenes s 10 Q2 Which of these do you think are most useful at measuring research quality? (Select up to 3) 64% 29% 29% 28% 58% 37% 34% 42% 0% 20% 40% 60% 80% 100% Impact factor (n=2,530) SNIP (n=51) SJR (n=126) Eigenfactor (n=285) h-index (n=1,335) Journal Usage Factor (n=309) F1000 (n=155) Alt-metrics (n=41) * Only people who said they were aware of a particular metric in Q1 were given the opportunity to select that metric in Q2, *See appendix for background and approach.Research by Elsevier Research & Academic Relations. ImpactFactor is published by Thomson Reuters, TOTAL (n=3,090) 82% 2% 4% 9% 43% 10% 5% 1% Researcher perception of most useful % useful
  • 11. Generally, metrics with the highest awareness are also considered to be the most useful 11 Impact factor SNIP SJR Eigenfactor h-index Journal Usage Factor F1000 Alt-metrics R² = 0.6972 0% 10% 20% 30% 40% 50% 60% 70% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% Percentageofawarerespondentsthatchosethe metricasoneofthemostuseful Percentage of respondents that are aware of the metric The trendline showsthe linear trendfor the relationship betweenawarenessandusage of metrics Metricsabove the line have lowerlevelsof awareness,but are more likelytobe ratedas useful thanthe typical awareness-usage relationship Metricsbelowthe line have higherlevelsof awareness, butare lesslikelytobe rated as useful thanthe typical awareness-usage relationship *See appendix for background and approach.Research by Elsevier Research & Academic Relations. ImpactFactor is published by Thomson Reuters,
  • 12. 13 Assessing the usefulness of potential quality metrics: by age  Significantdifferencebetween subsetandtotal (subset higher)  Significantdifferencebetween subsetandtotal (subset lower) Under 36 (n=540) 36-45 (n=920) 46-55 (n=819) 56-65 (n=507) Over 65 (n=242) TOTAL (n=3,090) Article views/downloads (for articles)  43% Citations from materials thatarein repositories   43% Share in social network mentions (for articles)    16% Number of readers (for articles)  40% Number of followers (for researchers)   31% Votes or ratings (for articles)   24% A metric that measures the contribution an individual makes to peer review (for researchers)  28% A score basedon reviewer assessment (for articles)  28% Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality of a researcher or a researcharticle?(By age) % Think it would be extremely/veryuseful 43% 49% 21% 42% 38% 35% 34% 33% 44% 45% 18% 41% 33% 24% 29% 29% 45% 41% 15% 39% 28% 22% 27% 27% 44% 41% 12% 41% 30% 22% 26% 27% 36% 37% 13% 35% 30% 19% 24% 27%
  • 13. 14 Assessing the usefulness of potential quality metrics: by region (1 of 2)  Significantdifferencebetween subsetandtotal (subset higher)  Significantdifferencebetween subsetandtotal (subset lower) Africa (n=72) APAC (n=803) Eastern Europe (n=183) Latin America (n=182) TOTAL (n=3,090) Articleviews/downloads (for articles)     43% Citations frommaterials thatare in repositories    43% Share in social network mentions (for articles)    16% Number of readers (for articles)  40% Number of followers (for researchers)   31% Votes or ratings (for articles)    24% A metric that measures the contribution an individual makes to peer review (for researchers)   28% A scorebased on reviewer assessment(for articles)    28% Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality of a researcher or a research article? (By region, slide 1 of 2) % Think it would be extremely/very useful 56% 51% 26% 49% 36% 33% 40% 44% 50% 55% 27% 46% 46% 29% 35% 36% 50% 49% 19% 45% 41% 30% 28% 26% 50% 49% 21% 45% 34% 24% 32% 35%
  • 14. 15 Assessing the usefulness of potential quality metrics: by region (2 of 2)  Significantdifferencebetween subsetandtotal (subset higher)  Significantdifferencebetween subsetandtotal (subset lower) Middle East (n=47) North America (n=770) Western Europe (n=1,033) TOTAL (n=3,090) Articleviews/downloads (for articles)  43% Citations frommaterials that are in repositories  43% Share in social network mentions (for articles)   16% Number of readers (for articles)   40% Number of followers (for researchers)   31% Votes or ratings (for articles)   24% A metric that measures the contribution an individual makes to peer review (for researchers)  28% A scorebased on reviewer assessment(for articles)  28% Q3 Thinking about possible new measures of research productivity, how useful do you think the below would be in assessing the quality of a researcher or a research article? (By region, slide 2 of 2) % Think it would be extremely/very useful 40% 40% 19% 43% 32% 28% 32% 34% 41% 42% 10% 36% 23% 19% 26% 26% 36% 32% 11% 36% 23% 22% 23% 22%
  • 15. “For publishers Greatly reduce emphasis on the journal impact factor as a promotional tool, ideally by ceasing to promote the impact factor or by presenting the metric in the context of a variety of journal- based metrics … that provide a richer view of journal performance.” – from The San Francisco Declaration on Research Assessment(DORA)(http://am.ascb.org/dora/ )
  • 16.  Transparency: Calculated by independent third-parties; freely and publicly accessible www.journalmetrics.com  Subject Field Normalization: allows for comparison independent of the journals’ subject classification. Reflects most current journal scopes, thereby taking ongoing changes into account  3-year citation window: demonstrably the fairest compromise  Manipulation-resistant: Article type consistency. Only citations to and from articles, reviews, and conference papers are considered  Breadth of coverage: Scopus has over 20,500 sources: 19,500 journals as well as trade publications, proceedings and book series.  Metrics based on Scopus.com: underlying database available for transparency; Titles indexed based on transparent criteria by independent advisory board CONS:  More complex methodology  Do not take amount of review content into account  Low awareness Advantages of SNIP & SJR
  • 17. Modified SNIP • Refined metric calculation,bettercorrects for field differences • Outlier scores are closer to average • Readilyunderstandablescoringscale with an average of 1 for easy comparison Modified SJR • More prestigious nature of citationsthat come from within the same, or a closely related field • Overcome the tendency for prestige scores the quantityof journalsincreases • Readily understandablescoring scale with an average of 1 for easy comparison http://www.journalmetrics.com/
  • 18. + + + A journal’s raw impact per paper Citation potential in its subject field Peer reviewed papers only A field’s frequency and immediacy of citation Database coverage Journal’s scope and focus Measured relative to database median SNIP: Source-normalized impact per paper
  • 19. SNIP: Molecular Biology VS Mathematics Journal RIP Cit. Pot. SNIP (RIP/Cit. Pot.) Inventionesmathematicae 1.5 0.4 3.8 MolecularCell 13.0 3.2 4.0
  • 20. APIs are available to easily + freely embed these metrics on your journal homepages
  • 22. Snowball Metrics … • Support universities’ strategic decision making processes • Aim to encompass the entire scope of key research and enterprise activities • What is the driver? – universities’ metrics tend to suit their data and priorities. With Snowball, they agree to a single method so that they can benchmark themselves against their peers • What is special about them? – owned by distinguished universities, including Oxford and Cambridge, and not imposed by e.g. funders. Universitiestaking control of their own destiny! – Triedand tested methodologiesthat are available free-of-charge to the higher education sector – Academia– industry collaboration 23
  • 23. Vision for Snowball Metrics Snowball Metrics drive quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier, since they are the preferred standards used by research-intensive universities to view their own performance within a global context 24 Snowball Metrics Project Partners
  • 24. Snowball Metrics Recipe Book 25 Agreed and tested methodologies for new Snowball Metrics, and versions of existing Snowball Metrics, are and will continue to be shared free-of-charge. None of the project partners will at any stage apply any charges for the methodologies. Any organisation can use these methodologies for their own purposes, public service or commercial. (Extracts from Statement of intent,October2012) www.snowballmetrics.com/metrics
  • 25. 1. Choose methods + metrics appropriate to level and impact type being assessed (DORA) 2. Don’t confuse level with type (alms ≠ altmetrics) Free + easy to embed Scopus Cited-by counts on article pages http://www.developers.elsevier.com/ 3. Awareness of metrics correlates to acceptance, raising awareness matters 4. APAC + younger researchers open to new metrics 5. Don’t use just one metric, promote a variety of metrics Free + easy to embed SNIP/SJR on journal homepages http://www.journalmetrics.com/ 6. Choose transparent and standard methods + metrics Learn more about Snowball Metrics http://www.snowballmetrics.com/ In summary
  • 26. Michael Habib, MSLS Product Manager, Scopus habib@elsevier.com Twitter: @habib http://orcid.org/0000-0002-8860-7565 http://www.mendeley.com/profiles/michael-habib/ Thank you!