Title: Tools for Effective Evaluation of Science InCites
1Tools for Effective Evaluation of ScienceInCites
David Horky Country Manager Central and
Eastern Europe david.horky_at_thomsonreuters.com
2AGENDA
- Challenges of evaluating science and research
- Bibliometrics as a way of effective evaluation
of science - Normalised metrics
3Who are the stakeholders in research evaluation?
4Overview
- What problems are people trying to solve
- Aggregate, track, and analyze output
- Create systems to do this
- Develop Reports
- Publicize their work
- Enable researchers to do deep analyses
- What do people do today?
- Disparate approaches
- Ad hoc
5Research Analytics Source and Foundation
Thomson Reuters Expertise and Processing
Research Analytics Resources
Data
Address Unification
Data Cleansing Standardization
Web of Science
Normalization and Baselines
- Our resources are based on the Web of Science,
the gold standard for research evaluation used by
universities and governments around the world.
The Web of Science is the best resource available
because it has - Authoritative and trustworthy content
- Consistent, 25 years of archive and a well
understood collection - Multidisciplinary, with no emphasis on any
subject area - Thomson Reuters presents not just citation and
record counts, but meaning behind the numbers.
Global averages and percentiles enable our
customers to benchmark and make comparisons and
ultimately make the right decisions.
6Web of Science STANDARD for bibliometrics
- Multidisciplinary coverage
- enable to analyze the whole context of scientific
research - Multiyear coverage
- enable to analyze the history and development of
sciences - Cover to Cover policy
- enable to follow the flow of a topic regardless
of communication type - ALL Authors, ALL Addresses
- enable to analyze by author name, by institution
- ALL Cited References
- enable to perform analyses on literature that is
not indexed
The CONSISTENCY enables large-scale counting and
reliable analyses
7Major assessments rely on Web of Science data
- US National Science Foundation
Science Engineering Indicators - European Commission European Union
Science Technology Indicators - US National Research Council
Doctoral
Program Ranking - Also governments in France, Australia, Italy,
Japan, UK, Portugal, Norway, Spain, Belgium,
South Korea, Canada, etc. to shape higher
education policy.
8The External vs Internal view of Research
- Research Performance Profiles (Indicators)
- Macro analysis
- Compare your institution / country to your peers
- All Institutions within a region
- All Nations/Territories of the world.
- Global Comparisons (Citation Report)
- Customized data set
- Measure and compare the internal entities at your
institution - Article, Author, Department, Collaboration
analysis - Typically all the records from an institution
9Research Performance Profiles
10Research Performance ProfilesInstitutional
Comparison
- Time trends
- Output vs Performance
- Where do we stand?
11Research Performance Profiles
Compare institutions in a particular field
12Research Performance Profiles
What are overall trends for our fields
13Global Comparisons
14CHALLENGES OF RESEARCH EVALUATIONUSING CITATION
METRICS
- Citation behavior is very different for different
disciplines - Life Sciences many articles, highly cited,
quickly tail off - Mathematics low citations, continue to be cited
for many years - How can I account for these differences?
- Older material is cited more, so how can I
account for the different ages of publications
that I wish to compare? - How can I better compare the performance of a
researcher with a long publication history to a
researcher with a short publication history? - Citation distribution is very uneven
- Only a small number of publications are highly
cited - Most publications have little or no citations
15Thomson Reuters value added metrics
Basic bibliographic information about the article
(including the field)
Number of citations
The Journal Impact Factor from the latest edition
of the Journal Citation Reports
16Thomson Reuters value added metrics
2nd generation citation data, the articles that
have cited the citing articles
17Thomson Reuters value added metrics
Expected performance metrics. We calculate the
number of citations a typical article would
expect to receive. This is calculated for each
Journal (Journal Expected Citations - JXC) and
for each Category (Category Expected Citations -
CXC) these metrics are also normalized for the
year and document type.
18Thomson Reuters value added metrics
JXC Ratio (157 / 45.09) 3.48 CXC Ratio (157 /
3.66) 42.90
Although, it is not displayed on this screen, we
also calculate the ratio between the actual and
expected performance. This provides meaning and
understanding of the citation counts and is a
normalized performance measure.
19Thomson Reuters value added metrics
The percentile. As compared for the set of
documents in the same field and the same year.
This paper is in the top 0.2 of all papers in
General Internal Medicine for the year 2007
20Citing articles
Link to the Web of Science
Additional Information
21Many pre-defined reports are presented for
immediate use.
InCites Global Comparisons
Citation Metrics provide fundamental information
on the papers within a dataset and their
collective citation influence.
Disciplinarity Metrics Disciplinarity Index
provides a measure of the concentration of a set
of source articles across a set of categories.
Interdisciplinarity Index communicates the
extent to which a collection of source articles
is or is not diversely multidisciplinary in
nature.
22InCites Global Comparisons
Collaboration Metrics provide fundamental
information for authors, institutions, and
countries represented within the source articles
dataset.
23View the data in time series to understand trends.
24InCites Global Comparisons
The Source Articles Listing and additional
Ranking reports associated with source articles
enable detailed examination and evaluation of
Papers, Authors, Institutions, etc. that have
produced the source articles.
25InCites Global Comparisons
26InCites Global Comparisons
This is a useful tool for making direct
comparisons of entities such as 2 individual
authors. The normalized data and multiple
indicators make for a comprehensive and accurate
evaluation. The graphical summaries provide
instant understanding
27Are these collaborators contributing to our
performance? We can make direct comparisons at a
glance
Collaborations
28InCites Citation Report
Citing Articles Listing and associated reports
provide unique insight into the body of published
Papers, Authors, Institutions, and Countries
influenced by the Source Articles.
29Summary
- Authoritative, consistent data from the worlds
leading provider of Research Evaluation
solutions. - A tailored data set and the ability to create
your own sub-sets and associated metrics provides
for specificity, answers to questions at a local
level. - Context around the data, such as baselines and
percentiles, gives the metrics genuine meaning,
comparative value. - Standard report tools, generate consistent
results and combine with your own datasets such
as funding etc. - A Web-Based tool, accessible by any number of
select users within your institution. Centralized
access ensures everyone gets the same data
30Thank you!More info http//in-cites.com/rsg/
- David Horky
- Country Manager Central and Eastern Europe
- david.horky_at_thomsonreuters.com