Bibliometrics are quantitative measures that indicate influence or interest in academic research. They can help to identify key authors and journals within a particular area, and they can also guide searching within academic literature. The main bibliometric databases are Web of Science, Scopus and the associated tool SciVal. The 2014 REF considered bibliometrics and, while they are not a replacement for peer-review, the future REF plans to take them into consideration.
Two 'golden rules' are to always use more than one metric (e.g. do not use Impact Factors alone to assess the quality of journals) and to use the context in which the metrics are generated to judge their relevance (e.g. metrics are less meaningful in some subject areas, such as the arts, as the above mentioned databases do not fully cover these areas).
This page aims to answer some of the most frequently asked questions the Library receives about metrics.
- How can I look up the metrics for a journal? (e.g. CiteScore, SJR and Impact Factors)
- How can I look up the metrics for an article? (e.g. citations)
- How can I compare my department or research group with an equivalent research group or department at another university?
- How can I investigate the effect of adding a particular new academic to my research group or department?
If you need further help, please contact the Research Outputs team (firstname.lastname@example.org) who are based in the Library. Or you may like to come along to the bibliometrics workshop on the Researcher Development Programme.
Bibliometrics include a range of things, some of which you may already be familiar with. For example, the number of times a journal article has been cited, or the Impact Factor of a journal.
They indicate the degree of influence that a particular journal or article has. They can also relate to a specific academic.
There are a couple of things to watch out for:-
- Bibliometrics with the same name can vary depending on the database they were calculated from. E.g. citation counts can be calculated from both Web of Science and Scopus, but results may differ. This is because different databases cover slightly different sets of publications. Web of Science and Scopus both publish a list of publications and sources their databases cover: Web of Science database coverage, Scopus database coverage.
- Some bibliometrics are specific to a particular database. E.g. Impact Factors are produced by Thomson-Reuters who make Web of Science.
The metrics in Scopus, Web of Science and SciVal are derived from the details you add to your publications. These databases run a complex 'matching algorithm' to identify which articles 'belong' to you. There are a number of things you should do to ensure this algorithm is accurate.
- Please ensure that you always put the full affiliation on articles that your publish. ie. please use "University of Portsmouth", as opposed to the department in which you are based.
- Please get an ORCID id number, as this greatly helps to correctly linking your publications to you.
Please check your details on Web of Science and Scopus to ensure they are correct. If you notice errors, then use the contact forms on the Web of Science and Scopus websites to request that they make any necessarily corrections. Click here for further instructions.
Please be aware that if you are working in a subject area that the main databases (i.e. Web of Science and Scopus) do not fully cover, then it follows that the metrics about you on these systems will not be accurate. For example, academics working in the arts and some humanities subject areas may find that these data bases do not list all of their publications. To see if this applies to you, details of the coverage can be found here: Web of Science database coverage and Scopus database coverage.
Finally, it is worth mentioning that having a profile on Google scholar can also be a valuable way of promoting your work. However, please be sure to either set up your profile using a personal email address, or if you leave Portsmouth make sure that you change the email address to your new institution before your UoP computing account is removed.
Metrics about an individual academic includes the number of articles they have published, total number of citations, number of co-authors and so on. Each author has a profile on Scopus, which displays these metrics (above). To view yours, log into Scopus and search for your name.
A metric that is often discussed is an academic's H-index. Their H-index is the number of papers (n) they have published that have n or more citations. For example, an academic's H-index is 7 if she/he has published 7 papers that have 7 or more citations each.
Finding my own (or another academic's) H-index:-
- Using Scopus: Log into Scopus and search for their name. After clicking on their name, you should see their h-index (circled above).
- Using Web of Science: Log into Web of Science and searching for their name, while at the same time restricting the results to the universities they have worked at. Then click the Create Citation Report button in the top-right, then manually remove any publications they did not write, and finally look at the h-index score in the top-right. (Or if an author has a ResearcherID or an ORCID you can search for this instead of their name. This has the advantage that it will not include publications that do not belong to them).
Things to be aware of:-
- H-indexes cannot be used to compare between authors working in different subject areas. This is because they are based on the number of citations an article receives, and conventions differ considerably between subject areas.
- H-indexes are derived from bibliographical databases, such as Web of Science or Scopus. There is much overlap between these database, however the publications they cover do differ to some degree (see above). This means that an author's H-index can vary depending on which database it's been derived it from, and so when quoting a H-index it’s important to also say the database it was derived it from.
These metrics give an indication of the 'quality' of a particular journal. There are a number of these metrics, which are calculated in slightly different ways. A question that's often asked is 'which metric should I use?' This is quite a difficult to answer as there are pros and cons of each. While the Impact Factor is sometimes seen as the 'baseline' standard metric, to gain a full picture, it's recommended you explore a range of metrics.
Journal metrics derived from the Scopus database:-
- Source-Normalized Impact per Paper (SNIP): "Source Normalized Impact per Paper (SNIP) measures actual citations received relative to citations expected for the serial's [journal's] subject field."
- CiteScore: "CiteScore measures average citations received per document published in the serial [journal]." This metric is not subject-normalised, so you cannot use this metric to compare journals across different subject areas.
- SCImago Journal Rank (SJR): "SCImago Journal Rank (SJR) measures weighted citations received by the serial [journal]. Citation weighting depends on subject field and prestige (SJR) of the citing serial."
- How to look up these journal metrics and use them to compare journals. (Further detailed information about these metrics can be found here).
Journal metrics derived from the Web of Science database:-
- Journal impact factors (IF): Allow you to judge the relative importance or impact of a journal. They show the frequency with which the journal's papers are cited. This metric is not subject-normalised, so you cannot use this metric to compare journals across different subject areas.
- Eigenfactor metrics: These use a similar methodology to SJR (below), but they are based on Web of Science data. You cannot use this metric to compare journals across different subject areas.
In addition to these metrics, the researchers in the area of business use the ABS Academic Journal Guide, which provides a ranked list of journals.
You may also like to look at the info on how to tell which journals to avoid.
You can also see the SNIP and SJR metric in Pure, using the Metric tab as per the screenshot below.
The citation count is the number of times an article has been cited by other academic research. To find this in Web of Science please see the Web of Science library guide. In Scopus the citation count is shown on the right of the article (below). Google Scholar also gives a citation count, though it's important to be aware of the limitations outlined below.
You can also see the SNIP and SJR metric in Pure, using the Metric tab as per the screenshot of Pure above.
You can do this using SciVal. SciVal draws data from Scopus and presents it in a way that allows you to easily get an overview of Portsmouth, and also compare Portsmouth to other universities (explained below).
All staff and students can access SciVal - login instructions.
When you login, you'll see three main tabs: Overview, Benchmarking and Collaboration.
The Overview tab gives a summary of a particular university, department or research group (see above). You can look at Portsmouth or any other university.
The Benchmarking tab allows you to compare universities, departments and research groups (see above).
The Collaboration tab allows you to explore who Portsmouth (or any other university or business) has published with (see above).
The Quick Guide to SciVal gives more detailed information about what SciVal can do.
You can do this using the Benchmarking tab in SciVal (see above). To access the Benchmarking tab -
- Click on the Benchmarking tab.
- Select the universities you want to compare Portsmouth to from the left-hand menu.
- Then select the metrics you want to make the comparisons on by clicking the x and y axis buttons if you are in the Chart view, or by clicking the metric 1 and metric 2 etc buttons if you are in the 'table' view.
- Use the Export button (top-right) if you need to download the data into an Excel spreadsheet etc.
How can I compare my department or research group with an equivalent research group or department at another university?
To do this, you will need to go into the My SciVal tab (top-right of the screen) and create groups containing the relevant researchers from Portsmouth and the researchers from the other university.
You can then use the Benchmarking tab to make comparisons.
For more information, please contact email@example.com
A powerful feature of SciVal is the way in which its Collaboration tab (see above) allows you to identify and analyse existing and potential collaboration opportunities based on publication output and citation impact. You can do this using the Collaboration tab -
- Click on the Collaboration tab.
- Choose between either the Map or the Table view.
- In the left-hand menu, select the main university you want to focus on (e.g. Portsmouth).
- Under where it says "Institutions collaborating with [for example] the University of Portsmouth", use the drop-down menu to select the country you want to see the collaborators from. (Or if you are in the map view, just click on the map).
- Use the Sectors drop-down menu to limit results to academic, government, corporate or medical collaborators.
You can do this in SciVal using the Research Areas facility. To do this -
- Click on the Overview tab.
- Click on the Research Areas in the left-hand menu.
- Click on the plus sign to define a new research area.
- Follow the instructions on the screen.
How can I investigate the effect of adding a particular new academic to my research group or department?
To do this, you will need to create a Research Group containing your existing academics and then add the new academic to that group. You can do this in the My SciVal tab (above).
Once you have done this, you can then see the effect adding this individual has on the metrics using the Benchmarking tab.
For more information, please contact firstname.lastname@example.org
Yes - Pure automatically pulls in metrics from Scopus and attached them to article in Pure, for example the number of citations received and the metrics for the journal.
To view these metrics in Pure, open the output record and click on metrics in the left-hand menu (below).
These systems have different purposes. Pure is an internal University of Portsmouth system, which holds details (and increasingly the full-text) of the publications produced by Portsmouth academics, along with detailed information about other aspects of their research, such as funding, impact, press coverage etc. The purpose of Pure is to manage and promote the research activities taking place at Portsmouth.
Conversely, the main bibliometric databases (i.e. Web of Science, Scopus and SciVal) are international databases, which hold data about publications produced by academics across the world. Unlike Pure, they do not cover the other aspects of the research life-cycle, and nor to they hold a copy of the full-text.
Therefore, each university now has its own Pure (or equivalent system) and also subscriptions to Web of Science, Scopus and SciVal. However, to integrate the systems, we now pull some metrics into Pure from Scopus and the Web of Science (see above).
This raises the question of whether we actually need to look at Scopus, Web of Science and SciVal directly or whether we can just look at the metrics via Pure? The answer is that if you just want a quick look at specific metrics on a particular article then you can look at them via Pure (see above), but if you want to explore the metrics in any depth (e.g. to answer the questions covered on this page) then you do need to go into Scopus/Web of Science/SciVal themselves.
Whereas bibliometrics look at citation counts, Impact Factors, h-indexes etc, altmetrics look things like mentions in the news, twitter, blogs, and social media traffic distribution across the world. Altmetrics is the name of the overall topic, though one of the main companies is called altmetric.com, and another is Impact Story.
The library has an account with Altmetric.com. If you would like to know the altmetrics for a particular article then please contact email@example.com (Research Outputs Manager).
This demo page on the PLOS mashup site show altmetrics in action. Click on the coloured circle icon next to the title of a paper. Altmetrics are increasingly being used as an indicator of research impact. However, they should be treated with caution. They really measure the amount of 'attention' being paid to a piece of research, but they do not show whether this attention is positive or negative. So altmetrics should be considered as being additional to bibliometrics, as opposed to being a replacement.
When explaining bibliometrics, a common question is how does Google Scholar fit in? There is nothing wrong with using Google scholar to find research, but it's useful to know its limitations.
Unlike Web of Science, you do not know how Google Scholar is generating its search results, and so you need to judge the validity of the sources for yourself. This also means that you don't know whether the sources included in Google Scholar's bibliometrics (e.g. the citation count) are of a high enough quality. So often the same bibliometric is higher when generated from Google Scholar, compared to Web of Science or Scopus.
Also, bibliometrics aside, while Google Scholar searches some of the resources that the Library subscribes to, it does not cover them all. So relying on Google Scholar alone could mean you miss out on things. Google Scholar also does not have the number of options for refining your search as Web of Science (screen shot below).