Bibliometrics and evaluation of research
Bibliometrics is the quantitative measurements of scientific publications. It is increasingly used for the evaluation of research and the allocation of research funds. Many of the international university rankings that have attracted attention in recent years are at least partly built on the use of bibliometric indicators.
At LUSEM, research funds are allocated on publication analyses. Also, research funders increasingly demand bibliometrical statistics from researchers in their announcements of research grants. On this page you will find useful links and information about journal ranking, H-index and citation analysis.
Impact Factor measures the average number of citations for articles in a journal. The citation data comes from a citation database.
The number of citations in a citation database are divided with the number of citable items of the journal. In InCites Journal Citation Reports it is called Journal Impact Factor (JIF). It is based on data from Web of Science. The data in SCImago Journal & Country Rank come from Scopus. In SCImago it is called SCImago Journal Rank Indicator (SJR). In SJR the impact of the journals in which citations appear is also taken into account.
The total number of citing articles are not necessarily included in Web of Science or Scopus, but there are no other databases who can present more accurate and transparent data. Journal Impact Factor (JIF) and SCImago Journal Rank Indicator (SJR) from different fields cannot be compared.
Source Normalized Impact per Paper (SNIP) is used by the Scopus database. It measures a source contextual citation impact by weighting citations based on the total number of citations in a subject field. It helps you make a direct comparison of sources in different subject fields.
The citation potential of a source subject field is the average number of references per document citing that source. It represents the likelihood of being cited for documents in a particular field. A source in a field with a high citation potential tends to have a high impact per paper.
Citation potential is important because it accounts for the fact that typical citation counts vary widely between research disciplines. For example, they tend to be higher in life sciences than in mathematics or social sciences. If papers in one subject field contain an average of 40 cited references while those in another contain an average of 10, then the former field has a citation potential that is 4 times higher than that of the latter.
The Norwegian Register for Scientific Journals, Series and Publishers (Register over vitenskapelige publiseringskanaler) is mostly known as the Norwegian list. The Norwegian list largely covers all subjects and publishers that are relevant for research conducted at LUSEM. The list is administered by NSD (Norsk senter for forskningsdata/ Norwegian Centre for Research Data). NSD is a part of the Norwegian Ministry of Education and Research.
Each year, the LUSEM management requests information about research output from LUCRIS. Based on publishing activities, funds from Lund University are distributed to the departments. Points are given according to publication type and ranking in the Norwegian list.
|Level, Norweigan system||1||2|
|Book chapter (publisher level)||0,7||1|
|Editorship (publisher level)||0,7||1|
|Monograph (publisher level)||5||8|
A LUCRIS registration is only counted if the publication is associated with your work at LUSEM. It is the affiliation as stated in the publication that determines where you are active. Example: If you work at a department at LUSEM, but the published article indicates that you are only working at a research institute, no funds will be distributed to your department.
The following minimum level must be achieved for journals, series, or publishers to be included in the Norwegian list:
- Established procedures for external peer review
- An academic editorial board (or equivalent) consisting primarily of researchers from universities, research institutes etc.
- International or national authorship
If the minimum level is reached, the publication/publisher ends in Level 1. Level 2 shows that the channel holds an international high standard. Level 2 represents the top 20 percent of the total amount of channels. As a researcher, you can nominate journals and publishers to the list. The ranking is made annually, and the list is evaluated by Norwegian scientific experts.
The ABS-list is the Chartered Association of Business Schools' Academic Journal Guide. The list focuses on journals in business, management and economics. It ranks journals on a scale of 1-4 based upon peer review, editorial and expert judgements following from the evaluation of publications and is informed by statistical information relating to citation.
To access the list, you must register online at the Chartered Association of Business Schools website. A new list for 2021 is in progress.
The H-index combines the number of articles a researcher has published, with the number of citations per article.
This is best explained with an example. This researcher has the H-index 6 and has published nine articles. If the 6th article had been cited more times the h-index still would have stopped on 6 because the 7th publication didn’t have 7 citations or more.
You can use Scopus and Web of Science to find your H-index. All citing documents of an article are not necessarily included in these databases. Google Scholar on the other hand includes citing master theses and sources not conventionally considered, which makes the Google Scholar H-index higher and less trustworthy.
The H-index logic is also used for journals. As an example: Google Scholar Metrics presents h5-index which is the h-index for articles published in the last 5 complete years. If a journal has the h5-index 20 it means that the 20th most cited article of the journal (latest 5 years) has been cited at least 20 times.
Google Scholar Metrics
Scopus is the largest abstract and citation database of peer-reviewed literature. In comparison to Web of Science, Scopus has a bigger scope, but Web of Science is more complete when it comes to citations prior to 1996. Scopus also covers articles in press. While Web of Science is a bit more exclusive than Scopus, Google Scholar contains material that has not been quality controlled at all.
Because no citation database is complete, evaluations should only be made on group level.