Bibliometrics: evaluating research via publications

Bibliometric methods can be used to assess scientific publications and their authors.

Scientific publishing can be evaluated qualitatively and quantitatively. Both methods are complementary to each other. In qualitative evaluation, the focus is, above all, in the publication's factual content and its relevance regarding the field of science in question. Thus, the reviewers require experience and expertise in the area. Peer reviewing before publishing is one part of qualitative evaluation.

In quantitative evaluation or in other words, bibliometrics, different indicators are produced from the publication using mathematical and statistical methods. These indicators illustrate the productivity, impact and quality in publishing. Indicators are used when evaluating research, for instance, in State of scientific research in Finland reports and university rankings.

The development of bibliometrics began in the first decades of the 20th century. At that time, scientific research activities had begun to establish themselves. Development of science itself was wanted to be analysed and modelled. Because research findings present themselves above all as publications, it was considered that analysing publications could provide information also about the development of science. The interest towards publications was purely academic at this point.

Bibliometrics lived a low profile for a long time within researchers' chambers, but in the 2000s, it emerged again. Academic ranking systems, scientific recruitments, tenure track arrangements, person evaluations, funding resolutions and such were wanted to be based on "objective" information. Then the possibilities of bibliometrics were noted. Sources of bibliometric information had developed to be easily used. Indicators could be produced easily and swiftly in large numbers.

Nevertheless, indicators are only numbers, and they need to be interpreted in some way in order to be understood. This often has difficulties. What is "small" or "large" in some field of science, and who defines the size or greatness? Who interprets these numbers in the first place and for what purpose? An individual scientist applying for funding? The faculty administration for outcome evaluation meetings? The university management figuring out the worldwide ranking of their institute?

Using bibliometrics to sort out the mutual "rankings" of different fields of science is a perilous choice. Differences can be explained, among others, with different publishing and citing conventions of different fields. The main rule is that a bibliometric analysis is valid only within one field of science. The "Leiden Manifesto for Research Metrics" consists of ten principles that should be followed when evaluating research using bibliometric methods.

For more information

information.services[at]uef.fi


Information Specialist
Heikki Laitinen
tel. 0294 45 8191
heikki.laitinen[at]uef.fi

 

Feedback

Feedback form

Feedback will be processed daily. Easily fixed issues will be solved immediately if they are essential to the service and in accordance with the operating policy. All feedback will be processed in the relevant services. If necessary, feedback will also be processed in the library's working groups. Feedback and the changes prompted by feedback will be processed annually by the management group in the management review.