Weekly reading 28.2017

Kaare Aagaard, Carter Bloch, Jesper W. Schneider, Jadranka Stojanovski, Ana Marušić, Roland Bal, Jeffrey Beall, Sven E. Hug, Martin P. Brändle

Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator

Kaare Aagaard, Carter Bloch, Jesper W. Schneider

DOI: https://doi.org/10.1093/reseval/rvv003

The purpose

  1. The purpose of this article is to present and discuss the main results of the Evaluation of the Norwegian Publication Indicator which took place in the year of 2014.
  2. The article examines the impact of the NPI on publication patterns.
  3. The next purpose is to examine whether the NPI had the same negative impact on research system as was the case with Australian model in early 1990s.

The note

In the paper authors describe the Norwegian Publication Indicator (NPI) which is a performance-based research funding system introduced in Norway in 2004. There is a need of implementing such systems because of the allocation of research funds to the most productive institutions, the pursuit of excellence in research, enhancing accountability  of public research and promoting greater alignment of research to societal and economic needs.

The authors sent a survey to researchers and research managers to answer some questions in this paper and also checked the bibliometric data for Norway in WoS.

There are three categories of the research funding systems:

  • peer review (UK);
  • publication-based (previously Australia, Norway, Denmark, Finland);
  • citation-based (Poland, Sweden, Slovakia, Flanders in Belgium).

The article examine the impact of the NPI on publication patterns.

In NPI there are two levels of publication. Level 1 are the journals that are peer-reviewed. Level 2 are the most exclusive journals widely known and leading in the research field with international audience.

Every publication is assigned to one level. The publications from Level 2 group get more points then those from the level 1. Points are fractured by the number of co-authors.

Funding model cannot replace evaluations and strategies for research at the institutions and cannot be use to measure individual publication or individual researcher.

This particular model helps to check the visibility of publication activity within the SSH compared to citation-based models.

The results

There is a strong increase in publication activity among young universities, many of these had a weaker focus on research.

There was a 82% increase in the overall number of the publications between 2004 and 2012. The question is what is the reason of this? Is it the greater volume of researchers that publish actively or maybe researchers that published actively publish even more paper then previously or is it a combination of the two.

According to WoS the percentage of short (1-10 pages) articles is almost the same for 2004 and 2010, which can indicate that in Norway after the introduction of NPI we do not observe the increased number of short papers published in less prestigious channels. However, basing of this research we cannot say with 100% certainty that we do not observe here the “salami publishing” tendency.

The citation impact reminded stable during the 2004-2010 period.

The inspiration

I would like to read https://doi.org/10.1016/j.respol.2011.09.007.


Does small equal predatory? Analysis of publication charges and transparency of editorial policies in Croatian open access journals

Jadranka Stojanovski, Ana Marušić

http://doi.org/10.11613/BM.2017.032

The purpose

The main aim of the paper is to check whether the journals that can be found in the Hrčak repository meet the transparency criteria for the journals and what business model do they use.

The note

Hrčak is a national Croatian Open Access repository that has been created with help of governmental funds. More than a 400 journals can be found in this website.

Open Access is suffering from the predatory journals that has been created only for profit. They use the APC (Article processing charge) business model to earn money from the authors. There are no real editorial boards, the is no peer-review of the papers and the titles of the journals are misleading and they mimic the real prestigious journals.

The main problem of the Croatian journals is the audience which is local, which is similar to Polish journals, moreover they have a low visibility and readability and a low citation impact.

The results

There are more than 104 journals storied in the Hrčak that are inactive, however 340 (76,6%) are active.

The results shows that almost 55% of all journals have statement on peer-review policy, however 30% have guidelines for the peer reviewers. Almost 90% of those which declared using the peer-review policy declared that the papers are sent to two reviewers. And 94% declare external affiliation of the reviewers.

The governmental funs helps the 60% journals with the publishing process, the editorial team consist almost only researchers or academic teachers.

Authors of the paper suggest Croatian journals from Hrčak repository follow general standards of 2-4 reviewers per manuscript, however the availability of guidelines for reviewers could be better. In spite of the fact that only 30% of all journals from the repository have the guidelines for reviewers we cannot suggest that the rest are predatory journals. Only 10 (2%) journals are charging authors for publication.


Playing the Indicator Game: Reflections on Strategies to Position an STS Group in a Multi-disciplinary Environment

Roland Bal

https://doi.org/10.17351/ests2017.111

The purpose

The author tried to describe how scholars dealt with a performance management in which they have become embedded. There are three main approaches:

  1. Changing the system.
  2. Adjusting to the system.
  3. Ignoring the system.

The note

The Author was able to set up a discussion with his colleagues from a department about the distribution of the research money within the department.

Changing the system

  • They changed the absolute Journal Impact Factor (JIF) to the relative one. It helped to remove do disciplinary differences, because JIF is calculated per scientific field.
  • They added the possibility to count publications created in the Dutch language, however they had to be published in a peer-reviewed journals, there were also introduced the list of a trusted publishers.
  • They added the amount of the words per paper to the system, to prevent publication slicing.

Adapting to the system

  • PhD theses were changed, and especially for those who want to pursue the academic career they had to be based on international journal articles instead of monographs written in the Dutch language.
  • PhD candidates were strongly encouraged to write a paper with different colleagues, not only with their supervisor and to publish at least one in peer-reviewed English-language journal.
  • They changed the approach of the authorship of the papers. Project teams were dynamically assigned on the basis of expertise needed to write particular papers. It helped to increase research output by targeting higher impact journals.

Ignoring the system

In some cases the approach of ignoring system could be good. An example has of such practice could be the long-term collaboration with the Dutch healthcare. The reports created in cooperation with them are almost always written only in Dutch language.

Sometimes what is almost impossible for an individual is possible for a team, thus it is possible to hit the target of the performance management system and be engaged in public discussions, what can help with rising funds for further research.

Bottom line

Performance management systems do not so much seem to constitute research practices, but are also constituted by them, making the relation between evaluation and practical work much more into a process of co-creation.


Predatory publishers are corrupting open access

Jeffrey Beall

doi:10.1038/489179a

There are a predatory publishers which publish counterfeit journals with the names and websites similar to the real ones.

Author, who submits the paper to this kind of a journal is invoiced for a publication fee, counted in hundreds of dollars. Often an author is asked to sign over the copyrights to the paper which is against to idea of Open Access, but puts an author in very inconvenient position, where he cannot withdraw the process without loosing the text.

Sometimes predatory journals name someone as a member of an editorial board, without his or her knowledge or permission. Many scientists are taking unethical shortcuts to get published to earn tenure and promotion.

According to the author there should be a catalogue of such publishers, who lack transparency.

This phenomenon can change the scholarly communication, because the real open access journals sometimes has to shorten the time needed for proper peer review of a paper.


Microsoft Academic is on the verge of becoming a bibliometric superpower

Sven E. Hug, Martin P. Brändle

In this short text the authors presents the new service that is called Microsoft Academic. It combines a broad coverage of the scientific papers, structured and rich metadata and a social network for academics.

There is an impressive semantic search engine implemented in the system, which use not only word connections like other search engines, but also entitles associated with a paper (journal, author, affiliation, field of study). There is also a wide rage of filters and sorting options.

Citation analyses with Microsoft Academic, Scopus, Web of Science yield similar results with respect to the h-index, rank correlations of citation counts. All three databases have a problem of the coverage of the SSH, non-English publications, and open access publications.


Challenge the impact factor

http://dx.doi.org/10.1038/s41551-017-0103

The percentage of highly cited papers is more informative than the average number of citations. Journal Impact Factor is a metric based on an average number of citations to the content that was published in it. The authors of this paper suggest a new metric called impact quotient (IQ). According to the authors it is better to count highly cited papers, regardsless of their actual citation numbers.

The advantages of IQ:

  • it ranges from 0% to 100%, while IF does not have an upper boundry,
  • it only counts research articles and reviews (news and opinion articles are not counted).

The two serious disadvantages of the IF are:

  1. The influence of the citation distribution.
  2. The variability of the citations in different research fields.

I am the author of this photo. It was taken with the Nokia 6210 Navigator mobile phone in 2009.