Bibliometrics, Measurements and Research Evaluation

A special issue of Publications (ISSN 2304-6775).

Deadline for manuscript submissions: closed (30 July 2018) | Viewed by 31184

Special Issue Editors


E-Mail Website
Guest Editor
Planning and Policy Department, Brunel University London, Kingston Lane, Uxbridge, Middlesex, UB8 3PH, UK
Interests: research information management; Open Science; research socio-economic impact; research assessment and measurement of research quality; responsible metrics

E-Mail Website
Guest Editor
Brunel University London
Interests: applying statistical mechanics to problems in social science, economics, and finance. Complex networks

Special Issue Information

Dear Colleagues,

In the last decade, there has been an increase in the use of bibliometrics as a proxy for the assessment of research quality. This is in part driven by the popularity of league tables and their appetite for simple—some would say simplistic—measures. However, the exponential rise of both on-line open access research publications and social media has produced a new and unexpected set of new alternative non-traditional metrics. These include all forms of traceable interest in scholarly and scientific publications (Webometrics), whether they are views and downloads, number of tweets, media and policy papers references, Facebook pages, blog mentions, etc.

Within this new world of open access, big data, and social media, one wonders about the future of the measurement of research publication quality. Will traditional bibliometrics evolve in more complex indicators, or will they be supplanted by alternative metrics? Will the peer review system become obsolete and be replaced by real-time information on the appeal and relevance of a scientific publication? Perhaps we should just pause, go back to basics, and re-think what we mean by “research evaluation” and what high-quality research looks like.

Within this context, we would like to invite colleagues to submit papers for this special issue to pull together a body of work that will underpin research evaluation in all its many facets. We welcome technically-oriented research on bibliometrics as well as more qualitative work on conceptualising the measurement of research quality.

Dr. Rosa Scoble
Prof. Geoff Rodgers
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Publications is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Research quality evaluation
  • Bibliometric
  • Alternative metrics
  • Research assessment
  • Peer review

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

12 pages, 1070 KiB  
Article
Grades of Openness: Open and Closed Articles in Norway
by Susanne Mikki, Øyvind L. Gjesdal and Tormod E. Strømme
Publications 2018, 6(4), 46; https://doi.org/10.3390/publications6040046 - 22 Nov 2018
Cited by 7 | Viewed by 4983
Abstract
Based on the total scholarly article output of Norway, we investigated the coverage and degree of openness according to the following three bibliographic services: (1) Google Scholar, (2) oaDOI by Impact Story, and (3) 1findr by 1science. According to Google Scholar, we found [...] Read more.
Based on the total scholarly article output of Norway, we investigated the coverage and degree of openness according to the following three bibliographic services: (1) Google Scholar, (2) oaDOI by Impact Story, and (3) 1findr by 1science. According to Google Scholar, we found that more than 70% of all Norwegian articles are openly available. However, the degrees of openness are profoundly lower according to oaDOI and 1findr at 31% and 52%, respectively. Varying degrees of openness are mainly caused by different interpretations of openness, with oaDOI being the most restrictive. Furthermore, open shares vary considerably by discipline, with the medicine and health sciences at the upper end and the humanities at the lower end. We also determined the citation frequencies using cited-by values in Google Scholar and applying year and subject normalization. We found a significant citation advantage for open articles. However, this was not the case for all types of openness. In fact, the category of open access journals was by far the lowest cited, indicating that young journals with a declared open access policy still lack recognition. Full article
(This article belongs to the Special Issue Bibliometrics, Measurements and Research Evaluation)
Show Figures

Figure 1

9 pages, 3244 KiB  
Article
A Correlation Analysis of Normalized Indicators of Citation
by Dmitry M. Kochetkov
Publications 2018, 6(3), 39; https://doi.org/10.3390/publications6030039 - 13 Sep 2018
Cited by 5 | Viewed by 5539
Abstract
Recently, more and more countries are entering the global race for university competitiveness. On the one hand, global rankings are a convenient tool for quantitative analysis. On the other hand, their indicators are often difficult to quickly calculate and they often contradict each [...] Read more.
Recently, more and more countries are entering the global race for university competitiveness. On the one hand, global rankings are a convenient tool for quantitative analysis. On the other hand, their indicators are often difficult to quickly calculate and they often contradict each other. The author of this paper hoped to use widely available indicators for a quick analysis of the University’s publication strategy and opted for the normalized citation indicators available in the SciVal analytical tool, namely, Source Normalized Impact per Paper (SNIP) and Field-Weighted Citation Impact (FWCI). The author demonstrated the possibility of applying the correlation analysis to the impact indicators of a document and a journal on a sample of social and humanitarian fields at Peoples’ Friendship University of Russia (PFUR, “RUDN” in Russian). A dot diagram of university (or country) documents was used to form a two-factor matrix (SNIP and FWCI) that was further divided into four quadrants. Such an analysis illustrated the present situation in that discipline. An analysis of the RUDN university publications revealed problems and prospects in the development of social sciences and humanities. A serious problem observed was that high-quality results were often published in low-impact journals that narrowed the results’ potential audience and, accordingly, the number of citations. A particular attention was paid to the application of the results in practice. Full article
(This article belongs to the Special Issue Bibliometrics, Measurements and Research Evaluation)
Show Figures

Figure 1

22 pages, 10134 KiB  
Article
A Scientometric Study of Neurocomputing Publications (1992–2018): An Aerial Overview of Intrinsic Structure
by Manvendra Janmaijaya, Amit K. Shukla, Ajith Abraham and Pranab K. Muhuri
Publications 2018, 6(3), 32; https://doi.org/10.3390/publications6030032 - 19 Jul 2018
Cited by 28 | Viewed by 6041
Abstract
The international journal of neurocomputing (NC) is considered to be one of the most sought out journals in the computer science research fraternity. In this paper, an extensive bibliometric overview of this journal is performed. The bibliometric data is extracted from the Web [...] Read more.
The international journal of neurocomputing (NC) is considered to be one of the most sought out journals in the computer science research fraternity. In this paper, an extensive bibliometric overview of this journal is performed. The bibliometric data is extracted from the Web of Science (WoS) repository. The main objective of this study is to reveal internal structures and hidden inferences, such as highly productive and influential authors, most contributing countries, top institutions, collaborating authors, and so on. The CiteSpace and VOS viewer is used to visualize the graphical mapping of the bibliometric data. Further, the document co-citations network, cluster detection and references with strong citation burst is analyzed to reveal the intellectual base of NC publications. Full article
(This article belongs to the Special Issue Bibliometrics, Measurements and Research Evaluation)
Show Figures

Figure 1

11 pages, 3283 KiB  
Article
Exploring the Hjif-Index, an Analogue to the H-Like Index for Journal Impact Factors
by William Cabos and Juan Miguel Campanario
Publications 2018, 6(2), 14; https://doi.org/10.3390/publications6020014 - 4 Apr 2018
Viewed by 5881
Abstract
We used the Journal Impact Factor (JIF) to develop the hjif-index, calculated in a similar way to h-like indices. To this end, we mapped the JIFs of one JCR group to natural numbers, and evaluated the degree of correspondence between the interval from [...] Read more.
We used the Journal Impact Factor (JIF) to develop the hjif-index, calculated in a similar way to h-like indices. To this end, we mapped the JIFs of one JCR group to natural numbers, and evaluated the degree of correspondence between the interval from zero to the highest JIF in the group and a set of natural numbers. Next, we plotted the straight line y = x to obtain the group’s hjif-index as the JIF corresponding to the journal immediately above the straight line. We call the set of journals above the straight line the hjif-core. We calculated hjif-indices corresponding to the 2-year JIF (hjif2-index) and 5-year JIF (hjif5-index) windows for all 176 JCR groups listed in the 2014 Science edition. We also studied derived indicators such as the distribution of journals in JCR groups according to their hjif-indices, the distribution of journals and JIFs in the hjif-core, and other variables and indicators. We found that the hjif2- and hjif5-index behaved in a similar way, and that in general their distribution showed a peak followed by a relatively long tail. The hjif-index can be used as a tool to rank journals in a manner that better reflects the variable number of journals within a given JCR group and in each group’s hjif-core as an alternative to the more arbitrary JCR-based percentile ranking. Full article
(This article belongs to the Special Issue Bibliometrics, Measurements and Research Evaluation)
Show Figures

Figure 1

Other

Jump to: Research

9 pages, 185 KiB  
Essay
Ethical Concerns in the Rise of Co-Authorship and Its Role as a Proxy of Research Collaborations
by Sameer Kumar
Publications 2018, 6(3), 37; https://doi.org/10.3390/publications6030037 - 16 Aug 2018
Cited by 31 | Viewed by 7710
Abstract
Increasing specialization, changes in the institutional incentives for publication, and a host of other reasons have brought about a marked trend towards co-authored articles among researchers. These changes have impacted Science and Technology (S&T) policies worldwide. Co-authorship is often considered to be a [...] Read more.
Increasing specialization, changes in the institutional incentives for publication, and a host of other reasons have brought about a marked trend towards co-authored articles among researchers. These changes have impacted Science and Technology (S&T) policies worldwide. Co-authorship is often considered to be a reliable proxy for assessing research collaborations at micro, meso, and macro levels. Although co-authorship in a scholarly publication brings numerous benefits to the participating authors, it has also given rise to issues of publication integrity, such as ghost authorships and honorary authorships. The code of conduct of bodies such as the American Psychological Association (APA) and the International Committee of Medical Journal Editors (ICMJE) make it clear that only those who have significantly contributed to the study should be on the authorship list. Those who have contributed little have to be appropriately “acknowledged” in footnotes or in the acknowledgement section. However, these principles are sometimes transgressed, and a complete solution still remains elusive. Full article
(This article belongs to the Special Issue Bibliometrics, Measurements and Research Evaluation)
Back to TopTop