Next Article in Journal
Quality Issues of CRIS Data: An Exploratory Investigation with Universities from Twelve Countries
Previous Article in Journal / Special Issue
Scientific Landscape of Citizen Science Publications: Dynamics, Content and Presence in Social Media
Open AccessArticle

Opening and Reusing Transparent Peer Reviews with Automatic Article Annotation

1
Institute for Applied Computer Science, University of Bonn, 53012 Bonn, Germany
2
Fraunhofer Institute f. Intelligent Analysis and Information Systems IAIS, 53757 Sankt Augustin, Germany
3
GESIS—Leibniz Institute for the Social Sciences, 68159 Mannheim, Germany
*
Author to whom correspondence should be addressed.
Publications 2019, 7(1), 13; https://doi.org/10.3390/publications7010013
Received: 3 December 2018 / Revised: 21 January 2019 / Accepted: 30 January 2019 / Published: 3 February 2019
(This article belongs to the Special Issue Social Media and Open Science)
An increasing number of scientific publications are created in open and transparent peer review models: a submission is published first, and then reviewers are invited, or a submission is reviewed in a closed environment but then these reviews are published with the final article, or combinations of these. Reasons for open peer review include giving better credit to reviewers, and enabling readers to better appraise the quality of a publication. In most cases, the full, unstructured text of an open review is published next to the full, unstructured text of the article reviewed. This approach prevents human readers from getting a quick impression of the quality of parts of an article, and it does not easily support secondary exploitation, e.g., for scientometrics on reviews. While document formats have been proposed for publishing structured articles including reviews, integrated tool support for entire open peer review workflows resulting in such documents is still scarce. We present AR-Annotator, the Automatic Article and Review Annotator which employs a semantic information model of an article and its reviews, using semantic markup and unique identifiers for all entities of interest. The fine-grained article structure is not only exposed to authors and reviewers but also preserved in the published version. We publish articles and their reviews in a Linked Data representation and thus maximise their reusability by third party applications. We demonstrate this reusability by running quality-related queries against the structured representation of articles and their reviews. View Full-Text
Keywords: automatic semantic annotation; open peer review; knowledge extraction; open science; electronic publishing on the Web automatic semantic annotation; open peer review; knowledge extraction; open science; electronic publishing on the Web
Show Figures

Figure 1

MDPI and ACS Style

Sadeghi, A.; Capadisli, S.; Wilm, J.; Lange, C.; Mayr, P. Opening and Reusing Transparent Peer Reviews with Automatic Article Annotation. Publications 2019, 7, 13.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
  • Externally hosted supplementary file 1
    Link: https://github.com/OSCOSS/AR-Annotator
    Description: The supplementary files include the difference of the new version of the manuscript and the last version that was provided by the editors. The diff is made by latexdiff tool so that the reviewers and editors observe the exact places that are changed in the latex manuscript files. The supplementary link is linking to HTML-RDFa code related to the article.
Back to TopTop