Semantic Web and Information Systems

A special issue of Information (ISSN 2078-2489). This special issue belongs to the section "Information Systems".

Deadline for manuscript submissions: closed (15 December 2021) | Viewed by 10547

Special Issue Editors


E-Mail Website
Guest Editor
Department of Management, Information and Production Engineering, University of Bergamo, 24044 Dalmine, BG, Italy
Interests: data management; data mining; information retrieval; NoSQL databases; JSON document stores; JSON processing; fuzzy sets and soft querying
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Management, Information and Production Engineering, University of Bergamo, 24044 Dalmine, BG, Italy
Interests: data management; NoSQL databases; JSON data; soft querying; data mining; smart cities; open data
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues, 

The semantic web is meant to provide semantics to web pages and web information, in order to improve the degree of information that can be obtained by consulting web sources; in fact, not only traditional web sites but Open Data portals, and Social Media provide Big Data and, more generally, a vast amount of information by relying on web technologies. How to add semantics to such an enormous amount of information? How to represent knowledge and associate it to web information? This is the general goal of researchers working on semantic web. 

However, on the other side of web technology we find information systems. Every day, information systems evolve toward the adoption of web-based architectures and use web technology and protocol to talk to other information systems, mobile apps, and web sites. An example of this evolution is the paradigm named micro-services, which have proven to be effective so as to keep under control the development and evolution of very large information systems (as demonstrated, e.g., by Amazon).

So, in a world where information systems and web information sources are substantially not distinguishable, what is the impact of semantic web on information systems? Does semantic web actually provide practical solution for building information systems? Is there the possibility to apply semantics web to improve capabilities of information systems in managing, querying and analyzing information, even for traditional OLTP (On-line Transaction Processing) daily activities? 

The Special Issue aims to publish any kind of contribution concerned with the integration of semantic web and information systems: we welcome good quality papers that describe successful experiences of semantic web adoption in information systems, as well as research papers that propose novel approaches and techniques to solve traditional as well as novel problems that are emerging from their practical use. 

Papers could be focused on some of the following topics, but need not necessarily be limited to them.

  • Ontologies in business and information systems
  • Knowledge representation and knowledge bases
  • Formats for knowledge representation and exchange
  • Knowledge graphs and real-time knowledge graphs
  • Simple knowledge organization systems
  • Interoperability of micro services and information systems
  • Semantic annotation
  • Semantic technologies
  • Data diversity
  • Data design and representation
  • Data governance
  • Enhanced data aggregation, querying, and information retrieval
  • Enhanced Data Analytics
  • Linked Open Data 

Prof. Dr. Giuseppe Psaila
Dr. Paolo Fosci
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Semantic Technologies and Information Systems
  • Knowledge Representation and Organizations
  • Data diversity in data management, querying and retrieval
  • Knowledge n Business

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 2369 KiB  
Article
A Semantic Approach for Quality Assurance and Assessment of Volunteered Geographic Information
by Gloria Bordogna
Information 2021, 12(12), 492; https://doi.org/10.3390/info12120492 - 25 Nov 2021
Cited by 2 | Viewed by 2130
Abstract
The paper analyses the characteristics of Volunteer Geographic Information (VGI) and the need to assure and assess its quality for a possible use and re-use. Ontologies and soft ontologies are presented as means to support quality assurance and assessment of VGI by highlighting [...] Read more.
The paper analyses the characteristics of Volunteer Geographic Information (VGI) and the need to assure and assess its quality for a possible use and re-use. Ontologies and soft ontologies are presented as means to support quality assurance and assessment of VGI by highlighting their limitations. A proposal of a possibilistic approach using fuzzy ontology is finally illustrated that allows to model both imprecision and vagueness of domain knowledge and epistemic uncertainty affecting observations. A case study example is illustrated. Full article
(This article belongs to the Special Issue Semantic Web and Information Systems)
Show Figures

Figure 1

36 pages, 1477 KiB  
Article
RADAR: Resilient Application for Dependable Aided Reporting
by Antonia Azzini, Nicola Cortesi and Giuseppe Psaila
Information 2021, 12(11), 463; https://doi.org/10.3390/info12110463 - 09 Nov 2021
Cited by 2 | Viewed by 4264
Abstract
Many organizations must produce many reports for various reasons. Although this activity could appear simple to carry out, this fact is not at all true: indeed, generating reports requires the collection of possibly large and heterogeneous data sets. Furthermore, different professional figures are [...] Read more.
Many organizations must produce many reports for various reasons. Although this activity could appear simple to carry out, this fact is not at all true: indeed, generating reports requires the collection of possibly large and heterogeneous data sets. Furthermore, different professional figures are involved in the process, possibly with different skills (database technicians, domain experts, employees): the lack of common knowledge and of a unifying framework significantly obstructs the effective and efficient definition and continuous generation of reports. This paper presents a novel framework named RADAR, which is the acronym for “Resilient Application for Dependable Aided Reporting”: the framework has been devised to be a ”bridge” between data and employees in charge of generating reports. Specifically, it builds a common knowledge base in which database administrators and domain experts describe their knowledge about the application domain and the gathered data; this knowledge can be browsed by employees to find out the relevant data to aggregate and insert into reports, while designing report layouts; the framework assists the overall process from data definition to report generation. The paper presents the application scenario and the vision by means of a running example, defines the data model and presents the architecture of the framework. Full article
(This article belongs to the Special Issue Semantic Web and Information Systems)
Show Figures

Figure 1

17 pages, 748 KiB  
Article
A Semi-Automatic Semantic Consistency-Checking Method for Learning Ontology from Relational Database
by Chuangtao Ma, Bálint Molnár and András Benczúr
Information 2021, 12(5), 188; https://doi.org/10.3390/info12050188 - 26 Apr 2021
Cited by 2 | Viewed by 2640
Abstract
To tackle the issues of semantic collision and inconsistencies between ontologies and the original data model while learning ontology from relational database (RDB), a semi-automatic semantic consistency checking method based on graph intermediate representation and model checking is presented. Initially, the W-Graph, as [...] Read more.
To tackle the issues of semantic collision and inconsistencies between ontologies and the original data model while learning ontology from relational database (RDB), a semi-automatic semantic consistency checking method based on graph intermediate representation and model checking is presented. Initially, the W-Graph, as an intermediate model between databases and ontologies, was utilized to formalize the semantic correspondences between databases and ontologies, which were then transformed into the Kripke structure and eventually encoded with the SMV program. Meanwhile, description logics (DLs) were employed to formalize the semantic specifications of the learned ontologies, since the OWL DL showed good semantic compatibility and the DLs presented an excellent expressivity. Thereafter, the specifications were converted into a computer tree logic (CTL) formula to improve machine readability. Furthermore, the task of checking semantic consistency could be converted into a global model checking problem that could be solved automatically by the symbolic model checker. Moreover, an example is given to demonstrate the specific process of formalizing and checking the semantic consistency between learned ontologies and RDB, and a verification experiment was conducted to verify the feasibility of the presented method. The results showed that the presented method could correctly check and identify the different kinds of inconsistencies between learned ontologies and its original data model. Full article
(This article belongs to the Special Issue Semantic Web and Information Systems)
Show Figures

Figure 1

Back to TopTop