Next Issue
Previous Issue

Table of Contents

Informatics, Volume 5, Issue 1 (March 2018)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-15
Export citation of selected articles as:
Open AccessArticle A Smart Sensor Data Transmission Technique for Logistics and Intelligent Transportation Systems
Informatics 2018, 5(1), 15; https://doi.org/10.3390/informatics5010015
Received: 22 January 2018 / Revised: 9 March 2018 / Accepted: 13 March 2018 / Published: 16 March 2018
Cited by 1 | PDF Full-text (8757 KB) | HTML Full-text | XML Full-text
Abstract
When it comes to Internet of Things systems that include both a logistics system and an intelligent transportation system, a smart sensor is one of the key elements to collect useful information whenever and wherever necessary. This study proposes the Smart Sensor Node
[...] Read more.
When it comes to Internet of Things systems that include both a logistics system and an intelligent transportation system, a smart sensor is one of the key elements to collect useful information whenever and wherever necessary. This study proposes the Smart Sensor Node Group Management Medium Access Control Scheme designed to group smart sensor devices and collect data from them efficiently. The proposed scheme performs grouping of portable sensor devices connected to a system depending on the distance from the sink node and transmits data by setting different buffer thresholds to each group. This method reduces energy consumption of sensor devices located near the sink node and enhances the IoT system’s general energy efficiency. When a sensor device is moved and, thus, becomes unable to transmit data, it is allocated to a new group so that it can continue transmitting data to the sink node. Full article
Figures

Figure 1

Open AccessArticle Utilizing Provenance in Reusable Research Objects
Informatics 2018, 5(1), 14; https://doi.org/10.3390/informatics5010014
Received: 5 December 2017 / Revised: 27 January 2018 / Accepted: 2 March 2018 / Published: 8 March 2018
Cited by 1 | PDF Full-text (1576 KB) | HTML Full-text | XML Full-text
Abstract
Science is conducted collaboratively, often requiring the sharing of knowledge about computational experiments. When experiments include only datasets, they can be shared using Uniform Resource Identifiers (URIs) or Digital Object Identifiers (DOIs). An experiment, however, seldom includes only datasets, but more often includes
[...] Read more.
Science is conducted collaboratively, often requiring the sharing of knowledge about computational experiments. When experiments include only datasets, they can be shared using Uniform Resource Identifiers (URIs) or Digital Object Identifiers (DOIs). An experiment, however, seldom includes only datasets, but more often includes software, its past execution, provenance, and associated documentation. The Research Object has recently emerged as a comprehensive and systematic method for aggregation and identification of diverse elements of computational experiments. While a necessary method, mere aggregation is not sufficient for the sharing of computational experiments. Other users must be able to easily recompute on these shared research objects. Computational provenance is often the key to enable such reuse. In this paper, we show how reusable research objects can utilize provenance to correctly repeat a previous reference execution, to construct a subset of a research object for partial reuse, and to reuse existing contents of a research object for modified reuse. We describe two methods to summarize provenance that aid in understanding the contents and past executions of a research object. The first method obtains a process-view by collapsing low-level system information, and the second method obtains a summary graph by grouping related nodes and edges with the goal to obtain a graph view similar to application workflow. Through detailed experiments, we show the efficacy and efficiency of our algorithms. Full article
(This article belongs to the Special Issue Using Computational Provenance)
Figures

Figure 1

Open AccessArticle A Novel Three-Stage Filter-Wrapper Framework for miRNA Subset Selection in Cancer Classification
Informatics 2018, 5(1), 13; https://doi.org/10.3390/informatics5010013
Received: 2 November 2017 / Revised: 20 February 2018 / Accepted: 27 February 2018 / Published: 1 March 2018
PDF Full-text (1077 KB) | HTML Full-text | XML Full-text
Abstract
Micro-Ribonucleic Acids (miRNAs) are small non-coding Ribonucleic Acid (RNA) molecules that play an important role in the cancer growth. There are a lot of miRNAs in the human body and not all of them are responsible for cancer growth. Therefore, there is a
[...] Read more.
Micro-Ribonucleic Acids (miRNAs) are small non-coding Ribonucleic Acid (RNA) molecules that play an important role in the cancer growth. There are a lot of miRNAs in the human body and not all of them are responsible for cancer growth. Therefore, there is a need to propose the novel miRNA subset selection algorithms to remove irrelevant and redundant miRNAs and find miRNAs responsible for cancer development. This paper tries to propose a novel three-stage miRNAs subset selection framework for increasing the cancer classification accuracy. In the first stage, multiple filter algorithms are used for ranking the miRNAs according to their relevance with the class label, and then generating a miRNA pool obtained based on the top-ranked miRNAs of each filter algorithm. In the second stage, we first rank the miRNAs of the miRNA pool by multiple filter algorithms and then this ranking is used to weight the probability of selecting each miRNA. In the third stage, Competitive Swarm Optimization (CSO) tries to find an optimal subset from the weighed miRNAs of the miRNA pool, which give us the most information about the cancer patients. It should be noted that the balance between exploration and exploitation in the proposed algorithm is accomplished by a zero-order Fuzzy Inference System (FIS). Experiments on several miRNA cancer datasets indicate that the proposed three-stage framework has a great performance in terms of both the low error rate of the cancer classification and minimizing the number of miRNAs. Full article
(This article belongs to the Special Issue Biomedical Informatics)
Figures

Figure 1

Open AccessArticle Using Introspection to Collect Provenance in R
Informatics 2018, 5(1), 12; https://doi.org/10.3390/informatics5010012
Received: 1 December 2017 / Revised: 25 February 2018 / Accepted: 26 February 2018 / Published: 1 March 2018
PDF Full-text (379 KB) | HTML Full-text | XML Full-text
Abstract
Data provenance is the history of an item of data from the point of its creation to its present state. It can support science by improving understanding of and confidence in data. RDataTracker is an R package that collects data provenance from R
[...] Read more.
Data provenance is the history of an item of data from the point of its creation to its present state. It can support science by improving understanding of and confidence in data. RDataTracker is an R package that collects data provenance from R scripts (https://github.com/End-to-end-provenance/RDataTracker). In addition to details on inputs, outputs, and the computing environment collected by most provenance tools, RDataTracker also records a detailed execution trace and intermediate data values. It does this using R’s powerful introspection functions and by parsing R statements prior to sending them to the interpreter so it knows what provenance to collect. The provenance is stored in a specialized graph structure called a Data Derivation Graph, which makes it possible to determine exactly how an output value is computed or how an input value is used. In this paper, we provide details about the provenance RDataTracker collects and the mechanisms used to collect it. We also speculate about how this rich source of information could be used by other tools to help an R programmer gain a deeper understanding of the software used and to support reproducibility. Full article
(This article belongs to the Special Issue Using Computational Provenance)
Figures

Figure 1

Open AccessArticle LabelFlow Framework for Annotating Workflow Provenance
Informatics 2018, 5(1), 11; https://doi.org/10.3390/informatics5010011
Received: 28 November 2017 / Revised: 4 February 2018 / Accepted: 21 February 2018 / Published: 23 February 2018
PDF Full-text (2623 KB) | HTML Full-text | XML Full-text
Abstract
Scientists routinely analyse and share data for others to use. Successful data (re)use relies on having metadata describing the context of analysis of data. In many disciplines the creation of contextual metadata is referred to as reporting. One method of implementing analyses
[...] Read more.
Scientists routinely analyse and share data for others to use. Successful data (re)use relies on having metadata describing the context of analysis of data. In many disciplines the creation of contextual metadata is referred to as reporting. One method of implementing analyses is with workflows. A stand-out feature of workflows is their ability to record provenance from executions. Provenance is useful when analyses are executed with changing parameters (changing contexts) and results need to be traced to respective parameters. In this paper we investigate whether provenance can be exploited to support reporting. Specifically; we outline a case-study based on a real-world workflow and set of reporting queries. We observe that provenance, as collected from workflow executions, is of limited use for reporting, as it supports queries partially. We identify that this is due to the generic nature of provenance, its lack of domain-specific contextual metadata. We observe that the required information is available in implicit form, embedded in data. We describe LabelFlow, a framework comprised of four Labelling Operators for decorating provenance with domain-specific Labels. LabelFlow can be instantiated for a domain by plugging it with domain-specific metadata extractors. We provide a tool that takes as input a workflow, and produces as output a Labelling Pipeline for that workflow, comprised of Labelling Operators. We revisit the case-study and show how Labels provide a more complete implementation of reporting queries. Full article
(This article belongs to the Special Issue Using Computational Provenance)
Figures

Figure 1

Open AccessArticle From Offshore Operation to Onshore Simulator: Using Visualized Ethnographic Outcomes to Work with Systems Developers
Informatics 2018, 5(1), 10; https://doi.org/10.3390/informatics5010010
Received: 19 December 2017 / Revised: 3 February 2018 / Accepted: 6 February 2018 / Published: 9 February 2018
PDF Full-text (9256 KB) | HTML Full-text | XML Full-text
Abstract
This paper focuses on the process of translating insights from a Computer Supported Cooperative Work (CSCW)-based study, conducted on a vessel at sea, into a model that can assist systems developers working with simulators, which are used by vessel operators for training purposes
[...] Read more.
This paper focuses on the process of translating insights from a Computer Supported Cooperative Work (CSCW)-based study, conducted on a vessel at sea, into a model that can assist systems developers working with simulators, which are used by vessel operators for training purposes on land. That is, the empirical study at sea brought about rich insights into cooperation, which is important for systems developers to know about and consider in their designs. In the paper, we establish a model that primarily consists of a ‘computational artifact’. The model is designed to support researchers working with systems developers. Drawing on marine examples, we focus on the translation process and investigate how the model serves to visualize work activities; how it addresses relations between technical and computational artifacts, as well as between functions in technical systems and functionalities in cooperative systems. In turn, we link design back to fieldwork studies. Full article
Figures

Figure 1

Open AccessArticle Bus Operations Scheduling Subject to Resource Constraints Using Evolutionary Optimization
Received: 19 December 2017 / Revised: 28 January 2018 / Accepted: 2 February 2018 / Published: 6 February 2018
PDF Full-text (976 KB) | HTML Full-text | XML Full-text
Abstract
In public transport operations, vehicles tend to bunch together due to the instability of passenger demand and traffic conditions. Fluctuation of the expected waiting times of passengers at bus stops due to bus bunching is perceived as service unreliability and degrades the overall
[...] Read more.
In public transport operations, vehicles tend to bunch together due to the instability of passenger demand and traffic conditions. Fluctuation of the expected waiting times of passengers at bus stops due to bus bunching is perceived as service unreliability and degrades the overall quality of service. For assessing the performance of high-frequency bus services, transportation authorities monitor the daily operations via Transit Management Systems (TMS) that collect vehicle positioning information in near real-time. This work explores the potential of using Automated Vehicle Location (AVL) data from the running vehicles for generating bus schedules that improve the service reliability and conform to various regulatory constraints. The computer-aided generation of optimal bus schedules is a tedious task due to the nonlinear and multi-variable nature of the bus scheduling problem. For this reason, this work develops a two-level approach where (i) the regulatory constraints are satisfied and (ii) the waiting times of passengers are optimized with the introduction of an evolutionary algorithm. This work also discusses the experimental results from the implementation of such an approach in a bi-directional bus line operated by a major bus operator in northern Europe. Full article
Figures

Figure 1

Open AccessArticle Embracing First-Person Perspectives in Soma-Based Design
Received: 9 November 2017 / Revised: 18 January 2018 / Accepted: 19 January 2018 / Published: 1 February 2018
Cited by 3 | PDF Full-text (5280 KB) | HTML Full-text | XML Full-text
Abstract
A set of prominent designers embarked on a research journey to explore aesthetics in movement-based design. Here we unpack one of the design sensitivities unique to our practice: a strong first person perspective—where the movements, somatics and aesthetic sensibilities of the designer, design
[...] Read more.
A set of prominent designers embarked on a research journey to explore aesthetics in movement-based design. Here we unpack one of the design sensitivities unique to our practice: a strong first person perspective—where the movements, somatics and aesthetic sensibilities of the designer, design researcher and user are at the forefront. We present an annotated portfolio of design exemplars and a brief introduction to some of the design methods and theory we use, together substantiating and explaining the first-person perspective. At the same time, we show how this felt dimension, despite its subjective nature, is what provides rigor and structure to our design research. Our aim is to assist researchers in soma-based design and designers wanting to consider the multiple facets when designing for the aesthetics of movement. The applications span a large field of designs, including slow introspective, contemplative interactions, arts, dance, health applications, games, work applications and many others. Full article
(This article belongs to the Special Issue Tangible and Embodied Interaction)
Figures

Figure 1

Open AccessArticle Internet of Tangible Things (IoTT): Challenges and Opportunities for Tangible Interaction with IoT
Received: 16 July 2017 / Revised: 4 January 2018 / Accepted: 10 January 2018 / Published: 25 January 2018
PDF Full-text (5496 KB) | HTML Full-text | XML Full-text
Abstract
In the Internet of Things era, an increasing number of everyday objects are able to offer innovative services to the user. However, most of these devices provide only smartphone or web user interfaces. As a result, the interaction is disconnected from the physical
[...] Read more.
In the Internet of Things era, an increasing number of everyday objects are able to offer innovative services to the user. However, most of these devices provide only smartphone or web user interfaces. As a result, the interaction is disconnected from the physical world, decreasing the user experience and increasing the risk of user alienation from the physical world. We argue that tangible interaction can counteract this trend and this article discusses the potential benefits and the still open challenges of tangible interaction applied to the Internet of Things. After an analysis of open challenges for Human-Computer Interaction in IoT, we summarize current trends in tangible interaction and extrapolate eight tangible interaction properties that could be exploited for designing novel interactions with IoT objects. Through a systematic review of tangible interaction applied to IoT, we show what has been already explored in the systems that pioneered the field and the future explorations that still have to be conducted. In order to guide future work in this field, we propose a design card set for supporting the generation of tangible interfaces for IoT objects. The card set has been evaluated during a workshop with 21 people and the results are discussed. Full article
(This article belongs to the Special Issue Tangible and Embodied Interaction)
Figures

Figure 1

Open AccessArticle A Hybrid Approach to Recognising Activities of Daily Living from Object Use in the Home Environment
Received: 13 December 2017 / Revised: 5 January 2018 / Accepted: 10 January 2018 / Published: 13 January 2018
PDF Full-text (791 KB) | HTML Full-text | XML Full-text
Abstract
Accurate recognition of Activities of Daily Living (ADL) plays an important role in providing assistance and support to the elderly and cognitively impaired. Current knowledge-driven and ontology-based techniques model object concepts from assumptions and everyday common knowledge of object use for routine activities.
[...] Read more.
Accurate recognition of Activities of Daily Living (ADL) plays an important role in providing assistance and support to the elderly and cognitively impaired. Current knowledge-driven and ontology-based techniques model object concepts from assumptions and everyday common knowledge of object use for routine activities. Modelling activities from such information can lead to incorrect recognition of particular routine activities resulting in possible failure to detect abnormal activity trends. In cases where such prior knowledge are not available, such techniques become virtually unemployable. A significant step in the recognition of activities is the accurate discovery of the object usage for specific routine activities. This paper presents a hybrid framework for automatic consumption of sensor data and associating object usage to routine activities using Latent Dirichlet Allocation (LDA) topic modelling. This process enables the recognition of simple activities of daily living from object usage and interactions in the home environment. The evaluation of the proposed framework on the Kasteren and Ordonez datasets show that it yields better results compared to existing techniques. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Figures

Figure 1

Open AccessEditorial Acknowledgement to Reviewers of Informatics in 2017
Received: 11 January 2018 / Revised: 11 January 2018 / Accepted: 11 January 2018 / Published: 11 January 2018
PDF Full-text (230 KB) | HTML Full-text | XML Full-text
Abstract
Peer review is an essential part in the publication process, ensuring that Informatics maintains high quality standards for its published papers.[...] Full article
Open AccessEditorial Ambient Assisted Living for Improvement of Health and Quality of Life—A Special Issue of the Journal of Informatics
Received: 31 December 2017 / Revised: 3 January 2018 / Accepted: 3 January 2018 / Published: 9 January 2018
PDF Full-text (146 KB) | HTML Full-text | XML Full-text
Abstract
The demographic change with respect to the ageing of the population has been a worldwide trend[...] Full article
(This article belongs to the Special Issue Ambient Assisted living for Improvement of Health and Quality of Life)
Open AccessArticle An Adaptable System to Support Provenance Management for the Public Policy-Making Process in Smart Cities
Received: 1 August 2017 / Revised: 1 December 2017 / Accepted: 29 December 2017 / Published: 8 January 2018
PDF Full-text (4735 KB) | HTML Full-text | XML Full-text
Abstract
Government policies aim to address public issues and problems and therefore play a pivotal role in people’s lives. The creation of public policies, however, is complex given the perspective of large and diverse stakeholders’ involvement, considerable human participation, lengthy processes, complex task specification
[...] Read more.
Government policies aim to address public issues and problems and therefore play a pivotal role in people’s lives. The creation of public policies, however, is complex given the perspective of large and diverse stakeholders’ involvement, considerable human participation, lengthy processes, complex task specification and the non-deterministic nature of the process. The inherent complexities of the policy process impart challenges for designing a computing system that assists in supporting and automating the business process pertaining to policy setup, which also raises concerns for setting up a tracking service in the policy-making environment. A tracking service informs how decisions have been taken during policy creation and can provide useful and intrinsic information regarding the policy process. At present, there exists no computing system that assists in tracking the complete process that has been employed for policy creation. To design such a system, it is important to consider the policy environment challenges; for this a novel network and goal based approach has been framed and is covered in detail in this paper. Furthermore, smart governance objectives that include stakeholders’ participation and citizens’ involvement have been considered. Thus, the proposed approach has been devised by considering smart governance principles and the knowledge environment of policy making where tasks are largely dependent on policy makers’ decisions and on individual policy objectives. Our approach reckons the human dimension for deciding and defining autonomous process activities at run time. Furthermore, with the network-based approach, so-called provenance data tracking is employed which enables the capture of policy process. Full article
(This article belongs to the Special Issue Smart Government in Smart Cities)
Figures

Figure 1

Open AccessArticle Modeling and Application of Customer Lifetime Value in Online Retail
Received: 11 December 2017 / Revised: 31 December 2017 / Accepted: 3 January 2018 / Published: 6 January 2018
PDF Full-text (1792 KB) | HTML Full-text | XML Full-text
Abstract
This article provides an empirical statistical analysis and discussion of the predictive abilities of selected customer lifetime value (CLV) models that could be used in online shopping within e-commerce business settings. The comparison of CLV predictive abilities, using selected evaluation metrics, is made
[...] Read more.
This article provides an empirical statistical analysis and discussion of the predictive abilities of selected customer lifetime value (CLV) models that could be used in online shopping within e-commerce business settings. The comparison of CLV predictive abilities, using selected evaluation metrics, is made on selected CLV models: Extended Pareto/NBD model (EP/NBD), Markov chain model and Status Quo model. The article uses six online store datasets with annual revenues in the order of tens of millions of euros for the comparison. The EP/NBD model has outperformed other selected models in a majority of evaluation metrics and can be considered good and stable for non-contractual relations in online shopping. The implications for the deployment of selected CLV models in practice, as well as suggestions for future research, are also discussed. Full article
Figures

Figure 1

Open AccessArticle Designing towards the Unknown: Engaging with Material and Aesthetic Uncertainty
Received: 9 August 2017 / Revised: 12 December 2017 / Accepted: 12 December 2017 / Published: 26 December 2017
Cited by 1 | PDF Full-text (3792 KB) | HTML Full-text | XML Full-text
Abstract
New materials with new capabilities demand new ways of approaching design. Destabilising existing methods is crucial to develop new methods. Yet, radical destabilisation—where outcomes remain unknown long enough that new discoveries become possible—is not easy in technology design where complex interdisciplinary teams with
[...] Read more.
New materials with new capabilities demand new ways of approaching design. Destabilising existing methods is crucial to develop new methods. Yet, radical destabilisation—where outcomes remain unknown long enough that new discoveries become possible—is not easy in technology design where complex interdisciplinary teams with time and resource constraints need to deliver concrete outcomes on schedule. The Poetic Kinaesthetic Interface project (PKI) engages with this problematic directly. In PKI we use unfolding processes—informed by participatory, speculative and critical design—in emergent actions, to design towards unknown outcomes, using unknown materials. The impossibility of this task is proving as useful as it is disruptive. At its most potent, it is destabilising expectations, aesthetics and processes. Keeping the researchers, collaborators and participants in a state of unknowing, is opening the research potential to far-ranging possibilities. In this article we unpack the motivations driving the PKI project. We present our mixed-methodology, which entangles textile crafts, design interactions and materiality to shape an embodied enquiry. Our research outcomes are procedural and methodological. PKI brings together diverse human, non-human, known and unknown actors to discover where the emergent assemblages might lead. Our approach is re-invigorating—as it demands re-envisioning of—the design process. Full article
(This article belongs to the Special Issue Tangible and Embodied Interaction)
Figures

Figure 1

Back to Top