Open AccessArticle
Design of a Convolutional Two-Dimensional Filter in FPGA for Image Processing Applications
Computers 2017, 6(2), 19; doi:10.3390/computers6020019 -
Abstract
Exploiting the Bachet weight decomposition theorem, a new two-dimensional filter is designed. The filter can be adapted to different multimedia applications, but in this work it is specifically targeted to image processing applications. The method allows emulating standard 32 bit floating point multipliers
[...] Read more.
Exploiting the Bachet weight decomposition theorem, a new two-dimensional filter is designed. The filter can be adapted to different multimedia applications, but in this work it is specifically targeted to image processing applications. The method allows emulating standard 32 bit floating point multipliers using a chain of fixed point adders and a logic unit to manage the exponent, in order to obtain IEEE-754 compliant results. The proposed design allows more compact implementation of a floating point filtering architecture when a fixed set of coefficients and a fixed range of input values are used. The elaboration of the data proceeds in raster-scan order and is capable of directly processing the data coming from the acquisition source thanks to a careful organization of the memories, avoiding the implementation of frame buffers or any aligning circuitry. The proposed architecture shows state-of-the-art performances in terms of critical path delay, obtaining a critical path delay of 4.7 ns when implemented on a Xilinx Virtex 7 FPGA board. Full article
Figures

Figure 1

Open AccessArticle
The Right to Remember: Implementing a Rudimentary Emotive-Effect Layer for Frustration on AI Agent Gameplay Strategy
Computers 2017, 6(2), 18; doi:10.3390/computers6020018 -
Abstract
AI (Artificial Intelligence) is often looked at as a logical way to develop a game agent that methodically looks at options and delivers rational or irrational solutions. This paper is based on developing an AI agent that plays a game with a similar
[...] Read more.
AI (Artificial Intelligence) is often looked at as a logical way to develop a game agent that methodically looks at options and delivers rational or irrational solutions. This paper is based on developing an AI agent that plays a game with a similar emotive content like a human. The purpose of the study was to see if the incorporation of this emotive content would influence the outcomes within the game Love Letter. In order to do this an AI agent with an emotive layer was developed to play the game over a million times. A lower win/loss ratio demonstrates that, to some extent, this methodology was vindicated and a 100 per cent win for the AI agent did not happen. Machine learning techniques were modelled purposely so as to match extreme models of behavioural change. The results demonstrated a win/loss ratio of 0.67 for the AI agent and, in many ways, reflected the frustration that a normal player would exhibit during game play. As was hypothesised, the final agent investment value was, on average, lower after match play than its initial value. Full article
Figures

Figure 1

Open AccessArticle
Research on Similarity Measurements of 3D Models Based on Skeleton Trees
Computers 2017, 6(2), 17; doi:10.3390/computers6020017 -
Abstract
There is a growing need to be able to accurately and efficiently recognize similar models from existing model sets, in particular, for 3D models. This paper proposes a method of similarity measurement of 3D models, in which the similarity between 3D models is
[...] Read more.
There is a growing need to be able to accurately and efficiently recognize similar models from existing model sets, in particular, for 3D models. This paper proposes a method of similarity measurement of 3D models, in which the similarity between 3D models is easily, accurately and automatically calculated by means of skeleton trees constructed by a simple rule. The skeleton operates well as a key descriptor of a 3D model. Specifically, a skeleton tree represents node features (including connection and orientation) that can reflect the topology and branch features (including region and bending degree) of 3D models geometrically. Node feature distance is first computed by the dot product between node connection distance, which is defined by 2-norm, and node orientation distance, which is defined by tangent space distance. Then branch feature distances are computed by the weighted sum of the average regional distances, as defined by generalized Hausdorff distance, and the average bending degree distance as defined by curvature. Overall similarity is expressed as the weighted sum of topology and geometry similarity. The similarity calculation is efficient and accurate because it is not necessary to perform other operations such as rotation or translation and it considers more topological and geometric information. The experiment demonstrates the feasibility and accuracy of the proposed method. Full article
Figures

Figure 1

Open AccessFeature PaperReview
Reliability of NAND Flash Memories: Planar Cells and Emerging Issues in 3D Devices
Computers 2017, 6(2), 16; doi:10.3390/computers6020016 -
Abstract
We review the state-of-the-art in the understanding of planar NAND Flash memory reliability and discuss how the recent move to three-dimensional (3D) devices has affected this field. Particular emphasis is placed on mechanisms developing along the lifetime of the memory array, as opposed
[...] Read more.
We review the state-of-the-art in the understanding of planar NAND Flash memory reliability and discuss how the recent move to three-dimensional (3D) devices has affected this field. Particular emphasis is placed on mechanisms developing along the lifetime of the memory array, as opposed to time-zero or technological issues, and the viewpoint is focused on the understanding of the root causes. The impressive amount of published work demonstrates that Flash reliability is a complex yet well-understood field, where nonetheless tighter and tighter constraints are set by device scaling. Three-dimensional NAND have offset the traditional scaling scenario, leading to an improvement in performance and reliability while raising new issues to be dealt with, determined by the newer and more complex cell and array architectures as well as operation modes. A thorough understanding of the complex phenomena involved in the operation and reliability of NAND cells remains vital for the development of future technology nodes. Full article
Figures

Figure 1

Open AccessArticle
Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm
Computers 2017, 6(2), 15; doi:10.3390/computers6020015 -
Abstract
In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time
[...] Read more.
In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA) to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality. Full article
Figures

Figure 1

Open AccessArticle
Emotion Elicitation in a Socially Intelligent Service: The Typing Tutor
Computers 2017, 6(2), 14; doi:10.3390/computers6020014 -
Abstract
This paper presents an experimental study on modeling machine emotion elicitation in a socially intelligent service, the typing tutor. The aim of the study is to evaluate the extent to which the machine emotion elicitation can influence the affective state (valence and arousal)
[...] Read more.
This paper presents an experimental study on modeling machine emotion elicitation in a socially intelligent service, the typing tutor. The aim of the study is to evaluate the extent to which the machine emotion elicitation can influence the affective state (valence and arousal) of the learner during a tutoring session. The tutor provides continuous real-time emotion elicitation via graphically rendered emoticons, as an emotional feedback to learner’s performance. Good performance is rewarded by the positive emoticon, based on the notion of positive reinforcement. Facial emotion recognition software is used to analyze the affective state of the learner for later evaluation. Experimental results show the correlation between the positive emoticon and the learner’s affective state is significant for all 13 (100%) test participants on the arousal dimension and for 9 (69%) test participants on both affective dimensions. The results also confirm our hypothesis and show that the machine emotion elicitation is significant for 11 (85%) of 13 test participants. We conclude that the machine emotion elicitation with simple graphical emoticons has a promising potential for the future development of the tutor. Full article
Figures

Figure 1

Open AccessArticle
Towards Trustworthy Collaborative Editing
Computers 2017, 6(2), 13; doi:10.3390/computers6020013 -
Abstract
Real-time collaborative editing applications are drastically different from typical client–server applications in that every participant has a copy of the shared document. In this type of environment, each participant acts as both a client and a server replica. In this article, we elaborate
[...] Read more.
Real-time collaborative editing applications are drastically different from typical client–server applications in that every participant has a copy of the shared document. In this type of environment, each participant acts as both a client and a server replica. In this article, we elaborate on how to adapt Byzantine fault tolerance (BFT) mechanisms to enhance the trustworthiness of such applications. It is apparent that traditional BFT algorithms cannot be used directly because it would dictate that all updates submitted by participants be applied sequentially, which would defeat the purpose of collaborative editing. The goal of this study is to design and implement an efficient BFT solution by exploiting the application semantics and by doing a threat analysis of these types of applications. Our solution can be considered as a form of optimistic BFT in that local states maintained by each participant may diverge temporarily. The states of the participants are made consistent with each other by a periodic synchronization mechanism. Full article
Figures

Figure 1

Open AccessArticle
Body-Borne Computers as Extensions of Self
Computers 2017, 6(1), 12; doi:10.3390/computers6010012 -
Abstract
The opportunities for wearable technologies go well beyond always-available information displays or health sensing devices. The concept of the cyborg introduced by Clynes and Kline, along with works in various fields of research and the arts, offers a vision of what technology integrated
[...] Read more.
The opportunities for wearable technologies go well beyond always-available information displays or health sensing devices. The concept of the cyborg introduced by Clynes and Kline, along with works in various fields of research and the arts, offers a vision of what technology integrated with the body can offer. This paper identifies different categories of research aimed at augmenting humans. The paper specifically focuses on three areas of augmentation of the human body and its sensorimotor capabilities: physical morphology, skin display, and somatosensory extension. We discuss how such digital extensions relate to the malleable nature of our self-image. We argue that body-borne devices are no longer simply functional apparatus, but offer a direct interplay with the mind. Finally, we also showcase some of our own projects in this area and shed light on future challenges. Full article
Figures

Figure 1

Open AccessArticle
Exploring a New Security Framework for Remote Patient Monitoring Devices
Computers 2017, 6(1), 11; doi:10.3390/computers6010011 -
Abstract
Security has been an issue of contention in healthcare. The lack of familiarity and poor implementation of security in healthcare leave the patients’ data vulnerable to attackers. The main issue is assessing how we can provide security in an RPM infrastructure. The findings
[...] Read more.
Security has been an issue of contention in healthcare. The lack of familiarity and poor implementation of security in healthcare leave the patients’ data vulnerable to attackers. The main issue is assessing how we can provide security in an RPM infrastructure. The findings in literature show there is little empirical evidence on proper implementation of security. Therefore, there is an urgent need in addressing cybersecurity issues in medical devices. Through the review of relevant literature in remote patient monitoring and use of a Microsoft threat modelling tool, we identify and explore current vulnerabilities and threats in IEEE 11073 standard devices to propose a new security framework for remote patient monitoring devices. Additionally, current RPM devices have a limitation on the number of people who can share a single device, therefore, we propose the use of NFC for identification in Remote Patient Monitoring (RPM) devices for multi-user environments where we have multiple people sharing a single device to reduce errors associated with incorrect user identification. We finally show how several techniques have been used to build the proposed framework. Full article
Figures

Figure 1

Open AccessArticle
Discrete Event Simulation Method as a Tool for Improvement of Manufacturing Systems
Computers 2017, 6(1), 10; doi:10.3390/computers6010010 -
Abstract
The problem of production flow in manufacturing systems is analyzed. The machines can be operated by workers or by robots, since breakdowns and human factors destabilize the production processes that robots are preferred to perform. The problem is how to determine the real
[...] Read more.
The problem of production flow in manufacturing systems is analyzed. The machines can be operated by workers or by robots, since breakdowns and human factors destabilize the production processes that robots are preferred to perform. The problem is how to determine the real difference in work efficiency between humans and robots. We present an analysis of the production efficiency and reliability of the press shop lines operated by human operators or industrial robots. This is a problem from the field of Operations Research for which the Discrete Event Simulation (DES) method has been used. Three models have been developed, including the manufacturing line before and after robotization, taking into account stochastic parameters of availability and reliability of the machines, operators, and robots. We apply the OEE (Overall Equipment Effectiveness) indicator to present how the availability, reliability, and quality parameters influence the performance of the workstations, especially in the short run and in the long run. In addition, the stability of the simulation model was analyzed. This approach enables a better representation of real manufacturing processes. Full article
Figures

Open AccessArticle
Traffic Priority-Aware Adaptive Slot Allocation for Medium Access Control Protocol in Wireless Body Area Network
Computers 2017, 6(1), 9; doi:10.3390/computers6010009 -
Abstract
Biomedical sensors (BMSs) monitor the heterogeneous vital signs of patients. They have diverse Quality of Service (QoS) requirements including reduced collision, delay, loss, and energy consumption in the transmission of data, which are non-constrained, delay-constrained, reliabilityconstrained, and critical. In this context, this paper
[...] Read more.
Biomedical sensors (BMSs) monitor the heterogeneous vital signs of patients. They have diverse Quality of Service (QoS) requirements including reduced collision, delay, loss, and energy consumption in the transmission of data, which are non-constrained, delay-constrained, reliabilityconstrained, and critical. In this context, this paper proposes a traffic priority-aware adaptive slot allocation-based medium access control (TraySL-MAC) protocol. Firstly, a reduced contention adaptive slot allocation algorithm is presented to minimize contention rounds. Secondly, a low threshold vital signs criticality-based adaptive slot allocation algorithm is developed for high priority data. Thirdly, a high threshold vital signs criticality-based adaptive slot allocation algorithm is designed for low priority data. Simulations are performed to comparatively evaluate the performance of the proposed protocol with state-of-the-art MAC protocols. From the analysis of the results, it is evident that the proposed protocol is beneficial in terms of lower packet delivery delay and energy consumption, and higher throughput in realistic biomedical environments. Full article
Figures

Figure 1

Open AccessReview
A Survey of Soft-Error Mitigation Techniques for Non-Volatile Memories
Computers 2017, 6(1), 8; doi:10.3390/computers6010008 -
Abstract
Non-volatile memories (NVMs) offer superior density and energy characteristics compared to the conventional memories; however, NVMs suffer from severe reliability issues that can easily eclipse their energy efficiency advantages. In this paper, we survey architectural techniques for improving the soft-error reliability of NVMs,
[...] Read more.
Non-volatile memories (NVMs) offer superior density and energy characteristics compared to the conventional memories; however, NVMs suffer from severe reliability issues that can easily eclipse their energy efficiency advantages. In this paper, we survey architectural techniques for improving the soft-error reliability of NVMs, specifically PCM (phase change memory) and STT-RAM (spin transfer torque RAM). We focus on soft-errors, such as resistance drift and write disturbance, in PCM and read disturbance and write failures in STT-RAM. By classifying the research works based on key parameters, we highlight their similarities and distinctions. We hope that this survey will underline the crucial importance of addressing NVM reliability for ensuring their system integration and will be useful for researchers, computer architects and processor designers. Full article
Figures

Figure 1

Open AccessArticle
Assessing Efficiency of Prompts Based on Learner Characteristics
Computers 2017, 6(1), 7; doi:10.3390/computers6010007 -
Abstract
Personalized prompting research has shown the significant learning benefit of prompting. The current paper outlines and examines a personalized prompting approach aimed at eliminating performance differences on the basis of a number of learner characteristics (capturing learning strategies and traits). The learner characteristics
[...] Read more.
Personalized prompting research has shown the significant learning benefit of prompting. The current paper outlines and examines a personalized prompting approach aimed at eliminating performance differences on the basis of a number of learner characteristics (capturing learning strategies and traits). The learner characteristics of interest were the need for cognition, work effort, computer self-efficacy, the use of surface learning, and the learner’s confidence in their learning. The approach was tested in two e-modules, using similar assessment forms (experimental n = 413; control group n = 243). Several prompts which corresponded to the learner characteristics were implemented, including an explanation prompt, a motivation prompt, a strategy prompt, and an assessment prompt. All learning characteristics were significant correlates of at least one of the outcome measures (test performance, errors, and omissions). However, only the assessment prompt increased test performance. On this basis, and drawing upon the testing effect, this prompt may be a particularly promising option to increase performance in e-learning and similar personalized systems. Full article
Figures

Figure 1

Open AccessArticle
A Comparative Experimental Design and Performance Analysis of Snort-Based Intrusion Detection System in Practical Computer Networks
Computers 2017, 6(1), 6; doi:10.3390/computers6010006 -
Abstract
As one of the most reliable technologies, network intrusion detection system (NIDS) allows the monitoring of incoming and outgoing traffic to identify unauthorised usage and mishandling of attackers in computer network systems. To this extent, this paper investigates the experimental performance of Snort-based
[...] Read more.
As one of the most reliable technologies, network intrusion detection system (NIDS) allows the monitoring of incoming and outgoing traffic to identify unauthorised usage and mishandling of attackers in computer network systems. To this extent, this paper investigates the experimental performance of Snort-based NIDS (S-NIDS) in a practical network with the latest technology in various network scenarios including high data speed and/or heavy traffic and/or large packet size. An effective testbed is designed based on Snort using different muti-core processors, e.g., i5 and i7, with different operating systems, e.g., Windows 7, Windows Server and Linux. Furthermore, considering an enterprise network consisting of multiple virtual local area networks (VLANs), a centralised parallel S-NIDS (CPS-NIDS) is proposed with the support of a centralised database server to deal with high data speed and heavy traffic. Experimental evaluation is carried out for each network configuration to evaluate the performance of the S-NIDS in different network scenarios as well as validating the effectiveness of the proposed CPS-NIDS. In particular, by analysing packet analysis efficiency, an improved performance of up to 10% is shown to be achieved with Linux over other operating systems, while up to 8% of improved performance can be achieved with i7 over i5 processors. Full article
Figures

Figure 1

Open AccessArticle
Grouped Bees Algorithm: A Grouped Version of the Bees Algorithm
Computers 2017, 6(1), 5; doi:10.3390/computers6010005 -
Abstract
In many non-deterministic search algorithms, particularly those analogous to complex biological systems, there are a number of inherent difficulties, and the Bees Algorithm (BA) is no exception. The BA is a population-based metaheuristic search algorithm inspired by bees seeking nectar/pollen. Basic versions and
[...] Read more.
In many non-deterministic search algorithms, particularly those analogous to complex biological systems, there are a number of inherent difficulties, and the Bees Algorithm (BA) is no exception. The BA is a population-based metaheuristic search algorithm inspired by bees seeking nectar/pollen. Basic versions and variations of the BA have their own drawbacks. Some of these drawbacks are a large number of parameters to be set, lack of methodology for parameter setting and computational complexity. This paper describes a Grouped version of the Bees Algorithm (GBA) addressing these issues. Unlike its conventional version, in this algorithm bees are grouped to search different sites with different neighbourhood sizes rather than just discovering two types of sites, namely elite and selected. Following a description of the GBA, the results gained for 12 well-known benchmark functions are presented and compared with those of the basic BA, enhanced BA, standard BA and modified BA to demonstrate the efficacy of the proposed algorithm. Compared to the conventional implementations of the BA, the proposed version requires setting of fewer parameters, while producing the optimum solutions much more quickly. Full article
Figures

Figure 1

Open AccessReview
Wearable Food Intake Monitoring Technologies: A Comprehensive Review
Computers 2017, 6(1), 4; doi:10.3390/computers6010004 -
Abstract
Wearable devices monitoring food intake through passive sensing is slowly emerging to complement self-reporting of users’ caloric intake and eating behaviors. Though the ultimate goal for the passive sensing of eating is to become a reliable gold standard in dietary assessment, it is
[...] Read more.
Wearable devices monitoring food intake through passive sensing is slowly emerging to complement self-reporting of users’ caloric intake and eating behaviors. Though the ultimate goal for the passive sensing of eating is to become a reliable gold standard in dietary assessment, it is currently showing promise as a means of validating self-report measures. Continuous food-intake monitoring allows for the validation and refusal of users’ reported data in order to obtain more reliable user information, resulting in more effective health intervention services. Recognizing the importance and strength of wearable sensors in food intake monitoring, there has been a variety of approaches proposed and studied in recent years. While existing technologies show promise, many challenges and opportunities discussed in this survey, still remain. This paper presents a meticulous review of the latest sensing platforms and data analytic approaches to solve the challenges of food-intake monitoring, ranging from ear-based chewing and swallowing detection systems that capture eating gestures to wearable cameras that identify food types and caloric content through image processing techniques. This paper focuses on the comparison of different technologies and approaches that relate to user comfort, body location, and applications for medical research. We identify and summarize the forthcoming opportunities and challenges in wearable food intake monitoring technologies. Full article
Figures

Figure 1

Open AccessArticle
Static Human Detection and Scenario Recognition via Wearable Thermal Sensing System
Computers 2017, 6(1), 3; doi:10.3390/computers6010003 -
Abstract
Conventional wearable sensors are mainly used to detect the physiological and activity information of individuals who wear them, but fail to perceive the information of the surrounding environment. This paper presents a wearable thermal sensing system to detect and perceive the information of
[...] Read more.
Conventional wearable sensors are mainly used to detect the physiological and activity information of individuals who wear them, but fail to perceive the information of the surrounding environment. This paper presents a wearable thermal sensing system to detect and perceive the information of surrounding human subjects. The proposed system is developed based on a pyroelectric infrared sensor. Such a sensor system aims to provide surrounding information to blind people and people with weak visual capability to help them adapt to the environment and avoid collision. In order to achieve this goal, a low-cost, low-data-throughput binary sampling and analyzing scheme is proposed. We also developed a conditioning sensing circuit with a low-noise signal amplifier and programmable system on chip (PSoC) to adjust the amplification gain. Three statistical features in information space are extracted to recognize static humans and human scenarios in indoor environments. The results demonstrate that the proposed wearable thermal sensing system and binary statistical analysis method are efficient in static human detection and human scenario perception. Full article
Figures

Figure 1

Open AccessEditorial
Acknowledgement to Reviewers of Computers in 2016
Computers 2017, 6(1), 2; doi:10.3390/computers6010002 -
Abstract The editors of Computers would like to express their sincere gratitude to the following reviewers for assessing manuscripts in 2016.[...] Full article
Open AccessArticle
BangA: An Efficient and Flexible Generalization-Based Algorithm for Privacy Preserving Data Publication
Computers 2017, 6(1), 1; doi:10.3390/computers6010001 -
Abstract
Privacy-Preserving Data Publishing (PPDP) has become a critical issue for companies and organizations that would release their data. k-Anonymization was proposed as a first generalization model to guarantee against identity disclosure of individual records in a data set. Point access methods (PAMs)
[...] Read more.
Privacy-Preserving Data Publishing (PPDP) has become a critical issue for companies and organizations that would release their data. k-Anonymization was proposed as a first generalization model to guarantee against identity disclosure of individual records in a data set. Point access methods (PAMs) are not well studied for the problem of data anonymization. In this article, we propose yet another approximation algorithm for anonymization, coined BangA, that combines useful features from Point Access Methods (PAMs) and clustering. Hence, it achieves fast computation and scalability as a PAM, and very high quality thanks to its density-based clustering step. Extensive experiments show the efficiency and effectiveness of our approach. Furthermore, we provide guidelines for extending BangA to achieve a relaxed form of differential privacy which provides stronger privacy guarantees as compared to traditional privacy definitions. Full article
Figures

Figure 1

Open AccessArticle
BSEA: A Blind Sealed-Bid E-Auction Scheme for E-Commerce Applications
Computers 2016, 5(4), 32; doi:10.3390/computers5040032 -
Abstract
Due to an increase in the number of internet users, electronic commerce has grown significantly during the last decade. Electronic auction (e-auction) is one of the famous e-commerce applications. Even so, security and robustness of e-auction schemes still remain a challenge. Requirements like
[...] Read more.
Due to an increase in the number of internet users, electronic commerce has grown significantly during the last decade. Electronic auction (e-auction) is one of the famous e-commerce applications. Even so, security and robustness of e-auction schemes still remain a challenge. Requirements like anonymity and privacy of the bid value are under threat from the attackers. Any auction protocol must not leak the anonymity and the privacy of the bid value of an honest Bidder. Keeping these requirements in mind, we have firstly proposed a controlled traceable blind signature scheme (CTBSS) because e-auction schemes should be able to trace the Bidders. Using CTBSS, a blind sealed-bid electronic auction scheme is proposed (BSEA). We have incorporated the notion of blind signature to e-auction schemes. Moreover, both the schemes are based upon elliptic curve cryptography (ECC), which provides a similar level of security with a comparatively smaller key size than the discrete logarithm problem (DLP) based e-auction protocols. The analysis shows that BSEA fulfills all the requirements of e-auction protocol, and the total computation overhead is lower than the existing schemes. Full article
Figures

Figure 1