Next Issue
Previous Issue

Table of Contents

Future Internet, Volume 10, Issue 8 (August 2018)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) This invited paper is an overview of current and future issues for the Internet of Things (IoT), [...] Read more.
View options order results:
result details:
Displaying articles 1-14
Export citation of selected articles as:
Open AccessArticle Interactive 3D Exploration of RDF Graphs through Semantic Planes
Future Internet 2018, 10(8), 81; https://doi.org/10.3390/fi10080081
Received: 19 July 2018 / Revised: 13 August 2018 / Accepted: 14 August 2018 / Published: 17 August 2018
PDF Full-text (15435 KB) | HTML Full-text | XML Full-text
Abstract
This article presents Tarsier, a tool for the interactive 3D visualization of RDF graphs. Tarsier is mainly intended to support teachers introducing students to Semantic Web data representation formalisms and developers in the debugging of applications based on Semantic Web knowledge bases. The
[...] Read more.
This article presents Tarsier, a tool for the interactive 3D visualization of RDF graphs. Tarsier is mainly intended to support teachers introducing students to Semantic Web data representation formalisms and developers in the debugging of applications based on Semantic Web knowledge bases. The tool proposes the metaphor of semantic planes as a way to visualize an RDF graph. A semantic plane contains all the RDF terms sharing a common concept; it can be created, and further split into several planes, through a set of UI controls or through SPARQL 1.1 queries, with the full support of OWL and RDFS. Thanks to the 3D visualization, links between semantic planes can be highlighted and the user can navigate within the 3D scene to find the better perspective to analyze data. Data can be gathered from generic SPARQL 1.1 protocol services. We believe that Tarsier will enhance the human friendliness of semantic technologies by: (1) helping newcomers assimilate new data representation formats; and (2) increasing the capabilities of inspection to detect relevant situations even in complex RDF graphs. Full article
Figures

Graphical abstract

Open AccessArticle A Fast and Lightweight Method with Feature Fusion and Multi-Context for Face Detection
Future Internet 2018, 10(8), 80; https://doi.org/10.3390/fi10080080
Received: 23 July 2018 / Revised: 6 August 2018 / Accepted: 14 August 2018 / Published: 17 August 2018
PDF Full-text (5005 KB) | HTML Full-text | XML Full-text
Abstract
Convolutional neural networks (CNN for short) have made great progress in face detection. They mostly take computation intensive networks as the backbone in order to obtain high precision, and they cannot get a good detection speed without the support of high-performance GPUs (Graphics
[...] Read more.
Convolutional neural networks (CNN for short) have made great progress in face detection. They mostly take computation intensive networks as the backbone in order to obtain high precision, and they cannot get a good detection speed without the support of high-performance GPUs (Graphics Processing Units). This limits CNN-based face detection algorithms in real applications, especially in some speed dependent ones. To alleviate this problem, we propose a lightweight face detector in this paper, which takes a fast residual network as backbone. Our method can run fast even on cheap and ordinary GPUs. To guarantee its detection precision, multi-scale features and multi-context are fully exploited in efficient ways. Specifically, feature fusion is used to obtain semantic strongly multi-scale features firstly. Then multi-context including both local and global context is added to these multi-scale features without extra computational burden. The local context is added through a depthwise separable convolution based approach, and the global context by a simple global average pooling way. Experimental results show that our method can run at about 110 fps on VGA (Video Graphics Array)-resolution images, while still maintaining competitive precision on WIDER FACE and FDDB (Face Detection Data Set and Benchmark) datasets as compared with its state-of-the-art counterparts. Full article
Figures

Figure 1

Open AccessArticle Queue Spillover Management in a Connected Vehicle Environment
Future Internet 2018, 10(8), 79; https://doi.org/10.3390/fi10080079
Received: 6 July 2018 / Revised: 7 August 2018 / Accepted: 8 August 2018 / Published: 10 August 2018
PDF Full-text (4608 KB) | HTML Full-text | XML Full-text
Abstract
To alleviate the queue spillovers at intersections of urban roads during rush hours, a solution to the cross-spill problem based on vehicle networking technologies is proposed. This involves using connected vehicle technology, to realize the interactive information on vehicle and intersection signal control.
[...] Read more.
To alleviate the queue spillovers at intersections of urban roads during rush hours, a solution to the cross-spill problem based on vehicle networking technologies is proposed. This involves using connected vehicle technology, to realize the interactive information on vehicle and intersection signal control. The maximum control distance between intersections is determined by how vehicles are controlled and would travel in that connected environment. A method of calculating overflow tendency towards intersection queuing is also proposed, based on the maximum phase control distance. By this method, the intersection overflow is identified, and then the signal phases are re-optimized according to the requirements of different phases. Finally, overflow prevention control was also performed in this study. The VISSIM simulation results show that the method can better prevent the overflow of queues at intersections. Full article
(This article belongs to the Section Smart System infrastructures and Cybersecurity)
Figures

Figure 1

Open AccessArticle Smart Collection of Real-Time Vehicular Mobility Traces
Future Internet 2018, 10(8), 78; https://doi.org/10.3390/fi10080078
Received: 4 July 2018 / Revised: 26 July 2018 / Accepted: 7 August 2018 / Published: 9 August 2018
PDF Full-text (6197 KB) | HTML Full-text | XML Full-text
Abstract
Mobility trace techniques makes possible drawing the behaviors of real-life movement which shape wireless networks mobility whereabouts. In our investigation, several trace mobility models have been collected after the devices’ deployment. The main issue of this classical procedure is that it produces uncompleted
[...] Read more.
Mobility trace techniques makes possible drawing the behaviors of real-life movement which shape wireless networks mobility whereabouts. In our investigation, several trace mobility models have been collected after the devices’ deployment. The main issue of this classical procedure is that it produces uncompleted records due to several unpredictable problems occurring during the deployment phase. In this paper, we propose a new procedure aimed at collecting traces while deployment phase failures are avoided, which improves the reliability of data. The introduced procedure makes possible the complete generation of traces with a minimum amount of damage without the need to recover mobile devices or lose them, as it is the case in previous mobility traces techniques. Based on detecting and correcting all accidental issues in real time, the proposed trace scanning offers a set of relevant information about the vehicle status which was collected during seven months. Furthermore, the proposed procedure could be applied to generate vehicular traces. Likewise, it is suitable to record/generate human and animal traces. The research outcomes demonstrate the effectiveness and robustness of the smart collection algorithm based on the proposed trace mobility model. Full article
Figures

Figure 1

Open AccessArticle Motives for Instagram Use and Topics of Interest among Young Adults
Future Internet 2018, 10(8), 77; https://doi.org/10.3390/fi10080077
Received: 8 May 2018 / Revised: 1 August 2018 / Accepted: 7 August 2018 / Published: 9 August 2018
PDF Full-text (453 KB) | HTML Full-text | XML Full-text
Abstract
Instagram is currently the most popular social media app among young people around the world. More than 70% of people between the ages of 12 and 24 are Instagram users. The research framework of this study was constructed based on smartphone addiction and
[...] Read more.
Instagram is currently the most popular social media app among young people around the world. More than 70% of people between the ages of 12 and 24 are Instagram users. The research framework of this study was constructed based on smartphone addiction and the uses and gratifications theory. We used 27 question items divided into five factors, namely social interaction, documentation, diversion, self-promotion, and creativity, to investigate the motives for Instagram use and topics of interest among university students in Taiwan. A total of 307 valid questionnaires were obtained. The results revealed that on the whole, the motives for Instagram use were mostly to look at posts, particularly involving social interaction and diversion motives. The level of agreement expressed toward motives for creating posts was lower. Gender, professional training background, and level of addiction to Instagram all exert influence on motives for Instagram use. Over half of the students majoring in design followed artisans and celebrities (including designers), and female students noticed ads on Instagram more than male students did. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Figures

Figure 1

Open AccessArticle SCADA System Testbed for Cybersecurity Research Using Machine Learning Approach
Future Internet 2018, 10(8), 76; https://doi.org/10.3390/fi10080076
Received: 17 July 2018 / Revised: 7 August 2018 / Accepted: 8 August 2018 / Published: 9 August 2018
PDF Full-text (3574 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents the development of a Supervisory Control and Data Acquisition (SCADA) system testbed used for cybersecurity research. The testbed consists of a water storage tank’s control system, which is a stage in the process of water treatment and distribution. Sophisticated cyber-attacks
[...] Read more.
This paper presents the development of a Supervisory Control and Data Acquisition (SCADA) system testbed used for cybersecurity research. The testbed consists of a water storage tank’s control system, which is a stage in the process of water treatment and distribution. Sophisticated cyber-attacks were conducted against the testbed. During the attacks, the network traffic was captured, and features were extracted from the traffic to build a dataset for training and testing different machine learning algorithms. Five traditional machine learning algorithms were trained to detect the attacks: Random Forest, Decision Tree, Logistic Regression, Naïve Bayes and KNN. Then, the trained machine learning models were built and deployed in the network, where new tests were made using online network traffic. The performance obtained during the training and testing of the machine learning models was compared to the performance obtained during the online deployment of these models in the network. The results show the efficiency of the machine learning models in detecting the attacks in real time. The testbed provides a good understanding of the effects and consequences of attacks on real SCADA environments. Full article
(This article belongs to the Section Smart System infrastructures and Cybersecurity)
Figures

Figure 1

Open AccessArticle A Hierarchical Mapping System for Flat Identifier to Locator Resolution Based on Active Degree
Future Internet 2018, 10(8), 75; https://doi.org/10.3390/fi10080075
Received: 26 June 2018 / Revised: 2 August 2018 / Accepted: 7 August 2018 / Published: 8 August 2018
PDF Full-text (2924 KB) | HTML Full-text | XML Full-text
Abstract
Overloading of IP address semantics appeals for a new network architecture based on Identifier (ID)/Locator separation. The challenge of Identifier (ID)/Locator separation is how to solve the scalability and efficiency challenges of identity-to-location resolution. By analyzing the requirements of the Identifier (ID)/Locator separation
[...] Read more.
Overloading of IP address semantics appeals for a new network architecture based on Identifier (ID)/Locator separation. The challenge of Identifier (ID)/Locator separation is how to solve the scalability and efficiency challenges of identity-to-location resolution. By analyzing the requirements of the Identifier (ID)/Locator separation protocol, this paper proposes a hierarchical mapping architecture on active-degree (HMAA). This HMAA was divided into three levels: active local level, neutral transfer level, and inert global level. Each mapping item is dynamically allocated to different levels to ensure minimizing delay according to its activity characteristics. The top layer CHORD is constructed by the Markov Decision Process, which can keep consistency between the physical topology and the logical topology. The simulation results on delay time show that HMAA can satisfy the scalability and efficiency requirements of an Identifier (ID)/Locator separation network. Full article
Figures

Figure 1

Open AccessArticle Predict and Forward: An Efficient Routing-Delivery Scheme Based on Node Profile in Opportunistic Networks
Future Internet 2018, 10(8), 74; https://doi.org/10.3390/fi10080074
Received: 26 June 2018 / Revised: 26 July 2018 / Accepted: 4 August 2018 / Published: 6 August 2018
PDF Full-text (1268 KB) | HTML Full-text | XML Full-text
Abstract
In the social scene of opportunistic networks, message applications find suitable relay nodes or certain transmission destinations from the surrounding neighbors through specific network addresses of users. However, at the dawn of big data and 5G networks, the variational location information of nodes
[...] Read more.
In the social scene of opportunistic networks, message applications find suitable relay nodes or certain transmission destinations from the surrounding neighbors through specific network addresses of users. However, at the dawn of big data and 5G networks, the variational location information of nodes is difficult to be available to mobile devices all the time, and a long wait for the destination may cause severe end-to-end delay. To improve the transmission environment, this study constructs an efficient routing-delivery scheme (Predict and Forward) based on node profile for the opportunistic networks. The node profile effectively characterizes nodes by analyzing and comparing their attributes instead of network addresses, such as physical characteristics, places of residence, workplaces, occupations or hobbies. According to the optimal stopping theory, this algorithm implements the optimal transmission for Prelearn messages by dividing the complex data transmission process into two different phases (Predict and Forward). Through simulations and the comparison of routing algorithms in opportunistic networks, the proposed strategy increases the delivery ratio by 80% with the traditional methods on average, and the average end-to-end delay in this algorithm is the lowest. Full article
Figures

Figure 1

Open AccessArticle Joint AP Association and Bandwidth Allocation Optimization Algorithm in High-Dense WLANs
Future Internet 2018, 10(8), 73; https://doi.org/10.3390/fi10080073
Received: 12 June 2018 / Revised: 1 August 2018 / Accepted: 3 August 2018 / Published: 6 August 2018
PDF Full-text (1836 KB) | HTML Full-text | XML Full-text
Abstract
Regarding access point (AP) overload and performance anomaly which is caused by mobile terminals with different bitrates, a joint AP association and bandwidth allocation optimization algorithm is presented in this paper. Meanwhile, load balancing and proportional fairness are analyzed and formulated as an
[...] Read more.
Regarding access point (AP) overload and performance anomaly which is caused by mobile terminals with different bitrates, a joint AP association and bandwidth allocation optimization algorithm is presented in this paper. Meanwhile, load balancing and proportional fairness are analyzed and formulated as an optimization model. Then, we present a Fair Bandwidth Allocation algorithm based on clients’ Business Priority (FBA-BP), which allocates bandwidth based on the bandwidth demand of clients and their business priority. Furthermore, we propose a Categorized AP Association algorithm based on clients’ demands (CAA-BD), which classifies APs by different types of clients and chooses an optimal associating AP for a new client according to AP categories and the aggregated demand transmission time that are calculated by the FBA-BP algorithm. The CAA-BD can achieve load balance and solve the performance anomaly caused by multi-rate clients coexisting. The simulation results show that our proposed algorithm obtains significant performance in terms of AP utilization, throughput, transmission delay and channel fairness in different client density levels compared with the categorized and Strong Signal First (SSF) algorithms. Full article
(This article belongs to the Section Internet of Things)
Figures

Figure 1

Open AccessArticle Context Analysis of Cloud Computing Systems Using a Pattern-Based Approach
Future Internet 2018, 10(8), 72; https://doi.org/10.3390/fi10080072
Received: 14 June 2018 / Revised: 21 July 2018 / Accepted: 29 July 2018 / Published: 31 July 2018
PDF Full-text (2040 KB) | HTML Full-text | XML Full-text
Abstract
Cloud computing services bring new capabilities for hosting and offering complex collaborative business operations. However, these advances might bring undesirable side-effects, e.g., introducing new vulnerabilities and threats caused by collaboration and data exchange over the Internet. Hence, users have become more concerned about
[...] Read more.
Cloud computing services bring new capabilities for hosting and offering complex collaborative business operations. However, these advances might bring undesirable side-effects, e.g., introducing new vulnerabilities and threats caused by collaboration and data exchange over the Internet. Hence, users have become more concerned about security and privacy aspects. For secure provisioning of a cloud computing service, security and privacy issues must be addressed by using a risk assessment method. To perform a risk assessment, it is necessary to obtain all relevant information about the context of the considered cloud computing service. The context analysis of a cloud computing service and its underlying system is a difficult task because of the variety of different types of information that have to be considered. This context information includes (i) legal, regulatory and/or contractual requirements that are relevant for a cloud computing service (indirect stakeholders); (ii) relations to other involved cloud computing services; (iii) high-level cloud system components that support the involved cloud computing services; (iv) data that is processed by the cloud computing services; and (v) stakeholders that interact directly with the cloud computing services and/or the underlying cloud system components. We present a pattern for the contextual analysis of cloud computing services and demonstrate the instantiation of our proposed pattern with real-life application examples. Our pattern contains elements that represent the above-mentioned types of contextual information. The elements of our pattern conform to the General Data Protection Regulation. Besides the context analysis, our pattern supports the identification of high-level assets. Additionally, our proposed pattern supports the documentation of the scope and boundaries of a cloud computing service conforming to the requirements of the ISO 27005 standard (information security risk management). The results of our context analysis contribute to the transparency of the achieved security and privacy level of a cloud computing service. This transparency can increase the trust of users in a cloud computing service. We present results of the RestAssured project related to the context analysis regarding cloud computing services and their underlying cloud computing systems. The context analysis is the prerequisite to threat and control identification that are performed later in the risk management process. The focus of this paper is the use of a pattern at the time of design systematic context analysis and scope definition for risk management methods. Full article
(This article belongs to the Special Issue Security Patterns in Industry)
Figures

Figure 1

Open AccessArticle Hybrid Approach with Improved Genetic Algorithm and Simulated Annealing for Thesis Sampling
Future Internet 2018, 10(8), 71; https://doi.org/10.3390/fi10080071
Received: 11 July 2018 / Revised: 27 July 2018 / Accepted: 27 July 2018 / Published: 30 July 2018
PDF Full-text (2032 KB) | HTML Full-text | XML Full-text
Abstract
Sampling inspection uses the sample characteristics to estimate that of the population, and it is an important method to describe the population, which has the features of low cost, strong applicability and high scientificity. This paper aims at the sampling inspection of the
[...] Read more.
Sampling inspection uses the sample characteristics to estimate that of the population, and it is an important method to describe the population, which has the features of low cost, strong applicability and high scientificity. This paper aims at the sampling inspection of the master’s degree thesis to ensure their quality, which is commonly estimated by random sampling. Since there are disadvantages in random sampling, a hybrid algorithm combined with an improved genetic algorithm and a simulated annealing algorithm is proposed in this paper. Furthermore, a novel mutation strategy is introduced according to the specialty of Shanghai’s thesis sampling to improve the efficiency of sampling inspection; the acceleration of convergence of the algorithm can also take advantage of this. The new algorithm features the traditional genetic algorithm, and it can obtain the global optimum in the optimization process and provide the fairest sampling plan under the constraint of multiple sampling indexes. The experimental results on the master’s thesis dataset of Shanghai show that the proposed algorithm well meets the requirements of the sampling inspection in Shanghai with a lower time-complexity. Full article
Figures

Figure 1

Open AccessArticle Multidiscipline Integrated Platform Based on Probabilistic Analysis for Manufacturing Engineering Processes
Future Internet 2018, 10(8), 70; https://doi.org/10.3390/fi10080070
Received: 2 July 2018 / Revised: 22 July 2018 / Accepted: 25 July 2018 / Published: 30 July 2018
PDF Full-text (2091 KB) | HTML Full-text | XML Full-text
Abstract
Researchers from different disciplines, such as materials science, computer science, safety science, mechanical engineering and controlling engineering, have aimed to improve the quality of manufacturing engineering processes. Considering the requirements of research and development of advanced materials, reliable manufacturing and collaborative innovation, a
[...] Read more.
Researchers from different disciplines, such as materials science, computer science, safety science, mechanical engineering and controlling engineering, have aimed to improve the quality of manufacturing engineering processes. Considering the requirements of research and development of advanced materials, reliable manufacturing and collaborative innovation, a multidiscipline integrated platform framework based on probabilistic analysis for manufacturing engineering processes is proposed. The proposed platform consists of three logical layers: The requirement layer, the database layer and the application layer. The platform is intended to be a scalable system to gradually supplement related data, models and approaches. The main key technologies of the platform, encapsulation methods, information fusion approaches and the collaborative mechanism are also discussed. The proposed platform will also be gradually improved in the future. In order to exchange information for manufacturing engineering processes, scientists and engineers of different institutes of materials science and manufacturing engineering should strengthen their cooperation. Full article
(This article belongs to the Section Big Data and Augmented Intelligence)
Figures

Figure 1

Open AccessArticle A Watermark-Based In-Situ Access Control Model for Image Big Data
Future Internet 2018, 10(8), 69; https://doi.org/10.3390/fi10080069
Received: 28 June 2018 / Revised: 24 July 2018 / Accepted: 26 July 2018 / Published: 29 July 2018
PDF Full-text (1103 KB) | HTML Full-text | XML Full-text
Abstract
When large images are used for big data analysis, they impose new challenges in protecting image privacy. For example, a geographic image may consist of several sensitive areas or layers. When it is uploaded into servers, the image will be accessed by diverse
[...] Read more.
When large images are used for big data analysis, they impose new challenges in protecting image privacy. For example, a geographic image may consist of several sensitive areas or layers. When it is uploaded into servers, the image will be accessed by diverse subjects. Traditional access control methods regulate access privileges to a single image, and their access control strategies are stored in servers, which imposes two shortcomings: (1) fine-grained access control is not guaranteed for areas/layers in a single image that need to maintain secret for different roles; and (2) access control policies that are stored in servers suffers from multiple attacks (e.g., transferring attacks). In this paper, we propose a novel watermark-based access control model in which access control policies are associated with objects being accessed (called an in-situ model). The proposed model integrates access control policies as watermarks within images, without relying on the availability of servers or connecting networks. The access control for images is still maintained even though images are redistributed again to further subjects. Therefore, access control policies can be delivered together with the big data of images. Moreover, we propose a hierarchical key-role-area model for fine-grained encryption, especially for large size images such as geographic maps. The extensive analysis justifies the security and performance of the proposed model. Full article
(This article belongs to the Section Big Data and Augmented Intelligence)
Figures

Figure 1

Open AccessReview Internet of Nano-Things, Things and Everything: Future Growth Trends
Future Internet 2018, 10(8), 68; https://doi.org/10.3390/fi10080068
Received: 22 June 2018 / Revised: 14 July 2018 / Accepted: 25 July 2018 / Published: 28 July 2018
PDF Full-text (7194 KB) | HTML Full-text | XML Full-text
Abstract
The current statuses and future promises of the Internet of Things (IoT), Internet of Everything (IoE) and Internet of Nano-Things (IoNT) are extensively reviewed and a summarized survey is presented. The analysis clearly distinguishes between IoT and IoE, which are wrongly considered to
[...] Read more.
The current statuses and future promises of the Internet of Things (IoT), Internet of Everything (IoE) and Internet of Nano-Things (IoNT) are extensively reviewed and a summarized survey is presented. The analysis clearly distinguishes between IoT and IoE, which are wrongly considered to be the same by many commentators. After evaluating the current trends of advancement in the fields of IoT, IoE and IoNT, this paper identifies the 21 most significant current and future challenges as well as scenarios for the possible future expansion of their applications. Despite possible negative aspects of these developments, there are grounds for general optimism about the coming technologies. Certainly, many tedious tasks can be taken over by IoT devices. However, the dangers of criminal and other nefarious activities, plus those of hardware and software errors, pose major challenges that are a priority for further research. Major specific priority issues for research are identified. Full article
(This article belongs to the Section Internet of Things)
Figures

Figure 1

Back to Top