Next Article in Journal
“MicroMED” Optical Particle Counter: From Design to Flight Model
Next Article in Special Issue
A Distributed Image Compression Scheme for Energy Harvesting Wireless Multimedia Sensor Networks
Previous Article in Journal
Aggregate Impact of Anomalous Noise Events on the WASN-Based Computation of Road Traffic Noise Levels in Urban and Suburban Environments
Previous Article in Special Issue
A Fine-Grained Video Encryption Service Based on the Cloud-Fog-Local Architecture for Public and Private Videos
Open AccessArticle

Mobility-Aware Service Caching in Mobile Edge Computing for Internet of Things

Beijing Key Laboratory of Intelligent Telecommunications Software and Multimedia, Beijing University of Posts and Telecommunications, Beijing 100876, China
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(3), 610; https://doi.org/10.3390/s20030610
Received: 30 December 2019 / Revised: 19 January 2020 / Accepted: 20 January 2020 / Published: 22 January 2020
(This article belongs to the Special Issue Fog/Edge Computing based Smart Sensing System)
The mobile edge computing architecture successfully solves the problem of high latency in cloud computing. However, current research focuses on computation offloading and lacks research on service caching issues. To solve the service caching problem, especially for scenarios with high mobility in the Sensor Networks environment, we study the mobility-aware service caching mechanism. Our goal is to maximize the number of users who are served by the local edge-cloud, and we need to make predictions about the user’s target location to avoid invalid service requests. First, we propose an idealized geometric model to predict the target area of a user’s movement. Since it is difficult to obtain all the data needed by the model in practical applications, we use frequent patterns to mine local moving track information. Then, by using the results of the trajectory data mining and the proposed geometric model, we make predictions about the user’s target location. Based on the prediction result and existing service cache, the service request is forwarded to the appropriate base station through the service allocation algorithm. Finally, to be able to train and predict the most popular services online, we propose a service cache selection algorithm based on back-propagation (BP) neural network. The simulation experiments show that our service cache algorithm reduces the service response time by about 13.21% on average compared to other algorithms, and increases the local service proportion by about 15.19% on average compared to the algorithm without mobility prediction. View Full-Text
Keywords: sensor networks; internet of things; mobility sensor; mobile edge computing; service cache; mobility-aware; service response time sensor networks; internet of things; mobility sensor; mobile edge computing; service cache; mobility-aware; service response time
Show Figures

Figure 1

MDPI and ACS Style

Wei, H.; Luo, H.; Sun, Y. Mobility-Aware Service Caching in Mobile Edge Computing for Internet of Things. Sensors 2020, 20, 610.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop