Open AccessArticle
Analysis of Entropy Generation in Mixed Convective Peristaltic Flow of Nanofluid
Entropy 2016, 18(10), 355; doi:10.3390/e18100355 (registering DOI) -
Abstract
This article examines entropy generation in the peristaltic transport of nanofluid in a channel with flexible walls. Single walled carbon nanotubes (SWCNT) and multiple walled carbon nanotubes (MWCNT) with water as base fluid are utilized in this study. Mixed convection is also [...] Read more.
This article examines entropy generation in the peristaltic transport of nanofluid in a channel with flexible walls. Single walled carbon nanotubes (SWCNT) and multiple walled carbon nanotubes (MWCNT) with water as base fluid are utilized in this study. Mixed convection is also considered in the present analysis. Viscous dissipation effect is present. Moreover, slip conditions are encountered for both velocity and temperature at the boundaries. Analysis is prepared in the presence of long wavelength and small Reynolds number assumptions. Two phase model for nanofluids are employed. Nonlinear system of equations for small Grashof number is solved. Velocity and temperature are examined for different parameters via graphs. Streamlines are also constructed to analyze the trapping. Results show that axial velocity and temperature of the nanofluid decrease when we enhance the nanoparticle volume fraction. Moreover, the wall elastance parameter shows increase in axial velocity and temperature, whereas decrease in both quantities is noticed for damping coefficient. Decrease is notified in Entropy generation and Bejan number for increasing values of nanoparticle volume fraction. Full article
Figures

Figure 1

Open AccessReview
Generalized Thermodynamic Optimization for Iron and Steel Production Processes: Theoretical Exploration and Application Cases
Entropy 2016, 18(10), 353; doi:10.3390/e18100353 -
Abstract
Combining modern thermodynamics theory branches, including finite time thermodynamics or entropy generation minimization, constructal theory and entransy theory, with metallurgical process engineering, this paper provides a new exploration on generalized thermodynamic optimization theory for iron and steel production processes. The theoretical core [...] Read more.
Combining modern thermodynamics theory branches, including finite time thermodynamics or entropy generation minimization, constructal theory and entransy theory, with metallurgical process engineering, this paper provides a new exploration on generalized thermodynamic optimization theory for iron and steel production processes. The theoretical core is to thermodynamically optimize performances of elemental packages, working procedure modules, functional subsystems, and whole process of iron and steel production processes with real finite-resource and/or finite-size constraints with various irreversibilities toward saving energy, decreasing consumption, reducing emission and increasing yield, and to achieve the comprehensive coordination among the material flow, energy flow and environment of the hierarchical process systems. A series of application cases of the theory are reviewed. It can provide a new angle of view for the iron and steel production processes from thermodynamics, and can also provide some guidelines for other process industries. Full article
Figures

Figure 1

Open AccessArticle
Exergetic Analysis of a Novel Solar Cooling System for Combined Cycle Power Plants
Entropy 2016, 18(10), 356; doi:10.3390/e18100356 -
Abstract
This paper presents a detailed exergetic analysis of a novel high-temperature Solar Assisted Combined Cycle (SACC) power plant. The system includes a solar field consisting of innovative high-temperature flat plate evacuated solar thermal collectors, a double stage LiBr-H2O absorption chiller, [...] Read more.
This paper presents a detailed exergetic analysis of a novel high-temperature Solar Assisted Combined Cycle (SACC) power plant. The system includes a solar field consisting of innovative high-temperature flat plate evacuated solar thermal collectors, a double stage LiBr-H2O absorption chiller, pumps, heat exchangers, storage tanks, mixers, diverters, controllers and a simple single-pressure Combined Cycle (CC) power plant. Here, a high temperature solar cooling system is coupled with a conventional combined cycle, in order to pre-cool gas turbine inlet air in order to enhance system efficiency and electrical capacity. In this paper, the system is analyzed from an exergetic point of view, on the basis of an energy-economic model presented in a recent work, where the obtained main results show that SACC exhibits a higher electrical production and efficiency with respect to the conventional CC. The system performance is evaluated by a dynamic simulation, where detailed simulation models are implemented for all the components included in the system. In addition, for all the components and for the system as whole, energy and exergy balances are implemented in order to calculate the magnitude of the irreversibilities within the system. In fact, exergy analysis is used in order to assess: exergy destructions and exergetic efficiencies. Such parameters are used in order to evaluate the magnitude of the irreversibilities in the system and to identify the sources of such irreversibilities. Exergetic efficiencies and exergy destructions are dynamically calculated for the 1-year operation of the system. Similarly, exergetic results are also integrated on weekly and yearly bases in order to evaluate the corresponding irreversibilities. The results showed that the components of the Joule cycle (combustor, turbine and compressor) are the major sources of irreversibilities. System overall exergetic efficiency was around 48%. Average weekly solar collector exergetic efficiency ranged from 6.5% to 14.5%, significantly increasing during the summer season. Conversely, absorption chiller exergy efficiency varies from 7.7% to 20.2%, being higher during the winter season. Combustor exergy efficiency is stably close to 68%, whereas the exergy efficiencies of the remaining components are higher than 80%. Full article
Figures

Figure 1

Open AccessArticle
A Langevin Canonical Approach to the Study of Quantum Stochastic Resonance in Chiral Molecules
Entropy 2016, 18(10), 354; doi:10.3390/e18100354 -
Abstract
A Langevin canonical framework for a chiral two-level system coupled to a bath of harmonic oscillators is used within a coupling scheme different from the well-known spin-boson model to study the quantum stochastic resonance for chiral molecules. This process refers to the [...] Read more.
A Langevin canonical framework for a chiral two-level system coupled to a bath of harmonic oscillators is used within a coupling scheme different from the well-known spin-boson model to study the quantum stochastic resonance for chiral molecules. This process refers to the amplification of the response to an external periodic signal at a certain value of the noise strength, being a cooperative effect of friction, noise, and periodic driving occurring in a bistable system. Furthermore, from this stochastic dynamics within the Markovian regime and Ohmic friction, the competing process between tunneling and the parity violating energy difference present in this type of chiral systems plays a fundamental role. This mechanism is finally proposed to observe the so-far elusive parity-violating energy difference in chiral molecules. Full article
Figures

Figure 1

Open AccessArticle
Propositions for Confidence Interval in Systematic Sampling on Real Line
Entropy 2016, 18(10), 352; doi:10.3390/e18100352 -
Abstract
Systematic sampling is used as a method to get the quantitative results from tissues and radiological images. Systematic sampling on a real line (R) is a very attractive method within which biomedical imaging is consulted by practitioners. For the systematic [...] Read more.
Systematic sampling is used as a method to get the quantitative results from tissues and radiological images. Systematic sampling on a real line (R) is a very attractive method within which biomedical imaging is consulted by practitioners. For the systematic sampling on R, the measurement function (MF) occurs by slicing the three-dimensional object equidistant systematically. The currently-used covariogram model in variance approximation is tested for the different measurement functions in a class to see the performance on the variance estimation of systematically-sampled R. An exact calculation method is proposed to calculate the constant λ(q,N) of the confidence interval in the systematic sampling. The exact value of constant λ(q,N) is examined for the different measurement functions, as well. As a result, it is observed from the simulation that the proposed MF should be used to check the performances of the variance approximation and the constant λ(q,N). Synthetic data can support the results of real data. Full article
Figures

Figure 1

Open AccessArticle
Influence of the Aqueous Environment on Protein Structure—A Plausible Hypothesis Concerning the Mechanism of Amyloidogenesis
Entropy 2016, 18(10), 351; doi:10.3390/e18100351 -
Abstract
The aqueous environment is a pervasive factor which, in many ways, determines the protein folding process and consequently the activity of proteins. Proteins are unable to perform their function unless immersed in water (membrane proteins excluded from this statement). Tertiary conformational stabilization [...] Read more.
The aqueous environment is a pervasive factor which, in many ways, determines the protein folding process and consequently the activity of proteins. Proteins are unable to perform their function unless immersed in water (membrane proteins excluded from this statement). Tertiary conformational stabilization is dependent on the presence of internal force fields (nonbonding interactions between atoms), as well as an external force field generated by water. The hitherto the unknown structuralization of water as the aqueous environment may be elucidated by analyzing its effects on protein structure and function. Our study is based on the fuzzy oil drop model—a mechanism which describes the formation of a hydrophobic core and attempts to explain the emergence of amyloid-like fibrils. A set of proteins which vary with respect to their fuzzy oil drop status (including titin, transthyretin and a prion protein) have been selected for in-depth analysis to suggest the plausible mechanism of amyloidogenesis. Full article
Figures

Figure 1

Open AccessArticle
Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method
Entropy 2016, 18(10), 343; doi:10.3390/e18100343 -
Abstract
Assessing Landslide Susceptibility Mapping (LSM) contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs) in [...] Read more.
Assessing Landslide Susceptibility Mapping (LSM) contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs) in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70%) and testing (≈30%) samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC) curves in combination with area under the curve (AUC). The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area. Full article
Figures

Figure 1

Open AccessArticle
Recognition of Abnormal Uptake through 123I-mIBG Scintigraphy Entropy for Paediatric Neuroblastoma Identification
Entropy 2016, 18(10), 349; doi:10.3390/e18100349 -
Abstract
Whole-body 123I-Metaiodobenzylguanidine (mIBG) scintigraphy is used as primary image modality to visualize neuroblastoma tumours and metastases because it is the most sensitive and specific radioactive tracer in staging the disease and evaluating the response to treatment. However, especially in paediatric [...] Read more.
Whole-body 123I-Metaiodobenzylguanidine (mIBG) scintigraphy is used as primary image modality to visualize neuroblastoma tumours and metastases because it is the most sensitive and specific radioactive tracer in staging the disease and evaluating the response to treatment. However, especially in paediatric neuroblastoma, information from mIBG scans is difficult to extract because of acquisition difficulties that produce low definition images, with poor contours, resolution and contrast. These problems limit physician assessment. Current oncological guidelines are based on qualitative observer-dependant analysis. This makes comparing results taken at different moments of therapy, or in different institutions, difficult. In this paper, we present a computerized method that processes an image and calculates a quantitative measurement considered as its entropy, suitable for the identification of abnormal uptake regions, for which there is enough suspicion that they may be a tumour or metastatic site. This measurement can also be compared with future scintigraphies of the same patient. Over 46 scintigraphies of 22 anonymous patients were tested; the procedure identified 96.7% of regions of abnormal uptake and it showed a low overall false negative rate of 3.3%. This method provides assistance to physicians in diagnosing tumours and also allows the monitoring of patients’ evolution. Full article
Figures

Figure 1

Open AccessArticle
Entropy for the Quantized Field in the Atom-Field Interaction: Initial Thermal Distribution
Entropy 2016, 18(10), 346; doi:10.3390/e18100346 -
Abstract
We study the entropy of a quantized field in interaction with a two-level atom (in a pure state) when the field is initially in a mixture of two number states. We then generalise the result for a thermal state; i.e., an (infinite) [...] Read more.
We study the entropy of a quantized field in interaction with a two-level atom (in a pure state) when the field is initially in a mixture of two number states. We then generalise the result for a thermal state; i.e., an (infinite) statistical mixture of number states. We show that for some specific interaction times, the atom passes its purity to the field and therefore the field entropy decreases from its initial value. Full article
Figures

Figure 1

Open AccessArticle
Boltzmann Complexity: An Emergent Property of the Majorization Partial Order
Entropy 2016, 18(10), 347; doi:10.3390/e18100347 -
Abstract
Boltzmann macrostates, which are in 1:1 correspondence with the partitions of integers, are investigated. Integer partitions, unlike entropy, uniquely characterize Boltzmann states, but their use has been limited. Integer partitions are well known to be partially ordered by majorization. It is less [...] Read more.
Boltzmann macrostates, which are in 1:1 correspondence with the partitions of integers, are investigated. Integer partitions, unlike entropy, uniquely characterize Boltzmann states, but their use has been limited. Integer partitions are well known to be partially ordered by majorization. It is less well known that this partial order is fundamentally equivalent to the “mixedness” of the set of microstates that comprise each macrostate. Thus, integer partitions represent the fundamental property of the mixing character of Boltzmann states. The standard definition of incomparability in partial orders is applied to the individual Boltzmann macrostates to determine the number of other macrostates with which it is incomparable. We apply this definition to each partition (or macrostate) and calculate the number C with which that partition is incomparable. We show that the value of C complements the value of the Boltzmann entropy, S, obtained in the usual way. Results for C and S are obtained for Boltzmann states comprised of up to N=50 microstates where there are 204,226 Boltzmann macrostates. We note that, unlike mixedness, neither C nor S uniquely characterizes macrostates. Plots of C vs. S are shown. The results are surprising and support the authors’ earlier suggestion that C be regarded as the complexity of the Boltzmann states. From this we propose that complexity may generally arise from incomparability in other systems as well. Full article
Figures

Figure 1

Open AccessArticle
The Analytical Solution of Parabolic Volterra Integro-Differential Equations in the Infinite Domain
Entropy 2016, 18(10), 344; doi:10.3390/e18100344 -
Abstract
This article focuses on obtaining analytical solutions for d-dimensional, parabolic Volterra integro-differential equations with different types of frictional memory kernel. Based on Laplace transform and Fourier transform theories, the properties of the Fox-H function and convolution theorem, analytical solutions for the [...] Read more.
This article focuses on obtaining analytical solutions for d-dimensional, parabolic Volterra integro-differential equations with different types of frictional memory kernel. Based on Laplace transform and Fourier transform theories, the properties of the Fox-H function and convolution theorem, analytical solutions for the equations in the infinite domain are derived under three frictional memory kernel functions. The analytical solutions are expressed by infinite series, the generalized multi-parameter Mittag-Leffler function, the Fox-H function and the convolution form of the Fourier transform. In addition, graphical representations of the analytical solution under different parameters are given for one-dimensional parabolic Volterra integro-differential equations with a power-law memory kernel. It can be seen that the solution curves are subject to Gaussian decay at any given moment. Full article
Figures

Figure 1

Open AccessArticle
A Novel Operational Matrix of Caputo Fractional Derivatives of Fibonacci Polynomials: Spectral Solutions of Fractional Differential Equations
Entropy 2016, 18(10), 345; doi:10.3390/e18100345 -
Abstract
Herein, two numerical algorithms for solving some linear and nonlinear fractional-order differential equations are presented and analyzed. For this purpose, a novel operational matrix of fractional-order derivatives of Fibonacci polynomials was constructed and employed along with the application of the tau and [...] Read more.
Herein, two numerical algorithms for solving some linear and nonlinear fractional-order differential equations are presented and analyzed. For this purpose, a novel operational matrix of fractional-order derivatives of Fibonacci polynomials was constructed and employed along with the application of the tau and collocation spectral methods. The convergence and error analysis of the suggested Fibonacci expansion were carefully investigated. Some numerical examples with comparisons are presented to ensure the efficiency, applicability and high accuracy of the proposed algorithms. Two accurate semi-analytic polynomial solutions for linear and nonlinear fractional differential equations are the result. Full article
Figures

Figure 1

Open AccessArticle
Application of Information Theory for an Entropic Gradient of Ecological Sites
Entropy 2016, 18(10), 340; doi:10.3390/e18100340 -
Abstract
The present study was carried out to compute the straightforward formulations of information entropy for ecological sites and to arrange their locations along the ordination axes using the values of those entropic measures. The data of plant communities taken from six sites [...] Read more.
The present study was carried out to compute the straightforward formulations of information entropy for ecological sites and to arrange their locations along the ordination axes using the values of those entropic measures. The data of plant communities taken from six sites found in the Dedegül Mountain sub-district and the Sultan Mountain sub-district located in the Beyşehir Watershed was examined in the present study. Firstly entropic measures (i.e., marginal entropy, joint entropy, conditional entropy and mutual entropy) were computed for each of the sites. Next principal component analysis (PCA) was applied to the data composed of the values of those entropic measures. As a result of the first axis of the applied PCA, the arrangement of the sites was found meaningful from an ecological point of view because the sites were located along with the first component axis of the PCA by illustrating the climatic differences between the sub-districts. Full article
Figures

Figure 1

Open AccessArticle
The Differential Entropy of the Joint Distribution of Eigenvalues of Random Density Matrices
Entropy 2016, 18(9), 342; doi:10.3390/e18090342 -
Abstract
We derive exactly the differential entropy of the joint distribution of eigenvalues of Wishart matrices. Based on this result, we calculate the differential entropy of the joint distribution of eigenvalues of random mixed quantum states, which is induced by taking the partial [...] Read more.
We derive exactly the differential entropy of the joint distribution of eigenvalues of Wishart matrices. Based on this result, we calculate the differential entropy of the joint distribution of eigenvalues of random mixed quantum states, which is induced by taking the partial trace over the environment of Haar-distributed bipartite pure states. Then, we investigate the differential entropy of the joint distribution of diagonal entries of random mixed quantum states. Finally, we investigate the relative entropy between these two kinds of distributions. Full article
Open AccessArticle
Effects of Fatty Infiltration of the Liver on the Shannon Entropy of Ultrasound Backscattered Signals
Entropy 2016, 18(9), 341; doi:10.3390/e18090341 -
Abstract
This study explored the effects of fatty infiltration on the signal uncertainty of ultrasound backscattered echoes from the liver. Standard ultrasound examinations were performed on 107 volunteers. For each participant, raw ultrasound image data of the right lobe of liver were acquired [...] Read more.
This study explored the effects of fatty infiltration on the signal uncertainty of ultrasound backscattered echoes from the liver. Standard ultrasound examinations were performed on 107 volunteers. For each participant, raw ultrasound image data of the right lobe of liver were acquired using a clinical scanner equipped with a 3.5-MHz convex transducer. An algorithmic scheme was proposed for ultrasound B-mode and entropy imaging. Fatty liver stage was evaluated using a sonographic scoring system. Entropy values constructed using the ultrasound radiofrequency (RF) and uncompressed envelope signals (denoted by HR and HE, respectively) as a function of fatty liver stage were analyzed using the Pearson correlation coefficient. Data were expressed as the median and interquartile range (IQR). Receiver operating characteristic (ROC) curve analysis with 95% confidence intervals (CIs) was performed to obtain the area under the ROC curve (AUC). The brightness of the entropy image typically increased as the fatty stage varied from mild to severe. The median value of HR monotonically increased from 4.69 (IQR: 4.60–4.79) to 4.90 (IQR: 4.87–4.92) as the severity of fatty liver increased (r = 0.63, p < 0.0001). Concurrently, the median value of HE increased from 4.80 (IQR: 4.69–4.89) to 5.05 (IQR: 5.02–5.07) (r = 0.69, p < 0.0001). In particular, the AUCs obtained using HE (95% CI) were 0.93 (0.87–0.99), 0.88 (0.82–0.94), and 0.76 (0.65–0.87) for fatty stages ≥mild, ≥moderate, and ≥severe, respectively. The sensitivity, specificity, and accuracy were 93.33%, 83.11%, and 86.00%, respectively (≥mild). Fatty infiltration increases the uncertainty of backscattered signals from livers. Ultrasound entropy imaging has potential for the routine examination of fatty liver disease. Full article
Figures

Figure 1

Open AccessEditorial
Quantum Computation and Information: Multi-Particle Aspects
Entropy 2016, 18(9), 339; doi:10.3390/e18090339 -
Abstract This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers. Full article
Open AccessArticle
The Constant Information Radar
Entropy 2016, 18(9), 338; doi:10.3390/e18090338 -
Abstract
The constant information radar, or CIR, is a tracking radar that modulates target revisit time by maintaining a fixed mutual information measure. For highly dynamic targets that deviate significantly from the path predicted by the tracking motion model, the CIR adjusts by [...] Read more.
The constant information radar, or CIR, is a tracking radar that modulates target revisit time by maintaining a fixed mutual information measure. For highly dynamic targets that deviate significantly from the path predicted by the tracking motion model, the CIR adjusts by illuminating the target more frequently than it would for well-modeled targets. If SNR is low, the radar delays revisit to the target until the state entropy overcomes noise uncertainty. As a result, we show that the information measure is highly dependent on target entropy and target measurement covariance. A constant information measure maintains a fixed spectral efficiency to support the RF convergence of radar and communications. The result is a radar implementing a novel target scheduling algorithm based on information instead of heuristic or ad hoc methods. The CIR mathematically ensures that spectral use is justified. Full article
Figures

Figure 1

Open AccessArticle
Combined Forecasting of Streamflow Based on Cross Entropy
Entropy 2016, 18(9), 336; doi:10.3390/e18090336 -
Abstract
In this study, we developed a model of combined streamflow forecasting based on cross entropy to solve the problems of streamflow complexity and random hydrological processes. First, we analyzed the streamflow data obtained from Wudaogou station on the Huifa River, which is [...] Read more.
In this study, we developed a model of combined streamflow forecasting based on cross entropy to solve the problems of streamflow complexity and random hydrological processes. First, we analyzed the streamflow data obtained from Wudaogou station on the Huifa River, which is the second tributary of the Songhua River, and found that the streamflow was characterized by fluctuations and periodicity, and it was closely related to rainfall. The proposed method involves selecting similar years based on the gray correlation degree. The forecasting results obtained by the time series model (autoregressive integrated moving average), improved grey forecasting model, and artificial neural network model (a radial basis function) were used as a single forecasting model, and from the viewpoint of the probability density, the method for determining weights was improved by using the cross entropy model. The numerical results showed that compared with the single forecasting model, the combined forecasting model improved the stability of the forecasting model, and the prediction accuracy was better than that of conventional combined forecasting models. Full article
Figures

Figure 1

Open AccessArticle
Entropy Minimizing Curves with Application to Flight Path Design and Clustering
Entropy 2016, 18(9), 337; doi:10.3390/e18090337 -
Abstract
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the [...] Read more.
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the increase of traffic, it is expected that the system will reach its limits in the near future: a paradigm change in ATM is planned with the introduction of trajectory-based operations. In this context, sets of well-separated flight paths are computed in advance, tremendously reducing the number of unsafe situations that must be dealt with by controllers. Unfortunately, automated tools used to generate such planning generally issue trajectories not complying with operational practices or even flight dynamics. In this paper, a means of producing realistic air routes from the output of an automated trajectory design tool is investigated. For that purpose, the entropy of a system of curves is first defined, and a mean of iteratively minimizing it is presented. The resulting curves form a route network that is suitable for use in a semi-automated ATM system with human in the loop. The tool introduced in this work is quite versatile and may be applied also to unsupervised classification of curves: an example is given for French traffic. Full article
Figures

Figure 1

Open AccessArticle
Sparse Trajectory Prediction Based on Multiple Entropy Measures
Entropy 2016, 18(9), 327; doi:10.3390/e18090327 -
Abstract
Trajectory prediction is an important problem that has a large number of applications. A common approach to trajectory prediction is based on historical trajectories. However, existing techniques suffer from the “data sparsity problem”. The available historical trajectories are far from enough to [...] Read more.
Trajectory prediction is an important problem that has a large number of applications. A common approach to trajectory prediction is based on historical trajectories. However, existing techniques suffer from the “data sparsity problem”. The available historical trajectories are far from enough to cover all possible query trajectories. We propose the sparsity trajectory prediction algorithm based on multiple entropy measures (STP-ME) to address the data sparsity problem. Firstly, the moving region is iteratively divided into a two-dimensional plane grid graph, and each trajectory is represented as a grid sequence with temporal information. Secondly, trajectory entropy is used to evaluate trajectory’s regularity, the L-Z entropy estimator is implemented to calculate trajectory entropy, and a new trajectory space is generated through trajectory synthesis. We define location entropy and time entropy to measure the popularity of locations and timeslots respectively. Finally, a second-order Markov model that contains a temporal dimension is adopted to perform sparse trajectory prediction. The experiments show that when trip completed percentage increases towards 90%, the coverage of the baseline algorithm decreases to almost 25%, while the STP-ME algorithm successfully copes with it as expected with only an unnoticeable drop in coverage, and can constantly answer almost 100% of query trajectories. It is found that the STP-ME algorithm improves the prediction accuracy generally by as much as 8%, 3%, and 4%, compared to the baseline algorithm, the second-order Markov model (2-MM), and sub-trajectory synthesis (SubSyn) algorithm, respectively. At the same time, the prediction time of STP-ME algorithm is negligible (10 μs), greatly outperforming the baseline algorithm (100 ms). Full article
Figures

Figure 1