Entropy2015, 17(5), 2642-2654; doi:10.3390/e17052642 - published 24 April 2015 Show/Hide Abstract
Abstract: After collecting data from observations or experiments, the next step is to analyze the data to build an appropriate mathematical or stochastic model to describe the data so that further studies can be done with the help of the model. In this article, the input-output type mechanism is considered first, where reaction, diffusion, reaction-diffusion, and production-destruction type physical situations can fit in. Then techniques are described to produce thicker or thinner tails (power law behavior) in stochastic models. Then the pathway idea is described where one can switch to different functional forms of the probability density function through a parameter called the pathway parameter. The paper is a continuation of related solar neutrino research published previously in this journal.
Entropy2015, 17(5), 2624-2641; doi:10.3390/e17052624 - published 24 April 2015 Show/Hide Abstract
Abstract: Because of the importance of damage detection in manufacturing systems and other areas, many fault detection methods have been developed that are based on a vibration signal. Little work, however, has been reported in the literature on using a recurrence plot method to analyze the vibration signal for damage detection. In this paper, we develop a recurrence plot based fault detection method by integrating the statistical process control technique. The recurrence plots of the vibration signals are derived by using the recurrence plot (RP) method. Five types of features are extracted from the recurrence plots to quantify the vibration signals’ characteristic. Then, the control chart, a multivariate statistical process control technique, is used to monitor these features. The control chart technique, however, has the assumption that all the data should follow a normal distribution. The RP based bootstrap control chart is proposed to estimate the control chart parameters. The performance of the proposed RP based bootstrap control chart is evaluated by a simulation study and compared with other univariate bootstrap control charts based on recurrence plot features. A real case study of rolling element bearing fault detection demonstrates that the proposed fault detection method achieves a very good performance.
Entropy2015, 17(5), 2606-2623; doi:10.3390/e17052606 - published 23 April 2015 Show/Hide Abstract
Abstract: In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis.
Entropy2015, 17(5), 2590-2605; doi:10.3390/e17052590 - published 23 April 2015 Show/Hide Abstract
Abstract: A financial time series agent-based model is reproduced and investigated by the statistical physics system, the finite-range interacting voter system. The voter system originally describes the collective behavior of voters who constantly update their positions on a particular topic, which is a continuous-time Markov process. In the proposed model, the fluctuations of stock price changes are attributed to the market information interaction amongst the traders and certain similarities of investors’ behaviors. Further, the complexity of return series of the financial model is studied in comparison with two real stock indexes, the Shanghai Stock Exchange Composite Index and the Hang Seng Index, by composite multiscale entropy analysis and recurrence analysis. The empirical research shows that the simulation data for the proposed model could grasp some natural features of actual markets to some extent.
Entropy2015, 17(5), 2573-2589; doi:10.3390/e17052573 - published 23 April 2015 Show/Hide Abstract
Abstract: A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG) transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG) priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI) experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.
Entropy2015, 17(5), 2556-2572; doi:10.3390/e17052556 - published 23 April 2015 Show/Hide Abstract
Abstract: An accelerated degradation test (ADT) is regarded as an effective alternative to an accelerated life test in the sense that an ADT can provide more accurate information on product reliability, even when few or no failures may be expected before the end of a practical test period. In this paper, statistical methods for optimal designing ADT plans are developed assuming that the degradation characteristic follows a gamma process (GP). The GP-based approach has an advantage that it can deal with more frequently encountered situations in which the degradation should always be nonnegative and strictly increasing over time. The optimal ADT plan is developed under the total experimental cost constraint by determining the optimal settings of variables such as the number of measurements, the measurement times, the test stress levels and the number of units allocated to each stress level such that the asymptotic variance of the maximum likelihood estimator of the q-th quantile of the lifetime distribution at the use condition is minimized. In addition, compromise plans are developed to provide means to check the adequacy of the assumed acceleration model. Finally, sensitivity analysis procedures for assessing the effects of the uncertainties in the pre-estimates of unknown parameters are illustrated with an example.