Display options:
Normal
Show Abstracts
Compact

Select/unselect all

Displaying article 1-30

Research
p. 1178-1190
Received: 23 December 2013 / Revised: 18 February 2014 / Accepted: 19 February 2014 / Published: 25 February 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (119 KB) | HTML Full-text | XML Full-text
Abstract: We show that a special entropic quantifier, called the statistical complexity, becomes maximal at the transition between super-Poisson and sub-Poisson regimes. This acquires important connotations given the fact that these regimes are usually associated with, respectively, classical and quantum processes.

p. 1191-1210
Received: 10 September 2013 / Revised: 13 December 2013 / Accepted: 5 February 2014 / Published: 25 February 2014

Show/Hide Abstract
| PDF Full-text (295 KB) | HTML Full-text | XML Full-text
Abstract: Conventional thermodynamics, which is formulated for our world populated by radiation and matter, can be extended to describe physical properties of antimatter in two mutually exclusive ways: CP-invariant or CPT-invariant. Here we refer to invariance of physical laws under charge (C), parity (P) and time reversal (T) transformations. While in quantum field theory CPT invariance is a theorem confirmed by experiments, the symmetry principles applied to macroscopic phenomena or to the whole of the Universe represent only hypotheses. Since both versions of thermodynamics are different only in their treatment of antimatter, but are the same in describing our world dominated by matter, making a clear experimentally justified choice between CP invariance and CPT invariance in context of thermodynamics is not possible at present. This work investigates the comparative properties of the CP- and CPT-invariant extensions of thermodynamics (focusing on the latter, which is less conventional than the former) and examines conditions under which these extensions can be experimentally tested.

p. 1211-1242
Received: 30 October 2013 / Revised: 10 December 2013 / Accepted: 24 December 2013 / Published: 25 February 2014

Show/Hide Abstract
| Cited by 1 | PDF Full-text (338 KB) | HTML Full-text | XML Full-text
Abstract: For a multiple input channel, one may define different capacity regions, according to the criterions of error, types of codes, and presence of feedback . In this paper, we aim to draw a complete picture of relations among these different capacity regions. To this end, we first prove that the average-error-probability capacity region of a multiple input channel can be achieved by a random code under the criterion of maximum error probability. Moreover, we show that for a non-deterministic multiple input channel with feedback, the capacity regions are the same under two different error criterions. In addition, we discuss two special classes of channels to shed light on the relation of different capacity regions. In particular, to illustrate the roles of feedback, we provide a class of MAC, for which feedback may enlarge maximum-error-probability capacity regions, but not average-error-capacity regions. Besides, we present a class of MAC, as an example for which the maximum-error-probability capacity regions are strictly smaller than the average-error-probability capacity regions (first example showing this was due to G. Dueck). Differently from G. Dueck’s enlightening example in which a deterministic MAC was considered, our example includes and further generalizes G. Dueck’s example by taking both deterministic and non-deterministic MACs into account. Finally, we extend our results for a discrete memoryless two-input channel, to compound, arbitrarily varying MAC, and MAC with more than two inputs.

p. 1243-1271
Received: 16 January 2014 / Revised: 12 February 2014 / Accepted: 17 February 2014 / Published: 26 February 2014

Show/Hide Abstract
| PDF Full-text (296 KB) | HTML Full-text | XML Full-text
Abstract: This paper addresses the problem of balancing statistical economic data, when data structure is arbitrary and both uncertainty estimates and a ranking of data quality are available. Using a Bayesian approach, the prior configuration is described as a multivariate random vector and the balanced posterior is obtained by application of relative entropy minimization. The paper shows that conventional data balancing methods, such as generalized least squares, weighted least squares and biproportional methods are particular cases of the general method described here. As a consequence, it is possible to determine the underlying assumptions and range of application of each traditional method. In particular, the popular biproportional method is found to assume that all source data has the same relative uncertainty. Finally, this paper proposes a simple linear iterative method that generalizes the biproportional method to the data balancing problem with arbitrary data structure, uncertainty estimates and multiple data quality levels.

p. 1272-1286
Received: 29 September 2013 / Revised: 12 February 2014 / Accepted: 19 February 2014 / Published: 27 February 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (283 KB) | HTML Full-text | XML Full-text
Abstract: The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k), so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

p. 1287-1314
Received: 13 January 2014 / Revised: 8 February 2014 / Accepted: 12 February 2014 / Published: 27 February 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (1265 KB) | HTML Full-text | XML Full-text
Abstract: Some known results from statistical thermophysics as well as from hydrology are revisited from a different perspective trying: (a) to unify the notion of entropy in thermodynamic and statistical/stochastic approaches of complex hydrological systems and (b) to show the power of entropy and the principle of maximum entropy in inference, both deductive and inductive. The capability for deductive reasoning is illustrated by deriving the law of phase change transition of water (Clausius-Clapeyron) from scratch by maximizing entropy in a formal probabilistic frame. However, such deductive reasoning cannot work in more complex hydrological systems with diverse elements, yet the entropy maximization framework can help in inductive inference, necessarily based on data. Several examples of this type are provided in an attempt to link statistical thermophysics with hydrology with a unifying view of entropy.

p. 1315-1330
Received: 30 December 2013 / Revised: 18 February 2014 / Accepted: 25 February 2014 / Published: 28 February 2014

Show/Hide Abstract
| Cited by 13 | PDF Full-text (715 KB) | HTML Full-text | XML Full-text
Abstract: The nonverbal transmission of information between social animals is a primary driving force behind their actions and, therefore, an important quantity to measure in animal behavior studies. Despite its key role in social behavior, the flow of information has only been inferred by correlating the actions of individuals with a simplifying assumption of linearity. In this paper, we leverage information-theoretic tools to relax this assumption. To demonstrate the feasibility of our approach, we focus on a robotics-based experimental paradigm, which affords consistent and controllable delivery of visual stimuli to zebrafish. Specifically, we use a robotic arm to maneuver a life-sized replica of a zebrafish in a predetermined trajectory as it interacts with a focal subject in a test tank. We track the fish and the replica through time and use the resulting trajectory data to measure the transfer entropy between the replica and the focal subject, which, in turn, is used to quantify one-directional information flow from the robot to the fish. In agreement with our expectations, we find that the information flow from the replica to the zebrafish is significantly more than the other way around. Notably, such information is specifically related to the response of the fish to the replica, whereby we observe that the information flow is reduced significantly if the motion of the replica is randomly delayed in a surrogate dataset. In addition, comparison with a control experiment, where the replica is replaced by a conspecific, shows that the information flow toward the focal fish is significantly more for a robotic than a live stimulus. These findings support the reliability of using transfer entropy as a measure of information flow, while providing indirect evidence for the efficacy of a robotics-based platform in animal behavioral studies.

p. 1331-1348
Received: 6 August 2013 / Revised: 15 January 2014 / Accepted: 21 February 2014 / Published: 3 March 2014

Show/Hide Abstract
| PDF Full-text (3639 KB) | HTML Full-text | XML Full-text
Abstract: RNA is usually classified as either structured or unstructured; however, neither category is adequate in describing the diversity of secondary structures expected in biological systems We describe this diversity within the ensemble of structures by using two different metrics: the average Shannon entropy and the ensemble defect. The average Shannon entropy is a measure of the structural diversity calculated from the base pair probability matrix. The ensemble defect, a tool in identifying optimal sequences for a given structure, is a measure of the average number of structural differences between a target structure and all the structures that make up the ensemble, scaled to the length of the sequence. In this paper, we show examples and discuss various uses of these metrics in both structured and unstructured RNA. By exploring how these two metrics describe RNA as an ensemble of different structures, as would be found in biological systems, it will push the field beyond the standard “structured” and “unstructured” categorization.

p. 1365-1375
Received: 21 January 2014 / Revised: 25 February 2014 / Accepted: 26 February 2014 / Published: 4 March 2014

Show/Hide Abstract
| Cited by 3 | PDF Full-text (1127 KB) | HTML Full-text | XML Full-text
Abstract: Information-theory provides, among others, conceptual methods to quantify the amount of information contained in single random variables and methods to quantify the amount of information contained and shared among two or more variables. Although these concepts have been successfully applied in hydrology and other fields, the evaluation of these quantities is sensitive to different assumptions in the estimation of probabilities. An example is the histogram bin size used to estimate probabilities to calculate Information Theory quantities via frequency methods. The present research aims at introducing a method to take into consideration the uncertainty coming from these parameters in the evaluation of the North Sea’s water level network. The main idea is that the entropy of a random variable can be represented as a probability distribution of possible values, instead of entropy being a deterministic value. The method consists of solving multiple scenarios of Multi-Objective Optimization Problem in which information content is maximized and redundancy is minimized. Results include probabilistic analysis of the chosen parameters on the resulting family of Pareto fronts, providing additional criteria on the selection of the final set of monitoring points.

p. 1376-1395
Received: 3 December 2013 / Revised: 21 February 2014 / Accepted: 5 March 2014 / Published: 7 March 2014

Show/Hide Abstract
| PDF Full-text (373 KB) | HTML Full-text | XML Full-text
Abstract: Conditional independence tests have received special attention lately in machine learning and computational intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of probabilistic graphical models, which includes Bayesian network models, conditional independence tests are especially important for the task of learning the probabilistic graphical model structure from data. In this paper, we propose the full Bayesian significance test for tests of conditional independence for discrete datasets. The full Bayesian significance test is a powerful Bayesian test for precise hypothesis, as an alternative to the frequentist’s significance tests (characterized by the calculation of the p-value).

p. 1396-1413
Received: 20 December 2013 / Revised: 24 February 2014 / Accepted: 5 March 2014 / Published: 10 March 2014

Show/Hide Abstract
| Cited by 4 | PDF Full-text (484 KB) | HTML Full-text | XML Full-text
Abstract: We present two examples of finite-alphabet, infinite excess entropy processes generated by stationary hidden Markov models (HMMs) with countable state sets. The first, simpler example is not ergodic, but the second is. These are the first explicit constructions of processes of this type.

p. 1414-1425
Received: 5 December 2013 / Revised: 16 December 2013 / Accepted: 27 February 2014 / Published: 10 March 2014

Show/Hide Abstract
| Cited by 5 | PDF Full-text (550 KB) | HTML Full-text | XML Full-text
Abstract: We are going back to the roots of the original solar neutrino problem: the analysis of data from solar neutrino experiments. The application of standard deviation analysis (SDA) and diffusion entropy analysis (DEA) to the Super-Kamiokande I and II data reveals that they represent a non-Gaussian signal. The Hurst exponent is different from the scaling exponent of the probability density function, and both the Hurst exponent and scaling exponent of the probability density function of the Super-Kamiokande data deviate considerably from the value of 0.5, which indicates that the statistics of the underlying phenomenon is anomalous. To develop a road to the possible interpretation of this finding, we utilize Mathai’s pathway model and consider fractional reaction and fractional diffusion as possible explanations of the non-Gaussian content of the Super-Kamiokande data.

p. 1426-1461
Received: 12 December 2013 / Revised: 7 February 2014 / Accepted: 24 February 2014 / Published: 11 March 2014

Show/Hide Abstract
| PDF Full-text (496 KB) | HTML Full-text | XML Full-text
Abstract: We treat the non-equilibrium evolution of an open one-particle statistical system, subject to a potential and to an external “heat bath” (hb ) with negligible dissipation. For the classical equilibrium Boltzmann distribution, W_{c,eq} , a non-equilibrium three-term hierarchy for moments fulfills Hermiticity, which allows one to justify an approximate long-time thermalization. That gives partial dynamical support to Boltzmann’s Wc_{,eq} , out of the set of classical stationary distributions, Wc;st, also investigated here, for which neither Hermiticity nor that thermalization hold, in general. For closed classical many-particle systems without hb (by using W_{c,eq} ), the long-time approximate thermalization for three-term hierarchies is justified and yields an approximate Lyapunov function and an arrow of time. The largest part of the work treats an open quantum one-particle system through the non-equilibrium Wigner function, W. Weq for a repulsive finite square well is reported. W ’s (< 0 in various cases) are assumed to be quasi-definite functionals regarding their dependences on momentum (q ). That yields orthogonal polynomials, H_{Q,n} (q ), for Weq (and for stationary W _{st} ), non-equilibrium moments, W_{n} , of W and hierarchies. For the first excited state of the harmonic oscillator, its stationary W _{st} is a quasi-definite functional, and the orthogonal polynomials and three-term hierarchy are studied. In general, the non-equilibrium quantum hierarchies (associated with Weq) for the W _{n} ’s are not three-term ones. As an illustration, we outline a non-equilibrium four-term hierarchy and its solution in terms of generalized operator continued fractions. Such structures also allow one to formulate long-time approximations, but make it more difficult to justify thermalization. For large thermal and de Broglie wavelengths, the dominant W _{eq} and a non-equilibrium equation for W are reported: the non-equilibrium hierarchy could plausibly be a three-term one and possibly not far from Gaussian, and thermalization could possibly be justified.

p. 1462-1483
Received: 5 February 2014 / Revised: 20 February 2014 / Accepted: 28 February 2014 / Published: 12 March 2014

Show/Hide Abstract
| Cited by 5 | PDF Full-text (704 KB) | HTML Full-text | XML Full-text
Abstract: Chemical composition of interfaces—free surfaces and grain boundaries—is generally described by the Langmuir–McLean segregation isotherm controlled by Gibbs energy of segregation. Various components of the Gibbs energy of segregation, the standard and the excess ones as well as other thermodynamic state functions—enthalpy, entropy and volume—of interfacial segregation are derived and their physical meaning is elucidated. The importance of the thermodynamic state functions of grain boundary segregation, their dependence on volume solid solubility, mutual solute–solute interaction and pressure effect in ferrous alloys is demonstrated.

p. 1484-1492
Received: 21 January 2014 / Revised: 4 February 2014 / Accepted: 5 March 2014 / Published: 14 March 2014

Show/Hide Abstract
| Cited by 9 | PDF Full-text (227 KB) | HTML Full-text | XML Full-text
Abstract: Recently there has been much effort in the quantum information community to prove (or disprove) the existence of symmetric informationally complete (SIC) sets of quantum states in arbitrary finite dimension. This paper strengthens the urgency of this question by showing that if SIC-sets exist: (1) by a natural measure of orthonormality, they are as close to being an orthonormal basis for the space of density operators as possible; and (2) in prime dimensions, the standard construction for complete sets of mutually unbiased bases and Weyl-Heisenberg covariant SIC-sets are intimately related: The latter represent minimum uncertainty states for the former in the sense of Wootters and Sussman. Finally, we contribute to the question of existence by conjecturing a quadratic redundancy in the equations for Weyl-Heisenberg SIC-sets.

p. 1493-1500
Received: 23 January 2014 / Revised: 21 February 2014 / Accepted: 12 March 2014 / Published: 17 March 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (313 KB) | HTML Full-text | XML Full-text
Abstract: Shannon’s source entropy formula is not appropriate to measure the uncertainty of non-stationary processes. In this paper, we propose a new entropy measure for non-stationary processes, which is greater than or equal to Shannon’s source entropy. The maximum entropy of the non-stationary process has been considered, and it can be used as a design guideline in cryptography.

p. 1501-1514
Received: 4 December 2013 / Revised: 15 January 2014 / Accepted: 10 March 2014 / Published: 18 March 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (101 KB) | HTML Full-text | XML Full-text
Abstract: In this paper, we consider the time averaged distribution of discrete time quantum walks on the glued trees. In order to analyze the walks on the glued trees, we consider a reduction to the walks on path graphs. Using a spectral analysis of the Jacobi matrices defined by the corresponding random walks on the path graphs, we have a spectral decomposition of the time evolution operator of the quantum walks. We find significant contributions of the eigenvalues, ±1, of the Jacobi matrices to the time averaged limit distribution of the quantum walks. As a consequence, we obtain the lower bounds of the time averaged distribution.

p. 1515-1546
Received: 9 November 2013 / Revised: 7 January 2014 / Accepted: 5 February 2014 / Published: 18 March 2014

Show/Hide Abstract
| Cited by 4 | PDF Full-text (282 KB) | HTML Full-text | XML Full-text
Abstract: Some features of hydro- and thermo-dynamics, as applied to atmospheres and to stellar structures, are puzzling: (1) the suggestion, first made by Laplace, that our atmosphere has an adiabatic temperature distribution, is confirmed for the lower layers, but the explanation for this is very controversial; (2) the standard treatment of relativistic thermodynamics does not favor a systematic treatment of mixtures, such as the mixture of a perfect gas with radiation; (3) the concept of mass density in applications of general relativity to stellar structures is less than completely satisfactory; and (4) arguments in which a concept of energy and entropy play a role, in the context of hydro-thermodynamical systems and gravitation, are not always convincing. It is proposed that a formulation of thermodynamics as an action principle may be a suitable approach to adopt for a new investigation of these matters. This paper formulates the thermodynamics of ideal gases in a constant gravitational field in terms of the Gibbsean action principle. This approach, in the simplest cases, does not deviate from standard practice, but it lays the foundations for a more systematic approach to the various extensions, such as the incorporation of radiation, the consideration of mixtures and the integration with general relativity. We study the interaction between an ideal gas and the photon gas and the propagation of sound in a vertical, isothermal column. We determine the entropy that allows for the popular isothermal equilibrium and introduce the study of the associated adiabatic dynamics. This leads to the suggestion that the equilibrium of an ideal gas must be isentropic, in which case, the role of solar radiation would be merely to compensate for the loss of energy by radiation into the cosmos. An experiment with a centrifuge is proposed, to determine the influence of gravitation on the equilibrium distribution with a very high degree of precision.

p. 1547-1570
Received: 2 January 2014 / Revised: 17 February 2014 / Accepted: 12 March 2014 / Published: 19 March 2014

Show/Hide Abstract
| Cited by 8 | PDF Full-text (293 KB) | HTML Full-text | XML Full-text
Abstract: The principal methods for the definition of thermodynamic entropy are discussed with special reference to those developed by Carathéodory, the Keenan School, Lieb and Yngvason, and the present authors. An improvement of the latter method is then presented. Seven basic axioms are employed: three Postulates , which are considered as having a quite general validity, and four Assumptions , which identify the domains of validity of the definitions of energy (Assumption 1) and entropy (Assumptions 2, 3, 4). The domain of validity of the present definition of entropy is not restricted to stable equilibrium states. For collections of simple systems, it coincides with that of the proof of existence and uniqueness of an entropy function which characterizes the relation of adiabatic accessibility proposed by Lieb and Yngvason. However, our treatment does not require the formation of scaled copies so that it applies not only to collections of simple systems, but also to systems contained in electric or magnetic fields and to small and few-particle systems.

p. 1571-1585
Received: 8 November 2013 / Revised: 14 January 2014 / Accepted: 5 March 2014 / Published: 20 March 2014

Show/Hide Abstract
| Cited by 6 | PDF Full-text (151 KB) | HTML Full-text | XML Full-text
Abstract: In many data envelopment analysis (DEA) applications, the analyst always confronts the difficulty that the selected data set is not suitable to apply traditional DEA models for their poor discrimination. This paper presents an approach using Shannon’s entropy to improve the discrimination of traditional DEA models. In this approach, DEA efficiencies are first calculated for all possible variable subsets and analyzed using Shannon’s entropy theory to calculate the degree of the importance of each subset in the performance measurement, then we combine the obtained efficiencies and the degrees of importance to generate a comprehensive efficiency score (CES), which can observably improve the discrimination of traditional DEA models. Finally, the proposed approach has been applied to some data sets from the prior DEA literature.

p. 1586-1631
Received: 9 January 2014 / Revised: 7 February 2014 / Accepted: 12 March 2014 / Published: 21 March 2014

Show/Hide Abstract
| Cited by 4 | PDF Full-text (1510 KB) | HTML Full-text | XML Full-text
Abstract: Recommendation systems are information-filtering systems that tailor information to users on the basis of knowledge about their preferences. The ability of these systems to profile users is what enables such intelligent functionality, but at the same time, it is the source of serious privacy concerns. In this paper we investigate a privacy-enhancing technology that aims at hindering an attacker in its efforts to accurately profile users based on the items they rate. Our approach capitalizes on the combination of two perturbative mechanisms—the forgery and the suppression of ratings. While this technique enhances user privacy to a certain extent, it inevitably comes at the cost of a loss in data utility, namely a degradation of the recommendation’s accuracy. In short, it poses a trade-off between privacy and utility. The theoretical analysis of such trade-off is the object of this work. We measure privacy as the Kullback-Leibler divergence between the user’s and the population’s item distributions, and quantify utility as the proportion of ratings users consent to forge and eliminate. Equipped with these quantitative measures, we find a closed-form solution to the problem of optimal forgery and suppression of ratings, an optimization problem that includes, as a particular case, the maximization of the entropy of the perturbed profile. We characterize the optimal trade-off surface among privacy, forgery rate and suppression rate,and experimentally evaluate how our approach could contribute to privacy protection in a real-world recommendation system.

p. 1632-1651
Received: 31 January 2014 / Revised: 25 February 2014 / Accepted: 12 March 2014 / Published: 21 March 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (455 KB) | HTML Full-text | XML Full-text
Abstract: Neuroimage registration has an important role in clinical (for both diagnostic and therapeutic purposes) and research applications. In this article we describe the applicability of Tsallis Entropy as a new cost function for neuroimage registration through a comparative analysis based on the performance of the traditional approaches (correlation based: Entropy Correlation Coefficient (ECC) and Normalized Cross Correlation (NCC); and Mutual Information (MI) based: Mutual Information using Shannon Entropy (MIS) and Normalized Mutual Information (NMI)) and the proposed one based on MI using Tsallis entropy (MIT). We created phantoms with known geometric transformations using Single Photon Emission Computed Tomography (SPECT) and Magnetic Resonance Imaging from 3 morphologically normal subjects. The simulated volumes were registered to the original ones using both the proposed and traditional approaches. The comparative analysis of the Relative Error (RE) showed that MIT was more accurate in the intra-modality registration, whereas for inter-modality registration, MIT presented the lowest RE for rotational transformations, and the ECC the lowest RE for translational transformations. In conclusion, we have shown that, with certain limitations, Tsallis Entropy has application as a better cost function for reliable neuroimage registration.

p. 1652-1686
Received: 15 November 2013 / Revised: 5 March 2014 / Accepted: 7 March 2014 / Published: 21 March 2014

Show/Hide Abstract
| Cited by 12 | PDF Full-text (355 KB) | HTML Full-text | XML Full-text
Abstract: The time evolution during which macroscopic systems reach thermodynamic equilibrium states proceeds as a continuous sequence of contact structure preserving transformations maximizing the entropy. This viewpoint of mesoscopic thermodynamics and dynamics provides a unified setting for the classical equilibrium and nonequilibrium thermodynamics, kinetic theory, and statistical mechanics. One of the illustrations presented in the paper is a new version of extended nonequilibrium thermodynamics with fluxes as extra state variables.

p. 1687-1727
Received: 2 December 2013 / Revised: 25 January 2014 / Accepted: 27 February 2014 / Published: 21 March 2014

Show/Hide Abstract
| PDF Full-text (2794 KB) | HTML Full-text | XML Full-text
Abstract: The primordial confrontation underlying the existence of our Universe can be conceived as the battle between entropy and complexity. The law of ever-increasing entropy (Boltzmann H-theorem) evokes an irreversible, one-directional evolution (or rather involution) going uniformly and monotonically from birth to death. Since the 19th century, this concept is one of the cornerstones and in the same time puzzles of statistical mechanics. On the other hand, there is the empirical experience where one witnesses the emergence, growth and diversification of new self-organized objects with ever-increasing complexity. When modeling them in terms of simple discrete elements one finds that the emergence of collective complex adaptive objects is a rather generic phenomenon governed by a new type of laws. These “emergence” laws, not connected directly with the fundamental laws of the physical reality, nor acting “in addition” to them but acting through them were called “More is Different” by Phil Anderson, “das Maass” by Hegel etc. Even though the “emergence laws” act through the intermediary of the fundamental laws that govern the individual elementary agents, it turns out that different systems apparently governed by very different fundamental laws: gravity, chemistry, biology, economics, social psychology, end up often with similar emergence laws and outcomes. In particular the emergence of adaptive collective objects endows the system with a granular structure which in turn causes specific macroscopic cycles of intermittent fluctuations.

p. 1743-1755
Received: 17 January 2014 / Revised: 10 March 2014 / Accepted: 18 March 2014 / Published: 24 March 2014

Show/Hide Abstract
| PDF Full-text (218 KB) | HTML Full-text | XML Full-text
Abstract: Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential, logistic, Pareto (type I – IV ) and Burr distributions. The latter two fall into the class of fat-tailed distributions with power law properties, used frequently in biological, physical and actuarial sciences. We discover that the transfer entropy expressions for all four distributions are identical and depend merely on the multivariate distribution parameter and the number of distribution dimensions. Moreover, we find that in all four cases the transfer entropies are given by the same decreasing function of distribution dimensionality.

p. 1756-1807
Received: 30 December 2013 / Revised: 27 January 2014 / Accepted: 7 March 2014 / Published: 24 March 2014

Show/Hide Abstract
| Cited by 13 | PDF Full-text (295 KB) | HTML Full-text | XML Full-text
Abstract: We present the state of the art on the modern mathematical methods of exploiting the entropy principle in thermomechanics of continuous media. A survey of recent results and conceptual discussions of this topic in some well-known non-equilibrium theories (Classical irreversible thermodynamics CIT, Rational thermodynamics RT, Thermodynamics of irreversible processes TIP, Extended irreversible thermodynamics EIT, Rational Extended thermodynamics RET) is also summarized.

p. 1808-1818
Received: 12 February 2014 / Revised: 18 March 2014 / Accepted: 19 March 2014 / Published: 24 March 2014

Show/Hide Abstract
| Cited by 2 | PDF Full-text (1317 KB) | HTML Full-text | XML Full-text
Abstract: Specific heat was systematically measured by the heat flow method in Ni_{50-x} Co_{x} Mn_{50} -_{y} Al_{y} metamagnetic shape memory alloys near the martensitic transformation temperatures. Martensitic transformation and ferromagnetic–paramagnetic transition for the parent phase were directly observed via the specific heat measurements. On the basis of the experimental results, the entropy change was estimated and it was found to show an abrupt decrease below the Curie temperature. The results were found to be consistent with those of earlier studies on Ni-Co-Mn-Al alloys.

p. 1819-1841
Received: 16 December 2013 / Revised: 18 March 2014 / Accepted: 19 March 2014 / Published: 24 March 2014

Show/Hide Abstract
| PDF Full-text (438 KB) | HTML Full-text | XML Full-text
Abstract: This paper considers the four-node relay-eavesdropper channel, where a relay node helps the source to send secret messages to the destination in the presence of a passive eavesdropper. For the discrete memoryless case, we propose a hybrid cooperative coding scheme, which is based on the combination of the partial decode-forward scheme, the noise-forward scheme and the random binning scheme. The key feature of the proposed hybrid cooperative scheme is that the relay integrates the explicit cooperation strategy and the implicit cooperation strategy by forwarding source messages and additional interference at the same time. The derived achievable secrecy rate shows that some existing works can be viewed as special cases of the proposed scheme. Then, the achievable secrecy rate is extended to the Gaussian channel based on Gaussian codebooks, and the optimal power policy is also identified in the high power region. Both the analysis and numerical results are provided to demonstrate that the proposed hybrid cooperative coding scheme outperforms the comparable ones, especially in the high power region.

Other
p. 1349-1364
Received: 6 January 2014 / Revised: 12 February 2014 / Accepted: 26 February 2014 / Published: 3 March 2014

Show/Hide Abstract
| Cited by 1 | PDF Full-text (349 KB) | HTML Full-text | XML Full-text
Abstract: This paper demonstrates a robust maximum entropy approach to estimating flexible-form farm-level multi-input/multi-output production functions using minimally specified disaggregated data. Since our goal is to address policy questions, we emphasize the model’s ability to reproduce characteristics of the existing production system and predict outcomes of policy changes at a disaggregate level. Measurement of distributional impacts of policy changes requires use of farm-level models estimated across a wide spectrum of sizes and types, which is often difficult with traditional econometric methods due to data limitations. We use a two-stage approach to generate observation-specific shadow values for incompletely priced inputs. We then use the shadow values and nominal input prices to estimate crop-specific production functions using generalized maximum entropy (GME) to capture individual heterogeneity of the production environment while replicating observed inputs and outputs to production. The two-stage GME approach can be implemented with small data sets. We demonstrate this methodology in an empirical application to a small cross-section data set for Northern Rio Bravo, Mexico and estimate production functions for small family farms and moderate commercial farms. The estimates show considerable distributional differences resulting from policies that change water subsidies in the region or shift price supports to direct payments.

p. 1728-1742
Received: 20 December 2013 / Revised: 10 February 2014 / Accepted: 19 March 2014 / Published: 24 March 2014

Show/Hide Abstract
| Cited by 3 | PDF Full-text (518 KB) | HTML Full-text | XML Full-text
Abstract: This study applies a new method and technology to measure the discharge in a lined canal in Taiwan. An Acoustic Digital Current Meter mounted on a measurement platform is used to measure the velocities over the full cross-section for establishing the measurement method. The proposed method primarily employs Chiu’s Equation which is based on entropy to establish a constant ratio the relation between the maximum and mean velocities in an irrigation canal, and compute the maximum velocity by the observed velocity profile. In consequence, the mean velocity of the lined canal can be rapidly determined by the maximum velocity and the constant ratio. The cross-sectional area of the artificial irrigation canal can be calculated for the water stage. Finally, the discharge in the lined canal can be efficiently determined by the estimated mean velocity and the cross-sectional area. Using the data of discharges and stages collected in the Wan-Dan Canal, the correlation of stage and discharge is also developed for remote real-time monitoring and estimating discharge from the pumping station. Overall, Chiu’s Equation is demonstrated to reliably and accurately measure discharge in a lined canal, and can serve as reference for future calibration for a stage-discharge rating curve.

Select/unselect all

Displaying article 1-30

Export citation of selected articles as:
Plain Text
BibTeX
BibTeX (without abstracts)
Endnote
Endnote (without abstracts)
Tab-delimited
RIS