Next Issue
Volume 9, June
Previous Issue
Volume 8, December
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 9, Issue 1 (March 2007) – 3 articles , Pages 1-41

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
139 KiB  
Article
Second Law Analysis of a non-Newtonian Laminar Falling Liquid Film Along an Inclined Heated Plate
by Rama Subba Reddy Gorla and David M. Pratt
Entropy 2007, 9(1), 30-41; https://doi.org/10.3390/e9010030 - 30 Mar 2007
Cited by 16 | Viewed by 6518
Abstract
The second law analysis of heat transfer of a non-Newtonian, laminar fallingliquid film along an inclined heated plate is investigated. The upper surface of the liquidfilm is considered free and adiabatic. Velocity and temperature profiles are obtainedanalytically and used to compute the entropy [...] Read more.
The second law analysis of heat transfer of a non-Newtonian, laminar fallingliquid film along an inclined heated plate is investigated. The upper surface of the liquidfilm is considered free and adiabatic. Velocity and temperature profiles are obtainedanalytically and used to compute the entropy generation number (Ns), irreversibility ratio (Φ) and the Bejan number (Be) for several values of the viscous dissipation parameter(Br -1), viscosity index (n) and the dimensionless axial distance (X). The Bejan numberincreases in the transverse direction and decreases as the viscous dissipation parameter(BrΩ -1) increases. The numerical results show that the Bejan number decreases as theviscous dissipation parameter (BrΩ -1), Peclet number (Pe) and the viscosity index (n)increase. Full article
Show Figures

Figure 1

17 KiB  
Other
An Adaption of the Jaynes Decision Algorithm
by Vlad Tarko
Entropy 2007, 9(1), 27-29; https://doi.org/10.3390/e9010027 - 31 Jan 2007
Viewed by 6593
Abstract
There are two types of decisions: given the estimated state of affairs, one decidesto change oneself in a certain way (that is best suited for the given conditions); given whatone is, one decides to change the state of affairs in a certain way [...] Read more.
There are two types of decisions: given the estimated state of affairs, one decidesto change oneself in a certain way (that is best suited for the given conditions); given whatone is, one decides to change the state of affairs in a certain way (that is best suited for whatone wants for oneself). Jaynes' approach to decision theory accounts only for the first type ofdecisions, the case when one is just an observer of the external world and the decisiondoesn't change the world. However, many decisions involve the wish to transform theexternal environment. To account for this we need to add an additional step in Jaynes'proposed algorithm. Full article
197 KiB  
Article
A Utility-Based Approach to Some Information Measures
by Craig Friedman, Jinggang Huang and Sven Sandow
Entropy 2007, 9(1), 1-26; https://doi.org/10.3390/e9010001 - 20 Jan 2007
Cited by 17 | Viewed by 8523
Abstract
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We show that the resulting quantities, [...] Read more.
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We show that the resulting quantities, share many of the properties of entropy andrelative entropy, such as the data processing inequality and the second law of thermodynamics.We formulate an important statistical learning problem – probability estimation – in terms of ageneralized relative entropy. The solution of this problem reflects general risk preferences via theutility function; moreover, the solution is optimal in a sense of robust absolute performance. Full article
Previous Issue
Next Issue
Back to TopTop