AbstractThere is a well-known analogy between statistical and quantum mechanics. In statistical mechanics, Boltzmann realized that the probability for a system in thermal equilibrium to occupy a given state is proportional to \(\exp(-E/kT)\), where \(E\) is the energy of that state. In quantum mechanics, Feynman realized that the amplitude for a system to undergo a given history is proportional to \(\exp(-S/i\hbar)\), where \(S\) is the action of that history. In statistical mechanics, we can recover Boltzmann's formula by maximizing entropy subject to a constraint on the expected energy. This raises the question: what is the quantum mechanical analogue of entropy? We give a formula for this quantity, which we call ``quantropy''. We recover Feynman's formula from assuming that histories have complex amplitudes, that these amplitudes sum to one and that the amplitudes give a stationary point of quantropy subject to a constraint on the expected action. Alternatively, we can assume the amplitudes sum to one and that they give a stationary point of a quantity that we call ``free action'', which is analogous to free energy in statistical mechanics. We compute the quantropy, expected action and free action for a free particle and draw some conclusions from the results. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Baez, J.C.; Pollard, B.S. Quantropy. Entropy 2015, 17, 772-789.
Baez JC, Pollard BS. Quantropy. Entropy. 2015; 17(2):772-789.Chicago/Turabian Style
Baez, John C.; Pollard, Blake S. 2015. "Quantropy." Entropy 17, no. 2: 772-789.