Boltzmann was one of the genius founders of statistical thermodynamics, and yet the subtleties of probability tripped him up:
From “Compendium of the foundations of classical statistical physics” by Jos Uffink:
He introduced the probability distribution as follows:
“Let (v)dv be the sum of all the instants of time during which the velocity of a disc in the course of a very long time lies between v and v + dv, and let N be the number of discs which on average are located in a unit surface area, then
is the number of discs per unit surface whose velocities lie between v and v + dv” Continue reading “Even Boltzmann had trouble with probability”
I just discovered this treasure trove on the foundations and history of statistical mechanics:
Compendium of the foundations of classical statistical physics, by Jos Uffink (PDF)
Roughly speaking, classical statistical physics is the branch of theoretical physics that aims to account for the thermal behaviour of macroscopic bodies in terms of a classical mechanical model of their microscopic constituents, with the help of probabilistic assumptions. In the last century and a half, a fair number of approaches have been developed to meet this aim. This study of their foundations assesses their coherence and analyzes the motivations for their basic assumptions, and the interpretations of their central concepts. The most outstanding foundational problems are the explanation of time-asymmetry in thermal behaviour, the relative autonomy of thermal phenomena from their microscopic underpinning, and the meaning of probability.
A more or less historic survey is given of the work of Maxwell, Boltzmann and Gibbs in statis- tical physics, and the problems and objections to which their work gave rise. Next, we review some modern approaches to (i) equilibrium statistical mechanics, such as ergodic theory and the theory of the thermodynamic limit; and to (ii) non-equilibrium statistical mechanics as provided by Lanford’s work on the Boltzmann equation, the so-called Bogolyubov-Born-Green-Kirkwood-Yvon approach, and stochastic approaches such as ‘coarse-graining’ and the ‘open systems’ approach. In all cases, we focus on the subtle interplay between probabilistic assumptions, dynamical assumptions, initial conditions and other ingredients used in these approaches.
This will keep me busy.