Gibbs entropy / Partition Function / Free energy#
We alluded to intuition that, in large systems, the product of microstates and Boltzmann factor is strongly peaked. The typical states of a large system lie on the peak, where Helmholtz free energy \(F=E_{S}-T_{R} S_{S}\) is minimized. But in fact, we can easily show that \(F\) is minimized even in small systems if we define the entropy as Gibbs entropy
To that end, recall from our discussion of information entropy that Boltzmann distribution is the maximum entropy distribution given the mean energy \(\left\langle E_{S}\right\rangle\). Thus, Boltzmann distribution maximizes the function
where \(T_{R}^{-1}\) is a Lagrange multiplier, only subject to the normalization constraint \(\sum_i p_i=1\).
So, let’s compute the Gibbs entropy:#
The Gibbs entropy depends on the probability distribution over energies,
which consists of the Boltzmann factor, multiplicity of states \(\Omega_S(E_S)\) and the normalization factor
which is called the partition function.
The Gibbs entropy thus evalutes to
or
The fact that the free energy is given by the log of the partition function explains why the partition function is such an important object.
Quick Math Recap
We will next discuss some mathematical properties of the partition function. As a warmup, let’s recall the definition of moment generating function
which satisfies
The Taylor expansion of the cumulant generating function \(\ln \tilde p(t)\) yields the connected moments \(\langle \rangle_c\),
which are also called cumulants in probability theory. The first cumulants are mean, variance, skew and curtosis. A Gaussian has an infinite series of moments, but only the first two cumulants are non-vanishing. That’s why cumulants are generally more useful than moments.
If \(p(x)\) is normalized, \(\tilde{p}(0)=1\), so
If is not normalized, with \( \sum_{x} p(x)=Z_0\),
\(\Rightarrow\) (*) still holds!
Properties of the Partition Function#
The first thing to note about the partition function
is that it is the Laplace transform of \(\Omega\left(E_{S}\right)\). Since Laplace transforms can be inverted, \(Z(\beta)\) and \(\Omega(E)\) contain the same information. Likewise, their logarithms, the free energy \(F\) and the entropy \(S\), contain the same information.
Moreover, \(Z(\beta)\) can be viewed as the unnormalized moment generating function of the microcanonical distribution (with \(t=-\beta\)). Therefore, we can obtain cumulants of the internal energy by taking derivatives of the log of \(Z\) (see Quick Math Recap)
Check:
Note:
\(C_{x}=\frac{1}{k_{B} T^{2}}\left\langle\delta E^{2}\right\rangle\). This is our first encounter of a fluctuation-response relationship (also called fluctuation-dissipation relationship): Fluctuations quite generlly probe the system’s potential to (linearly) respond to external forces.
These relationships are super useful for experiments: We can measure how a polymer responds to a pulling force just by watching the polymer fluctuate, importantly without applying any force ourselves.
The ratio of mean squared energy fluctuations and mean internal energy quantifies the concentration of measure in the thermodynamic limit: \(\frac{\left\langle \delta E^{2}\right\rangle}{\langle E\rangle}=\sqrt{k_{B} T^{2}} \frac{\sqrt{C_{x}}}{\langle E\rangle} \sim N^{-1 / 2} \) because both \(C_{x}\propto N\) and \(E\propto N\) are extensive.
The concentration of measure implies that canonical and microcanonical ensembles are equivalent in the thermodynamic limit. (But they are not equivalent for small systems!)
Applications: N particles in a box#
(i) Maxwell–Boltzmann distribution
Distribution over each \(p_i\) factorizes, with the marginal
Ideal gas
The partition function evaluates to
where we introduced the characteristic length scale
associated with Planck’s constant. Note that we introduced the factor \(1/N!\) accounting for the fact that \(N!\) configurations obtained by relabelling indices belong to the same microstate.
The free energy \(F=-k_\text{B}T \ln(Z)\) is obtained from the log of the partition function,
Note that the free energy is extensive. If we hadn’t incorporated the combinatorial factor \(1/N!\), we would have obtained a super-extensive free energy (scaling as \(N\log N\)). This non-extensivity of the naive free energy (without \(N!\)) is called the Gibbs paradox.
To obtain from the free energy all the usual ideal gas laws, we use
where the expression for the ideal gas entropy is called Sackur-Tetrode formula. For this expression to be a good approximation, we have to require that the volume per particle is much larger than \(\lambda^3\),
Multiple particles trying to occupy the same “voxel” \(\lambda^3\) will interact quantum mechanically. The onset of quantum mechanical effects can be seen in a graph comparing the entropy of a classical, Bose and Fermi gas (Figure from Wikipedia)