PHD → Statistical mechanics and thermodynamics ↓
Statistical mechanics
Statistical mechanics is a branch of physics that combines the principles of statistics and classical mechanics to predict the behavior of systems containing large numbers of particles. It serves as a framework for explaining and deriving the thermodynamic properties of matter based on the probabilistic distribution of particles and their dynamics. The main idea behind statistical mechanics is to use statistical methods to determine the macroscopic properties of systems by considering the microscopic details of their components: atoms, molecules, or particles.
Introduction to microstates and macrostates
Distinguishing between microscopic states and macroscopic states is important to understanding statistical mechanics.
A microstate refers to the specific detailed microscopic configuration of a system. If we consider a system of gas particles in a container, the microstate describes the exact position and velocity of each particle at a given time.
For example, consider a simple system with two particles, where each particle can occupy one of two boxes. The possible microscopic states can be listed as follows:
- Particle 1 in box A, particle 2 in box A.
- Particle 1 in box A, particle 2 in box B.
- Particle 1 in box B, particle 2 in box A.
- Particle 1 in box B, particle 2 in box B.
On the other hand, macrostates are defined by macroscopic properties such as temperature, pressure and volume, which can be measured without knowing the exact configuration of the particles. Thus a macrostate is a collection of many microstates. For the above system:
- A macrostate is defined as both particles in box A.
- Another macrostate is one particle in each box.
- The third macrostate has both particles present in box B.
In statistical mechanics, we are interested in calculating the probability of finding a system in a given macroscopic state by counting the microscopic states that are consistent with it.
Boltzmann distribution
Ludwig Boltzmann introduced a revolutionary perspective by linking the macroscopic properties of a system to the statistics of its microscopic states. The Boltzmann distribution is a probability distribution that gives the probability of a system being in a particular state based on the energy of that state.
P(E) = frac{e^{-E/kT}}{Z}
Here:
P(E)
is the probability of a state with energyE
k
is the Boltzmann constant: approximately1.38 times 10^{-23} , J/K
.T
is the absolute temperature.Z
is the partition function, which acts as a standardization factor, ensuring that all probabilities sum to 1. This is given by summing up all possible cases:
Z = sum e^{-E_i/kT}
Entropy and the second law of thermodynamics
Entropy is a central concept in statistical mechanics, reflecting the degree of disorder or randomness in a system. It provides a bridge between the microscopic and macroscopic realms. The statistical definition of entropy is given by Boltzmann's formula:
S = k ln(W)
where W
is the number of microstates corresponding to a macrostate, and S
is the entropy.
Statistical mechanics provides insight into the second law of thermodynamics, which states that in an isolated system, entropy tends to increase, and tends toward a state of equilibrium, where entropy is maximum.
Canonical ensemble and partition function
In statistical mechanics, a group is a collection of a large number of imaginary copies of a system, considered together to study its statistical properties. The canonical group is an important type, representing a system in thermal equilibrium at a given temperature.
The partition function, denoted by Z
, is important in obtaining thermodynamic quantities. It is the sum of the exponential terms of the energies of all possible states:
Z = sum_{i} e^{-E_i/kT}
Many thermodynamic quantities can be derived from the partition function. For example, the Helmholtz free energy F
is given by:
F = -kT ln(Z)
The average energy <E>
is calculated as:
<E> = -frac{partial ln Z}{partial beta} = - frac{partial ln Z}{partial (1/kT)}
Visual example: Understanding energy distribution
Consider a simple system of three energy states, with energies E_1 = 0
, E_2 = 1
and E_3 = 2
. We will determine the probability distribution at a given temperature.
| E | e^(-E/kT) | Probability P(E) | |---------|------------|------------------| | E_1 = 0 | e^0 = 1 | 1/Z | | E_2 = 1 | e^(-1/kT) | e^(-1/kT)/Z | | E_3 = 2 | e^(-2/kT) | e^(-2/kT)/Z |
Here, Z = 1 + e^{-1/kT} + e^{-2/kT}
.
Example: Distribution at different temperatures
Let us consider an example of how the energy distribution of a system changes with temperature.
Imagine we vary the temperature - from low to high - and calculate the probability of finding particles in different energy states. Use T = 1/k
and increase it to see the evolution of the probabilities:
T = 1/k: P(E_1) < P(E_2) < P(E_3) T = 2/k: P(E_1) = P(E_2) = P(E_3)
As the temperature increases, the probabilities also equalize, reflecting an increase in entropy and disorder.
Thermodynamic potential and stability
Statistical mechanics connects to thermodynamics through thermodynamic potentials – functions such as internal energy, Helmholtz free energy, Gibbs free energy, and enthalpy – which determine equilibrium and stability.
These potentials are interconnected and convert between each other via Legendre transformations, allowing scientists to choose the most useful function based on certain variables in the system. For example:
- The Helmholtz free energy
F = U - TS
is relevant for constant volume and temperature settings. - The Gibbs free energy
G = H - TS
fits systems at constant pressure and temperature.
Role of thermodynamic potential in phase transitions
Phase transitions, such as the change from a liquid to a gas, are major applications of statistical mechanics. Gibbs free energy plays an important role in analyzing these transformations. At equilibrium, the chemical potentials of the different phases become equal, leading to specific criteria for transitions:
Use of Gibbs free energy:
Delta G = G_2 - G_1 = (H_2 - H_1) - T(S_2 - S_1)
This determines the energy cost associated with the phase change, and helps understand the effects of latent heat and temperature.
Visual example: Phase change
Consider the transition of water between phases; the Gibbs restrictions predict when the transition occurs. The equilibrium between liquid and gas means that the Gibbs energy per molecule is the same for both phases at the boiling point.
Delta G = 0
This condition reflects the stability of the system and the possible phase change boundary.
Quantum statistical mechanics
The principles of statistical mechanics are extended to quantum systems, and introduce new statistics:
- Bose–Einstein statistics for particles called bosons, which can occupy the same state for a large number of times. Examples include the photon and the helium-4 atom.
- Fermi–Dirac statistics for particles such as fermions, electrons, and protons, which obey the Pauli exclusion principle, avoiding identical states.
The distribution function for bosons, called the Bose-Einstein distribution, is:
f_B(E) = frac{1}{e^{(E-mu)/kT} - 1}
while for fermions, the Fermi–Dirac distribution is:
f_F(E) = frac{1}{e^{(E-mu)/kT} + 1}
Here, mu
is the chemical potential. Both distributions are important in understanding phenomena such as quantum gases, superconductivity, and electron behavior in metals.
Visual example: Distribution function
Here is a comparison between classical and quantum statistical distributions. Classical Maxwell–Boltzmann statistics for identifiable particles differs from the uniform particle behavior described by the quantum distribution:
Classical: f_{MB}(E) = e^{-E/kT} Bose-Einstein: f_B(E) Fermi-Dirac: f_F(E)
These distributions serve as the basis for advances in modern physics and quantum mechanics.
Applications in modern physics
Statistical mechanics enhances our understanding of various physical phenomena and modern technologies. For example, it is important for:
- Condensed matter physics: Understanding the properties of exotic states of matter such as solids, liquids, and superfluids.
- Thermodynamic cycles: Designing efficient engines, refrigeration systems and solar cells.
- Quantum computing: developing qubits and controlling quantum states.
- Astrophysics: Modeling the interiors of stars and the cosmic microwave background radiation.
Conclusion
Statistical mechanics serves as a bridge between the microscopic and macroscopic worlds, providing profound insights into the laws of the natural universe. By statistically analyzing microscopic behavior, physicists can predict and derive macroscopic properties and understand the fundamental processes that govern a variety of physical systems. With real-world applications ranging from technology to cosmology, statistical mechanics remains vital to scientific advancement and practical innovation.