Entropy isn't just about heat and energy. It's a measure of disorder at the microscopic level, counting the ways particles can arrange themselves. The more ways, the higher the entropy.
This statistical view of entropy connects to the thermodynamic definition. It explains why heat flows from hot to cold and why some processes are irreversible. Understanding this helps predict how systems change over time.
Statistical Interpretation of Entropy
Microscopic interpretation of entropy
- Entropy measures the number of possible microscopic configurations or microstates of a system
- Microstate: specific arrangement of particles in a system
- Higher entropy corresponds to more microstates
- Boltzmann equation relates entropy to the number of microstates: $S = k_B \ln \Omega$
- $S$: entropy of the system
- $k_B$: Boltzmann constant ($1.38 \times 10^{-23}$ J/K)
- $\Omega$: number of microstates available to the system
- Entropy is proportional to the natural logarithm of the number of microstates according to the Boltzmann equation
Statistical vs thermodynamic entropy
- Thermodynamic definition of entropy based on heat transfer in a reversible process: $dS = \frac{dQ_{rev}}{T}$
- $dS$: change in entropy
- $dQ_{rev}$: heat transferred in a reversible process
- $T$: absolute temperature
- Statistical definition of entropy (Boltzmann equation) consistent with the thermodynamic definition
- More microstates correspond to higher entropy
- Heat transfer to a system increases accessible microstates, raising entropy
Entropy calculations for simple configurations
- Calculate entropy using the statistical approach by determining the number of microstates ($\Omega$) and applying the Boltzmann equation: $S = k_B \ln \Omega$
- For simple systems (ideal gases, Einstein solid), calculate the number of microstates using combinatorial methods
- Ideal gas: number of microstates related to possible arrangements of particles in available energy levels
- Substitute the determined number of microstates into the Boltzmann equation to calculate entropy
Entropy and system microstates
- Probability of a system being in a particular microstate related to the system's entropy
- In equilibrium, a system more likely to be in a high-entropy microstate (more probable state)
- Number of microstates ($\Omega$) related to the probability of a system being in a particular macrostate
- Macrostate with more microstates is more probable than one with fewer microstates
- Second law of thermodynamics: entropy of an isolated system always increases or remains constant
- System naturally evolves towards the most probable macrostate with the highest number of microstates and entropy