Fundamentals of Thermodynamics

1. Macroscopic Description of Physical Systems

A physical system contains an enormous number of microscopic degrees of freedom—positions and momenta of atoms, electrons, vibrational modes, and internal electronic states. Tracking these directly is neither feasible nor necessary. Instead, thermodynamics provides a macroscopic description based on a small number of variables that are experimentally accessible and sufficient to predict equilibrium behavior.

These variables are called state variables. A state variable has the defining property that its value depends only on the present state of the system, not on how that state was reached.

Common state variables include internal energy \( U \), entropy \( S \), volume \( V \), pressure \( P \), temperature \( T \), and the amounts of chemical species \( N_i \). The central task of thermodynamics is to relate these quantities and determine how changes in one necessitate changes in the others.

2. Internal Energy

The internal energy \( U \) is the total microscopic energy contained in a system. It includes translational kinetic energy of molecules, rotational and vibrational energy, electronic excitation energy, chemical bond energy, and intermolecular interaction energy. Internal energy does not include macroscopic kinetic energy or potential energy due to external fields unless explicitly stated.

Internal energy has units of joules (J). In practice, the absolute value of \( U \) is rarely needed. What is measurable and relevant are changes in internal energy, denoted \( \Delta U \).

2.1 Measuring Changes in Internal Energy

For many systems, changes in internal energy are related to temperature changes through a material property called the heat capacity. The constant-volume heat capacity, denoted \( C_V \), is defined as

$$ C_V \equiv \left( \frac{\partial U}{\partial T} \right)_V $$

This definition states that \( C_V \) measures how much the internal energy increases when the temperature increases by one kelvin while the volume is held fixed. Heat capacity has units of joules per kelvin (J/K), or joules per mole per kelvin (J/(mol·K)) when expressed per mole.

2.2 Example: Heating a Gas at Constant Volume

Consider one mole of an ideal monatomic gas, such as argon, confined in a rigid container. For a monatomic ideal gas, the constant-volume heat capacity is

$$ C_V = \frac{3}{2} R $$

where \( R = 8.314 \,\mathrm{J/(mol\,K)} \) is the universal gas constant. Numerically, this gives \( C_V \approx 12.47 \,\mathrm{J/(mol\,K)} \).

If the temperature of the gas increases by \( \Delta T = 10 \,\mathrm{K} \), the corresponding change in internal energy is

$$ \Delta U = C_V \Delta T = (12.47)(10) \approx 125 \,\mathrm{J}. $$

This result follows directly from the definition of heat capacity under constant-volume conditions; it is not a general law but a consequence of the constraints imposed on the system.

3. Temperature

Temperature is not simply “average energy.” It is a thermodynamic parameter that governs how energy is distributed among microscopic states. Formally, temperature is defined through the relationship

$$ \frac{1}{T} \equiv \left( \frac{\partial S}{\partial U} \right)_{V,N}. $$

This definition implies that temperature measures how rapidly the number of accessible microscopic configurations increases as energy is added to the system. Temperature is measured in kelvin (K) and determines both the direction of heat flow and the relative probability of occupying higher-energy states.

4. Entropy

Entropy quantifies how energy is distributed among the microscopic configurations consistent with macroscopic constraints. From statistical mechanics, entropy is defined as

$$ S = k_B \ln \Omega, $$

where \( k_B = 1.38 \times 10^{-23} \,\mathrm{J/K} \) is Boltzmann’s constant and \( \Omega \) is the number of accessible microstates. Entropy has units of joules per kelvin (J/K).

4.1 Operational Meaning of Entropy Change

Changes in entropy can be measured experimentally through reversible heat transfer. For a reversible process,

$$ dS = \frac{\delta Q_{\text{rev}}}{T}. $$

For heating at constant pressure, the entropy change can be written as

$$ \Delta S = \int_{T_1}^{T_2} \frac{C_P}{T} \, dT, $$

where \( C_P \) is the constant-pressure heat capacity.

5. The First Law in Differential Form

Energy conservation, combined with the identification of distinct work modes, leads to the fundamental thermodynamic identity

$$ dU = T\,dS - P\,dV + \sum_i \mu_i \, dN_i. $$

Each term represents a different channel through which energy can be exchanged: thermal energy transfer, mechanical work, and chemical work. This equation is not postulated; it follows directly from accounting for all possible contributions to energy change.

6. Chemical Potential

The chemical potential \( \mu_i \) of species \( i \) is defined as

$$ \mu_i \equiv \left( \frac{\partial U}{\partial N_i} \right)_{S,V,N_{j\neq i}}. $$

This definition states that the chemical potential is the energy added to a system when one mole of species \( i \) is added while entropy and volume are held fixed. Chemical potential has units of joules per mole (J/mol) and governs diffusion, chemical reactions, phase equilibrium, and electrochemical behavior.

6.1 Composition Dependence

For most systems, the chemical potential can be written as

$$ \mu_i = \mu_i^\circ(T,P) + RT \ln a_i, $$

where \( \mu_i^\circ \) is the standard chemical potential and \( a_i \) is the activity. The logarithmic dependence arises from the entropy of mixing.

7. Gibbs Free Energy

Many physical and chemical processes occur under approximately constant temperature and pressure. Under these conditions, it is convenient to define the Gibbs free energy as

$$ G \equiv U - TS + PV. $$

Differentiating this expression yields

$$ dG = -S\,dT + V\,dP + \sum_i \mu_i dN_i. $$

At constant temperature and pressure, this reduces to

$$ dG = \sum_i \mu_i dN_i, $$

which forms the basis for understanding chemical equilibrium, battery voltages, combustion energetics, and phase stability.

8. Electrical Work and Electrochemical Potential

When charged species move, electrical work must be accounted for. The electrical work associated with moving a charge \( dq \) through an electric potential \( \phi \) is given by

$$ \delta W_{\text{elec}} = \phi \, dq. $$

For one mole of ions with charge number \( z_i \), the corresponding charge is \( dq = z_i F \), where \( F = 96485 \,\mathrm{C/mol} \) is Faraday’s constant. This leads to the definition of the electrochemical potential:

$$ \tilde{\mu}_i = \mu_i + z_i F \phi. $$

This quantity determines whether charged species will move between regions of differing electrical potential.

9. Batteries, Combustion, and Explosions

At this stage, no new variables are required to describe batteries, flames, or explosions. The same thermodynamic quantities govern all three. Batteries are dominated by gradients in electrochemical potential and controlled charge transport. Combustion is dominated by thermal energy release and heat transport. Explosions are dominated by rapid pressure buildup and mechanical work. These regimes are not mutually exclusive, and systems can transition between them when constraints change. A battery can explode if the mechanisms that regulate charge transport fail: internal short circuits, excessive current draw, or thermal runaway can cause electrochemical energy to couple into heat faster than it can be dissipated, raising temperature, accelerating side reactions, generating gas, and ultimately converting a controlled electrochemical process into rapid pressure-driven mechanical failure. Conversely, many explosive compounds can function as propellants when their reaction rate is deliberately constrained. By formulating the material to burn via subsonic deflagration rather than detonation, and by allowing expanding gases to escape through a nozzle, chemical energy is released primarily as directed momentum rather than as a destructive shock. In physical terms, the key change in both cases is the hierarchy of timescales: when charge, heat, or pressure transport is artificially slowed or accelerated relative to reaction kinetics, energy is redirected into a different channel, causing the same underlying chemistry to manifest as electrical work, thrust, or catastrophic explosion.

The fundamental distinction between these regimes is not the nature of energy itself, but which transport process equilibrates slowest and therefore controls the dynamics of energy release.