Minimum entropy production principle
Christian Maes and Karel Netocny (2013), Scholarpedia, 8(7):9664. | doi:10.4249/scholarpedia.9664 | revision #134780 [link to/cite this article] |
The MINimum Entropy Production principle (MINEP) is an approximate variational characterization of steady states for thermodynamically open systems maintained out of equilibrium. Originally formulated within the framework of linear irreversible thermodynamics (Prigogine 1947), it was extended to stochastic kinetics, e.g., for close-to-equilibrium systems described by a Master equation (Klein and Meijer 1954). The MINEP is consistent yet different from other nonequilibrium variational principles like the MAXimum entropy production principle (Paltridge 1979) or the Least dissipation principle (Onsager and Machlup 1953). Recent dynamical fluctuation theories provide a framework for their precise formulation, unification and systematic improvement.
Contents |
Entropy production in irreversible thermodynamics
Irreversible thermodynamics considers systems consisting of several subsystems, each of them being at thermal equilibrium but where global thermodynamic equilibrium is broken. Within that framework the total entropy is defined as the sum of equilibrium entropies over all subsystems, including the environment (usually represented in terms of boundary conditions) in case the system is still not isolated. There can even be a continuum of subsystems in which case we speak about local equilibrium, local temperature, pressure or chemical potential, and we introduce the entropy density.
Due to gradients of thermodynamic parameters (like temperature or chemical potential) called thermodynamic forces $F_\alpha$ there exist flows between the subsystems (like those of energy or mass) called thermodynamic fluxes $J_\alpha$. Forces and fluxes are naturally paired with each other so that the entropy production measuring the increase rate of total entropy has the form \[ \sigma = \sum_\alpha J_\alpha F_\alpha\,. \] The second law of thermodynamics requires $\sigma \geq 0$.
Linear irreversible thermodynamics assumes \[ J_\alpha = \sum_\gamma L_{\alpha\gamma} F_\gamma \] with linear response coefficients $L_{\alpha\gamma}$ satisfying the Onsager-Casimir reciprocity relations $L_{\alpha\gamma} = L^*_{\gamma\alpha}$. The star indicates time-reversal of the microscopic dynamics, e.g., it inverts the sign of velocities and magnetic fields. In "even" systems $L_{\alpha\gamma} = L^*_{\alpha\gamma}$ and then the matrix $[L_{\alpha\gamma}]$ is symmetric and positive-definite.
Minimum entropy production
MINEP states that whenever an even system $(L_{\alpha\gamma} = L_{\gamma\alpha}$) is driven out of equilibrium by applying time-independent constraints on the thermodynamic forces, then it approaches the steady state characterized by the minimum of the entropy production functional $\sigma = \sum_{\alpha\gamma} L_{\alpha\gamma} F_\alpha F_\gamma$.
In particular, if some but not all of the thermodynamic forces $F_1,\ldots,F_m$ are externally constrained then the steady values of the remaining forces $F_{m+1},\ldots,F_n$ are such that the corresponding currents vanish, \[ J_\alpha = 0\,,\qquad \alpha = m+1,\ldots,n\,. \]
A dynamical form of the MINEP states that if the steady state is initially perturbed then the system returns to stationarity along a trajectory on which the entropy production is decreasing, i.e., that $\sigma(t)$ is a Lyapunov function.
Example A: Heat conduction
For pure heat conduction, \[ \sigma = \int_\Omega J_Q \cdot \nabla \bigl( \frac{1}{T} \bigr)\,d V\,,\qquad J_Q = \kappa\, \nabla \bigl( \frac{1}{T} \bigr) \] with $\nabla (1/T)$ the thermodynamic force and $J_Q$ the heat flux. They are related by the Fourier law where we assume that the thermal conductivity $\kappa > 0$ is temperature-independent. Fixing the temperature profile on the boundary $\partial\Omega$, the entropy production is minimal for the bulk temperature profile satisfying the boundary condition and the stationary heat equation $\nabla \cdot J_Q = 0$.
The time-dependent heat equation $C(\partial T / \partial t) + \nabla \cdot J_Q = 0$ (with $C > 0$ the heat capacity) describes relaxation to the steady state. Along its solutions, $d\sigma / d t < 0$, i.e., the entropy production is monotone decreasing in time.
Example B: Electrical networks
In a linear electrical network made only of resistors and batteries the entropy production is proportional to the Joule heat dissipated altogether in all resistors. The voltages over resistors are constrained by the second Kirchhoff law: their sum balances the sum of all electromotive forces along any closed loop. Under these constraints, the MINEP is equivalent to the first Kirchhoff law: the sum of incoming currents to every node is zero.
For example, two resistors with resistances $R_1$, $R_2$ in series with electromotive force $E$ yield the entropy production \[ \sigma = \frac{1}{T} \Bigl( \frac{U_1^2}{R_1} + \frac{U_2^2}{R_2} \Bigr) \] where the voltages are constrained by $U_1 + U_2 = E$. Thus, $\sigma$ takes its minimum for $U_1 / R_1 = U_2 / R_2$.
The network above obtains a smooth dynamics by adding capacitors parallel to the resistors. Then the steady state remains unchanged and the entropy production is monotone in time as it approaches that steady regime.
Breaking of MINEP
The MINEP is generally not valid for systems which are either non-linear or non-even.
- The steady voltages in Example B are temperature-independent and generally corresponding to the minimum of the total Joule heat. Hence, the MINEP manifestly breaks down whenever the resistors dissipate at different temperatures. The temperature gradient provides another thermodynamic force which here does not couple with the electrical current; however it enters the linear response coefficients $L_i = 1 / (R_i T_i)$, yielding non-linear effects beyond the scope of the linear theory (Jaynes 1980). a similar situation occurs in Example A in the case where the thermal conductivity $\kappa(T)$ is temperature-dependent.
- The principle can break down even at homogeneous temperature, e.g., for a resistance $R$ in series with conductance $L$ and driven by electromotive force $E$: the steady current is $I = E / R$ and hence it obviously does not minimize the entropy production $\sigma = R I^2 / T$ as a function of the current (Landauer 1975). The point is that the current $I$ playing here the role of thermodynamic force is odd under time-reversal, violating the conditions.
Entropy production in stochastic kinetics
Stochastic kinetics describes fluctuating and thermodynamically open systems which interact with an environment consisting of one or several equilibrium reservoirs. Physical states and dynamics of the system are described by probability distributions on configurations, respectively their time sequences (trajectories). Steady states generally have non-Boltzmann distributions and non-zero currents, unless the attached reservoirs are all at mutual equilibrium.
Local detailed balance
The hypothesis of local detailed balance identifies the inherent asymmetry of Markov transition rates with entropy changes. It is a stochastic counterpart of the time-reversal invariance property of microscopic Hamiltonian systems.
Local detailed balance states that for every single transition $x \rightarrow y$ between two configurations of the system, the antisymmetric component of the logarithm of the transition rate $k(x,y)$ corresponds to the increase of entropy in the environment, \[ k_B \log\frac{k(x,y)}{k(y,x)} = \Delta S_\text{env}(x \rightarrow y)\,. \] The environment may consist of several reservoirs not at mutual equilibrium; only one of them may assist at each transition.
The entropy change in the reservoir is determined from the heat and the particle flow. E.g., for a heat reservoir at temperature $T$ involved in the transition $x\rightarrow y$, $\Delta S_\text{env}(x \rightarrow y) = \Delta Q(x \rightarrow y) / T$ with $\Delta Q$ the heat leaving the system, equal to the change of energy in the reservoir.
- If the system is coupled to a single heat reservoir and driven by conservative forces with potential $E(x)$ then $\Delta S_\text{env}(x \rightarrow y) = [E(x) - E(y)] / T$, yielding the usual (global) detailed balance condition of equilibrium dynamics,
\[ e^{-\beta E(x)} k(x,y) = e^{-\beta E(y)} k(y,x)\,,\qquad \beta = \frac{1}{k_B T}\,. \]
- The general global detailed balance condition requires that $\sum_{i=0}^n \Delta S_\text{env}(x_i \rightarrow x_{i+1}) = 0$ for any closed sequence of configurations $x_0 \rightarrow \ldots \rightarrow x_n \rightarrow x_{n+1} = x_0$. Equivalently, $\Delta S_\text{env}(x \rightarrow y) = \psi(y) - \psi(x)$ for some function $\psi$ on configurations. Global detailed balance gets violated under non-equilibrium conditions, e.g., (i) if the system is driven by non-conservative forces, or (ii) when different transitions are assisted by different reservoirs which are not mutually at equilibrium.
Entropy production
The time-evolving state of a Markov system is described by a probability distribution $\rho$ on configurations (= occupation statistics within the ensemble interpretation) and its entropy is identified with the Gibbs-Shannon entropy \[ S = -k_B \sum_x \rho(x) \log \rho(x)\,. \] The rate of entropy increase in the environment (or, from the point of view of the open system, the outgoing entropy flux) derives from the totality of local fluxes $j(x,y) = \rho(x) k(x,y) - \rho(y) k(y,x)$ along all transitions, \[ J_S = \frac{1}{2} \sum_{x,y} j(x,y)\, \Delta S_\text{env}(x \rightarrow y)\,, \] and the local fluxes close the Master equation \[ \frac{d\rho(x)}{d t} + \sum_y j(x,y) = 0\,. \]
The entropy production is the source term in the system's entropy balance equation \[ \frac{d S}{d t} + J_S = \sigma\,. \] That is now a functional on the space of probability distributions, \[ \sigma(\rho) = \sum_{x,y} \rho(x) k(x,y) \log \frac{\rho(x) k(x,y)}{\rho(y) k(y,x)}\,, \] and it retains the bilinear structure in fluxes and forces, with thermodynamic forces of the form $F(x,y) = \log \frac{\rho(x) k(x,y)}{\rho(y) k(y,x)}$ along each transition channel.
- The functional $\sigma(\rho)$ is non-negative, convex, and its extension to the domain of all (unnormalized) distributions is homogeneous, $\sigma(\lambda \rho) = \lambda\,\sigma(\rho)$.
- If a Markov system with finitely many configurations is irreducible (i.e., if any two configurations can be connected via a sequence of allowed transitions) then $\sigma(\rho)$ is strictly convex, with unique minimum at some $\rho_P > 0$ (called Prigogine distribution in the sequel).
The stationary distribution $\bar\rho$ is the (normalized) solution to the steady-state Master equation $\sum_y \bar j(x,y) = 0$ with $\bar j(x,y) = \bar\rho(x) k(x,y) - \bar\rho(y) k(y,x)$ the stationary currents.
- In finite irreducible systems the stationary distribution exists and it is unique.
- Moreover, if the condition of global detailed balance is satisfied then (i) $\bar\rho = \rho_P$, (ii) $\bar j(x,y) = 0$ and $\bar F(x,y) = 0$ for all transitions, and (iii) $\sigma(\bar\rho) = 0$. In this case the system is at thermodynamic equilibrium.
Stochastic MINEP
The Markov stochastic model fits the general framework of irreversible thermodynamics with $F(x,y)$ and $j(x,y)$ the conjugated forces and fluxes, both dependent on the probability distribution $\rho$. The linear relations between the forces and fluxes are (approximately) verified whenever the global detailed balance condition is only slightly violated. In this linear regime the MINEP holds true and the Prigogine distribution $\rho_P$ well approximates the stationary distribution $\bar\rho$ (Klein and Meijer 1954; Jiu-li et al 1984).
Alternatively, in Schnakenberg's graph-theoretic representation the conjugated thermodynamic forces and fluxes are associated with independent cycles in the graph of allowed transitions (Schnakenberg 1976). Those forces, defined by summing $F(x,y)$ over all transitions $x \rightarrow y$ along each cycle, are $\rho-$independent and hence provide an explicit realization of external constraints for the stochastic MINEP.
For an analytic formulation let the transition rates $k^\lambda(x,y)$ be smoothly dependent on $\lambda$ around $\lambda = 0$ with $k^0(x,y)$ corresponding to an irreducible system satisfying the global detailed balance condition. Then, $\bar\rho^\lambda - \rho_P^\lambda = O(\lambda^2)$, i.e., the stationary distribution and the Prigogine distribution coincide up to linear order in $\lambda$.
Derivation from dynamical fluctuations
The empirical occupation measure $p_\tau^\omega$ is the probability-measure-valued functional on the system's trajectories $(\omega_t)_{t=0}^\infty$ such that $p_\tau^\omega(x)$ equals the fraction of time in the interval $[0,\tau]$ that the trajectory $\omega$ visits the configuration $x$.
Finite irreducible Markov systems are ergodic which means that $p_\tau^\omega \to \bar\rho$ in the limit $\tau \to \infty$, for almost every trajectory $\omega$.
According to the large-deviation principle for such Markov dynamics, the fluctuations of the empirical occupation measure have the exponential large-time asymptotics (Donsker and Varadhan 1975) \[ \text{Prob}(p_\tau^\omega = \rho) \asymp e^{-\tau\, \Phi(\rho)}\,,\qquad \tau \to \infty\,. \]
- The rate function $\Phi(\rho) \geq 0$ is strictly convex and $\Phi(\rho) = 0$ if and only if $\rho = \bar\rho$, the stationary distribution.
The minimization of $\Phi(\rho)$ is an exact stationary variational principle which for systems only slightly violating the global detailed balance condition boils down to minimizing the entropy production (Maes and Netočný 2007). It is an instance of general statistical relations between time-symmetric quantities (like the empirical occupation measure) and time-antisymmetric ones (like currents or entropy changes) which follows from the time-reversal symmetry of the underlying equilibrium dynamics.
Analytically, if the transition rates $k^\lambda(x,y)$ are a smooth deformation of a finite irreducible system with $k^0(x,y)$ satisfying the global detailed balance condition, then \[ \Phi^\lambda(\rho) = \frac{\sigma^\lambda(\rho) - \sigma^\lambda(\bar\rho^\lambda)}{4} + O(\lambda^2 d[\rho,\bar\rho^\lambda], d[\rho,\bar\rho^\lambda]^3) \] where the superscript $\lambda$ indicates correspondence to the transition rates $k^\lambda(x,y)$, and $d[\rho,\bar\rho^\lambda]$ is the distance between $\rho$ and stationary distribution $\bar\rho^\lambda$. Whenever the higher-order corrections are negligible then the stationary distribution $\bar\rho^\lambda$ (i.e., the minimizer of $\Phi^\lambda$) coincides with the Prigogine distribution $\rho_P^\lambda$ (i.e., the minimizer of $\sigma^\lambda$).
Entropy production in diffusive systems
The dynamics of a Brownian particle driven by force $f$ and moving in a medium characterized by temperature $T$ and positive-definite diffusion and mobility matrices $D$ and $\chi$, is in the high-friction limit given by the Smoluchowski equation \[ \frac{\partial\rho}{\partial t} + \nabla \cdot J = 0\,,\qquad J = \chi f \rho - D\nabla\rho \] with $\rho$ the probability density and $J$ the current density. Their stationary values $\bar\rho$ and $\bar J$ solve the steady-state equation $\nabla \cdot \bar J = 0$.
- Local detailed balance corresponds to the Einstein relation $D = \chi k_B T$ between the diffusion and mobility matrices. (Let it be always assumed in the sequel.)
- Global detailed balance is satisfied if (i) the Einstein relation is valid and (ii) the force $f$ is conservative, i.e., $f = -\nabla U$ for some (globally defined) potential $U$. Provided $Z = \int_\Omega e^{-\beta U}\,dV < \infty$, the steady state with $\bar\rho = e^{-\beta U}/Z$ and $\bar J = 0$ corresponds to thermodynamic equilibrium ($1/\beta = k_B T$).
- Violation of the global detailed balance, e.g., for the particle driven by a constant force along a circle, manifests the presence of non-equilibrium constraints which do not allow the particle to reach thermodynamic equilibrium.
The heat flux balances the power of driving force, yielding the entropy flux (i.e., the rate of entropy increase in the medium) $J_S = \beta \int_\Omega f \cdot J\,dV$. The entropy production term in the balance equation for the Gibbs-Shannon entropy $S = -\int \rho \log \rho\,dV$ is \[ \sigma = \int_\Omega J \cdot (\rho D)^{-1} J\,dV\,. \]
MINEP for diffusion
The MINEP can be applied whenever the non-conservative component $\varepsilon \tilde f$ of the force $f^\varepsilon = -\nabla U + \varepsilon \tilde f$ is small.
More precisely: As a functional on densities, $\sigma^\varepsilon(\rho)$ attains its minimum at $\rho_P^\varepsilon = \bar\rho^\varepsilon + O(\varepsilon^2)$, where $\bar\rho^\varepsilon$ is the stationary density for the particle driven by force $f^\varepsilon$.
- The statement can be either (i) checked by a direct calculation, or (ii) verified by casting the model into the framework of irreversible thermodynamics with $X = \beta f - \nabla\log\rho$ the thermodynamic force conjugated to $J$, or (iii) proven to emerge from the general structure of dynamical fluctuations (Maes et al 2008).
- The assumption of high-friction limit leading to the Smoluchowski equation is crucial here: the MINEP is generally not valid for the diffusion in phase space modeled by the Kramers equation since that is a non-even system (the momentum is odd under time-reversal).
There is also a modified MINEP for diffusions which remains exact at arbitrary distance from global detailed balance: With the driving force of the form $f = -\nabla U + \tilde f$ and taking the entropy production $\sigma(U)$ as a functional on the space of potentials $U$ (keeping $\rho$, $\tilde f$, $T$, and $D$ fixed), the $\sigma(U)$ attains its minimum at $U = U^*$ for which $\rho$ is stationary, i.e., \[ \nabla \cdot J^* = 0\,,\qquad J^* = \beta D (f - U^*)\rho - D\nabla\rho\,. \] This is a variational principle for the (inverse) stationary problem with variable $U^*$ instead of $\rho$.
- That follows from the convexity of $\sigma(U)$ and from the functional relation $\frac{\delta\sigma(U)}{\delta U} = 2\beta\,\nabla \cdot J$.
The formalism and the statements above have extensions to other diffusive systems, e.g., to diffusive fields in fluctuating hydrodynamics where $\rho$ becomes a density profile, the diffusion matrix $D(\rho)$ is $\rho-$dependent, and for various types of boundary conditions.
References
- Donsker, M. D. and Varadhan, S. R. S. 1975. Asymptotic evaluation of certain Markov process expectations for large time, I. Commun. Pure Appl. Math. 28: 1-47.
- de Groot, S. R. and Mazur, P. 1962. Non-Equilibrium Thermodynamics. Amsterdam, NorthHolland.
- Eyink, G., Lebowitz, J. L. and Spohn, H. 1990. Macroscopic origin of hydrodynamic behavior: Entropy production and the steady state. In Campbell, D. (ed.) Chaos, Soviet-American Perspectives in Nonlinear Science, American Institute of Physics, New York, pp. 367–391.
- Jaynes, E. T. 1980. The minimum entropy production principle. Ann. Rev. Phys. Chem. 31: 579--601.
- Jiu-li, L., Van den Broeck, C. and Nicolis, Z. 1984. Stability Criteria and Fluctuations around Nonequilibrium States. Z. Phys. B Condensed Matter 56: 165-170.
- Klein, M. J. and Meijer, P. H. E. 1954. Principle of minimum entropy production. Phys. Rev. 96: 250-255.
- Landauer, R. 1975. Inadequacy of entropy and entropy derivatives in characterizing the steady state. Phys. Rev. A 12: 636-638.
- Maes, C. and Netočný, K. 2007. Minimum entropy production principle from a dynamical fluctuation law. J. Math. Phys. 48: 053306.
- Maes, C., Netočný, K. and Wynants B. 2008. Steady state statistics of driven diffusions. Physica A 387: 2675–2689.
- Onsager, L. and Machlup S. 1953. Fluctuations and Irreversible Processes. Phys. Rev. 91: 1505-1512.
- Paltridge, G. W. 1979. Climate and thermodynamic systems of maximum dissipation. Nature 279: 630-631.
- Prigogine, I. 1947. Etude Thermodynamique des phénoménes irréversibles. Desoer, Liége.
- Schnakenberg, J. 1976. Network theory of behavior of Master equation systems. Rev. Mod. Phys. 48: 571-585.
Internal references
- Tomasz Downarowicz (2007) Entropy. Scholarpedia, 2(11):3901.
- Andre Longtin (2010) Stochastic dynamical systems. Scholarpedia, 5(4):1619.
- James Meiss (2007) Hamiltonian systems. Scholarpedia, 2(8):1943.
See also
Fluctuations, Fluctuation theorem, Time's arrow and Boltzmann's entropy