Free entropy

{{Short description|Thermodynamic potential of entropy, analogous to the free energy}}

{{Thermodynamics|expanded=Potentials}}

A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a partition function. The Onsager reciprocal relations in particular, are developed in terms of entropic potentials. In mathematics, free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability.

A free entropy is generated by a Legendre transformation of the entropy. The different potentials correspond to different constraints to which the system may be subjected.

Examples

{{see also|List of thermodynamic properties}}

The most common examples are:

class="wikitable"
Name

|Function

|Alt. function

|Natural variables

Entropy

| dS = \frac {1}{T} dU + \frac {P}{T} dV - \sum_{i=1}^s \frac {\mu_i}{T} dN_i \,

|

|align="center"|~~~~~U,V,\{N_i\}\,

Massieu potential \ Helmholtz free entropy

|\Phi =S-\frac{1}{T} U

|= - \frac {A}{T}

|align="center"|~~~~~\frac {1}{T},V,\{N_i\}\,

Planck potential \ Gibbs free entropy

|\Xi=\Phi -\frac{P}{T} V

|= - \frac{G}{T}

|align="center"|~~~~~\frac{1}{T},\frac{P}{T},\{N_i\}\,

where

{{Col-begin}}

{{Col-break}}

::S is entropy

::\Phi is the Massieu potential{{cite web |author=Antoni Planes |author2=Eduard Vives |date=2000-10-24 |publisher=Universitat de Barcelona |url=http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html |title=Entropic variables and Massieu-Planck functions |access-date=2007-09-18 |work=Entropic Formulation of Statistical Mechanics |archive-date=2008-10-11 |archive-url=https://web.archive.org/web/20081011011717/http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html |url-status=dead }}{{cite journal |author=T. Wada |author2=A.M. Scarfone |date=December 2004 |title=Connections between Tsallis' formalisms employing the standard linear average energy and ones employing the normalized q-average energy |journal=Physics Letters A |volume=335 |issue=5–6 |pages=351–362 |doi=10.1016/j.physleta.2004.12.054 |arxiv=cond-mat/0410527|bibcode = 2005PhLA..335..351W |s2cid=17101164 }}

::\Xi is the Planck potential

::U is internal energy

{{Col-break}}

::T is temperature

::P is pressure

::V is volume

::A is Helmholtz free energy

{{Col-break}}

::G is Gibbs free energy

::N_i is number of particles (or number of moles) composing the i-th chemical component

::\mu_i is the chemical potential of the i-th chemical component

::s is the total number of components

::i is the ith components.

{{Col-end}}

Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is \psi, used by both Planck and Schrödinger. (Note that Gibbs used \psi to denote the free energy.) Free entropies were invented by French engineer François Massieu in 1869, and actually predate Gibbs's free energy (1875).

Dependence of the potentials on the natural variables

=Entropy=

:S = S(U,V,\{N_i\})

By the definition of a total differential,

:d S = \frac {\partial S} {\partial U} d U + \frac {\partial S} {\partial V} d V + \sum_{i=1}^s \frac {\partial S} {\partial N_i} d N_i .

From the equations of state,

:d S = \frac{1}{T}dU+\frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i .

The differentials in the above equation are all of extensive variables, so they may be integrated to yield

:S = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) + \textrm{constant}.

=Massieu potential / Helmholtz free entropy=

:\Phi = S - \frac {U}{T}

:\Phi = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) - \frac {U}{T}

:\Phi = \frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right)

Starting over at the definition of \Phi and taking the total differential, we have via a Legendre transform (and the chain rule)

:d \Phi = d S - \frac {1} {T} dU - U d \frac {1} {T} ,

:d \Phi = \frac{1}{T}dU + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac {1} {T} dU - U d \frac {1} {T},

:d \Phi = - U d \frac {1} {T}+\frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i.

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From d \Phi we see that

:\Phi = \Phi(\frac {1}{T},V, \{N_i\}) .

If reciprocal variables are not desired,

{{cite book

| title=The Collected Papers of Peter J. W. Debye

| publisher=Interscience Publishers, Inc.

| place=New York, New York

| year=1954

}}

{{rp|222}}

:d \Phi = d S - \frac {T d U - U d T} {T^2} ,

:d \Phi = d S - \frac {1} {T} d U + \frac {U} {T^2} d T ,

:d \Phi = \frac{1}{T}dU + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac {1} {T} d U + \frac {U} {T^2} d T,

:d \Phi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i ,

:\Phi = \Phi(T,V,\{N_i\}) .

=Planck potential / Gibbs free entropy=

:\Xi = \Phi -\frac{P V}{T}

:\Xi = \frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) -\frac{P V}{T}

:\Xi = \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right)

Starting over at the definition of \Xi and taking the total differential, we have via a Legendre transform (and the chain rule)

:d \Xi = d \Phi - \frac{P}{T} d V - V d \frac{P}{T}

:d \Xi = - U d \frac {2} {T} + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac{P}{T} d V - V d \frac{P}{T}

:d \Xi = - U d \frac {1} {T} - V d \frac{P}{T} + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i.

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From d \Xi we see that

:\Xi = \Xi \left(\frac {1}{T}, \frac {P}{T}, \{N_i\} \right) .

If reciprocal variables are not desired,{{rp|222}}

:d \Xi = d \Phi - \frac{T (P d V + V d P) - P V d T}{T^2} ,

:d \Xi = d \Phi - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T ,

:d \Xi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T ,

:d \Xi = \frac {U + P V} {T^2} d T - \frac {V}{T} d P + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i ,

:\Xi = \Xi(T,P,\{N_i\}) .

References

Bibliography

  • {{cite journal

| first =M.F. |last = Massieu|year=1869 |title= Compt. Rend

| volume=69

| issue= 858

| pages= 1057 }}

  • {{cite book

| first = Herbert B. | last = Callen | author-link = Herbert Callen | year = 1985

| title = Thermodynamics and an Introduction to Thermostatistics | edition = 2nd

| publisher = John Wiley & Sons | location = New York | isbn = 0-471-86256-8 }}

Category:Thermodynamic entropy