Peeter Joot's (OLD) Blog.

Math, physics, perl, and programming obscurity.

probability forms of entropy

Posted by peeterjoot on March 16, 2013

[Click here for a PDF of this post with nicer formatting (especially if my latex to wordpress script has left FORMULA DOES NOT PARSE errors.)]

Question: Entropy as probability

[1] points out that entropy can be written as

\begin{aligned}S = - k_{\mathrm{B}} \sum_i P_i \ln P_i\end{aligned} \hspace{\stretch{1}}(1.0.1)

where

\begin{aligned}P_i = \frac{e^{-\beta E_i}}{Z}\end{aligned} \hspace{\stretch{1}}(1.0.2a)

\begin{aligned}Z = \sum_i e^{-\beta E_i}.\end{aligned} \hspace{\stretch{1}}(1.0.2b)

Show that this follows from the free energy F = U - T S = -k_{\mathrm{B}} \ln Z.

Answer

In terms of the free and average energies, we have

\begin{aligned}\frac{S}{k_{\mathrm{B}}} &= \frac{U - F}{k_{\mathrm{B}} T} \\ &=   \beta \left( -\frac{\partial {\ln Z}}{\partial {\beta}} \right)   - \beta \left( -k_{\mathrm{B}} T \ln Z \right) \\ &= \frac{\sum_i \beta E_i e^{-\beta E_i}}{Z}  +\ln Z \\ &= -\sum_i P_i \ln e^{-\beta E_i} + \sum_i P_i \ln Z \\ &= -\sum_i P_i \ln \frac{e^{-\beta E_i}}{Z} P_i \\ &= -\sum_i P_i \ln P_i.\end{aligned} \hspace{\stretch{1}}(1.0.3)

Question: Entropy in terms of grand partition probabilites ( [2] pr 4.1)

Generalize \cref{pr:entropyProbabilityForm:1} to the grand canonical scheme, where we have

\begin{aligned}P_{r, s} = \frac{e^{-\alpha N_r - \beta E_s}}{Z_{\mathrm{G}}}\end{aligned} \hspace{\stretch{1}}(1.0.4a)

\begin{aligned}Z_{\mathrm{G}} = \sum_{r,s} e^{-\alpha N_r - \beta E_s}\end{aligned} \hspace{\stretch{1}}(1.0.4b)

\begin{aligned}z = e^{-\alpha} = e^{\mu \beta}\end{aligned} \hspace{\stretch{1}}(1.0.4c)

\begin{aligned}q = \ln Z_{\mathrm{G}},\end{aligned} \hspace{\stretch{1}}(1.0.4d)

and show

\begin{aligned}S = - k_{\mathrm{B}} \sum_{r,s} P_{r,s} \ln P_{r,s}.\end{aligned} \hspace{\stretch{1}}(1.0.5)

Answer

With

\begin{aligned}\beta P V = q,\end{aligned} \hspace{\stretch{1}}(1.0.6)

the free energy takes the form

\begin{aligned}F = N \mu - P V = N \mu - q/\beta,\end{aligned} \hspace{\stretch{1}}(1.0.7)

so that the entropy (scaled by k_{\mathrm{B}}) leads us to the desired result

\begin{aligned}\frac{S}{k_{\mathrm{B}}} &= \beta U - N \mu \beta + q/(\beta k_{\mathrm{B}} T) \\ &= -\beta \frac{\partial {q}}{\partial {\beta}} - z \mu \beta \frac{\partial {q}}{\partial {z}} + q \\ &= \frac{1}{{Z_{\mathrm{G}}}}\sum_{r, s}\left( -\beta (-E_s) - \mu \beta N_r  \right) e^{-\alpha N_r - \beta E_s}+ \ln Z_{\mathrm{G}} \\ &= \sum_{r, s} \ln e^{ \alpha N_r + \beta E_s } P_{r,s} + \left( \sum_{r, s} P_{r, s}  \right)\ln Z_{\mathrm{G}} \\ &= -\sum_{r, s} \ln \frac{e^{ -\alpha N_r - \beta E_s }}{Z_{\mathrm{G}}} P_{r,s} \\ &= -\sum_{r, s} P_{r, s} \ln P_{r, s}\end{aligned} \hspace{\stretch{1}}(1.0.8)

References

[1] E.A. Jackson. Equilibrium statistical mechanics. Dover Pubns, 2000.

[2] RK Pathria. Statistical mechanics. Butterworth Heinemann, Oxford, UK, 1996.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: