Recall that in classical physics we say that if two systems are in thermal equilibrium then their temperatures are equal
We also saw that the average kinetic energy of a system was related to the temperature via
where kB is called Boltzmann’s constant and has a value of kB = 1.381 x 10‑23 J/K. Recall that (3.8) is the condition of thermal equilibrium. Thus, we should be able to relate it to temperature. Noticing that the energy is involved in the denominator of the relationship, we are lead to the definition
We call tthe fundamental temperature. It is related to the normal temperature in Kelvin by
t= kB T
Notice that the fundamental temperature has units of energy. This allows us to relate our definition of entropy to the classical one
S= kB s
where S is the classical entropy.
Assume that Ul > U2, and that an amount of energy DUis extracted from S1and placed in S2. Then the total entropy change Dsis
Since Ul > U2, we have t1> t2, and so the quantity on the right is positive, thus showing that the total change of entropy is positive when energy flows from a hotter system to a cooler one.
Relationship between temperature and entropy
How can we increase the entropy of a system? There are three ways. We can
1. Increase the number of particles, DN.
2. Increase the volume, DV.
3. Add energy to the system, DU(This energy must ultimately appear as heat).
Now consider two systems that are brought into thermal contact. In general, and for all time, we must have
U= Ul + U2 = U1,0 + U2,0
The multiplicity function is then
and contains, as one of the accessible states, the original state g(U1,0) g(U2,0). Since there are other states also accessible now, we see that, in general, g(U) ³g(U1,0)g(U2,0). Recall that the entropy is defined by s= ln g(U), so this conclusion leads to the law of increase of entropy
Suppose that dUis the uncertainty in U. We can look at the density of states in a given system. Let D(U) be the number of states per unit interval of energy. Then
g(U) = D(U) dU
s(U) = ln D(U) + ln dU
In many cases, we find that the total number of states is proportional to 2N. If the total number of states is of order N times some average particle energy, D, then . For this case, we see that
s(U) = N ln(2) ‑ ln(N) ‑ ln(D) + ln(dU)
In most other cases, we find that the total number of states in a system is proportional to UN dU. So the entropy can be written as
s(U) = N ln(U) + ln(dU) (4.4)
Typically, the uncertainty in U will be less than 1. Thus, we see that in both cases, the first term, N ln(U) or N ln(2), will dominate the entropy.
The Three Laws of Thermodynamics
We are now ready to define the three laws of thermodynamics:
If A is in thermal equilibrium with B and B is in thermal equilibrium with C, then A is in thermal equilibrium with C. This becomes obvious by looking at it mathematically:
Heat is a form of energy. This is simply a statement of conservation of energy.
If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time. Another formulation of this law is the classical Kelvin‑Planck formulation “It is impossible for any cyclic process to occur whose sole effect is the extraction of heat from a reservoir and the performance of an equal amount of work.”
The entropy of a system approaches a constant value as the temperature approaches zero.