Recall that in classical physics we say that if two systems are in thermal equilibrium then their temperatures are equal

*T*_{1}= *T*_{2}

We also saw that the average kinetic energy of a system was related to the temperature via

where *k*_{B} is called Boltzmann’s constant and has a value of *k*_{B} = 1.381 x 10^{‑23} J/K. Recall that (3.8) is the condition of thermal equilibrium. Thus, we should be able to relate it to temperature. Noticing that the energy is involved in the denominator of the relationship, we are lead to the definition

(4.1)

We call tthe **fundamental temperature**. It is related to the normal temperature in Kelvin by

t= *k*_{B} T

Notice that the fundamental temperature has units of energy. This allows us to relate our definition of entropy to the classical one

*S*= *k*_{B} s

where *S* is the classical entropy.

Example:

Assume that *U*_{l} > *U*_{2}, and that an amount of energy D*U*is extracted from *S*_{1}and placed in *S*_{2}. Then the total entropy change Dsis

(4.2)

Since *U*_{l} > *U*_{2}, we have t_{1}> t_{2}, and so the quantity on the right is positive, thus showing that the total change of entropy is positive when energy flows from a hotter system to a cooler one.

## Relationship between temperature and entropy

How can we increase the entropy of a system? There are three ways. We can

1. Increase the number of particles, D*N*.

2. Increase the volume, D*V*.

3. Add energy to the system, D*U*(This energy must ultimately appear as heat).

Now consider two systems that are brought into thermal contact. In general, and for all time, we must have

*U*= *U*_{l} + *U*_{2} = *U*_{1,0} + *U*_{2,0}

The multiplicity function is then

and contains, as one of the accessible states, the original state *g*(*U*_{1,0}) *g*(*U*_{2,0}). Since there are other states also accessible now, we see that, in general, *g*(*U*) ³*g*(*U*_{1,0})*g*(*U*_{2,0}). Recall that the entropy is defined by s= ln *g*(*U*), so this conclusion leads to the **law of increase of entropy**

s_{final}³s_{initial} (4.3)

Suppose that d*U*is the uncertainty in *U*. We can look at the density of states in a given system. Let *D*(*U*) be the number of states per unit interval of energy. Then

*g*(*U*) = *D*(*U*) d*U*

and

s(*U*) = ln *D*(*U*) + ln d*U*

In many cases, we find that the total number of states is proportional to 2^{N}. If the total number of states is of order *N* times some average particle energy, D, then . For this case, we see that

s(*U*) = *N* ln(2) ‑ ln(*N*) ‑ ln(D) + ln(d*U*)

In most other cases, we find that the total number of states in a system is proportional to *U*^{N} d*U*. So the entropy can be written as

s(*U*) = *N* ln(*U*) + ln(d*U*) (4.4)

Typically, the uncertainty in *U* will be less than 1. Thus, we see that in both cases, the first term, *N* ln(*U*) or *N* ln(2), will dominate the entropy.

## The Three Laws of Thermodynamics

We are now ready to define the **three laws of thermodynamics:**

**Zeroth law**

If *A* is in thermal equilibrium with *B* and *B* is in thermal equilibrium with *C*, then *A* is in thermal equilibrium with *C*. This becomes obvious by looking at it mathematically:

**First law**

Heat is a form of energy. This is simply a statement of conservation of energy.

**Second law**

If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time. Another formulation of this law is the classical Kelvin‑Planck formulation “It is impossible for any cyclic process to occur whose sole effect is the extraction of heat from a reservoir and the performance of an equal amount of work.”

**Third law**

The entropy of a system approaches a constant value as the temperature approaches zero.