THE UNIVERSITY OF BRITISH COLUMBIA

PHYSICS 455

Lecture # 3 :

Fri. 10 Jan. 1997

Equilibrium


I. Ensembles

So far we have been talking about an isolated system with a fixed number of particles and a fixed total energy U. The collection of all possible fully specified microscopic quantum states of such a system is called its microcanonical ensemble (µCE for short). In my notational convention this can be written { | > } where the curly brackets denote ``the set of all...'' The number of different | >'s in the set is the multiplicity g and the natural logarithm of g is the entropy . This is about all the µCE is good for, except cocktail party conversation (mystify your friends!) and defining the temperature:

II. Temperature

In most circumstances, adding some more energy dU to a system will change its entropy - usually with more U there will be more ways to distribute it, so we normally expect the entropy to increase when we add some energy. So what happens when we allow two µCE's to exchange U (but not N) through a ``diathermal wall'' (one that passes U but not N)? the two systems taken together are still an isolated system, so any dU1 added to system 1 must be taken away from system 2: dU2 = - dU1. A particular division of U between U1 and U2 is called a configuration and the most probable configuration (MPC) is the one for which the combined multiplicity for the two systems, gtot = g1 g2 is a maximum, which implies an extremum, which implies dg/dU = 0. Thus the criterion for the MPC is

0 = g1 /U1 g2 + g1 g2 /U1

or

(1/g1) g1 /U1 = (1/g2) g2 /U2

which is the same as

1 /dU1 = 2 /dU2

That is, the MPC will be the one for which both systems have the same /dU. We intuitively expect that two systems will stop exchanging heat (except for small fluctuations) when they are at the same temperature, so we should expect the derivative of the entropy with respect to the energy to be equivalent to temperature.

Unfortunately this derivative has the opposite implication from what we expect of temperature: the system with the higher /dU will tend to absorb heat from the system with the lower /dU. Because we have already gotten used to the idea that heat will flow spontaneously from regions of high T to regions of low T, we associate our derivative with the inverse temperature

1/ = /dU

where (since the entropy here is a pure number) the natural units for the temperature are energy units. If you really want to convert the entropy and temperature into conventional units, use

S = kB and = kB T

where kB = 1.381 10-23 J/K is the Boltzmann constant.

The graphs above show the behaviour of the entropy, inverse temperature and temperature for a system of N spin-1/2 particles in a magnetic field B. In most realistic circumstances such a system is in thermal equilibrium with the rest of the world (usually referred to as ``the lattice'' in deference to the fact that most spins are found in crystalline solids) at a fairly high positive temperature - i.e., just to the left of the centre of the diagram, corresponding to a modest excess of ``up'' spins. However, it is possible in some cases to change the magnetic field more rapidly than the spin system can equilibrate thermally, leaving the option of simply reversing the direction of B and therefore the sign of U and therefore the sign of - yes, one can, without too much trouble, produce negative temperatures in the laboratory!

Does this mean we can produce systems colder than absolute zero? Not at all. The inverse temperature is a smooth and monotonic function of U over the whole range of possible energies of the system, and it is just our insistence on adapting the theory to fit our conventions about ``high'' and ``low'' temperature that produce this nasty divergence. A negative infinite temperature is slightly hotter than a positive infinite temperature and the hottest temperature you can get is negative zero! If you need to predict the actual behaviour of thermal systems it is best to go back to the definition of thermal equilibrium.

Fortunately the exotic features of a spin system are only characteristic of closed systems with upper limits on energy, which we shall rarely encounter in physics elsewhere.

III. Analogues of Thermal Equilibrium

It seems to me that one can use the notion of thermal equilibrium to draw qualitative conclusions about any sort of system involving a lot of apparently random redistributions - for instance, economic systems, where wealth (the analogue of energy) is constantly being redistributed for reasons that may seem very logical to the participants in the process but are apt to look pretty random to the economist studying the ``big picture.'' This is enough like our viewpoint in Stat Mech that I assigned a somewhat frivolous essay on the subject as your first assignement. Now let me stick my neck out a bit:

Insofar as an ideal socialist economy should redistribute any newly acquired (or created) wealth equally between all N members of society, there is only one microstate of the system and g = 1 no matter how much wealth the society possesses. By contrast, an ideal laissez-faire capitalist economy has all N members of society constantly striving to get a larger piece of the available U (wealth) and any new wealth added to the system will drastically increase the number of different ways it can be redistributed.

By our definitions, then, s = 0 and c is an increasing function of U, where the subscripts stand for socialist and capitalist systems. Then /dU is always zero for the socialist system and larger than zero for the capitalist system, implying an infinite economic ``temperature'' for the former and a finite one for the latter. What happens when a high-temperature system is put in thermal contact with a low-temperature one? Hmm.

I propose that the most helpful thing the United States could possibly have done for Castro's Cuba was to try to isolate it economically. In the natural, random course of trade with Yankee money-grubbers, all the wealth in Cuba would have long ago trickled out into the USA and gotten lost in all the disorder, had the USA only welcomed Cuban traders with open arms. Not for any particular reason, of course, any more than there is a reason for heat to flow from a hot system to a cold one; there are just a lot more ways to be random when the temperatures are equal.

Enough sophistry. You decide for yourselves whether you think economic theory could benefit from the paradigms of Stat Mech. For now, let's get back to energy, entropy, temperature and physics.