Encyclopedia > Entropy

  Article Content

Entropy

Entropy has distinct meanings in thermodynamics and information theory. This article discusses the thermodynamic entropy; there is a separate article on information entropy. In fact, the two types of entropy are closely related, and their relationship reveals deep connections between thermodynamics and information theory.

The thermodynamic entropy S is a measure of the amount of energy in a system which cannot be used to do work. It is also a measure of the disorder present in a system.

Table of contents

Thermodynamic definition of entropy

The concept of entropy was introduced in 1865 by Rudolf Clausius. He defined the change in entropy of a thermodynamic system, during a reversible process in which an amount of heat δQ is applied at constant absolute temperature T, as

<math>\delta S = \frac{\delta Q}{T}</math>

Clausius gave the quantity S the name "entropy", from the Greek word τρoπη, "transformation". Note that this equation involves only a change in entropy, so the entropy itself is only defined up to an additive constant. Later, we will discuss an alternative definition which uniquely defines the entropy.

Entropy change in heat engines

Clausius' identification of S as a significant quantity was motivated by the study of reversible and irreversible thermodynamic transformations. In the next few sections, we will examine the steps leading to this identification, and its consequences for thermodynamics.

A thermodynamic transformation is a change in a system's thermodynamic properties, such as its temperature and volume. A transformation is said to be reversible if, at each successive step of the transformation, the system is infinitesimally close to equilibrium; otherwise, the transformation is said to be irreversible. As an example, consider a gas enclosed in a piston chamber, whose volume may be changed by moving the piston. A reversible volume change is one that takes place so slowly that the density of the gas always remains homogeneous. An irreversible volume change is one that takes place so quickly that pressure waves[?] are created within the gas, which is a state of disequilibrium. Reversible processes are sometimes referred to as quasi-static processes.

A heat engine is a thermodynamic system that can undergo a sequence of transformations which ultimately return it to its original state. This sequence is called a cycle. During some transformations, the engine may exchange heat with large systems known as heat reservoirs, which have a fixed temperature and can absorb or provide an arbitrary amount of heat. The net result of a cycle is (i) work done by the system (which may be negative, which is the same as positive work done on the system), and (ii) heat transfered between the heat reservoirs. By the conservation of energy, the heat lost by the heat reservoirs is exactly equal to the work done by the engine plus the heat gained by the heat reservoirs.

If every transformation in the cycle is reversible, the cycle is reversible. This means that it can be run in reverse, i.e. the heat transfers occur in the opposite direction and the amount of work done switches sign. The simplest reversible cycle is a Carnot cycle, which exchanges heat with two heat reservoirs.

In thermodynamics, absolute temperature is defined in the following way. Suppose we have two heat reservoirs. If a Carnot cycle absorbs an amount of heat Q from the first reservoir and delivers an amount of heat Q′ to the second, then the respective temperatures T and T′ are given by

<math>\frac{Q}{T} = \frac{Q'}{T'}</math>

Now consider a cycle of an arbitrary heat engine, during which the system exchanges heats Q1, Q2, ..., QN with a sequence of N heat reservoirs that have temperatures T1, ..., TN. We take each Q to be positive if it represents heat absorbed by the system, and negative if it represents heat lost by the system. We will show that

<math>\sum_{i=1}^N \frac{Q_i}{T_i} \ge 0</math>

where the equality sign holds if the cycle is reversible.

To prove this, we introduce an additional heat reservoir at some arbitrary temperature T0, as well as N Carnot cycles that have the following property: the j-th such cycle operates between the T0 reservoir and the Tj reservoir, transferring heat Qj to the latter. From the above definition of temperature, this means that the heat extracted by the j-th cycle from the T0 reservoir is

<math>Q_{0,j} = T_0 \frac{Q_j}{T_j}</math>

We now consider one cycle of our arbitrary heat engine, accompanied by one cycle of each of the N Carnot cycles. At the end of this process, each of the reservoirs T1, ..., TN have no net heat loss, since the heat extracted by the heat engine is replaced by one of the Carnot cycles. The net result is (i) an unspecified amount of work done by the heat engine, and (ii) a total amount of heat extracted from the T0 reservoir, equal to

<math>Q_0 = \sum_{j=1}^N Q_{0,j} = T_0 \frac{Q_j}{T_j}</math>

If this quantity is positive, this process would function as a perpetual motion machine of the second kind. The second law of thermodynamics states that this is impossible, so

<math>\sum_{i=1}^N \frac{Q_i}{T_i} \ge 0</math>

as claimed. It is easy to show that the equality holds if the engine is reversible, by repeating the above argument for the reverse cycle.

We have used Tj to refer to the temperature of each heat reservoir with which the system comes into contact, not the temperature of the system itself. However, if the cycle is reversible, the system is always infinitesimally close to equilibrium, so its temperature must be equal to any reservoir with which it is contact. If the cycle is not reversible, then because heat always flows from higher temperatures to lower temperatures,

<math>\frac{Q_j}{T_j} \le \frac{Q_j}{T}</math>

where T is the temperature of the system while it is in thermal contact with the heat reservoir.

The above arguments generalise to a cycle that consists of a continuous sequence of transformations. We obtain

<math>\oint \frac{dQ}{T} \equiv \oint dS \ge 0</math>

where the integral is taken over the entire cycle, and T is the temperature of the system at each step. As before, the equality holds if the cycle is reversible.

Entropy as a state function

We can now deduce an important fact about the entropy change during any thermodynamic transformation, not just a cycle. First, consider a reversible transformation that brings a system from an equilibrium state A to another equilibrium state B. If we follow this with any reversible transformation which returns that system to state A, our above result says that the net entropy change is zero. However, since the entropy change during the second transformation is arbitrary, this implies that the entropy change in the first transformation depends only on the initial and final states.

This allows us to define the entropy of any equilibrium state of a system. Choose a reference state R and call its entropy SR. The entropy of any equilibrium state X is

<math>S_X = S_R + \int_R^X \frac{dQ}{T}</math>

Since the integral is independent of the particular transformation taken, this equation is well-defined.

We now consider irreversible transformations. It is straightforward to show that the entropy change during any transformation between two equilibrium states is

<math>\Delta S \ge \int \frac{dQ}{T}</math>

where the equality holds if the transformation is reversible.

Notice that if dQ = 0, then ΔS ≥ 0. The second law of thermodynamics is sometimes stated as this result: the total entropy of a thermally isolated system can never decrease.

Suppose a system is thermally isolated but remains in mechanical contact with the environment. If it is not in mechanical equilibrium with the environment, it will do work on the environment, or vice versa. For example, consider a gas enclosed in a piston chamber whose walls are perfect thermal insulators. If the pressure of the gas differs from the pressure applied to the piston, it will expand or contract, and work will be done. Our above result indicates that the entropy of the system will increase during this process (it could in principle remain constant, but this is unlikely.) Typically, there exists a maximum amount of entropy the system may possess under the circumstances. This entropy corresponds to a state of stable equilibrium, since a transformation to any other equilibrium state would cause the entropy to decrease, which is forbidden. Once the system reaches this maximum-entropy state, no more work may be done.

Statistical definition of entropy: Boltzmann's Principle

In 1877, Boltzmann realised that the entropy of a system may be related to the number of possible "microstates" (microscopic states) consistent with its thermodynamic properties. Consider, for example, an ideal gas in a container. A microstate is specified with the positions and momenta of each constituent atom. Consistency requires us to consider only those microstates for which (i) the positions of all the particles are located within the volume of the container, (ii) the kinetic energies of the atoms sum up to the total energy of the gas, and so forth. Boltzmann then postulated that

<math>S = k (\ln \Omega)</math>

where k is known as Boltzmann constant and Ω is the number of consistent microstates. This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It relates a microscopic property of the system (Ω) to one of its thermodynamic properties (S).

Under Boltzmann's definition, the entropy is clearly a function of state. Furthermore, since Ω is just a natural number (1,2,3,...), the entropy must be positive (this is simply a property of the logarithm.)

Entropy as a measure of disorder

We can view Ω as a measure of the disorder in a system. This is reasonable because what we think of as "ordered" systems tend to have very few configurational possibilities, and "disordered" systems have very many. Consider, for example, a set of 10 coins, each of which is either heads up or tails up[?]. The most "ordered" macroscopic states are 10 heads or 10 tails; in either case, there is exactly one configuration that can produce the result. In contrast, the most "disordered" state consists of 5 heads and 5 tails, and there are 10C5 = 252 ways to produce this result (see combinatorics.)

Under the statistical definition of entropy, the second law of thermodynamics states that the disorder in an isolated system tends to increase. This can be understood using our coin example. Suppose that we start off with 10 heads, and re-flip one coin at random every minute. If we examine the system after a long time has passed, it is possible that we will still see 10 heads, or even 10 tails, but that is not very likely; it is far more probable that we will see approximately as many heads as tails.

Since its discovery, the idea that disorder tends to increase has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the result ΔS ≥ 0 applies only to isolated systems; notably, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. Nevertheless, it has been pointed out that the universe may be considered an isolated system, so that its total disorder should be constantly increasing. It has been speculated that the universe is fated to a heat death[?] in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source.

Counting of microstates

In classical statistical mechanics, the number of microstates is actually infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. Therefore, a method of "classifying" the microstates must be invented if we are to define Ω. In the case of the ideal gas, we count two states of an atom as the "same" state if their positions and momenta are within δx and δp of each other. Since the values of δx and δp can be chosen quite arbitrarily, the entropy is not uniquely defined; it is in fact defined only up to an additive constant, as before. This grouping of microstates is called coarse graining[?], and has its counterpart in the choice of basis states in quantum mechanics.

This ambiguity is partly resolved with quantum mechanics. The quantum state of a system can be expressed as a superposition of basis states, which are typically chosen to be eigenstates of the unperturbed Hamiltonian. In quantum statistical mechanics, Ω refers to the number of basis states consistent with the thermodynamic properties. Since the set of basis states is generally countable, we can define &Omega.

However the choice of the set of basic states is still somehow arbitrary. It coresponds to the choice of coarse graining[?] of microstates, to the distinct macrostates in classical physics.

This leads to Nernst's theorem, sometimes referred to as the third law of thermodynamics, which states that the entropy of a system at zero absolute temperature is a well-defined constant. This is due to the fact that a system at zero temperature exists in its ground state, so that its entropy is determined by the degeneracy of the ground state. Many systems, such as crystal lattices, have a unique ground state, and therefore have zero entropy at at absolute zero (since ln(1) = 0).

Measuring Entropy

In real experiments, it is quite difficult to measure the entropy of a system. The techniques for doing so are based on the thermodynamic definition of the entropy, and require extremely careful calorimetry.

For simplicity, we will examine a mechanical system, whose thermodynamic state may be specified by its volume V and pressure P. In order to measure the entropy of a specific state, we must first measure the heat capacity at constant volume and at constant pressure (denoted CV and CP respectively), for a successive set of states intermediate between a reference state and the desired state. The heat capacities are related to the entropy S and the temperature T by

<math>C_X = T \left(\frac{\partial S}{\partial T}\right)_X</math>

where the X subscript refers to either constant volume or constant pressure. This may be integrated numerically to obtain a change in entropy:

<math>\Delta S = \int \frac{C_X}{T} dT</math>

We can thus obtain the entropy of any state (P,V) with respect to a reference state (P0,V0). The exact formula depends on our choice of intermediate states. For example, if the reference state has the same pressure as the final state,

<math> S(P,V) = S(P, V_0) + \int^{T(P,V)}_{T(P,V_0)} \frac{C_P(P,V(T,P))}{T} dT </math>

In addition, if the path between the reference and final states lies across any first order phase transition, the latent heat associated with the transition must be taken into account.

The entropy of the reference state must be determined independently. Ideally, one chooses a reference state at an extremely high temperature, at which the system exists as a gas. The entropy in such a state would be that of a classical ideal gas plus contributions from molecular rotations and vibrations, which may be determined spectroscopically. Choosing a low temperature reference state is sometimes problematic since the entropy at low temperatures may behave in unexpected ways. For instance, a calculation of the entropy of ice by the latter method, assuming no entropy at zero temperature, falls short of the value obtained with a high-temperature reference state by 3.41 J/K/mol. This is due to the fact that the molecular crystal lattice of ice exhibits geometrical frustration[?], and thus possesses a non-vanishing "zero-point" entropy at arbitrarily low temperatures.


See also

References

  • Fermi, E., Thermodynamics, Prentice Hall (1937)
  • Reif, F., Fundamentals of statistical and thermal physics, McGraw-Hill (1965)



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Thomas a Kempis

... centuries, and with the "Confessions" of Augustine and John Bunyan's Pilgrim's Progress it occupies a front rank, if not the foremost place, among ...

 
 
 
This page was created in 35.9 ms