• Keine Ergebnisse gefunden

Statistical thermodynamics

Im Dokument The Origin and the Evolution of Firms (Seite 53-57)

CHAPTER 3. MACROSCOPIC THERMODYNAMICS

4.4. Statistical thermodynamics

(

ln !

pi

I (4.2.) In eqn. 4.2, we use the shorthand notation <quantity> for the average value of a quantity. As we already mentioned a calculation of the statistical entropy of the normal distribution leads to the conclusion that its statistical entropy is the natural logarithm of the standard deviation of the probability distribution.

The material presented above summarizes everything that we need to know about information theory for the purpose of this book.

4.4. Statistical thermodynamics.

Consider a macroscopic system. It can exist in a large number of microstates that comply with the macroscopic information we have available. We assume the probability density function known as well as the energy Ei of each microstate. We straightforwardly obtain the average

40

energy by summing up all the energies of the microstates weighted according to their probability. This average is the value of the energy we most likely observe macroscopically.

To be in contact with thermodynamics we simply denote that average as the macroscopic energy E.

The problem of statistical thermodynamics is now to find the distribution ofEi over the states that are consistent with the macroscopic information we have about the system (Hill (1960), Andrews (1975)). In statistical thermodynamics, we assume that the probability of a microstate uniquely depends on the energy of that microstate. This is a basic postulate and rests on the circumstance that the energy of the microstate is the only distinctive feature of that state from the perspective of the macroscopic description. To assume anything else than sole dependency on the energy of the microstate assumes availability of information we do not have and this introduces undue bias. We do not present the mathematical detail but only cite the major results.

The literature cited provides ample access to the mathematics, Roels (2010) provides a summary.

The statistical analysis reveals that the change of the macroscopic energy of a system has two contributions:

x The first one represents the change of the macroscopic energy due performing work on the system. We discuss this as the starting point for defining energy in Chapter 3.

x The second one represents a change in the information the observer has about the probabilities of the microstates; it stands for the effect of the changes in the probability density function. We identify the second term as the contribution due to “information work” performed on the system. It represents the change in the information we have about the system’s microstate.

In Chapter 3, eqn. 3.1, we formulated an expression for the change of the macroscopic energy. It contains a contribution due to work performed on the system and a term due to heat exchanged with the environment. For convenience sake, we reproduce eqn. 3.1 here.

Q

dt P

dE ) )

(3.1) The first term at the right hand side of this equation corresponds to the first bullet point above, it represent the exchange of energy with the environment due to work performed on the collective microstates of the system.

The second bullet point above thus has to represent the exchange of heat. It represents changes in the information we have about the microstate of the system as reflected in changes of the probability density function. Heat is a statistical rather than a mechanical concept. Heat emerges as a direct consequence of the information that is lacking in a macroscopic approach to reality.

The analysis leads, using eqn. 4.2., straightforwardly to the following expression for the exchange of heat between the system and the environment:

2 We discuss the nature of the constant ȕ later.

41 Combination of eqns. 4.3 and 4.2, leads to the following equality:

dt dI

Q 1)

(E

) (4.4)

Eqn. 4.4 relates the heat flow to the rate of change of the lacking information in our reduced

information macroscopic picture of reality. It again highlights the statistical nature of heat.

As the next step, we return to eqn. 3.7, the expression for the change of the system’s entropy:

T dt dS )Q

(3.7.) Combining eqns 3.7 and 4.4 results in:

dt dI T dt dS 1 )

(E (4.5.)

If we integrate both sides of eqn. 4.5 starting from a reference state that we attribute zero entropy and zero amount of lacking information, we obtain the following relation:

T S I

E (4.6) This expression confirms the proportionality between entropy and lack of information.

A fundamental result of statistical thermodynamics shows that the constant E in eqn. 4.6 follows as:

kT

E 1 (4.7)

Where k is the Boltzmann constant.

Combining eqns. 4.7 and 4.6 leads to the famous equation that originates from the work of Boltzmann, it relates entropy and lacking information by the relation S=kI.

The literature (see e.g. Brillouin (1962)) debates the nature of the factor ȕ. kTis equal to the energy cost of one bit of information; at least it puts a minimum to the energy cost to obtain information beyond the macroscopic information. Careful analysis of e.g. the concept of the

“Maxwell demon” , a genie assumed able to avoid paying the cost of information and hence escaping the constraints due to the second law, leads to the conclusion that there exists no way to avoid paying this minimum cost. This exorcises the demon effectively (Brillouin (1962)).

To close this section we remind the reader again of the nature of the concept of statistical entropy that underlies the entropy state function as used in thermodynamics. The statistical entropy, a term we frequently use in this work, measures the information that our model of a system lacks if we want to pinpoint the microstate, i.e. the description representing the microscopic details of the system. The statistical entropy provides a quantification of the lack of information.

42

4.5. Conclusion.

In summary, we reach the following conclusions in this chapter. Eqn. 4.6 presents the statistical interpretation of entropy as introduced in macroscopic thermodynamics. It ties entropy to the limitations of the information about the exact microstate the system is in due to having only a reduced information picture in its macroscopic description. This limited information results in the distinction between macroscopic intrinsic energy and free energy as introduced in Chapter 3:

T S E

G (4.8) Combining eqns. 4.8, 4.7 and 4.6 results in the statistical interpretation of free energy:

kT I E

G (4.9) The rationale behind eqn. 4.9 develops as follows. As we have a reduced information picture of reality, a restriction exists to the useful work that can result from a given quantity of potentially available energy. We pay a penalty for that lack of information; this penalty equals the product of the amount of information that is lacking and the value, in energy units, of that information. An in depth study of the nature of that value (Brillouin (1962)) shows that we cannot avoid paying that penalty. Even if we use the most clever way of gathering additional information about the exact microstate of the system, in order to free up part of the additional “true energy“ of the system, we pay an energy cost which exceeds its value in energy units or is at best equal to its value in energy units.

By combining this interpretation with the second law, we see that this reduced information also sets the direction that natural processes take. Natural processes proceed in the direction of decreasing information or increasing entropy, at least in isolated systems. Alternatively phrased, if we combine the system and its environment (the environment being the rest of the universe outside the system) the information naturally decreases and entropy increases in any possible process. This means that the statistical meaning of the second law is very logical indeed. In any natural process, i.e. if there is no further interaction between the system and the environment, the system will evolve in the direction of the state that is the most likely one in view of the limited information we have; assuming anything else rests on unfounded bias.

43 CHAPTER 5. A GENERAL THEORY OF TRANSFORMATIONS AND ECONOMIC TRANSACTIONS.

5.1. Introduction.

In Chapter 3, we introduce macroscopic thermodynamics. Here we investigate the extension of this theory to include economic transaction. In this way, we arrive at a theory to describe transactions and other processes that appear in our economic system. We henceforth use the name economic value theory, shorthanded EVT, to indicate this theory. In addition, we attempt to merge thermodynamics and EVT into one body of theory that encompasses both economic processes and processes in the purely physical sense. Is it possible to combine thermodynamics and Economic Value Theory (EVT) in one theory? We introduced this theory in earlier work (Roels (2010)), where we termed it Value Transaction Theory but we argue that the term Economic Value Theory reflects the nature of the theory in a better way.

In fact, classical thermodynamics hardly deserves a name in which dynamics appears.

Thermodynamics is, in its classical sense, an equilibrium theory; hence, there is nothing dynamic about it. It introduces infinitely slow reversible changes and that is not very dynamic at all. We consider dynamic processes in irreversible thermodynamics, a subject that we introduce in this chapter. Classical thermodynamics has an unfortunate name, as it is a static theory and is not limited to processes involving only thermal effects as the adjective “thermo”

leads to believe.

To investigate the formulation of an extended theory we return to the basic concepts of thermodynamics, i.e. energy, entropy and temperature and the laws of thermodynamics. In addition, we return to the heat engine that spawned thermodynamics following Carnot’s analysis of that contraption 185 years ago. In the heat engine, we find the inspiration to generalize our analysis.

Im Dokument The Origin and the Evolution of Firms (Seite 53-57)