information and thermodynamic entropy john d. norton department of history and philosophy of science...

63
Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh 1 Pitt-Tsinghua Summer School for Philosophy of Science Institute of Science, Technology and Society, Tsinghua University Center for Philosophy of Science, University of Pittsburgh At Tsinghua University, Beijing June 27- July 1, 2011

Upload: scot-shaw

Post on 18-Jan-2016

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Information and Thermodynamic

EntropyJohn D. Norton

Department of History and Philosophy of ScienceCenter for Philosophy of Science

University of Pittsburgh

1

Pitt-Tsinghua Summer School for Philosophy of ScienceInstitute of Science, Technology and Society, Tsinghua University

Center for Philosophy of Science, University of PittsburghAt Tsinghua University, Beijing June 27- July 1, 2011

Page 2: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Philosophy and Physics

Informationideas and concepts

Entropyheat, work,thermodynamics

=And why not?

Mass = EnergyParticles = Waves

Geometry = Gravity….

2

Time = Money

Page 3: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

This Talk

Background

Maxwell’s demon and the molecular challenge to the second law of thermodynamics.

Exorcism by principleSzilard’s Principle,Landauer’s principle

3

Foreground

Failed proofs of Landauer’s PrincipleThermalization, Compression of phase spaceInformation entropy, Indirect proof

The standard inventory of processes in the thermodynamics of computation neglects fluctuations.

Page 4: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Fluctuations and Maxwell’s

demon

4

Page 5: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The original conception

J. C. Maxwell in a letter to P. G. Tait, 11th December 1867

“…the hot system has got hotter and the cold system colder and yet no work has been done, only the intelligence of a very observant and neat-fingered being has been employed.”

Divided chamber with a kinetic gas.

Demon operates door intelligently

“[T]he 2nd law of thermodynamics has the same degree of truth as the statement that if you throw a tumblerful of water into the sea you cannot get the same tumblerful of water out again.”

5

Page 6: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Maxwell’s demon livesin the details of Brownian motion and other fluctuations

“…we see under out eyes now motion transformed into heat by friction, now heat changed inversely into motion, and that without loss since the movement lasts forever. That is the contrary of the principle of Carnot.”

Poincaré, 1907

Could these momentary, miniature

violations of the second law be accumulated to large-scale violations?

Guoy (1888), Svedberg (1907) designed mini-machines with that purpose.

6

“One can almost see Maxwell’s demon at work.”

Poincaré, 1905

Page 7: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Szilard’sOne-Molecule

Engine

7

Page 8: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Simplest case of fluctuations

Many molecules

A few molecules

8

One molecule Can a demon exploit these fluctuations?

Page 9: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The One-Molecule Engine

Initial stateA partition is inserted to trap the molecule on one side.

The gas undergoes a reversible, isothermal expansion to its original state.

Work kT ln 2gained in raising the weight.

It comes from theheat kT ln 2,

drawn from the heat bath.

Szilard 1929

Heat kT ln 2 is drawn from the heat bath and fully converted to work.

The total entropy of the universe decreases by k ln 2.

The Second Law of Thermodynamics is violated.

Net effect of the completed cycle:

Page 10: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The One-Molecule Engine

Initial stateA partition is inserted to trap the molecule on one side.

The gas undergoes a reversible, isothermal expansion to its original state.

Work kT ln 2gained in raising the weight.

It comes from theheat kT ln 2,

drawn from the heat bath.

Szilard 1929

Heat kT ln 2 is drawn from the heat bath and fully converted to work.

The total entropy of the universe decreases by k ln 2.

The Second Law of Thermodynamics is violated.

Net effect of the completed cycle:

Page 11: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Exorcism by principle

11

Page 12: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Szilard’s Principle

12

Acquisition of one bit of information creates k ln 2

of thermodynamic entropy.

Von Neumann 1932Brillouin 1951+…

Landauer’s Principleversus

Landauer 1961Bennett 1987+…

Proof:By “working backwards.”

By suggestive thought experiments.

(e.g. Brillouin’s torch)

Erasure of one bit of information creates k ln 2 of thermodynamic entropy.

Szilard’s principle is false.

Real entropy cost only taken when naturalized demon erases the memory of the position of the molecule

Proof: …???...

Page 13: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Failed proofs of Landauer’s

Principle

13

Page 14: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Direct Proofs that model the erasure processes in the

memory device directly.

14

Wrong sort of entropy.No connection to heat.

3. Information-theoretic Entropy “p ln p”

Associate entropy with our uncertaintyover which memory cell is occupied.

An inefficiently designed erasure procedure creates entropy.No demonstration that all must.

1. Thermalization

2. Phase Volume Compressionaka “many to one argument”

Erasure need not compress phase volume but only rearrange it.

or

See: "Eaters of the Lotus: Landauer's Principle and the Return of Maxwell's Demon." Studies in History and Philosophy of Modern Physics, 36 (2005), pp. 375-411.

Page 15: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

4. Indirect Proof: General Strategy

15

Process known to reduce entropy

Arbitrary erasure process

coupledto

Assumesecond law of thermodynamics holds on average.

Entropy must increase on average.

Entropy reduces.

Page 16: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

4. An Indirect Proof

16

Ladyman et al., “The connection between logical

and thermodynamic irreversibility,” 2007.

gas

memory

One-Molecule

One-Molecule

Reduces entropy of heat bath by k ln 2.

isothermal reversible expansion

insert partition

or

shift cell to match

dissipationlessly detect gas state

or

perform any erasure

Assume second law of thermodynamics holds on average.

Erasure must create entropy k ln 2 on average.

Original proof given only in terms of quantities of heat passed among components.

Page 17: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

4. An Indirect Proof

17

Fails

Inventory of admissible processes allows:

Processes that violate the second law of thermodynamics, even in its statistical form.

Processes that erase dissipationlessly (without passing heat to surroundings) in violation of Landauer’s principle.

See: “Waiting for Landauer,” Studies in History and Philosophy of Modern Physics, forthcoming.

Page 18: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Dissipationless Erasure

18

or

First method.

1. Dissipationlessly detect memory state.

2. If R, shift to L.

Second method.

1. Dissipationlessly detect memory state.

2. If R, remove and reinsert partition and go to 1.Else, halt.

Page 19: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The Importance of Fluctuations

19

Page 20: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Marian Smoluchowski, 1912

20

Exorcism of Maxwell’s demon by fluctuations.

The best known of many examples.

Trapdoor hinged so that fast molecules moving from left to right swing it open and pass, but not vice versa.

The second law holds on average only over time.Machines that try to accumulate fluctuations are

disrupted fatally by them.

BUT

The trapdoor must be very light so a molecule can swing it open.

AND

The trapdoor has its own thermal energy of kT/2 per degree of freedom.

SO

The trapdoor will flap about wildly and let molecules pass in both directions.

Page 21: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Fluctuations disprupt

Reversible Expansion and

Compression

21

Page 22: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The Intended Process

22

Infinitely slow expansion converts heat to work in the raising of the mass.

Mass M of piston continually adjusted so its weight remains in perfect balance with the mean gas pressure P= kT/V.

Equilibrium height is

heq = kT/Mg

Page 23: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The massive piston…

23

….is very light since it must be supported by collisions with a single molecule. It has mean thermal energy kT/2 and will fluctuate in position.

Probability density for the piston at height h

p(h) = (Mg/kT) exp ( -Mgh/kT)

Meanheight

= kT/Mg = heq

Standard deviation

= kT/Mg = heq

Page 24: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

What Happens.

24

Fluctuations obliterate the infinitely slow expansion intended

This analysis is approximate. The exact analysis replaces the gravitational field with

pistonenergy = 2kT ln (height)

Page 25: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Fluctuations disrupt

Measurement and Detection

25

Page 26: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Bennett’s Machine for Dissipationless Measurement…

Measurement apparatus, designed by the author to fit the Szilard engine, determines which half of the cylinder the molecule is trapped in without doing appreciable work. A slightly modified Szilard engine sits near the top of the apparatus (1) within a boat-shaped frame; a second pair of pistons has replaced part of the cylinder wall. Below the frame is a key, whose position on a locking pin indicates the state of the machine's memory. At the start of the measurement the memory is in a neutral state, and the partition has been lowered so that the molecule is trapped in one side of the apparatus. To begin the measurement (2) the key is moved up so that it disengages from the locking pin and engages a "keel" at the bottom of the frame. Then the frame is pressed down (3). The piston in the half of the cylinder containing no molecule is able to desend completely, but the piston in the other half cannot, because of the pressure of the molecule. As a result the frame tilts and the keel pushes the key to one side. The key, in its new position. is moved down to engage the locking pin (4), and the frame is allowed to move back up (5). undoing any work that was done in compressing the molecule when the frame was pressed down. The key's position indicates which half of the cylinder the molecule is in, but the work required for the operation can be made negligible To reverse the operation one would do the steps in reverse order.

Charles H. Bennett, “Demons, Engines and the Second Law,” Scientific American 257(5):108-116 (November, 1987).

26

…is fatally disrupted by fluctuations that leave the keel rocking wildly.

FAILS

Page 27: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

A Measurement Scheme Using Ferromagnets

27

Charles H. Bennett, “The Thermodynamics of Computation—A Review,” In. J. Theor. Phys. 21, (1982), pp. 905-40,

Page 28: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

A Measurement Scheme Using Ferromagnets

28

Charles H. Bennett, “The Thermodynamics of Computation—A Review,” In. J. Theor. Phys. 21, (1982), pp. 905-40,

Page 29: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

A General Model of Detection

29

First step: the detector is coupled with the target system.

The process is isothermal, thermodynamically reversible:

• It proceeds infinitely slowly.

• The driver is in equilibrium with the detector.

The process intended:

The coupling is an isothermal, reversible

compression of the detector phase space.

Page 30: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

A General “No-Go”

Result

30

Page 31: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Fluctuation Disrupt All Reversible, Isothermal Processes at Molecular Scales

31

Intended process

=1 =2

Actual process

=1 =2

Page 32: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Einstein-Tolman Analysis of Fluctuations

32

Probability densitythat system is in stage

Total system of gas-pistonor target-detector-driver is

canonically distributed.

p(x, ) = (1/Z) exp(-E(x,)/kT)

Z() = ∫ exp(-E(x,)/kT) dxd

Different stages

p() proportional to Z()

Free energy of stage

F() = - kT ln Z()

Probability density for fluctuation to

stage :

p(λ) proportional to exp(-F()/kT)

p(2)

p(1)

=

exp(- )F(2)-F(1)

kT

Different subvolumes of the phase space.

Page 33: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Equilibrium implies uniform probability over

33

Condition for equilibrium ∂F/∂ = 0 F() = constant

Probability distribution over p() = constant p(1) = p(2)

Time evolution over phase space

Expected

p(2)

p(1)= exp(- )F(2)-F(1)

kTsince

Actual

Page 34: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

One-Molecule Gas/Piston System

34

Overlap of subvolumes corresponding to stagesh = 0.5Hh=0.75Hh=Hh=1.25H

Slice through phase space.

Page 35: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Fluctuations Obliterate Reversible Detection

35

What happens:

What we expected:

Page 36: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

What it takes to overcome fluctuations

36

Enforcing a small probability gradient…

p(2)

p(1)= exp(- ) > exp(3) = 20

F(2)-F(1)

kT

…requires a disequilibrium…

F(1) > F(2) + 3kT

…which creates entropy.

S(2)-S(1) – (E(2)-E(1))/T = 3k

Exceeds the entropy k ln2 = 0.69k tracked by Landauer’s Principle!

No problem for macroscopic reversible

processes.

F(1) - F(2) = 25kT

p(2)/p(1) = 7.2 x 1010

= mean thermal energy of ten Oxygen molecules

Page 37: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

More Woes

37

Page 38: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Dissipationless Insertion of Partition?

38

With a conservative Hamiltonian, the partition

will bounce back.

Arrest partition with a spring-loaded pin?

No friction-based device is allowed to secure the partition.

The pin will bounce back.

Feynman, ratchet and pawl.

Page 39: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

In Sum… We are selectively ignoring fluctuations.

39

Dissipationless detection disrupted by fluctuations.

Reversible, isothermal expansion and contraction does not complete due thermal motions of piston.

Inserted partition bounces off wall unless held by… what?Friction?? Spring loaded pin??...

Need to demonstrate that each of these processes is admissible. None is primitive.

Inventory assembled inconsistently.It concentrates on fluctuations when convenient; it ignores them when not.

Page 40: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Conclusions

40

Page 41: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Why should we believe that…

41

…the second law obtains even statistically when we deal with tiny systems in which fluctuations dominate?

…the reason for the supposed failure of a Maxwell demon is localizable into some single information theoretic process? (detection? Erasure?)

Page 42: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Conclusions

42

Is a Maxwell demon possible?

The best analysis is the Smoluchowski fluctuation exorcism of 1912. It is not a proof but a plausibility argument against the demon.

Efforts to prove Landauer’s Principle have failed.

…even those that presume a form of the second law. It is still speculation and now looks dubious.

Thermodynamics of computation has incoherent

foundations.

The standard inventory of processes admits composite processes that violate the second law and erase without dissipation.

It selectively considers and ignores fluctuation phenomena according to the result sought.

Its inventory of processes is assembled inconsistently.

Page 43: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

43http://www.pitt.edu/~jdnorton/lectures/Tsinghua/Tsinghua.html

Page 44: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

44

Finis

Page 45: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

45

Appendix

Page 46: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

A dilemmafor information

theoretic exorcisms

46

Page 47: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

EITHER

47

the total system IS canonically thermal.(sound horn)

the total system is NOT canonically thermal.(profound horn)

OR

Earman and Norton,

1998, 1999, “Exorcist XIV…”

Total system =gas + demon + all surrounding.

Canonically thermal = obeys your favorite version of the second law.

Cannot have both!

Profound“ …the real reason Maxwell’s demon cannot violate the second law …uncovered only recently… energy requirements of computers.”Bennett, 1987.

andSoundDeduce the principles (Szilard’s, Landauer’s) from the second law by working backwards.

Demon’s failure assured by our decision to consider only system that it cannot breach.

Principles need independent justifications which are not delivered.(…and cannot? Zhang and Zhang pressure demon.)

Do information theoretic ideas reveal why the demon must fail?

Page 48: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

1. 48

Page 49: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

1. Thermalization

49

Initial dataL or R

Proof shows only that an inefficiently designed erasure procedure creates entropy.No demonstration that all must.

Mustn’t we thermalize so the procedure works with arbitrary data?No demonstration that thermalization is the only way to make procedure robust.

Entropy created in this ill-advised, dissipative step. !!!!!!

Irreversible expansion

“thermalization”

Reversible isothermal

compression passes heat kT ln 2 to heat

bath.

Data reset to LEntropy k ln 2 created in heat

bath

Page 50: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

2. 50

Page 51: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

2. Phase Volume Compressionaka “many to one argument”

51

Boltzmann statistical

mechanics

thermodynamic

entropy k ln (accessible phase volume)=

“random” data

reset data

occupies twice the phase volume of

Erasure halves phase volume.

Erasure reduces entropy of memory by k ln 2.

Entropy k ln 2 must be created in surroundings to conserve phase volume.

Page 52: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

2. Phase Volume Compressionaka “many to one argument”

52

“random” data

reset data

DOES NOT occupy twice the phase

volume of

thermalizeddata

Confusion with

It occupies the same phase volume.

FAILS

Page 53: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

A Ruinous Sense of “Reversible”

53

Random data

and

thermalized data

have the same entropy because they are connected by a reversible, adiabatic process???

insertion of the partition

removal of the partition

No. Under this sense of reversible, entropy ceases to be a state function.

S = 0

S = k ln 2

random data thermalized data

Page 54: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

3. 54

Page 55: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

3. Information-theoretic Entropy “p ln p”

55

“random” data

reset data

Information

entropy Pi ln PiSinf = - k i

PL = PR = 1/2Sinf = k ln 2

PL = 1; PR = 0Sinf = 0

Hence erasure reduces the entropy of the memory by k ln 2, which must appear in surroundings.

But…in thiscase,

Information

entropyThermodynamic

entropydoes NOT equal

Thermodynamic entropy is attached to a probability only in special cases. Not this one.

Page 56: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

What it takes…

56

IF…

Information

entropyThermodynamic

entropyDOES equal“p ln p” Clausius dS = dQrev/T

A system is distributed canonically over its phase space

p(x) = exp( -E(x)/kT) / Z

Z normalizes

All regions of phase space of non-zero E(x) are accessible to the system over time.

AND

For details of the proof and the importance of the accessibility condition, see Norton, “Eaters of the Lotus,” 2005.

Accessibility condition FAILS for “random data” since only half of phase space is accessible.

Page 57: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

4. 57

Page 58: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

4. An Indirect Proof

58

gas

memory

One-Molecule

One-Molecule

Reduces entropy of heat bath by k ln 2.

isothermal reversible expansion

insert partition

or

shift cell to match

dissipationlessly detect gas state

or

Dissipationlessly detect memory state.

If R, shift to L.

Net effect is a reduction of entropy of heat bath. Second law violated even in statistical form.(Earman and Norton, 1999, “no-erasure” demon.)

Final step is a dissipationless erasure built out of processes routinely admitted in this literature.

Fails

Page 59: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

“…the same bit cannot be both the control and the target of a controlled operation…”

59

Every negative feedback control device acts on its own control bit. (Thermostat, regulator.)

The Most Beautiful Machine 2003Trunk, prosthesis, compressor, pneumatic cylinder13,4 x 35,4 x 35,2 in.“…the observers are supposed to push the ON button. After a while the lid of

the trunk opens, a hand comes out and turns off the machine. The trunk closes - that's it!”

http://www.kugelbahn.ch/sesam_e.htm

Page 60: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

Marian Smoluchowski, 1912

The second law holds on average only over time.Machines that try to accumulate fluctuations are disrupted fatally by them.

The best known of many examples.

Trapdoor hinged so that fast molecules moving from left to right swing it open and pass, but not vice versa.

BUT

The trapdoor must be very light so a molecule can swing it open.

AND

The trapdoor has its own thermal energy of kT/2 per degree of freedom.

SO

The trapdoor will flap about wildly and let molecules pass in both directions.

60

Exorcism of Maxwell’s demon by fluctations.

Page 61: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

The standard inventory of

processes

61

Page 62: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

We may…

62

Exploit the fluctuations of single molecule in a chamber at will.

Insert and remove a partition

Perform reversible, isothermal expansions and contractions

Inventory read from steps in Ladyman et al. proofs.

Page 63: Information and Thermodynamic Entropy John D. Norton Department of History and Philosophy of Science Center for Philosophy of Science University of Pittsburgh

We may…

63

Detect the location of the molecule without dissipation.

??

Shift between equal entropy states without dissipation.

?Trigger new processes according to the location detected.

Gas

Memory

R

L