reviews

20
REVIEWS Ian Hacking, The Emergence of Probability (London & New York, Cambridge University Press, 1975). 209 pp. $15.95. Ian Hacking's delightful new book is a tour de force of historical scholarship. It is a great deal more, too, and although we are alerted to the juxtaposition in ancient Indian literature of the concepts of gaming and sampling, and to Paracelsus' claim that the best herb against internal pricking is the thistle, and although Hacking has played with ancient dice in the Cairo Museum of Antiquities and examined the handwriting on the manuscript of the Port Royal Logic in the Bibliothbque Nationale, the point of these exercises is not merely historical and antiquarian, but rather to shed light on present day conceptions of probability. Hacking takes us on a fascinating tour, not only among the 17 th century giants of probability theory, such as Pascal, Huygens, Leibniz, and Bernoulli, but among their predecessors and descendents. Hacking, of course, has his own views about the interpretation of probability, and, inevitably, and appropriately, this colors his treatment of historical figures. I, too, have my views about probability, and that will no doubt color my treatment of his treatment. But it is a tribute to Hacking's scholarship, as well as to his exposition, that it is easy to see what parts of his interpretations are arguable. This is a fortunate thing for me, for I am no historian, and could not take Hacking to task for quoting only from page 158, when there is a very clear passage that he ignores on page 247 which suggests the w,~ry opposite interpretation. That both author and critic can depend on the citations provided by the author is yet another indication of the quality of the small book in question. 1. That there is a dual aspect to probability is acknowledged by everybody. People use expressions in the probability family to communicate both facts about frequencies and warrants regarding beliefs. These two aspects Hacking calls the 'aleatory' and the 'epistemic'. It is true that in modem times many writers have suggested that we have here two distinct concepts. That the suggestion is correct is not so clear. Hacking puzzles over the fac, t that "the different kinds of probability are less well understood, and so less easily distinguished, than weight and inertial mass" (p. 13). And this is so, despite the fact that this duality has been a perennial element in the part of the history of probability with which we are concerned, recurring again and again. "Philosophers seem singularly unable to put asunder the a~leatory and Theory and Decision 9 (1978) 205-217. All Rights Reserved. Copyright 1978 by D. Reidel Publishing Company, Dordrecht, Holland.

Upload: henry-e-kyburg

Post on 06-Jul-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reviews

R E V I E W S

Ian Hacking, The Emergence o f Probability (London & New York, Cambridge

University Press, 1975). 209 pp. $15.95.

Ian Hacking's delightful new book is a tour de force of historical scholarship. It is a great deal more, too, and although we are alerted to the juxtaposition in ancient Indian literature of the concepts of gaming and sampling, and to Paracelsus' claim that the best herb against internal pricking is the thistle, and although Hacking has played with ancient dice in the Cairo Museum of Antiquities and examined the handwriting on the manuscript of the Port

Royal Logic in the Bibliothbque Nationale, the point of these exercises is not merely historical and antiquarian, but rather to shed light on present day conceptions of probability. Hacking takes us on a fascinating tour, not only among the 17 th century giants of probability theory, such as Pascal, Huygens,

Leibniz, and Bernoulli, but among their predecessors and descendents. Hacking, of course, has his own views about the interpretation of probability, and, inevitably, and appropriately, this colors his treatment of historical figures. I, too, have my views about probability, and that will no doubt color my treatment of his treatment. But it is a tribute to Hacking's scholarship, as well as to his exposition, that it is easy to see what parts of his interpretations are arguable. This is a fortunate thing for me, for I am no historian, and could not take Hacking to task for quoting only from page 158, when there is a very clear passage that he ignores on page 247 which suggests the w,~ry opposite interpretation. That both author and critic can depend on the citations provided by the author is yet another indication of the quality of the small book in question.

1. That there is a dual aspect to probability is acknowledged by everybody. People use expressions in the probability family to communicate both facts about frequencies and warrants regarding beliefs. These two aspects Hacking calls the 'aleatory' and the 'epistemic'. It is true that in modem times many writers have suggested that we have here two distinct concepts. That the suggestion is correct is not so clear. Hacking puzzles over the fac, t that "the different kinds of probability are less well understood, and so less easily distinguished, than weight and inertial mass" (p. 13). And this is so, despite the fact that this duality has been a perennial element in the part of the history of probability with which we are concerned, recurring again and again. "Philosophers seem singularly unable to put asunder the a~leatory and

Theory and Decision 9 (1978) 205-217. All Rights Reserved. Copyright �9 1978 by D. Reidel Publishing Company, Dordrecht, Holland.

Page 2: Reviews

206 REVIEWS

the epistemological side of probability. This suggests that we are in the grip of darker powers than are admitted into the positivist ontology" (p. 15).

Alternatively it might be construed as suggesting that we have here not two concepts, but one; and that the perennial efforts to put asunder these two aspects of probability have met with no more success than would meet the effort to sever the north from the south pole of a magnet - and for much the same reasons. Hacking straightforwardly puts his cards on the table: he claims that there is a space of possible theories of probability, and that this

has remained more or less constant from 1660 on; the aleatory and the

epistemological interpretations of probability are presumably to be found at

the antipodes of this space. I shall put my cards on the table, too. It seems to

me that the very evidence that Hacking cites indicates but a single concept of

probability, toward which the 17th century authors more or less gropingly directed their attention. They were not successful in finding it, although it

seems to me that they came pretty close, and it is this failure that has led analysts in the 19th and 20th centuries to suspect that there were two ideas, rather than one, at issue.

The thesis that the two concepts of probability have, since the 17th

century, been there to be distinguished, had someone only been clear enough

about his ideas, is not a thesis of the book. Hacking does find a variety of

theories of probability, and within these theories finds two incommensurable

aspects of probability, the aleatory and the epistemological. I shall argue that

these aspects of probability are not incommensurable. The two concepts of

probability that Hacking sees as emerging from the 17th century can be construed as the result of a twentieth century mistake; we might claim that what was attempting to emerge in the seventeenth century was a single, unified, concept of probability, having both aleatory and epistemological

aspects.

2. One of the most fascinating and informative parts of the book deals with the precursors of the 17th century writers on probability. Probability, up through the Renaissance, pertained to opinion, as distinct from knowledge. Knowledge is knowledge of universal, necessary, truths. The method of knowledge is demonstration. Opinion concerns belief, and belief concerns the contingent and accidental. Knowledge and opinion are not on the same continuum, but have different objects and employ different methods. The method by which opinion is established is reflection or argument. Probability,

Page 3: Reviews

REVIEWS 207

for a long time, concerned the source of an opinion: that is, the authority, or

book, or writer, who vouched for it. In somewhat the same vein, probability

was applied to testimony. Probability is an attribute of opinion, and that which is probable is that which is supported by testimony and authority.

At the same time, Science, the proper object of knowledge, concerned universal and necessary generalizations, known by demonstration. There is no call for probability here. A demonstration that doesn't quite come off doesn't leave you with an opinion; it leaves you with a failed attempt at demonstration. What Hacking calls the high sciences (astronomy, mechanics, geometry) were at this time committed to the medieval standards; opinion, probable or otherwise, had no place in them. In addition to the higlaer sciences, however, there were the low sciences: medicine, alchemy, and the like, which could not plausibly aspire to demonstrative certainty. There opinion, and, in consequence, probability, played a large role. And there Hacking traces the gradual development of the epistemic aspect of probability. One finds in the low sciences a doctrine of signs, where a sign is some observable phenomenon that testifies to the presence of something unobservable. Thus sweat around

the nose is a sign of death, mice a sign of coming plague, etc. These signs may be thought of as belonging to a language of nature, thus suggesting the evidence of the testimony of things, in the language of things paralleling the evidence of the testimony of men, in human language. (The distinction between the evidence of testimony and the evidence of things, between external evidence and internal evidence, is to be found in the Port Royal Logic.)

But not all signs are to be trusted equally; Fracastoro (1546)says "Some signs are almost always, others are often to be trusted", and that these are the

signs with probability. Hacking finds here "the old notion of probability as testimony conjoined with that of frequency" (p. 43)but he make,; surprisingly

little of it. A few pages later he records Hobbes (1640) as casually accepting the connection between natural signs and the frequency of their correctness. Now this is surely remarkable. We have been following out the conceptual precursers of epistemological probability that contributed to the foundations of the work done in the 17th century. There is no suggestion that any use of probability to characterize opinion, in this tradition, is quantitative: you may pit one authority against another, and determine your belief correspondingly, but there is nothing quantitative here, as the Jesuit probabilists made perfectly clear. When you turn to the examination of the language of nature, however,

Page 4: Reviews

208 REVIEWS

you suddenly do find a quantitative measure of the reliability of her signs: the frequency with which they turn out to be reliable. Furthermore, these frequencies satisfy what we now call the calculus of probability just as well as the possible outcomes on the roll of a die. But it is not surprising that there did not develop a mathematical probability here, for as Hacking points out (later and in another context) there is no easy and obvious way to combine the import of different signs. Nevertheless, we seem to be on the verge of a quantitative notion of epistemic probability.

3. Hacking drops this strand of the background of probability here, and turns to the development of the probability calculus. The kind of puzzle that initiated the development of this mathematics is the following: Two men are playing at a game of chance. The winner is the first person to obtain a score

of N. A has m points and B has n points when the game is interrupted. In what proportion is it fair to divide the prize of the game?

It was an interrupted game problem of just this sort, posedby the Chevalier de Mer6, that got Pascal interested in probability computations. In due course, as a result, Huygens, Fermat, and Leibniz all became involved with probability. Some of these games were games with dice and cards. Galileo, who solved some of these problems concerning relative chances, is alleged to

have been concerned with relative frequencies. Galileo, it is claimed, "... is explicitly concerned with the relative frequencies of different outcomes" (p. 53). But he solves the problems simply by counting cases. If we suppose that counting cases is appropriate because it is known that each case occurs with equal frequency, that premise is not, so far as I can tell, stated in the argument. Nevertheless, it may be reasonable to take it as implicit; Cardano, in 1550, quoted by Hacking, mentions explicitly the possibility of dishonest dice, and that the wagers should reflect this bias. Where there is a clear notion of dishonest dice, there must be a clear notion of honest ones - that is, dice that fall on each side with the same frequency. Galileo notes that some numbers are "more easily and frequently made than others" (p. 52), but the context suggests that he may only be counting cases. We shouldn't get carried away with Galileo's language: he said "frequency", but he might have meant 'probability'; there is little uniformity of diction now, and there was less in the 17th century. But even if he is talking of frequency in this context, we are hardly warranted in attributing a frequency theory of probability to him be- cause he talks of the frequency with which a die lands in a certain way!

Page 5: Reviews

REVIEWS 209

Where does all this get us with our problem of the unfinished game?

Hacking sees it as part of a developing understanding of aleatory probability, and even sees Cardano as adopting a propensity view of chance (p. 56). But if A and B are interrupted in their game of dice, long run frequencies or propensities just don't seem relevant to the question of how to divide the

prize. They are concerned with that one game, not any frequency in future

games. And this is even more plain in the specific case that Hacking cites from Pacioli (1494) (p. 50), which concerns teams playing ball - hardly the same

sort of thing as a dice game. I f any sort of probability is relevant to the

decision as to how to divide the prize in the ball game, it is epistemic prob-

ability. What could be more natural, even in the framework of Renaissance

medicine and alchemy, than to see the score at the time of interruption as a

sign indicating the winner, though only with a certain degree of probability?

And is it not almost equally natural, apart from anachronistic preconceptions,

to see the interrupted dice game in the same light? Should we not regard the

score as a sign of the unknown outcome of the game, which, did we but know

it, or have the time to reveal it by continuing the play, would settle the

distribution of the stakes unequivocally? But these signs are signs with probability, and, like signs with probability in medicine, their epistemological

strength is measured by frequencies. I agree with Hacking that the derivation that solves the problem (when it exists: clearly no such derivation can solve a problem involving skill, such as a ball game) is a derivation involving frequencies.

It is a derivation of frequencies of winning under those circum.,;tances from assumed frequencies for the various 'cases', and this derived frequency is the frequency characterizing the epistemic reliability of the sign.

4. Hacking is fascinated by Pascal's wager. He finds three arguments in

Pascal's discussion, and finds all of them perfectly valid. I found Hacking's interpretation both persuasive and enlightening. Before reading Hacking's

account, I confess I sided with Voltaire and found the argument puerile. But

the intrinsic interest of the argument is not theological, as Hacking points out. It is an argument from expectation. Since expectation provides an important

connection between decisions and chances, it is worth quoting Hacking's gloss at length.

We would express the argument in terms of some idea of subjective or personal prob- ability, saying, for example, that no matter how slender our degree of belief in the existence of God, it is not 0. Pascal does not speak of a quantitative measure of degree

Page 6: Reviews

210 REVIEWS

of belief. He is saying that we are in the same epistemological position as someone who is gambling about a coin whose aleatory properties are unknown. His judgment relies on an alleged isomorphism between the structure of a decision problem when objective physical chances are known to exist, and a decision problem in which there are no objective physical chances. (p. 70)

There are a number of curious things about this passage. Heretofore, Hacking has attempted to interpret the writings he cites in the framework

of the ideas that were there at the time. Here he says how 'we' would express

'the argument'. Thus expressed would it be Pascal's argument? We have no reason to think so. The probability that God exists is, no doubt, an epistemic

probability, related (historically) to the probabilities provided by testimony and evidence. But there was surely, at this time, no suggestion that th~se probabilities, vague as they might be, were simply 'subjective: or 'personal', without any sort of objective force. Surely Pascal is not talking about the

slenderness of our belief, but about the weight of the evidence for God's existence. Be as skeptical as you please, he is saying, the weight of the evidence, internal and external, is not 0. He does not speak of a quanti-

tative measure: again, the problem of combining the evidence of a number of

signs precludes quantitative measures, even when we have good frequencies

indicating the reliability of individual signs.

Pascal is saying just what Hacking takes him to be saying in the next

sentence; but again in the last sentence we find ourselves indulging in two

hundred years of time travel. 'Objective physical chances' represents a notion that, I suspect, only came into being with quantum mechanics. The contrast

between objective physical chances and mere irregularity in a series is one that would be unintelligible to the 17th century. What, furthermore, do

objective physical chances have to do with our decision problem? Suppose what were at issue were a game of heads or tails "at the other end of an infinite distance", as Pascal puts it (p. 66). What counts for our decision is not the aleatory properties of the coin, but our epistemological position. And as Hacking appears to agree, we are in the same epistemological position with respect to the existence of God. This is not a matter of an 'alleged isomorphism; it is a matter of ident i ty .

The argument, as expounded by Hacking, seems perfectly clear and intelligible, and to reveal unequivocally the roots in opinion and in gambling which Hacking has been so competently uncovering. Hacking's gloss, on the other hand, seems to obscure this perfectly clear 17th century argument by

Page 7: Reviews

REVIEWS 211

overlaying it with 20th century confusions for the sake of showing that a contemporary alleged distinction was already emerging in the 17th century.

5. Hacking finds the "first occasion on which some probabilistic expression with epistemic overtones was systematically used to denote something

measurable" to be in the last pages of the Port Royal Logic, and he regards this as the first emergence of a significant concept of epistemic probability (p. 73). In this, obviously, I disagree. I think that we have quantitative epistemic probability clearly present in the discussion of the unfinished games; I think we have found it even in the medical writings cited by Hacking himself. We surely have it in the use of expectations in Pascal's wager, even on Hacking's gloss. We have found it both in the epistemic background of the development of probability, and in the aleatory background. It :may well be, as Hacking remarks, that it is in the Logic that we find the first occasion where probability, so called, is measured; though even here we must except such imprecise associations of epistemic probability with frequency as that of Fracastro, also cited by Hacking (p. 43).

Leibniz, of course, did write explicitly about probability, and explicitly took numerical probability as an epistemic notion. "So he takes the doctrine of chance not to be about physical characteristics of gambling set-ups but about our knowledge of those set-ups" (p. 89). He went to Paris and found, courtesy of Huygens, just the mathematics needed. Hacking writes "... when

we look at Huygens, we shall find his book is entirely about games of chances and has few epistemic overtones. The word 'probability' does not occur. Leibniz, ... could call it 'an elegant example of reasoning about degrees of probability'" (p. 90). Why not say that Huygens was performing computations on frequencies? Why not say that Leibniz saw that those computations

provided the mathematics for probability? From what Hacking says, neither would object.

In his illuminating discussion of equipossibility, Hacking points out that 'possibility' was subject to the same dual interpretation as probability. One thinks of 'possibility' as a matter of what the world might allow; and one also thinks of 'possibility' as reflecting what we know of the world. In this connection, Hacking quotes Leibniz to the effect that "our judgment of probability 'in the mind' is proportional to what we believe to be the facility or propensity in things". This is clearly epistemic. Hacking writes: "Leibniz was probably confused and he almost certainly vacillated in his conception of probability." My own view of the matter is that there is neither confusion nor

Page 8: Reviews

212 REVIEWS

vacillation here, but that Leibniz had a relatively clear and precise - and indeed correct - notion of probability. Probability represents precisely the

epistemological counterpart of what we know or believe about frequencies or propensities. (Note that it is mainly in the self-conscious 20th century that people, out of obvious desperation, start trying to identify probabilities

and frequencies.)

6. The critical link between frequencies (or propensities) and decisions is provided by the notion of expectation. Before we examine what Hacking has to say about this link, let us observe that the link between epistemic probability and decision has already appeared. It appeared in the interrupted game problem - each player receives his expected share of the prize - that is, the product of the prize and his (epistemic) probability of winning it. This applies to the ball game as well as to dicing. It appeared in Pascal's wager, whether or not we reject Hacking's aleatoric gloss. I would conjecture it could be found in medieval and renaissance medical writings.

Huygens wanted to compute the fair price for a gamble - its expectation. He takes for granted the fair price of a lottery ticket: z/n, when the price is

z and the number of tickets is n. His justification for taking the expected value of a gamble as the feature that should enter into our decisions, lies in showing the equivalence of an arbitrary gamble with the appropriate lottery. This is a perfectly sound justification of the use of expectation - as opposed to the dreadful infinitely-long-run argument that Hacking mentions with appropriate disdain (p. 94). It is also an argument which is perfectly neutral between long run frequencies and epistemic probabilities, though it seems a bit more intelligible construed in the latter way. Expectation, as a guide to decision, is a guide applicable to particular cases. Long run frequencies, or proclivities, don't have any obvious bearing on particular cases, even when

multiplied by utilities. It is interesting that Hacking, discussing Leibniz, said that Huygens' book

had "few epistemic overtones". Discussing Huygens, he says that "Huygens is to some extent neutral between aleatory and epistemic approaches to probability, although he leans to the former". And finally he mentions that Huygens argues that you should pay five coins for the privilege of choosing a hand when I hold three coins in one hand and seven in the other. "But Huygens is simply untroubled by such modern sophistication" as would demand that the coins ge t into my hands by a random mechanism. (p. 97)

Page 9: Reviews

REVIEWS 213

Poor Huygens! But it seems to me that Huygens is perfectly clear and per-

fectly intelligible in what he says; and that, when it 'comes right down to

it, there is more than an 'overtone' of epistemic concern here.

At the same time that Pascal, Huygens, Fermat, Leibniz, zaad the other

mathematical greats of the 17th century were developing the probability

calculus and combinatorial mathematics, demographers in England and on

the continent were beginning to accumulate death and birth records. This

was valuable for a number of reasons, not the least of which was providing

a reasonable basis for the value of annuities. Hacking takes the use of this

data (since it is a matter o f frequencies) to reflect an aleatory notion of

probability rather than an epistemic one. But this is a confusion. Surely

there is no way of construing a death from smallpox as a matter o f 'ob-

jective physical chance' - the notion he saddled Pascal with in order to give

his wager an aleatory basis. Nor, given the frequency of death at given ages is

there any way to use this value to compute the worth of an annuity "on each

occasion of use" - as Hacking demands in a later chapter. (It does not apply,

for example, to a woman in childbirth.) Death is surely no more a matter o f

chance than the hand I choose to hold seven coins in.

7. The discrepancy between the view that is suggested to me by the history

that Hacking cites and the view that he is attempting to uphold, comes out most dearly in the chapter on Bernoulli. Bernoulli writes, as quoted by

Hacking: "Probability is degree of certainty and differs from absolute cer-

t a in ty as the part differs from the whole" (p. 145). But: "There is no need

to foist a single probability idea on Bernoulli. Indeed it is in consequence of

his work that the distinction between aleatory and epistemic concepts of

probability became more important." We shall see.

The point at issue concerns the interpretation of Bernoulli's famous limit

theorem. Hacking is perfectly accurate in his initial statement of the problem:

Bernoulli plainly wants to estimate an unknown parameter p. His favorite example is the proportion of white pebbles in an urn. An estimator is a function F from data to possible parameter values... Bernoulli uses an interval estimator which maps given data onto a set of possible values of p, 'bounded by two limits'. (p. 156)

Le t s n be the outcome of n trials. F ( s n ) i s thus the value of the est imatorF

at the data point s n. Hacking asks, "when is F a good estimator?" He answers that two desiderata present themselves: F should usually give the right answer;

and F should be credible on each occasion of use. Hacking argues that the

Page 10: Reviews

214 REVIEWS

prospect of finding such an estimator are bleak. It is not hard to see why: Hacking says that "The best analyses of what makes for credibility on each

occasion of use' yield desiderata that are incompatible with the desideratum of being usually right." This is a fancy way of putting it. If F* is an estimator

with the property of being usually right, and its value for the observation r is

the interval/, then in the special case that I know that the value ofp is not in/ , the best value of the credibility is 0; on this occasion of use F* is not credible.

But it is patently absurd to demand that both these desiderata be satisfied.

What we could reasonably demand is that on any occasion of use, the esti-

mator we use on that occasion be credible, thus allowing that on different occasions we may find difference estimators appropriate. There is no reason

whatever, if we can satisfy this desideratum, to seek some estimator we can

always use and which will usually be right.

Let us consider a Bernoullian argument in somewhat more detail. Suppose

we have the classical urn of pebbles, of which an unknown fraction p are

white. Corresponding to Hacking's formula (2) (p. 158), we have: The frequency of samples of 10,000 in which a ration r in the limits

[ P - V / ~ , P + x / r ~ J o c c u r s is at least 0.999.

Since p lies between 0 and 1; this means that r lies between p -0 .016 and p

+0.016 with frequency of at least 0.999. This, of course, is not Bernoulli's (or anybody else's) practical certainty - it is a straightforward frequency. It is

still a frequency when we express the event in question as that of the interval (r -0 .016, r +0.016) covering the parameter p.

We still haven't got Hacking's formula (3), p. 158, and in fact we haven't

got any probability formula at all. We have a straight-out frequency statement:

The frequency with which the interval (r -0.016, r +0.016) will cover p will be at least 0.999.

Now we draw a particular sample, exhibiting a frequency r* of white pebbles. And now, after we have drawn the sample, we consider cases. Case I: We know the value of p. The epistemic probability - the credibility - that r* -0 .016 ~< p ~< r* +0.016 is 0 or 1, according as we know that it is true or false. (We know one or the other, since by assumption we know both r* and p.) Case II: This is presumably the case about which Bernoulli is talking,

in which we know nothing about the origin of the urn, nothin about its contents other than what is revealed by our sample. In that case, relative to that body of knowledge, the epistemic probability that r* -0.016 ~< p ~< r*

Page 11: Reviews

REVIEWS 215

+0.016 is at least 0.999. Since this is Bernoulli's favorite number for practical

certainty, this allows us to say that, relative to what we know, we can be

practically certain that r* - 0 . 016 ~< p ~< r* +0.016. Case III: We do know

something about the origin of the urn and its pebbles, or we have at any

rate some prior knowledge about it; then we can use this prior knowledge

to give us a prior distribution for p , from which, by Bayes' theorem, we can compute a posterior distribution for p.

The requirement that Bernoulli's estimator "will be credible on every

occasion of use" is a red herring. All that is required for Bernoulli's program to go through on this occasion, is that the estimator be credible on this occasion of use. What we require is moral certainty, that is, high epistemic probability, which obviously is relative to my entire body of knowledge and not just the description of the sample drawn (much less, that description reduced to a single real number). Furthermore, this treatment allows us to take Bernoulli seriously when he says, in the passage first quoted, that prob- ability is degree of certainty, that is, an epistemic concept. ] 'here are ob-

jections to this way of looking at the matter (which I take to be Bernoulli's), but I find them uncompelling.

The confusion in this problem no doubt stems from the proclivity of some

authors - perhaps Bernoulli among them - to construe the prob][em as one of measuring the probability that a certain probability is in a certain interval.

The second 'probabili ty ' in this context cannot be directly construed as

epistemic. And in the precise statement of the problem - even 'the one given

by Hacking - it isn't even called a probability. It is called the proportion of

white balls in the urn. The confusion arises, because if we know the proportion

of white balls in the urn, then the probability that a ball drawn at random

will be white would be exactly that fraction. Knowing the proportion, we would use the indefinite article and say (epistemically): the probability that

a ball drawn from that urn will be white is p. Not knowing the proportion, as

in the case at hand, we might still use the same locution: the probability that a ball drawn from that urn is white is p, but I don' t know what p is. Strictly

speaking, if we take probability, as I am suggesting, as purely epistemic, this

is nonsense. But it is easy enough to see what is intended: namely, i f I knew what the proportion was, then I would take that to be the probability that ball drawn from the urn is white. Since I don' t know the proportion, one might go on, there is no number that I take to be my epistemii." probability.

Page 12: Reviews

216 REVIEWS

8. Leibniz, Pascal, Huygens, and Bernoulli all seem to have had the same idea of probability in mind. It was, as Hacking says, a dual notion, involving both

epistemic and aleatory elements. But I think the confusions that Hacking

f'mds in their writings are in good part confusions engendered by the rather desperate analyses of the 20th century. Leibniz seems to have been most

explicit about probability. As Hacking writes: "Leibniz had learned from the

law that probability is a relation between hypotheses and evidence" (p. 139).

That is, Leibniz took probability as epistemic. "But he learned from the

doctrine of chances that probabilities are.., physical propensities" (p. 139).

For this, I see no evidence whatever. Of course, in the case of gaming devices,

one bases one's beliefs on frequencies. Similarly in demography, where it is

hard to know how the 'stable frequencies' - which change continually - are

to be interpreted. Similarly in sampling theory, where the 'stable frequencies'

have nothing to do with physical propensities at all, but are simply the

proportions in fixed classes. Similarly in the old theory of signs, where the

probability of the prognosis of death will reflect the frequency (roughly) with which death has followed upon those signs. Hacking continues: "'On the

one hand we have degrees of makeability in r e , which we may gloss as tendencies to produce stable frequencies" (p. 139). Better, we may gloss these degrees as the stable frequencies themselves, in some cases, and as

proportions in others.

These are the basis of probabilities in m e n t e ... For example if r asserts only that in some chance set-up the objective tendency is to produce outcome E on repeated trials with stable relative frequency f, then the probability of the hypothesis that E occurs on the next trial, relative to this data r, is surely f. Leibniz appears to be inclined to say that this local piece of reasoning has general application. (p. 139)

Me too. And so I find the space of possible probability theories bequeathed

us by the 17th century much narrower than Hacking does. The evidence

marshalled in Hacking's book is interesting enough in its own right: the book can be recommended to anyone interested in History, or Philosophy, or

Probability, or Gambling, or Theology. It contributes to the understanding of all o f these topics and more. The evidence is furthermore strong evidence for the duality of probability. There is no argument about that. But whether it should be interpreted as evidence that there is a single still emerging notion of probability possessed of both these aspects, or whether it should be interpreted as evidence that (at least) two distinct notions of probability had

Page 13: Reviews

REVIEWS 217

already begun to emerge in the 17th century seems to me arguable. I have tried to suggest some of the arguments which might lead one to put the

former interpretation on Hacking's data rather than the latter.

Department of Philosophy University of Rochester

HENRY E. KYBURG, Jr.

Page 14: Reviews

t~

4~

o

o

AN

NE

X

2 -

cont

inue

d

Mod

el

Deg

ree

of

Spec

ific

A

lloc

atio

n C

omm

ents

ad

apta

bili

ty

dim

ensi

ons

rule

s

7. A

naly

tic

Full

y ad

apta

ble

to

As

wit

h co

njoi

nt

hier

arch

y m

anag

emen

t ne

eds

anal

ysis

, pr

oces

s de

term

ined

by

man

agem

ent

judg

men

t

8. R

isk/

retu

rn

Non

e. A

the

ory

1. E

xpec

ted

retu

rn

mod

el

deri

ved

mod

el

(mea

n)

2. R

isk

(var

ianc

e)

9. S

toch

asti

c Sa

me

as

The

ent

ire

dom

inan

ce

risk

/ret

urn

mod

el

dist

ribu

tion

of

retu

rn

Opt

imal

all

ocat

ion

amon

g al

l it

ems

of t

he p

ortf

olio

(e.

g.,

prod

ucts

, m

arke

t se

gmen

ts)

dete

rmin

ed a

lgor

ithm

ical

ly

Det

erm

inat

ion

of o

ptim

al

port

foli

o

Sam

e as

ris

k/re

turn

Lim

ited

app

lica

tion

s.

Con

cept

uall

y an

d m

athe

mat

ical

ly v

ery

appe

alin

g.

All

ows

assu

mpt

ions

and

al

loca

te r

esou

rces

acr

oss

prod

ucts

,mar

ket

segm

ents

, an

d di

stri

buti

on n

etw

orks

op

tim

ally

und

er d

iffe

rent

sc

enar

ios

of m

arke

t an

d co

mpe

titi

ve c

ondi

tion

s.

Wei

ghti

ng o

f di

men

sion

s ex

plic

itly

con

side

red

Con

cept

uall

y th

e m

ost

defe

nsib

le,

yet,

diff

icul

t to

oper

atio

nali

ze f

or t

he p

rodu

ct

port

foli

o de

cisi

on.

Lim

ited

re

al-w

orld

app

lica

tins

Sam

e as

ris

k/re

turn

Z

O

;}

r~

Page 15: Reviews

REVIEWS

Ian Hacking, Why Does Language Matter to Philosophy ? (Cambridge University Press, 1975). VIII + 200 pp.

"Why should the study of language matter more to philosophy than to, say, zoology?" (p. 4) asks Professor Hacking, and we are anxious to hear his answer. For the question is as fascinating as it is weighty: it lurks behind nearly everything that goes on in analytical philosophy, yet still awaits a convincing reply. Hacking, moreover, is as qualified as any to provide one; a capable and original analytic philosopher himself, who has made, his mark in dealing with issues equally difficult.

Unfortunately, he hardly speaks to the question raised in the title. What he does, instead, is try to answer another one: how and when did language thus matter? Yet even here, what he offers is a rather capricious selection of 'case studies', loaded in advance to support his own prejudice; a potpourri of sometimes interesting tid-bits of history, which by themselves do not even fit into the grand design of history he hopes to establish by their means.

The book is divided into three parts: 'The heyday of ideas' (Hobbes, Port Royal, Berkeley). 'The heyday of meanings' (Chomsky, Russell, Wittgenstein, Ayer, Malcolm) and 'The heyday of se.ntences' (Feyerabend, Davidson). Hacking's theory, which is supposed to justify this rather weird classification, is the following. Philosophers in the age of ideas were primarily concerned with the 'mental discourse' of the Cartesian ego conducted in ideas, not words, so to them language did not matter after all. Accordinl~y they did not have a theory of meaning as we understand it now. The second period is marked by an increasing concern for public language, a carrier of inter- subjective 'meaning', somehow or other tied to the physical world. Finally, in the third period, the realm of sentences emerges as an autonomous 'third world', a detached and impersonal depository of culture.

There is something rather Hegelian in this view, and, as in many things Hegelian, something fascinating and suggestive. Unfortunately, both for Hegel and Hacking, fascination is not a mark of truth, and suggestion is no proof.

Hacking argues that what the British Empiricists say, about the signification of words does not quite fit into Alston's slots for theories of meaning (pp. 18 ff.) The worse for Alston, one would like to say. For that matter, does Grice's or Katz's theory fit? What Hacking concludes is that, for instance, "Locke did not have a theory of meaning" (p. 52). It is amusing to notice that Alston

Theory and Decision 9 (1978), 219-221. All Rights Reserved. Copyright �9 1978 by D. Reidel Publishing Company, Dordrecht, Holland.

Page 16: Reviews

220 REVIEWS

himself (in the very work Hacking quotes, Philosophy of Language, 1964) takes Locke as the paradigm of the ideational theory of meaning. What does Hacking suggest? That Alston did not understand Locke - or his own theory?

On the other end of his schema, what makes Hacking so sure that meanings are out and sentences are in; what gift of prophecy enables him to say that Feyerabend and Davidson, rather than Grice, Putnam and the generative semanticists are riding the Wave of the Future? Meanings die hard, even in Hacking's own mouth: "Davidson resuscitates meaning by administering the kiss of death" (p. 179); "With work like this [Grice's] in the offing it would be foolish to go on arguing from examples that meaning is dead" (p. 181). Yet that is exactly what he does on the remaining few pages: plot the march of history from 'idea-lism' (via 'meaning4tis', if I may add my own joke) to 'lingua-lism' - from Berkeley to Marshall McLuhan (pp. 182-186).

The question Hacking asks in the title, which he does not answer, and the question he actually answers in a dogmatic fashion, evoke yet another: why, and for whom did he write this book? Who are the 'students' who, as the dustjacket says, will be provided "with a stimulating broad survey of prob- lems in the theory of meaning and the development of philosophy, par- ticulady in this century"? Surely not the beginners, since he discusses about a dozen figures from Hobbes to Davidson in less than 200 small pages, but omits such admittedly crucial figures as Austin, Quine and Strawson, because "the philosophical magazines are full of excellent argument, pro and con, concerning these great systems of our time" (p. 9). And certainly not the experts, who need not be told what a valid argument is (p. 82), and who will not be amused by the rather cavalier, if not folksy, explanations of certain key-notions he discusses in the book. Consider, for instance, the following description of the rationalist view of innate ideas: "Infants are born with a tendency to pick out mother [sic!I, hunger, colors, triangles, and even shapes as peculiar as combs, each at def~mite stages of maturation . . . . The rationalist suggests...a natural faculty to sort things into mothers and triangles..." (p. 60). Who among the rationalists would ever dream of putting mother on the pedestal reserved for the triangle and like (clear and) distinguished things? Nor will the experts be pleased by such obvious errors in history as the one he commits concerning Berkeley. "[Berkeley] acknowledges plenty of ideas of which we can form no images, God and the will, for example" (p. 40). Now what does Berkeley say? "So far as I can see, the words will, soul, spirit, do not stand for different ideas, or, in truth, for any idea at all, but for something

Page 17: Reviews

REVIEWS 221

which is very different from ideas, and which, being an agent, cannot be like

unto, or represented by, any idea whatsoever" (Principles of Human Know- ledge, sec. 27). It seems to me that the book is written for 'the curious layman'. If so, it further seems to me that even he is being ill served.

Why, indeed, does language matter, in a peculiar and unique way, to philosophy? Not merely because, as someone said, "I gotta use words if I am to talk to you", for this reason applies even in the sciences, as their professors are beginning to find out in dealing with the growing mass of functional mutes in their clientele. The only place where Hacking comes near to a solution is at the very beginning of the book. There he quotes Francis Bacon:

"Although we think we govern our words, ...certain it is that words, as a

Tartar's bow, do shoot back upon the understanding of the wisest, and

mightily entangle and pervert the judgement" (p. 5), and Locke: "But we see

that though it be proper to say, There is one matter of all bodies,, one cannot

say, There is one body of all matters: we familiarly say one body is bigger than another; but it sounds harsh (and I think never used) to say' one matter

is bigger than another" (p. 6). But then he goes on to dismiss these points as

"minor ways"... Philosophy, unlike the sciences, has no observations to find facts, no

experiments to locate errors. Yet it offers proofs and corrects errors on an

intersubjective level. What it talks about are certain notions, concepts and

ideas forming the very matrix of human thought. And one way, perhaps the

most important one, in which these notions, concepts and ideas are publicly

accessible is via the restrictions imposed upon our discourse on any subject

whatever: what is proper to say . . . . what we cannot say .... what sounds harsh,

...what is never used... Not to speak, of course, of the tendency of words to "shoot back upon the understanding ...and pervert the judgement". That is why language matters to philosophy: it did matter to Aristotle, Locke and

Wittgenstein, and always will. This is but a hint, albeit a very familiar one; a suggestion that calls for

illustration, discussion and qualification. Hacking's book, unfortunately,

brushes it aside without a second look.

Department of Philosophy University of California, San Diego

ZENO VENDLER

Page 18: Reviews

t~

4~

o

o

AN

NE

X

2 -

cont

inue

d

Mod

el

Deg

ree

of

Spec

ific

A

lloc

atio

n C

omm

ents

ad

apta

bili

ty

dim

ensi

ons

rule

s

7. A

naly

tic

Full

y ad

apta

ble

to

As

wit

h co

njoi

nt

hier

arch

y m

anag

emen

t ne

eds

anal

ysis

, pr

oces

s de

term

ined

by

man

agem

ent

judg

men

t

8. R

isk/

retu

rn

Non

e. A

the

ory

1. E

xpec

ted

retu

rn

mod

el

deri

ved

mod

el

(mea

n)

2. R

isk

(var

ianc

e)

9. S

toch

asti

c Sa

me

as

The

ent

ire

dom

inan

ce

risk

/ret

urn

mod

el

dist

ribu

tion

of

retu

rn

Opt

imal

all

ocat

ion

amon

g al

l it

ems

of t

he p

ortf

olio

(e.

g.,

prod

ucts

, m

arke

t se

gmen

ts)

dete

rmin

ed a

lgor

ithm

ical

ly

Det

erm

inat

ion

of o

ptim

al

port

foli

o

Sam

e as

ris

k/re

turn

Lim

ited

app

lica

tion

s.

Con

cept

uall

y an

d m

athe

mat

ical

ly v

ery

appe

alin

g.

All

ows

assu

mpt

ions

and

al

loca

te r

esou

rces

acr

oss

prod

ucts

,mar

ket

segm

ents

, an

d di

stri

buti

on n

etw

orks

op

tim

ally

und

er d

iffe

rent

sc

enar

ios

of m

arke

t an

d co

mpe

titi

ve c

ondi

tion

s.

Wei

ghti

ng o

f di

men

sion

s ex

plic

itly

con

side

red

Con

cept

uall

y th

e m

ost

defe

nsib

le,

yet,

diff

icul

t to

oper

atio

nali

ze f

or t

he p

rodu

ct

port

foli

o de

cisi

on.

Lim

ited

re

al-w

orld

app

lica

tins

Sam

e as

ris

k/re

turn

Z

O

;}

r~

Page 19: Reviews

REVIEWS

James Martin, Principles o f Data-Base Management (Prentice-Hall, 1976), 340 pp., price: $25.00.

It isn't often that a flowchart outlining the natural successions in which an author's books might be read is provided on the inside panels of his works. But far more striking is the fact that in at least one case such a flowchart is

necessary. Due to the recent proliferation of James Martin's books on Com- puter Systems, Telecommunications and Data-Base Management and Organi- zation, it is only with the flowchart that the reader can take full advantage of Martin's expertise in these fields.

The book under review is one of the more modest examples of the fruits of Martin's labor. While neither particularly original nor detailed (this does not apply to Martin's other works), it/'dis a des need as an interdiscip- linary sourcebook for those in Management, Economics, Information and Computer Sciences, etc. who fall victim to the widespread misunderstandings conceming Data-Base technology, utility and implications. It also makes an attempt to clarify some muddles concerning the principles, such as they are, of the amorphous collages known as Management Information Systems. In general, it is to the credit of Martin's book that he maintained a broad rather than penetrating spirit throughout, for in this way it avoided the tangles of detail which beset the more standard and detailed contributions on the subject from other authors.

Principles o f Data-BaseManagement divides into 4 parts. The first, entitled 'Why Data-Base?', is a more than adequate account of the advantages of using data within a Data-Base environment. Such standard examples as reductions in data-redundancy and retrieval inconsistencies, greater shared-use potential, ease of maintenance of Data-Base use standards, security and dalia-integrity are all covered in ample detail. Part 2 covers 'Data Organization', and gives an excellent introductory account of how the information in a Data-Base environment is logically organized. The reader should be forewarned, however, that in the chapter dealing with Relational Data-Base modelling (e.g. Ch. 9), only Codd's First Normal Form is covered. I think that this is a serious short- coming, for First (and Second, for that matter) Normal Form derives its importance from the fact that it is an intermediate construction from non-flat /'des to Third Normal Form. If space was important, discussion of the first 2 normal forms should have been jettisoned in favor of Third Normal Form, which is the only normal form reasonably free of anomalies and hence having

Theory and Decision 9 (1978). 223-224. All Rights Reserved. Copyright �9 1978 by D. Reidel Publishing Company, Dordrecht, Holland.

Page 20: Reviews

224 REVIEWS

any real-world application. If pagecount was not a problem, discussions of at least all of the first 3 normal forms should have been included. In any event, omission of the more important Third Normal Form seems to be poorly motivated. Part 3 deals with the types of software currently available which implement the models introduced in part 2. The fourth part, of perhaps the greatest interest to persons in Management, outlines types of management considerations which ultimately have to be brought to bear on Data-Base systems. One of the strong points of this section is the repeated contrasts between Data-Base systems and Management Information Systems, terms assumed to be synonymous by the uninitiated.

Though reasonably free from the usual first printing typographical errors, this book does contain some rather bothersome mistakes in referencing, probably because so much of the first three chapters is borrowed from Martin's earlier book, Computer Data-Base Organization.

For example, a non-existent 'Fig. 7.8' is referred to on p. 183, and an incorrect reference is made to Fig. 4.4 on p. 188. However, none of these problems looms large enough to discourage the patient reader. As mentioned before, this work is intended for those in the above mentioned fields who want to develop a feeling for Data-Base management and technology without saturating themselves with esoterica. This book is clearly not intended for specialists, but rather for those who want to have some idea of what Data- Base systems are all about. Unlike so many other sourcebooks of an introduc- tory and interdisciplinary character, this book manages to avoid confusing the arcane with the obvious, mistaking detail for accuracy, and in general missing the audience for which it was intended.

University o f Nebraska - Lincoln H.L. BERGHEL