math20302 propositional logic - school of mathematicsmprest/math20302.pdf · math20302...

40
MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building Room 1.120 [email protected] April 10, 2015

Upload: hoangnhan

Post on 09-Mar-2018

245 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

MATH20302 Propositional Logic

Mike PrestSchool of MathematicsAlan Turing Building

Room [email protected]

April 10, 2015

Page 2: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Contents

I Propositional Logic 3

1 Propositional languages 41.1 Propositional terms . . . . . . . . . . . . . . . . . . . . . . . . . . 41.2 Valuations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91.3 Beth trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121.4 Normal forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151.5 Adequate sets of connectives . . . . . . . . . . . . . . . . . . . . 181.6 Interpolation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2 Deductive systems 212.1 A Hilbert-style system for propositional logic . . . . . . . . . . . 22

2.1.1 Soundness . . . . . . . . . . . . . . . . . . . . . . . . . . . 232.1.2 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . 24

2.2 A natural deduction system for propositional logic . . . . . . . . 29

II Predicate Logic 32

3 A brief introduction to predicate logic: languages and struc-tures 333.1 Predicate languages . . . . . . . . . . . . . . . . . . . . . . . . . 333.2 The basic language . . . . . . . . . . . . . . . . . . . . . . . . . . 343.3 Enriching the language . . . . . . . . . . . . . . . . . . . . . . . . 363.4 L-structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373.5 Some basic examples . . . . . . . . . . . . . . . . . . . . . . . . . 383.6 Definable Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

If you come across any typos or errors, here or in the examples/solutions,please let me know of them.

1

Page 3: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Introduction: The domain of logic

By logic I mean either propositional logic (the logic of combining statements)or first-order predicate logic (a logic which can be used for constructing state-ments). This course is mostly about the former; we will, however, spend sometime on predicate logic in the later part of the course. In any case, propositionallogic is a part of predicate logic so we must begin with it. Predicate Logic isdealt with thoroughly in the 3rd/4th-year course by that title; other naturalfollow-on courses from this one are Model Theory and Non-Standard Logics.

Propositional logic can be seen as expressing the most basic “laws of thought”which are used not just in mathematics but also in everyday discourse. Predi-cate logic, which can also be thought of as “the logic of quantifiers”, is strongenough to express essentially all formal mathematical argument.

Most of the examples that we will use are taken from mathematics but we douse natural language examples to illustrate some of the basic ideas. The naturallanguage examples will be rather “bare”, reflecting the fact that these formallanguages can capture only a small part of the meanings and nuances of ordinarylanguage. There are logics which capture more of natural language (modality,uncertainty, etc.) though these have had little impact within mathematics itself(as opposed to within philosophy and computer science), because predicate logicis already enough for expressing the results of mathematical thinking.1

1One should be clear on the distinction between the formal expression of mathematics(which is as precise and as formal as one wishes it to be) and the process of mathematicalthinking and informal communication of mathematics (which uses mental imagery and all theusual devices of human communication).

2

Page 4: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Part I

Propositional Logic

3

Page 5: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Chapter 1

Propositional languages

1.1 Propositional terms

Propositional logic is the logic of combining already formed statements. Itbegins with careful and completely unambiguous descriptions of how to use the“propositional connectives” which are “and”, “or”, “not”, “implies”. But firstwe should be clear on what is meant by a “statement” (the words “assertion”and “proposition” will be used interchangably with “statement”).

The distinguishing feature of a statement is that it is either true or false.“The moon is made of cheese” is a (false) statement and “1 + 1 = 2” is a (true,essentially by definition) statement. Fortunately, in order to deal with the logicof statements, we do not need to know whether a given statement is true or false:it might not be immediately obvious whether “113456 × 65421 = 880459536“is true or false but certainly it is a statement. A more interesting example is“There are infinitely many prime pairs.” where by a prime pair we mean a pair,p, p+2, of numbers, two apart, where both are prime (for instance 3 and 5 forma prime pair, as do 17 and 19 but not 19 and 21). It is a remarkable fact that,to date, no-one has been able to determine whether this statement is true orfalse. Yet it is surely1 either false (after some large enough number there are nomore prime pairs) or true (given any prime pair there is always a larger primepair somewhere out there).

On the other hand, the following are not statements.“Is 7 a prime number?”“Add 1 and 1.“

The first is a question, the second a command.What about“x is a prime number.”:

is this a statement? The answer is, “It depends.”: if the context is such thatx already has been given a value then it will be a statement (since then eitherx is a prime number or it is not) but otherwise, if no value (or other sufficientinformation) has been assigned to x then it is not a statement.

Here’s a silly example (where we can’t tell whether something is a statementor not). Set x = 7 if there are infinitely many prime pairs but leave the value of

1There are some issues there but they are more philosophical than mathematical.

4

Page 6: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

x unassigned if there are not. Is “x is a prime number” a statement? Answer:(to date) we can’t tell! But this example is silly (the context we have set up ishighly artifical) and quite off the path of what we will be doing.

When we discuss mathematical properties of, for instance, numbers, we usevariables, x, y to stand for these numbers. This allows us to make generalassertions. So we can say “for all integers x, y we have x + y = y + x” insteadof listing all the examples of this assertion: ..., “0 + 1 = 1 + 0”, “1 + 1 = 1 + 1”,..., “2 + 5 = 5 + 2”, ... (not that we could list all the assertions covered by thisgeneral assertion, since there are infinitely many of them). In the same way,although we will use particular statements as examples, most of the time we usevariables p, q, r, s, t to stand for statements in order that we may make generalassertions.2

As indicated already, propositional logic is the logic of “and”, “or”, “not”,“implies” (as well as “iff” and other combinations of the connectives). Thewords in quotes are propositional connectives: they operate on propositions(and propositional variables) to give new propositions.

Initially we define these connectives somewhat informally in order to empha-sise their intuitive meaning. Then we give their exact definition after we havebeen more precise about the context and have introduced the idea of (truth)valuation.

First, notation: we write ∧ for “and”; ∨ for “or”, ¬ for “not”,→ for “implies“and ↔ for “iff”. So if p is the proposition “the moon is made of cheese” andq is the proposition “mice like cheese” then p ∧ q, p ∨ q, ¬p, p → q, p ↔ qrespectively may be read as “the moon is made of cheese and mice like cheese”,“the moon is made of cheese or mice like cheese”, “the moon is not made ofcheese”, “if the moon is made of cheese then mice like cheese” and “the moonis made of cheese iff mice like cheese”.

A crucial observation is that the truth value (true or false) of a statementobtained by using these connectives only depends on the truth values of the“component propositions”. Check through the examples given to see if you agree(you might have some doubts about the last two examples: we could discussthese). For another example, you may not know whether or not the followingare true statements: “the third homology group of the torus is trivial”, “everyartinian unital ring is noetherian” but you know that the combined statement“the third homology group of the torus is trivial and every artinian unital ringis noetherian” is true exactly if each of the separate statements is true. Thatis why it makes sense to apply these propositional connectives to propositionalvariables as well as to propositions.

So now the formal definition.

We start with a collection, p, q, r, p0, p1, ... of symbols which we callpropositional variables. Then we define, by induction, the propositionalterms by the following clauses:

2You might notice that in this paragraph I assigned different uses to the words “assertion”and “statement” (although earlier I said that I would use these interchangably). This is be-cause I was making statements about statements. That can be confusing, so I used “assertion”for the first (more general, “meta”, “higher”) type of use and “statement” for the second typeof use. In logic we make statements about statements (and even statements about statementswhich are themselves statements about statements ...).

5

Page 7: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(0) every propositional variable is a propositional term;(i) if s and t are propositional terms then so are: s ∧ t, s ∨ t, ¬s, s → t,

s↔ t;(ii) that’s it (more formally, there are no propositional terms other than

those which are so by virtue of the first two clauses).The terms seen in (i) are respectively called the conjunction (s∧ t), disjunc-tion (s ∨ t), of s and t, ¬s is the negation of s, s → t is an implication ands↔ t a biimplication.Remark: Following the usual convention in mathematics we will use symbolssuch as p, q, respectively s, t, not just for individual propositional variables,respectively propositional terms, but also as variables ranging over propositionalvariables, resp. propositional terms, (as we did just above).

The definition above is an inductive one, with (0) being the base case and (i)the inductions step(s) but it’s a more complicated inductive structure than thatwhich uses the natural numbers as “indexing structure”. For there are manybase cases (any propositional variable), not just one (0 in ordinary induction)and there are (as given) five types of inductive step, not just one (“add 1” inordinary induction).

Example 1.1.1. Start with propositional variables p, q, r; these are propositionalterms by clause (0) and then, by clause (i), so are p ∧ p, p ∧ q, ¬q, q → p forinstance. Then, by clause (i) again, (p∧p)∧p, (p∧q)→ ¬r, (q → p)→ (q → p)are further propositional terms. Further applications of clause (i) allow us tobuild up more and more complicated propositional terms. So you can see thatthese little clauses have large consequences. The last clause (ii) simply says thatevery propositional term has to be built up in this way.

Notice how we have to use parentheses to write propositional terms. This isjust like the use in arithmetic and algebra: without parentheses the expression(−3 + 5) × 4 would read −3 + 5 × 4 and the latter is ambiguous. At least, itwould be if we had not become used to the hierarchy of arithmetical symbolsby which − binds more closely than × and ÷, and those bind more closely than+ and −. Of course parentheses are still needed but such a hierarchy reducesthe number needed and leads to easier readability. A similar hierarchy is usedfor propositional terms, by which ¬ binds more closely than ∧ and ∨, whichbind more closely than → and ↔ (at least those are my conventions, but theyare not universal). Therefore ¬p ∧ q → r means ((¬p) ∧ q) → r rather than¬(p ∧ q) → r or (¬p) ∧ (q → r) or ¬(p ∧ (q → r)) (at least it does to me; if indoubt, put in more parentheses).

You will recall that in order to prove results about things which are definedby induction (on the natural numbers) it is often necessary to use proof byinduction. The same is true here: one deals with the base case (propositionalvariables) then the inductive steps. In this case there are five different types ofinductive step but we’ll see later than some of the propositional connectives canbe defined in terms of the others. For instance using ∧ and ¬ (or using → and¬) we can define all the others. Having made that observation, we then needonly prove the inductive steps for ∧ and ¬ (or for → and ¬).

Proofs of assertions about propositional terms which follow their inductiveconstruction are often called “proofs by induction on complexity (of terms)”.

If we wish to be more precise about the set of propositional variables that we

6

Page 8: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

are using then we will let L (“L” for “language”) denote the set of propositionalvariables. We also introduce notation for the set of propositional terms builtup from these, namely, set S0L = L and, having inductively (on n) defined theset SnL we define Sn+1L to be the set of all propositional terms which may bebuilt from SnL using a single propositional connective and, so as to make thisprocess cumulative, we also include SnL in Sn+1L. More formally:

Sn+1L = SnL ∪ {(s ∧ t), (s ∨ t), (¬s), (s→ t), (s↔ t) : s, t ∈ SnL}.

We also set SL =⋃

n≥0 SnL to be the union of all these - the set of all proposi-tional terms (sometime called sentences, hence the “S” in “SL”) which can bebuilt up from the chosen base set, L = S0L, of propositional variables.3

Notice that we place parentheses around all the propositional terms we build;we discussed already that leaving these out could give rise to ambiguity inreading them: was “s ∧ t ∨ u” - a term in S2L - built up by applying ∧ tos, t ∨ u ∈ S1L or by applying ∨ to s ∧ t, u ∈ S1L, that is, should it be read ass ∧ (t ∨ u) or as (s ∧ t) ∨ u? In practice we can omit some pairs of parentheseswithout losing unique readability but, formally, those pairs are there.

In fact, although intuitively it might at first seem obvious that if we lookat a propositional term in SL, then we can figure out how it was constructed- that is, there is a unique way of reading it - a bit more thought reveals thatthere is an issue: how do we detect the “last” connective in its construction?Clearly, if we can do that then we can proceed inductively to reconstruct its“construction tree”. We have been precise in setting things up so we should beable to prove unique readability (if it is true - which it is, as we show now).

Theorem 1.1.2. Let s ∈ SL be any propositional term. Then exactly one ofthe following is the case:(a) s is a propositional variable;(b) s has the form (t ∧ u) for some t, u ∈ SL;(c) s has the form (t ∨ u) for some t, u ∈ SL;(d) s has the form (¬t) for some t ∈ SL;(e) s has the form (t→ u) for some t, u ∈ SL;(f) s has the form (t↔ u) for some t, u ∈ SL.

Proof. Every propositional term s does have at least one of the listed forms:because s ∈ SL it must be that s ∈ SnL for some n and then, just by thedefinitions of S0L and Sn+1L, s does have such a form. We have to show thatit has a unique such form. For this we introduce two lemmas and the followingdefinitions: if s ∈ SL then by l(s) we denote the number of left parentheses,“(”, occurring in s and by r(s) we denote the number of right parentheses, “)”,occurring in s (for purposes of this definition we count all the parentheses thatshould be there).

Lemma 1.1.3. For every propositional term s we have l(s) = r(s).

Proof. This is an example of a proof by induction on complexity/constructionof terms.

3A word about notation: I will tend to use p, q, r for propositional variables, s, t, u forpropositional terms (which might or might not be propositional variables) and v for valuations(see later). That rather squeezes that part of the alphabet so I will sometimes use other partsand/or the Greek alphabet for propositional variables and terms.

7

Page 9: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

If s ∈ S0L then l(s) = 0 = r(s) so the result is true if s ∈ S0L.For the induction step, suppose that for every s ∈ SnL we have l(s) = r(s).

Let s ∈ Sn+1L; then either there is t ∈ SnL such that s = (¬t) or there aret, u ∈ SnL such that s = (t∧u) or (t∨u) or (t→ u) or (t↔ u). Since t, u ∈ SnL,we have l(t) = r(t) and l(u) = r(u) by the inductive assumption. In the firstcase, s = (¬t), it follows that l(s) = 1+l(t) = 1+r(t) = r(s), as required. In thesecond case, s = (t∧ u), we have, on counting parentheses, l(s) = 1 + l(t) + l(u)and r(s) = r(t) + r(u) + 1, and so l(s) = r(s), as required. The other three casesare similar and so we see that in all cases, l(s) = r(s). Thus the inductive stepis proved and so is the lemma. �

Digression on proof by induction on complexity. At the start of the proof of1.1.3 above I said that the proof would be by induction on complexity of termsbut you might have felt that the proof was shaped as a proof by induction onthe natural numbers N = {0, 1, 2, . . . }. That’s true; we used the sets SnL tostructure the proof, and the proof by induction on complexity of terms wasreflected in the various subcases that were considered when going from SnL toSn+1L. But the proof could have been given without reference to the sets SnL.The argument - the various subcases - would be essentially the same; the hithertomissing ingredient is the statement of the appropriate Principle of Induction.Recall that, for N that takes the form “Given a statement P (n), depending onn ∈ N, if P (0) is true and if from P (n) we can prove P (n+1), then P (n) is truefor every n ∈ N.”4 The corresponding statement for our “construction tree” forpropositional terms is: “Given a statement P (s), depending on s ∈ SL, if P (p)is true for every propositional variable p and if, whenever P (s) and P (t) aretrue so are P (s∧ t), P (s∨ t), P (¬s), P (s→ t) and P (s↔ t), then P (s) is truefor every s ∈ SL.”

Before the next lemma, notice that every propositional term can be thoughtof simply as a string of symbols which, individually, are either: propositionalvariables (p, q etc.), connectives (∧,∨,¬,→,↔), or parentheses (left, right).Then the statement that s, as a string, is, for instance, xyz will mean that x, y,z are strings and, if we place them next to each other in the given order, thenwe get s. For instance if s′ is ¬¬(s ∧ (t ∨ u)) then we could write s′ as xyzwhere x, y, z are the strings x = ¬, y = ¬(s, z = ∧(t ∨ u)); we could even writes′ = xyzw with x, y, z as before and w the empty string (which we allow). Wedefine the length, lng(x), of any string x to be the number of occurrences ofsymbols in it. We extend the notations l(x) and r(x) to count the numbers ofleft parentheses, right parentheses in any string x. If the string x has the formyz then we say that y is a left subword of x, a proper subword if z 6= ∅;similarly z is a right subword of x, proper if y is not the empty string. (Wewill use the terms “string” and “word” interchangably.)

Proposition 1.1.4. For every propositional term s, either s is a propositionalvariable or there is just one way of writing s in either of the forms s = (¬t) forsome propositional term t or s = (t ∗ u) for some propositional terms t, u where∗ is one of the binary propositional connectives.

Proof. We can suppose that s is not a propositional variable. Note that if shas the form (¬t) then the leftmost symbols of s are (¬, whereas if s has the

4I follow the convention that 0 is a natural number; not followed by everyone but standardin mathematical logic.

8

Page 10: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

form (t ∗ u) then its leftmost symbols are (( or (p where p is a propositionalvariable, so we can treat these two cases entirely separately.

In the first case, s = (¬t), this is the only possible way of writing it in thisform because t is determined by s. Therefore, since, as we observed above, itcannot be written in the form (t ∗ u), there is no other way of writing s as apropositional term.

In the second case, we argue by contradiction and suppose that we can writes = (t ∗ u) = (t′ ∗′ u′) where t, u, t′, u′ are propositional terms and ∗, ∗′ arepropositional connectives and, for the contradiction, that these are not identicalways of writing s, hence that either t is a proper left subword of t or t′ is aproper left subword of t. A contradiction will follow immediately once we haveproved the following lemma. �

Lemma 1.1.5. If s is a propositional term and if s′ is a proper left subword ofs then either s′ = ∅ or l(s′) − r(s′) > 0; in particular s′ is not a propositionalterm.

Similarly, if s′′ is a proper right subword of s then either s′′ = ∅ or r(s′′)−l(s′′) > 0, and s′′ is not a propositional term.

Proof. We know that s has the form (¬t) or (t ∗ u).In the first case, s′ has one of the forms ∅, ( or (¬t′ where t′ is a left subword

(possibly empty) of t. By induction on lengths of propositional terms we canassume that t′ = ∅ or l(t′)−r(t′) ≥ 0 (“>” if t′ is a proper left subword of t, “=”by 1.1.3 in the case t′ = t) and so, in each case, it follows that l(s′)− r(s′) > 0.

In the second case, s′ has one of the forms ∅, (, (t′ where t′ is a left subwordof t, (t ∗ u′ where u′ is a left subword of u. Again by induction on lengths ofpropositional terms we can assume that l(t′) − r(t′) ≥ 0 and l(u′) − r(u′) ≥ 0;checking each case, it follows that l(s′)− r(s′) > 0.

By 1.1.3 we deduce that s′ is not a propositional term.Similarly for the assertion about right subwords. �

1.2 Valuations

Now for the key idea of a (truth) valuation. Fix some set L = S0L of proposi-tional variables, and hence the corresponding set SL of propositional terms. Avaluation on the set of propositional terms is a function v : SL → {T,F} tothe 2-element set5 {T,F} which satisfies the following conditions.6

For all propositional terms s, t we havev(s ∧ t) = T iff v(s) = T and v(t) = T;v(s ∨ t) = T iff v(s) = T or v(t) = T;v(¬s) = T iff v(s) = F;v(s→ t) = T iff v(s) = F or v(t) = T;v(s↔ t) = T iff the values of v(s) and v(t) are the same: v(s) = v(t).

5really, the two-element boolean algebra6Of course, T represents “true” and F “false”. Often the 2-element set {1, 0} is used instead,

normally with 1 representing “true” and 0 representing “false”.

9

Page 11: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

There’s quite a lot to say about this definition. We start with a key point.Namely, because all propositional terms are built up from the propositionalvariables using the propositional connectives, any valuation is completely de-termined by its values on the propositional variables (this, see 1.2.1((b) below,is the formal statement of the point we made (the “crucial observation”) whendiscussing mice, cheese and homology groups).

For instance if v(p) = v(q) = T and v(r) = F then we have, since v is avaluation, v(p∨r) = T and hence v(¬(p∨r)) = F. Similarly, for any propositionalterm, t, built from p, q and r, the value v(t) is determined by the above choicesof v(p), v(q), v(r). That does actually need proof. There is the, rather obviousand easily proved by induction, point that this process works (in the sense that itgives a value), but there’s also the more subtle point that if there were more thanone way of building up a propositional term then, conceivably, one constructionroute might lead to the valuation T and the other to F. But we have seenalready in 1.1.4 that this does not, in fact, happen: every propositional termhas a unique “construction tree”. Therefore if v0 is an function from the set,S0L, of propositional variables to the set {F,T} then this extends to a uniquevaluation v on SL. In particular, if there are n propositional variables therewill be 2n valuations on the propositional terms built from them. We state thisformally.

Proposition 1.2.1. Let L be a set of propositional variables.(a) If v0 : L → {F,T} is any function then there is a valuation v : SL → {F,T}on propositional terms in L such that v(p) = v0(p) for every p ∈ L.(b) If v and w are valuations on SL and if v(p) = w(p) for all p ∈ L then v = w(so the valuation in part (a) is unique).(c) If t is a propositional term and if v and w are valuations which agree on allpropositional variables occurring in t then v(t) = w(t).

The proof of part (c), which is a slight strengthening of (b), is left as anexercise. In order to prove it we could prove the following statement first (byinduction on complexity of terms):if L′ ⊆ L are sets of propositional variables then for every n, SL′n ⊆ SnL;furthermore, if v′ is a valuation on SL′ and v is a valuation on SL such thatv(p) = v′(p) for every p ∈ L′ then v(t) = v′(t) for every t ∈ SL′.From that, part (c) follows easily (take L′ to be the set of propositional termsactually occurring in t). (You might have noticed that I didn’t actually definewhat I mean by a propositional variable occurring in a propositional term;I hope the meaning is clear but it is easy to give a definition by, what else,induction on complexity of terms.)

Truth tables are tables showing evaluation of valuations on propositionalterms. They can also be used to show the effect of the propositional connectiveson truth values. Note that “or” is used in the inclusive sense (“one or the otheror both”) rather than the exclusive sense (“one or the other but not both”).

p q p ∧ qT T TT F FF T FF F F

p q p ∨ qT T TT F TF T TF F F

10

Page 12: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

p q p→ qT T TT F FF T TF F T

p ¬pT FF T

p q p↔ qT T TT F FF T FF F T

You might feel that the truth table for→ does not capture what you considerto be the meaning of “implies” but, if we are to regard it as a function on truthvalues (whatever the material connection or lack thereof between its “input”propositions) then the definition given is surely the right one. Or just regardp→ q as an abbreviation for ¬p∨q “(not-p) or q”, since they have the same truthtables. The following example might make the reading of p → q as meaning¬p∨ q reasonable: let p be “n = 1” and let q be “(n− 1)(n− 2) = 0”, so p→ qreads “n = 1 implies (n−1)(n−2) = 0” or “If n = 1 then (n−1)(n−2) = 0” andthen consider setting n = 1, 2, 3, . . . in turn and think about the truth values ofp, q and p→ q.

You will have seen examples of truth tables in the first year Sets, Numbersand Functions course. Recall that they can be used to determine whether apropositional term t is a tautology, meaning that v(t) = T for every valuationv. The “opposite” notion is: if v(t) = F for every valuation v; then we saythat t is unsatisfiable (also called “a contradiction” though that’s not goodterminology to use when we’ll be drawing the distinction between syntax andsemantics). Notice that the use of truth tables implicitly assumes part (c) of1.2.1.

We say that two propositional terms, s and t, are logically equivalent,and write s ≡ t, if v(s) = v(t) for every valuation v. It is equivalent to say thats↔ t is a tautology. Let’s prove that.

Suppose s ≡ t so, if v is any valuation, then v(s) = v(t) so, from the definitionof valuation, v(s ↔ t) = T. This is so for every valuation so, by the definitionof tautology, s ↔ t is a tautology. For the converse, suppose that s ↔ t is atautology and let v be any valuation. Then v(s↔ t) = T and so (again, by thedefinition of valuation) v(s) = v(t). Thus, by definition of equivalence, s and tare logically equivalent. We see that the proof was just an easy exercise fromthe definitions.

Now for the semantic notion of entailment; we contrast “semantics” (“mean-ing” or, at least, notions of being true and false) with “syntax” (constructionand manipulation of strings of symbols). If S is a set of propositional termsand t is a propositional term then we write S |= t if for every valuation v withv(S) = T, by which we mean v(s) = T for every s ∈ S, we have v(t) = T:“whenever S is true so is t”.

Extending the above notions we say that a set S of propositional terms istautologous if v(S) = T for every valuation v and S is unsatisfiable if forevery valuation v there is some s ∈ S with v(s) = F - in other words, if novaluation makes all the terms in S true. We also say that S is satisfiable ifthere is at least one valuation v with v(S) = T. So note: tautologous meansevery valuation makes all terms in S true; satisfiable means that some valuationmakes all terms in S true; unsatisfiable means that no valuation makes all termsin S true.

Lemma 1.2.2. Let S be a set of propositional terms and let t, t′, u be proposi-

11

Page 13: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

tional terms.(a) S |= t iff S ∪ {¬t} is unsatisfiable(b) S ∪ {t} |= u iff S |= t→ u(c) S ∪ {t, t′} |= u iff S ∪ {t ∧ t′} |= u

Proof. These are all simple consequences of the definitions. Before we begin,we introduce a standard and slightly shorter notation: instead of writing S ∪{t1, . . . , tk} |= u we write S, t1, . . . , tk |= u.

(a) S ∪ {¬t} is unsatisfiable ifffor all valuations v, we have v(s) = F for some s ∈ S or v(¬t) = F ifffor all valuations v, if v(s) = T for all s ∈ S then v(¬t) = F ifffor all valuations v, if v(s) = T for all s ∈ S then v(t) = T iffS |= t.

(b) S ∪ {t} |= u ifffor every valuation v, if v(s) = T for all s ∈ S and v(t) = T then v(u) = T ifffor every valuation v with v(s) = T for all s ∈ S then, if v(t) = T then v(u) = Tifffor every valuation v with v(s) = T for all s ∈ S then v(t → u) = T (by thetruth table for “→”) iffS |= t→ u.

(c) S ∪ {t ∧ t′} |= u ifffor every valuation v with v(s) = T for all s ∈ S and v(t ∧ t′) = T we havev(u) = T iff(by the truth table for ∧) for every valuation v with v(s) = T for all s ∈ S andv(t) = T and v(t′) = T, we have v(u) = T iffS ∪ {t, t′} |= u. �

We can use truth tables to determine whether or not S |= u (assuming S isa finite (and, in practice, not very large) set) but this can take a long time: ifthere are n propositional variables appearing then we need to compute a truthtable with 2n rows. The next section describes a method which sometimes ismore efficient.

1.3 Beth trees

Beth trees provide a method, often more efficient than and perhaps more inter-esting than, truth tables, of testing whether a collection of propositional termsis satisfiable or not (and, if it is satisfiable, of giving a valuation demonstrat-ing this). Note that this includes testing whether a propositional term is atautology, whether one term implies another, whether S |= t, et cetera.

The input to the method consists of two sets S, T of propositional terms; todistinguish between these we will write the typical input as S|T . The outputwill, if we carry the method to its conclusion (which for some purposes will bemore than we need to do), be all valuations with v(S) = T and v(T ) = F. So ifthe output is nonempty then we know that S ∪ {¬t : t ∈ T} is satisfiable. Forinstance, t is a tautology if the output from the pair ∅|{t} is empty (which oftenwill be easier than checking whether the output of {t}|∅ is all valuations).

The actual computation has the form of a tree (as usual in mathematics,trees grow downwards) and, at each node of the tree, there will be a pair of the

12

Page 14: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

form S′|T ′. A node (of a fully or partially-computed Beth tree) is terminalif it has no node beneath it. A node is a leaf if all the propositional terms atit are propositional variables. Directly underneath each non-terminal node iseither a branch segment with another node at its end, or two branch segmentswith a node at the end of each. A key feature of the tree is that if a nodelies under another then the lower one contains fewer propositional connectives.That means that if the initial data contains k propositional connectives then nobranch can contain more than k+1 nodes. And that means that the computationof the tree will terminate.

Before we describe how to compute such trees here, in order to anchor ideas,is an example.

Example 1.3.1. We determine whether or not ¬p, (p ∧ q)→ r |= ¬r → (q → p).We will build a tree beginning with the input ¬p, (p ∧ q) → r | ¬r → (q → p)since there will be a valuation satisfying this condition exactly if ¬p, (p ∧ q)→r |= ¬r → (q → p) does not hold.

¬p, (p ∧ q)→ r | ¬r → (q → p)

(p ∧ q)→ r | p,¬r → (q → p)

¬r, (p ∧ q)→ r | p, q → p

¬r, q, (p ∧ q)→ r | p, p

q, (p ∧ q)→ r | r, p

jjjjjjjj

jjjjjjjj

TTTTTTTT

TTTTTTT

q | r, p, p ∧ q

qqqqqq

qqqq

TTTTTTTT

TTTTTTTT

TTq, r | r, p

q | q, r, p q | r, pThe property (∗) below implies that a valuation v satisfies the input condi-

tions (making both ¬p and (p ∧ q) → r true but making ¬r → (q → p) false)iff it satisfies at least one of the leaves. But we can see immediately that theonly leaf satisfied by any valuation is q | r, p, which is satisfied by the valuationv with v(q) = T, v(r) = F, v(p) = F. So there is a valuation making both ¬pand (p ∧ q) → r true but making ¬r → (q → p) false. That is ¬r → (q → p)does not follow from ¬p and (p ∧ q)→ r.

We will list the allowable rules for generating the nodes directly under agiven node. To make sense of these, we first explain the idea. The propertythat we want is the following:(∗) If, at any stage of the construction of the tree with initial node S|T , thecurrently terminal nodes are S1|T1,...,Sk|Tk then, for every valuation v, we have

v(S) = T and v(T ) = F iff v(Si) = T and v(Ti) = F for some i.

13

Page 15: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

For this section, when I write v(T ) = F I mean v(t) = F for every t ∈ T . Thisis a convenient, but bad (because easily misinterpreted), notation.

In order for this property to hold it is enough to have the following two:(∗1) if a node S′|T ′ is immediately followed by a single node S1|T1 then, for

every valuation v we have v(S′) = T and v(T ′) = F iff v(S1) = T and v(T1) = F;(∗2) if a node S′|T ′ is immediately followed by the nodes S1|T1 and S2|T2

then, for every valuation v we have: v(S′) = T and v(T ′) = F iff [v(S1) = T andv(T1) = F] or [v(S2) = T and v(T2) = F].(The fact that these are enough can be proved by an inductive argument.)

In the pair S|T you can think of the left hand side as the “positive” state-ments and those on the right as the “negative” ones. Each rule involves eithermoving one term between the positive and negative sides (with appropriatechange to the term) or splitting one pair into two. Here are the allowable rules.

S,¬t |T

S | t, T

S, | ¬t, T

S, t |T

S, s ∧ t |T

S, s, t |T

S, | s ∧ t, T

LLLLLL

LLLL

rrrrrr

rrrr

S | s, T S | t, T

S, s ∨ t |T

KKKKKK

KKKK

rrrrrr

rrrr

S, s |T S, t |T

S | s ∨ t, T

S | s, t, T

S, s→ t |T

LLLLLL

LLLL

rrrrrr

rrrr

S | s, T S, t |T

S | s→ t, T

S, s | t, T

In lectures we will explain a few of these but you should think through whyeach one is valid (that is, satisfies (∗1) or (∗2), as appropriate). You should alsonote that they cover all the cases - together they allow a single pair to be inputand will output a tree where every terminal node is a leaf. When constructing aBeth tree there may well be some nodes where there is a choice as to which ruleto apply but no choice of applicable rule is wrong (though some choices mightlead to a shorter computation).

Example 1.3.2. We use Beth trees to show that p ∧ q → p is a tautology. Wealready suggested that it might be easier to do the equivalent thing of showingthat ¬(p ∧ q → p) is unsatisfiable; here’s the computation for that.

14

Page 16: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

∅ | p ∧ q → p

p ∧ q | p

p, q | p- and clearly no valuation can make both p, q true but make p false; we concludethat p ∧ q → p is a tautology.

For comparison here is the direct check that p ∧ q → p is a tautology.

p ∧ q → p | ∅

KKKKKK

KKKK

qqqqqq

qqqq

∅ | p ∧ q

MMMMMM

MMMMM

wwwwwwwww

p | ∅

∅ | p ∅ | qNow note that every valuation satisfies the condition expressed by at least oneof the leaves, so p ∧ q → p is indeed a tautology.

1.4 Normal forms

First, we look at some more basic properties of logical equivalence where, recall,two propositional terms s, t are said to be logically equivalent, s ≡ t, if v(s) =v(t) for every valuation v (and by 1.2.1(c) it is enough to check for valuationson just the propositional variables actually occurring in s or t).

Lemma 1.4.1. If s, t are propositional terms then:(i) s ≡ t iff(ii) s |= t and t |= s iff(iii) |= s↔ t iff(iv) s↔ t is a tautology.

Proof. All this is immediate from the definitions. For instance, to prove(iv)⇒(i) let v be any valuation; then, assuming (iv), v(s ↔ t) = T and bydefinition of valuation, we see this can happen only if v(s) = v(t), as required.�

Note that this is an equivalence relation on SL; that is, it is reflexive (s ≡ s),symmetric (s ≡ t implies t ≡ s) and transitive (s ≡ t and t ≡ u together implys ≡ u).

Here are some, easily checked, basic logical equivalences.For any propositional terms s, t, u:s ∧ t ≡ t ∧ s;s ∨ t ≡ t ∨ s;¬(s ∧ t) ≡ ¬s ∨ ¬t;¬(s ∨ t) ≡ ¬s ∧ ¬t;¬¬s ≡ s;

15

Page 17: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

s→ t ≡ ¬s ∨ t;(s ∧ t) ∧ u ≡ s ∧ (t ∧ u), so we can write s ∧ t ∧ u without ambiguity;(s ∨ t) ∨ u ≡ s ∨ (t ∨ u), so we can write s ∨ t ∨ u without ambiguity;(s ∧ t) ∨ u ≡ (s ∨ u) ∧ (t ∨ u);(s ∨ t) ∧ u ≡ (s ∧ u) ∨ (t ∧ u);s ∧ s ≡ s;s ∨ s ≡ s.

Proposition 1.4.2. Suppose that s ≡ s′ and t ≡ t′ are propositional terms.Then:

(i) ¬s ≡ ¬s′;(ii) s ∧ t ≡ s′ ∧ t′;(iii) s ∨ t ≡ s′ ∨ t′;(iv) s→ t ≡ s′ → t′.

Proof. To prove (ii): suppose v(s∧ t) = T. Then by the truth table for ∧, bothv(s) = T and v(t) = T; so v(s′) = T and v(t′) = T and hence v(s′ ∧ t′) = T.The other parts are equally easy. �

We introduce notations for multiple conjunctions and disjunctions; they arecompletely analogous to the use of

∑for repeated +. Given propositional terms

s1, . . . , sn we define∧n

i=1 si by induction:∧1

i=1 = s1,∧k+1

i=1 si =∧k

i=1 si ∧ sk+1.Similarly we define

∨ni=1 si. Because of associativity and commutativity of ∧,

respectively of ∨, if we permute the terms in such a repeated conjunction ordisjunction, then we obtain an equivalent propositional term. Indeed, we havethe following (the proofs of which are left as exercises).

Proposition 1.4.3. If s1, . . . , sn are propositional terms and v is a valuationthen:

(i) v(∧n

i=1 si) = T iff v(si) = T for all i = 1, . . . , n;(ii) v(

∨ni=1 si) = T iff v(si) = T for some i ∈ {1, . . . , n};

(iii)∧n

i=1 si ≡ ¬∨n

i=1 ¬si;(iv)

∨ni=1 si ≡ ¬

∧ni=1 ¬si;

Proposition 1.4.4. Suppose that s1, . . . , sn and t1, . . . , tm are sequences ofpropositional terms such that {s1, . . . , sn} = {t1, . . . , tm} (thus the sequencesdiffer only in the order of their terms and possible repetitions). Then

∨ni=1 si =∨m

j=1 tj and∧n

i=1 si =∧m

j=1 tj.

If S = {s1, . . . , sn} is a finite set of propositional terms then we write∨S

for∨n

i=1 si and∧S for

∧ni=1 si. What if S = ∅? Since, roughly, the more

conjuncts there are in∧S the harder it is to be true, it makes some sense to

define∧∅ to be any tautology (i.e. always true). Dually we define

∨∅ to be

any unsatisfiable term (so false under every valuation). (Because we are onlyinterested in the truth values of

∨∅ and

∧∅ it doesn’t matter which tautology

and which contradiction are chosen.)A little more terminology: given a set L of propositional variables, we refer

to any propositional variable p, or any negation, ¬p, of a propositional variableas a literal.

We are going to show that every propositional term is equivalent to onewhich is in a special form (indeed, there are two special forms: disjunctive andconjunctive).

16

Page 18: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

A propositional term is in disjunctive normal form if it has the form∨ni=1

∧mi

j=1 gij where each gij is a literal.

Proposition 1.4.5. (Disjunctive Normal Form Theorem) If t ∈ SL then thereis a propositional term s ∈ SL which is in disjunctive normal form and suchthat s ≡ t. If {p1, . . . , pk} are the propositional variables appearing in t thenwe may suppose that s has the form

∨ni=1

∧mi

j=1 gij with each mi ≤ k and with

n ≤ 2k.

Proof. Let v1, . . . , vn be the distinct valuations v on {p1, . . . , pk} such that

v(t) = T. For each i = 1, . . . , n and j = 1, . . . , k, set gij =

{pj if vi(pj) = T¬pj if vi(pj) = F .

Note that vi(∧k

j=1 gij) = T and that if v′ 6= vi is any other valuation on

{p1, . . . , pk} then v′(∧k

j=1 gij) = F. It follows that if w is any valuation on

{p1, . . . , pk} then w(∨n

i=1

∧kj=1 gij) = T iff w is one of v1, . . . , vn. Therefore for

any valuation v, v(∨n

i=1

∧kj=1 gij) = v(t), so t and

∨ni=1

∧mi

j=1 gij are logicallyequivalent, as required.

For the final statement, note that there are 2k distinct valuations on {p1, . . . , pk}.�

The proof shows how to go about actually constructing an equivalent propo-sitional term in disjunctive normal form, using either truth tables or, the proofslightly modified, Beth trees.

Example 1.4.6. Consider the propositional term t = (p ∧ q) → (¬p ∨ r). If weconstruct its truth table then we find 7 rows/valuations on {p, q, r} which makeit true. For each of these we form the corresponding “gij”. For instance, thevaluation v1(p) = v1(q) = v1(r) = T is one of those making t true and thecorresponding term is p ∧ q ∧ r. Another row where t is true is that where p istrue and q and r are false, so the corresponding term is p ∧ ¬q ∧ ¬r. Et cetera,giving the disjunctive normal form term

(p∧q∧r)∨(p∧¬q∧r)∨(p∧¬q∧¬r)∨(¬p∧q∧r)∨(¬p∧q∧¬r)∨(¬p∧¬q∧r)∨(¬p∧¬q∧¬r)

equivalent to t.Normal forms are, however, not unique and you might note that, for example,

the last four disjuncts can be replaced by the logically equivalent term ¬p. Fromthis point of view, Beth trees are more efficient, as we can illustrate with thisexample. If we construct a Beth tree starting with p∧q | ¬p∨r then very quicklywe reach the single leaf p, q | r, which corresponds to the single valuation making(p∧q)→ (¬p∨r) false. That corresponds to the term p∧q∧¬r, so t is equivalentto the negation of this, namely ¬(p∧ q ∧¬r), which is equivalent to ¬p∨¬q ∨ r- a much simpler disjunctive normal form.

You might instead construct a Beth tree starting from (p∧ q)→ (¬p∨ r) | ∅.Taking an obvious sequence of steps leads to a completed tree with the leaves∅ | p, ∅ | q and r | ∅. These correspond to the (conjunctions of) literals: ¬p, ¬q,r respectively. Therefore this also leads to the form ¬p ∨ ¬q ∨ r.

The dual form is as follows: A propositional term is in conjunctive normalform if it has the form

∧ni=1

∨mi

j=1 gij where each gij is a literal.

17

Page 19: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Proposition 1.4.7. If t ∈ SL then there is a propositional term s ∈ SL whichis in conjunctive normal form and such that s ≡ t. If {p1, . . . , pk} are thepropositional variables appearing in s then we may suppose that s has the form∧n

i=1

∨mi

j=1 gij with each mi ≤ k and with n ≤ 2k.

Proof. The term t is logically equivalent to ¬¬t and, by 1.4.5, ¬t is equivalentto some term of the form

∨ni=1

∧mi

j=1 gij . So t is equivalent to ¬∨n

i=1

∧mi

j=1 gijwhich, using DeMorgan’s laws (the third and fourth on the list of identities after1.4.1), is in turn equivalent to

∧ni=1

∨mi

j=1 ¬gij . Since each ¬gij is a literal (atleast once we cancel double negations), the result follows. �

1.5 Adequate sets of connectives

The proof of 1.4.5 actually shows that every truth table on a set, p1, . . . , pk, ofpropositional variables can be generated from them by using the propositionalconnectives ∧, ∨, ¬. More precisely, every propositional term t in p1, . . . , pkdefines a function, evaluation-at-t, from the set Valp1,...,pk

of valuations v onp1, . . . , pk, to {T,F}. Conversely, given any function e : Valp1,...,pk

→ {T,F},one may construct, using ∧, ∨ and ¬, a propositional term t such that e is justevaluation at t. If we change the set of propositional connectives that we are“allowed to use” then we can ask the same question. For instance, using just∧ and ∨ can we construct every truth table/build a term inducing any givenevaluation e? What if we use ¬ and →? And other such questions (the five wehave introduced are not the only possible connectives, indeed not even the onlyones which occur in nature, or at least in Computer Science, where one also seesNAND=Sheffer stroke, NOR, XOR).7

We say that a set, S, of propositional connectives is adequate if for everypropositional term t (in any number of propositional variables) there is a termt′ constructed using just the connectives in S such that t and t′ are “logicallyequivalent”.8 By “logically equivalent” we mean that they “have the same truthtables” or, a bit more precisely, they define the same function from Valp1,...,pn

to {T,F}.Example 1.5.1. The set {∧,¬} is adequate.

We have already commented that {∧,∨,¬} is adequate so we need only notethat s ∨ t ≡ ¬(¬s ∧ ¬t).Example 1.5.2. The NAND gate/operator or Sheffer stroke is a binary(i.e. has two inputs) propositional connective whose effect is as shown in thetruth table below.

p q p|qT T FT F TF T TF F T

7We won’t formulate the general question because then we would have to give a generaldefinition of “(n-ary) propositional connective” and would be hard-pressed to distinguish thesefrom propositional terms.

8Notice that if S includes some “new” propositional connectives then we have to extendour definitions of “propositional term” etc. to allow these. That’s why I used quotation marksjust then.

18

Page 20: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

You can see from this that p|q is logically equivalent to ¬(p∧q), hence the name“NAND”.

If we take our set of connectives to be just S = {|} then we have to re-define“propositional term” by saying: every propositional variable is a propositionalterm; if s, t are propositional terms then so is s|t. We can refer to these as“(propositional) terms build using | (only)” and can write S|(L) for the set ofsuch terms.

It is easy to show that {|} is adequate. All we have to do is to show that, givenpropositional variables p, q we can find terms using | only which are equivalentto ¬p and p∧q - because we know that {¬,∧} is an adequate set of connectives.Indeed, it is easy to check see that p|p is equivalent to ¬p and hence that(p|q)|(p|q) is equivalent to p ∧ q.

Showing that a given set S of connectives is not adequate can take morethought: how can one show that some propositional terms do not have equiva-lents built only using connectives from S?

Example 1.5.3. One might feel that, intuitively, ∧ and ∨ together are not ad-equate since they are both “positive”. How can one turn that intuition into aproof? One would like to show, for instance, that no term built only using ∧ and∨ can be logically equivalent to ¬p but, even using only the single propositionalvariable p, there are infinitely many propositional terms to check. That mightsuggest trying some sort of inductive proof. But a proof of what statement?

What we can do is to prove by induction on complexity/length of a termthat: if t is any term built only from ∧ and ∨ then for every valuation v suchthat all the propositional variables appearing in t are assigned the value T byv, we also have v(t) = T. Once that is done, we can deduce, in particular, thatno term built only using ∧ and ∨ can be equivalent to ¬p.

1.6 Interpolation

Suppose that s and t are propositional terms and that s |= t, equivalently s→ tis a tautology. This could be for the trivial reasons that either s is always false(unsatisfiable) or that t is always true (a tautology). But if that’s not the casethen the interpolation theorem guarantees that there is some propositional termu which involves only the propositional variables appearing in both s and t suchthat s |= u and u |= t. Such a u is referred to as an interpolant between s andt.

Theorem 1.6.1. (Interpolation Theorem) Suppose that s ∈ S(L1) and t ∈S(L2) are such that s |= t. Then either s is unsatisfiable or t is a tautology orthere is u ∈ S(L3), where L3 = L1 ∩ L2, such that s |= u and u |= t.

Proof. We suppose that s is satisfiable and that t is not a tautology; we mustproduce a suitable u.

Since s is satisfiable there is some valuation v1 on L1 such that v1(s) = T andsince t is not a tautology there is some valuation v2 on L2 such that v2(t) = F.

First we show that L3 6= ∅. If this were not so, that is, if L1 and L2 hadno propositional variables in common, then we could define a valuation v3 on

S(L1 ∪ L2) by setting, for p ∈ L1 ∪ L2, v3(p) =

{v1(p) if p ∈ L1

v2(p) if p ∈ L2. Then,

19

Page 21: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

by 1.2.1(c), we would have v3(s) = v1(s) = T and v3(t) = v2(t) = F, whichcontradicts the assumption that s |= t.

Now choose, by 1.4.7, a formula of S(L1) in disjunctive normal form which

is equivalent to s, say∨n

i=1(∧li

j=1 gij ∧∧mi

k=1 hik) where we have separated outthe literals into two groups: the gij - those belonging to S(L1) \ S(L3); thehik - those belonging to S(L3). (We allow that some of these conjuncts mightbe empty.) We can assume that each disjunct is satisfiable (we can drop anywhich are not). Define u to be

∨ni=1

∧mi

k=1 hik. Clearly u ∈ S(L3) and, if v is a

valuation on S(L1) then, if v(s) = T, it must be that v(∧li

j=1 gij∧∧mi

k=1 hik) = Tfor some i (by 1.4.3) and hence9 v(

∧mi

k=1 hik) = T and hence v(u) = T. Thuss |= u and we have just seen that u is satisfiable.

It remains to prove that u |= t. So let v be a valuation on S(L2) such thatv(u) = T. Then there must be some i0 such that v(

∧mi0

k=1 hi0k) = T. We definea valuation w on S(L1 ∪ L2) by setting, for p ∈ L1 ∪ L2,

w(p) =

v(p) if p ∈ L2

T if p = gi0k for some kF if ¬p = gi0k for some k

T, say if p ∈ L1 \ L2 and is not already assigned a value, that is, if p doesnot occur in the i0th disjunct

.

Note that w(∧li0

j=1 gi0j∧∧mi0

k=1 hi0k) = T by construction and hence w(s) = T.But we assumed that s |= t and so w(t) = T. But w and v agree on allpropositional variables in L2; hence v(t) = T.

We conclude that u |= t, which was what had remained to be proved. �

The proof gives an effective procedure for computing interpolants.

Example 1.6.2. Given that (p→ (¬r∧s))∧(p∨(r∧¬s)) |= ((s→ r)→ t)∨(¬t→(r ∧ ¬s)), how do we find an interpolant involving r and s only? (Note that, inthe notation of the proof, L1 = {p, r, s}, L2 = {r, s, t}, L3 = {r, s}.

We find a term in disjunctive normal form which is logically equivalent to(p→ (¬r∧s))∧(p∨(r∧¬s)); one such is (p∧s∧¬r)∨(¬p∧r∧¬s). Following theprocedure in the proof, we obtain the interpolant u which is (s∧¬r)∨ (r ∧¬s).

9You might reasonably ask what happens if, for this value of i, there are no hik conjuncts.That could happen but there must be at least one such value of i (that is, with v making theith conjunct true) such that there is an hik. Otherwise, arguing as before, we could adjust thevaluation v, keeping the same values on propositional variables in S(L1)\S(L3) but adjustingit on those belonging to S(L3), so as to make t false, while keeping s true, contradicting thats |= t.

20

Page 22: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Chapter 2

Deductive systems

“The design of the following treatise is to investigate the fundamental laws ofthose operations of the mind by which reasoning is performed; to give expressionto them in the symbolical language of a Calculus, and upon this foundation toestablish the science of Logic and construct its method; to make that methoditself the basis of a general method for the application of the mathematicaldoctrine of Probabilities; and, finally, to collect from the various elements oftruth brought to view in the course of these inquiries some probable intimationsconcerning the nature and constitution of the human mind.”

Thus begins Chapter 1 of George Boole’s “An Investigation of the Lawsof Thought (on which are founded the Mathematical Theories of Logic andProbabilities)” (1854)

We have already seen the “symbolical language” (though not the way Boolewrote it) and what Boole meant by a Calculus (or Algebra). Now we discussproof/deductive systems further.

Given a propositional term, we may test whether or not it is a tautology by,for example, constructing its truth table. This is regarded as a “semantic” testbecause it is in terms of valuations. The test is recursive in the sense that wehave a procedure which, after a finite amount of time, is guaranteed to tell uswhether or not the term is a tautology.

More generally, suppose that S is a finite set of propositional terms andthat t is a propositional term. Recall that we write S |= t to mean that everyvaluation which makes everything in S true also makes t true. Checking whetheror not this is true also is a recursive procedure.

In the case of predicate logic, however, it turns out that there is no corre-sponding algorithm for determining whether or not a propositional term (“sen-tence” in that context) is a tautology or whether the truth of a finite set ofpropositions implies the truth of another proposition.1 The best we can dois to produce a method of “generating” all tautologies or, more generally ofstarting with a set, S, of sentences/statements/propositions which we treat asaxioms and then generating all consequences of those axioms. Such a method of

1In fact the set of tautologies of predicate logic is “recursively enumerable” but not recur-sive. Saying that the set is recursively enumerable means that there is an algorithm whichwill output only tautologies and such that any tautology eventually will be output; but wecan’t predict when.

21

Page 23: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

generating consequences (and, of course, avoiding anything which is not a con-sequence) is a propositional or predicate calculus. In the following sectionswe will describe two such calculi for propositional logic. Of course, for classicalpropositional logic no such calculus is necessary because we have methods suchas truth tables or Beth trees. But these calculi will serve as models of calculi forlogics where there is no analogue of those recursive methods. It is also the casethat these calculi do correspond to “Laws of Thought” in the sense that theiraxioms and rules of inference capture steps in reasoning that we use in practice.

The calculi that we will see here are considerably simpler than those forpredicate logic but the main concepts and issues ((in)consistency, soundness,completeness, compactness, how one might prove completeness) all are presentalready in this simpler context which provides, therefore, a good opportunity tounderstand these fundamental issues.

2.1 A Hilbert-style system for propositional logic

Our (Hilbert-style) calculus will consist of certain axioms and one rule ofdeduction (or rule of inference). There are infinitely many axioms, beingall the propositional terms of one of the forms:

(i) s→ (t→ s)(ii) (r → (s→ t))→ ((r → s)→ (r → t))(iii) ¬¬s→ s(iv) (¬s→ ¬t)→ (t→ s),

where r, s and t may be any propositional terms.Thus, for instance, the following is an axiom: (p∧¬r)→ ((s∨t)→ (p∧¬r)).

We refer to (i)-(iv) as axiom schemas.

The single rule of deduction, modus ponens says that, from s and s → twe may deduce t.

Then we define the notion of entailment or logical implication, written`, within this calculus. Let S be a set (not necessarily finite) of propositionalterms and let s, t be propositional terms.

(i) If t is an axiom then S ` t (“logical axiom” LA)(ii) If s ∈ S then S ` s (“non-logical axiom” NLA)(iii) If S ` s and S ` s→ t then S ` t (“modus ponens” MP)(iv) That’s it. (like the corresponding clause in the definition of propositional

term)We read S ` t as “S entails t” or “S logically implies (within this particular

calculus) t”.

This definition is, like various definitions we have seen before, an inductiveone: it allows chains of entailments. Here is an example, of a deduction of p→ rfrom S = {p→ q, q → r}.

1. S ` q → r NLA2. S ` (q → r)→ (p→ (q → r)) LA(i)3. S ` (p→ (q → r)) MP1,24. S ` ((p→ (q → r))→ ((p→ q)→ (p→ r)) LA(ii)5. S ` ((p→ q)→ (p→ r)) MP3,46. S ` p→ q NLA7. S ` p→ r MP5,6

22

Page 24: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(The line numbers and right-hand entries are there to help any reader fol-low/check the deduction.)

Note that if S ⊆ T and if S ` s then T ` s because any deduction (such asthat above) of s from S may be changed into a deduction of s from T simply byreplacing every occurrence of “S” by “T”. A point about notation: if we writesomething like “S ` t” in a mathematical assertion (as opposed to this being aline of a formal deduction) you should read this as saying “There is a deductionof t from S.”.

Warning: it can be surprisingly difficult to find deductions, even of simplethings, in this calculus. The (“Gentzen-style/Natural Deduction”) calculus thatwe will use later allows deductions to be found more easily. Our main point is,however, not in the details of the calculus but the fact that there is a calculusfor which one can prove a completeness theorem (2.1.12).

For any such deductive calculus there are two central issues: soundness andcompleteness. We say that a deductive calculus is sound if we cannot deducethings that we should not be able to deduce using it, equivalently if we cannotdeduce contradictions by using it. That is, if S ` t then S |= t. And wesay that a deductive calculus is complete if it is strong enough to deduce allconsequences, that is if S |= t implies S ` t.

So soundness is “If we can deduce t from S then, whenever S is true, t istrue.” and completeness is “If t is true whenever S is true then there will be adeduction of t from S.”.

In the remainder of this section we will give a proof of soundness (this is theeasier part) and completeness for the calculus above.

2.1.1 Soundness

Suppose that S is a set of propositional terms and that t is a propositional term.We have to show that if S ` t is true then so is S |= t. Suppose then that S ` tand let v be a valuation with v(S) = T. We must show that v(t) = T.

In outline, the proof is this. The fact that S ` t is that there is a deduction oft from S. Any such deduction is given by a sequence of (logical and non-logical)axioms and applications of modus ponens. If we show that v assigns “T” toevery axiom and that modus ponens preserves “T” then every consequence ofa deduction will be “T”. More precisely, we argue as follows (“by induction online number”).

If r is a logical axiom then (go back and check that all those axioms areactually tautologies!) r is a tautology, so certainly v(r) = T. If r is a non-logicalaxiom then r ∈ S so, by assumption on v, we have v(r) = T. Suppose now thatwe have an application of MP in the deduction of t. That application has theform (perhaps with intervening lines and with the first two lines occurring inthe opposite order)

S ` rS ` r → r′

S ` r′for some propositional terms r, r′. We may assume inductively (inducting onthe length of the deduction) that v(r) = T and that v(r → r′) = T. Then,

23

Page 25: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

from the list of conditions for v to be a valuation, it follows that v(r′) = T, asrequired.

On the very last line of the deduction we haveS ` t

so our argument shows that v(t) = T, and we conclude that the calculus issound.

2.1.2 Completeness

Our first step is to prove the Deduction Theorem, which allows us to move termsin and out of the set of non-logical axioms.

Theorem 2.1.1. (Deduction Theorem) Let S be a set of propositional termsand let s and t be propositional terms. Then S ` (s→ t) iff S ∪ {s} ` t.

Proof. Both directions of the proof are really instructions on how to transforma deduction of one into a deduction of the other.

From a deduction showing that S ` (s→ t) we may obtain a deduction of tfrom S ∪ {s} by first replacing each occurrence of S (to the right of “`”) by anoccurrence of S∪{s} and noting that this is still a valid deduction, then addingtwo more lines at the end, namely

S ∪ {s} ` s NLAS ∪ {s} ` t MP(line above and line before that).

Note that this does give a deduction of t from S ∪ {s}.For the converse, suppose that there is a deduction of t from S ∪ {s}. This

deduction is a sequence of linesS ∪ {s} ` ti for i = 1, . . . , n where tn = t.

We will replace each of these lines by some new lines.If ti is a logical axiom or member of S then we replace the i-th line byS ` ti LA or NLAS ` (ti → (s→ ti)) LA(i)S ` s→ ti MPIf ti is s then we replace the i-th line by lines constituting a deduction of

s→ s from S (the proof of 2.1.2 below but with “S” to the left of each “`”).If the i-th line is obtained by an application of modus ponens then there are

line numbers j, k < i such that tk is tj → ti. In our transformed deduction therewill be corresponding (also earlier) lines reading

S ` s→ tj andS ` s→ (tj → ti)

so we replace the old i-th line by the linesS ` (s→ (tj → ti))→ ((s→ tj)→ (s→ ti)) Ax(ii)S ` ((s→ tj)→ (s→ ti)) MP(line above and one of the earlier ones)S ` s→ ti MP(line above and one of the earlier ones).What we end up with is a (valid - you should check that you see this)

deduction with last lineS ` s→ tn,

as required (recall that tn is t). (It’s worthwhile applying the process describedto an example just to clarify how this works.) �

Next, some lemmas, the first of which was used in the proof above.

24

Page 26: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Lemma 2.1.2. For every propositional term s there is a deduction (independentof s) with last line ` s → s and hence for every set S of propositional termsthere is a deduction with last line S ` s→ s.

Proof. Here’s the deduction.1. ` (s→ ((s→ s)→ s))→ ((s→ (s→ s))→ (s→ s)) Ax(ii)2. ` s→ ((s→ s)→ s) Ax(i)3. ` (s→ (s→ s))→ (s→ s) MP(1,2)4. ` s→ (s→ s) Ax(i)5. ` s→ s MP(3,4)To obtain the second statement just put “S” to the left of each “`” and note

that the deduction is still valid. �

We’ll abbreviate the statement of the following lemmas as in the statementof the Deduction Theorem. Throughout, s and t are any propositional terms.

Lemma 2.1.3. ` s→ (¬s→ t)

Proof. The first part of the proof is just to write down a deduction which takesus close to the end. Then there are two applications of the Deduction Theorem.We’ve actually incorporated those uses, labelled DT, into the deduction itself, asa derived rule of deduction. An alternative would be to stop the deductionat the line “7. {s,¬s} ` t MP(5,6)” and then say “Therefore {s,¬s} ` t. By theDeduction Theorem it follows that {s} ` ¬s → t and then, by the DeductionTheorem again, ` s→ (¬s→ t) follows.”

1. {s,¬s} ` ¬s→ (¬t→ ¬s) Ax(i)2. {s,¬s} ` ¬s NLA3. {s,¬s} ` ¬t→ ¬s MP(1,2)4. {s,¬s} ` (¬t→ ¬s)→ (s→ t) Ax(iv)5. {s,¬s} ` s→ t MP(3,4)6. {s,¬s} ` s NLA7. {s,¬s} ` t MP(5,6)8. {s} ` ¬s→ t DT9. ` s→ (¬s→ t) DT �

In the next proof we use more derived rules of deduction.

Lemma 2.1.4. ` (s→ ¬s)→ ¬s

Proof.1. {s→ ¬s} ` ¬¬s→ s Ax(iii)2. {s→ ¬s,¬¬s} ` s DT3. {s→ ¬s,¬¬s} ` s→ ¬s NLA4. {s→ ¬s,¬¬s} ` ¬s MP(2,3)5. {s→ ¬s,¬¬s} ` s→ (¬s→ ¬(s→ s)) Lemma 2.1.36. {s→ ¬s,¬¬s} ` ¬s→ ¬(s→ s) MP(2,5)7. {s→ ¬s,¬¬s} ` ¬(s→ s) MP(4,6)8. {s→ ¬s} ` ¬¬s→ ¬(s→ s) DT9. {s→ ¬s} ` (¬¬s→ ¬(s→ s))→ ((s→ s)→ ¬s) Ax(iv)10. {s→ ¬s} ` (s→ s)→ ¬s MP(8,9)11. {s→ ¬s} ` s→ s Lemma 2.1.212. {s→ ¬s} ` ¬s MP(10,11)

25

Page 27: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

13. ` (s→ ¬s)→ ¬s DT �

Lemma 2.1.5. ` s→ ¬¬s

Proof.` ¬¬¬s→ ¬s Ax(iii)` (¬¬¬s→ ¬s)→ (s→ ¬¬s) Av(iv)` s→ ¬¬s MP �

Lemma 2.1.6. ` ¬s→ (s→ t)

Proof. Exercise! �

Lemma 2.1.7. ` s→ (¬t→ ¬(s→ t))

Proof. Exercise! �

Now, define a set S of (propositional) terms to be consistent if there is someterm t such that there is no deduction of t from S. Accordingly, say that a setS is inconsistent if for every term t one has S ` t. You might reasonably haveexpected the definition of S being consistent to be that no contradiction canbe deduced from S. But the definition just given is marginally more useful andis equivalent to the definition just suggested (this follows once we have proved2.1.12 but is already illustrated by the next lemma).

Lemma 2.1.8. The set S of terms is inconsistent iff for some term s we haveS ` ¬(s→ s).

Proof. The direction “⇒“ is immediate from the definition.For the other direction, we suppose that there is some term s such that

S ` ¬(s → s). It must be shown that for every term t we have S ` t. Here isthe proof.

1. S ` s→ s Lemma 2.1.22. S ` (s→ s)→ ¬¬(s→ s) Lemma 2.1.53. S ` ¬¬(s→ s) MP(1,2)4. S ` ¬¬(s→ s)→ (¬t→ ¬¬(s→ s)) Ax(i)5. S ` ¬t→ ¬¬(s→ s) MP(3,4)6. S ` (¬t→ ¬¬(s→ s))→ (¬(s→ s)→ t) Ax(iv)7. S ` ¬(s→ s)→ t MP(5,6)8. S ` ¬(s→ s) by assumption9. S ` t MP(7,8) �

Lemma 2.1.9. Let S be a set of terms and let s be a term. Then S ∪ {s} isinconsistent iff S ` ¬s.

Proof. Suppose first that S ∪ {s} is inconsistent. Then, by definition, S ∪{s} ` ¬s. So, by the Deduction Theorem, we have S ` s → ¬s. Since also` (s → ¬s) → ¬s (2.1.4) and hence S ` (s → ¬s) → ¬s, we can apply modusponens to obtain S ` ¬s.

For the converse, suppose that S ` ¬s and let t be any term. It must beshown that S ∪ {s} ` t. We have S ∪ {s} ` s and also, by 2.1.3, S ∪ {s} ` s→

26

Page 28: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(¬s → t). So, by modus ponens, S ∪ {s} ` ¬s → t follows. Since S ` ¬s alsoS ∪ {s} ` ¬s so another application of modus ponens gives S ∪ {s} ` t. Thisshows that S ∪ {s} is inconsistent, as required. �

Lemma 2.1.10. Suppose that S is a set of terms and that s is a term. If bothS ` s and S ` ¬s then S is inconsistent.

Proof. For every term t we have S ` s→ (¬s→ t) (by 2.1.3). Since also S ` sand S ` ¬s, two applications of modus ponens, gives us S ` t (for every t), soS is inconsistent. �

The next lemma is an expression of the finite character of the notion ofdeduction.

Lemma 2.1.11. Suppose that S is a set of terms and that s is a term such thatS ` s. Then there is a finite subset, S′, of S such that S′ ` s.

Proof. Any derivation (of s from S) has only a finite number of lines and henceuses only a finite number of non-logical axioms. Let S′ be the, finite, set of allthose actually used. Replace S by S′ throughout the deduction to obtain a validdeduction, showing that S′ ` s. �

In the proof of the next theorem we make use of the observation that all thepropositional connectives may be defined using just ¬ and→ (that is, together,these two are adequate in the sense of Section 1.5) and so, in order to checkthat a function v from the set of propositional terms to {T,F} is a valuation, itis enough to check the defining clauses for ¬ and → only.

Theorem 2.1.12. (Completeness Theorem for Propositional Logic, version 1)Suppose that S is a consistent set of propositional terms. Then there is a valu-ation v such that v(S) = T.

Proof. Let Γ = {T : T is a consistent set of propositional terms and T ⊇ S}be the set of all sets of terms which contain S and are still consistent. We beginby showing, using Zorn’s lemma2 (see 2.1.16 below, for this), that

Γ has a maximal element.

Let ∆ be a subset of Γ which is totally ordered by inclusion. Let T =⋃

∆be the union of all the sets in ∆. It has to be shown that T ∈ Γ and the onlypossibly non-obvious point is that T is consistent. If it were not then, choosingany term s, there would be a deduction T ` ¬(s → s). By 2.1.11 there wouldbe a finite subset T ′ of T with T ′ ` ¬(s → s). Since ∆ is totally ordered andsince T ′ is finite there would be some T0 ∈ ∆ such that T0 ⊇ T ′. But then wewould have T0 ` ¬(s → s). By 2.1.8 it would follow that T0 is inconsistent,contradicting the fact that T0 ∈ ∆ ⊆ Γ.

This shows that every totally ordered subset of Γ has an upper bound inΓ and so Zorn’s Lemma gives the existence of a maximal element, T say, of Γ.That is, T is a maximal consistent set of terms containing S. What we will do,

2Chances are you haven’t seen this before. It is needed in the general case but if we assumethat L is countable then there’s a simpler proof of existence of a maximal element, and that’sthe one I’ll give in the lectures.

27

Page 29: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

and this is a key step in the proof, is define the valuation v by v(r) = T if r ∈ Tand v(r) = F if r /∈ T , but various things have to be proved in order to showthat this really does give a valuation.

First, we show that T is “deductively closed” in the sense that

(*1) if T ` r then r ∈ T.

Suppose, for a contradiction, that we had T ` r but r /∈ T. Then, bymaximality of T, the set T ∪ {r} would have to be inconsistent and hence, by2.1.9, T ` ¬r. By 2.1.3, T ` r → (¬r → t) for any term t, so two applicationsof modus ponens gives T ` t. Since t was arbitrary that shows inconsistency ofT - contradiction. Therefore (*1) is proved.

Next we show that T is “complete” in the sense that

(*2) for every term t either t ∈ T or ¬t ∈ T.

For, suppose that t /∈ T. Then, by maximality of T, the set T ∪ {t} isinconsistent so, by 2.1.9, T ` ¬t. Therefore, by (*1), ¬t ∈ T .

Next we show that

(*3) s→ t ∈ T iff ¬s ∈ T or t ∈ T.

For the direction “⇐” suppose first that ¬s ∈ T. Then, by 2.1.6 and (*1),s → t ∈ T. On the other hand if t ∈ T then s → t ∈ T by Axiom (i) and (*1).For the converse, “⇒”, if we have neither ¬s nor t in T then, by (*2) both sand ¬t are in T. Then, by 2.1.7 and (*1), we have ¬(s → t) ∈ T and so, byconsistency of T , s→ t /∈ T, as required.

Now define the (purported) valuation v by v(t) = T iff t ∈ T. Since S ⊆ Tcertainly v |= S so it remains to check that v really is a valuation. First, ifv(t) = T then t ∈ T so (consistency of T ) ¬t /∈ T so v(¬t) = F. Conversely, ifv(t) = F then t /∈ T so ((*2)) ¬t ∈ T so v(¬t) = T. That dealt with the ¬ clausein the definition of valuation. The → clause is direct from (*3) which, in termsof v, becomes v(s → t) = T iff v(¬s) = T or v(t) = T that is (by what we justshowed), iff v(s) = F or v(t) = T, as required. �

Theorem 2.1.13. (Completeness Theorem for Propositional Logic, version 2)Let S be a set of propositional terms and let t be a propositional term. ThenS ` t iff S |= t.

Proof. The direction “⇒” is the Soundness Theorem. For the converse, supposethat S 0 t. Then, by Axiom (iii) and modus ponens, S 0 ¬¬t. It then followsfrom 2.1.9 that S∪{¬t} is consistent so, by the first version of the CompletenessTheorem, there is a valuation v such that v(S) = T and v(¬t) = T so certainlywe cannot have v(t) = T. Therefore S 2 t, as required. �

Theorem 2.1.14. (Compactness Theorem for Propositional Logic, version 1)Let S be a set of propositional terms. There is a valuation v such that v(S) = Tiff for every finite subset S′ of S there is a valuation v′ with v′(S′) = T.

28

Page 30: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Proof. One direction is immediate: if v(S) = T then certainly v(S′) = T forany (finite) subset S′ of S. For the converse suppose, for a contradiction, thatthere is no v with v(S) = T. Then, by the Completeness Theorem (version1), S is inconsistent. Choose any term s. Then, by definition of inconsistent,S ` ¬(s→ s). So, by 2.1.11, there is a finite subset, S′, of S with S′ ` ¬(s→ s).By 2.1.8, S′ is inconsistent. So by Soundness there is no valuation v′ withv′(S′) = T, as required. �

Theorem 2.1.15. (Compactness Theorem for Propositional Logic, version 2)Let S be a set of propositional terms and let t be a propositional term. ThenS |= t iff there is some finite subset S′ of S such that S′ |= t.

Proof. Exercise. �

Theorem 2.1.16. (Zorn’s Lemma)3 Suppose that (P,≤) is a partially orderedset such that every chain has an upper bound, that is, if {ai}i∈I ⊆ P is totallyordered (for all i, j either ai ≤ aj or aj ≤ ai) then there is some a ∈ P witha ≥ ai for all i ∈ I. Then there is at least one maximal element in P (i.e. anelement with nothing in P strictly above it).

This is a consequence, in fact is equivalent to, the Axiom of Choice from settheory.

2.2 A natural deduction system for propositionallogic

The calculus that we describe in this section has no logical axioms as such butit has many rules of deduction and it allows much more “natural” proofs. Wedefine, by induction, a relation S

∣∣t where S is any set of propositional termsand t is any propositional term. It will turn out to be equivalent to the relationS ` t because one can prove the Completeness Theorem also for this calculus.

A sequent is a line of the form S∣∣t where S is a (finite) set of proposi-

tional terms and t is a propositional term. We write s1, . . . , sn∣∣t instead of

{s1, . . . , sn}∣∣t and we can write

∣∣t if S is empty. Certain sequents are calledtheorems and they are defined inductively by the following rules.

(Ax) Every sequent of the form S, t∣∣t is a theorem (these sequents play the

role of non-logical axioms in the Hilbert-style calculus).(→I) If S, s

∣∣t is a theorem then so is S∣∣s→ t.

(→E) If S∣∣s→ t and S

∣∣s are theorems then so is S∣∣t.

(¬I) If S, s∣∣t and S, s

∣∣¬t are theorems then so is S∣∣¬s.

(¬¬) If S∣∣¬¬t is a theorem then so is S

∣∣t.The theorems/rules of deduction in this calculus are usually written using a

less linear notation, as follows.(Ax) S, t

∣∣t(→I)

S, s∣∣t

S∣∣s→ t

.

3included only for completeness of exposition

29

Page 31: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(→E)S∣∣s→ t S

∣∣sS∣∣t .

(¬I)S, s∣∣t S, s

∣∣¬tS∣∣¬s .

(¬¬)S∣∣¬¬tS∣∣t .

This is a minimal list, corresponding to writing every propositional term upto equivalence using only {→,¬} (which, recall, is an adequate set of proposi-tional connectives). Of course, there are also rules involving ∨ and ∧, as follows.

S∣∣s T

∣∣tS ∪ T

∣∣s ∧ t S, s, t∣∣u

S, s ∧ t∣∣u

S∣∣s ∧ tS∣∣s S

∣∣s ∧ tS∣∣t

S∣∣s

S∣∣s ∨ t S

∣∣sS∣∣t ∨ s

S, s∣∣u T, t

∣∣uS ∪ T, s ∨ t

∣∣uS ` t S1 ⊇ S

S1 ` tAs before one may introduce derived rules, for example, Proof by Contra-

diction which says:

If S,¬s∣∣t and S,¬s

∣∣¬t are theorems then so is S∣∣s.

Which may be expressed by

S,¬s∣∣t S,¬s

∣∣¬tS∣∣s .

This rule can be derived from those above as follows:If S,¬s

∣∣t and S,¬s∣∣¬t are theorems then so is S

∣∣¬¬s (by (¬I)) and hence

so is S∣∣s (by (¬¬)).

Here’s the same argument written using the 2-dimensional notation.

S,¬s∣∣t S,¬s

∣∣¬tS∣∣¬¬s by (¬I)

S∣∣s by (¬¬).

As in the earlier-described calculus, a sequence of theorems is called a (valid)deduction. You should, of course, check that you agree that the above all are“valid rules of deduction”.

If S is a (possibly infinite) set of propositional terms and t is any propo-sitional term then we will write S ` t if there is a proof of t from S in thiscalculus, more formally, if there is a finite subset S′ of S such that S′

∣∣t is atheorem. You can read “S ` t” as “there is a deduction of t from S”. Of course,we already have such a notation and terminology from the previous section butignore that earlier deductive system for the moment.

Some further, easily derived, properties of the relation ` are:

30

Page 32: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(PbC)S,¬s ` t S,¬s ` ¬t

S ` ¬s

(Fin)S ` tS′ ` t

for some finite subset S′ ⊆ S

(Cut)S, φ ` t S ` φ

S ` tOne can prove soundness and completeness for this calculus. Recall what

the issues are.

• Is the calculus sound? That is, does the calculus generate only tautologies,more generally, if S

∣∣t then is it true that S |= t?

• Is the calculus complete? That is, does the calculus generate all tautologies,more generally, does S |= t imply that there is a proof in this calculus of S

∣∣t?The answer to each question is “yes”. The proof of soundness involves check-

ing that each rule of deduction preserves tautologies (compare the analogouspoint in the Hilbert-style calculus). The proof of completeness is entirely anal-ogous to that for the Hilbert-style calculus (though notice that the DeductionTheorem is already built into this natural deduction calculus). In particular,one makes the same definition for a set of terms of be (in)consistent and theheart of the proof is: given a consistent set S of terms, build a valuation whichgives all elements of S the value T.

31

Page 33: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Part II

Predicate Logic

32

Page 34: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

Chapter 3

A brief introduction topredicate logic: languagesand structures

3.1 Predicate languages

As we said in the introduction, propositional logic is about combining already-formed statements into more complex ones, whereas predicate logic allows usto formulate mathematical (and other) statements. Predicate logic is foundedon the standard view in pure mathematics that the main objects of study aresets-with-structure. The statements that we can form in predicate logic will bestatements about sets-with-structure. So first we need to be able to talk, inthis logic, about elements of sets; that is reflected in predicate languages havingvariables, x, y, ..., which range over the elements of a given set. We alsohave the universal quantifier ∀ (“for all”) and the existential quantifier∃ (“there is”) which prefix variables - so a formula in this language can begin∀x∃y . . . (“for all x there is a y such that ...). Of course, at “...” we want tobe able to insert something about x and y and that’s where the “structure” in“sets-with-structure” comes in. This will take a bit of explaining because thepredicate language that we set up depends on the exact type of “structure” thatwe want to deal with.

In brief, using a pick-and-mix approach, we set up a predicate language bychoosing a certain collection of symbols which can stand for constants (specificand fixed elements of structures), for functions and for relations. Here aresome examples.

Example 3.1.1. One piece of structure that is always there in a set-with-structureis equality - so we will (in this course) always have the relation = which expressesequality between elements of a set. This is a binary (=2-ary) relation, meaningthat it relates pairs of elements.

Example 3.1.2. Part of the structure on a set might be an ordering - for examplethe integers or reals with the usual ordering. If so then we would also include abinary relation symbol, different from equality, say ≤, in our language.

Example 3.1.3. Continuing with the examples Z, R, we might want to express

33

Page 35: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

the arithmetic operations of addition, multiplication and taking-the-negativein our language: so we would add two binary function symbols (i.e. functionsymbols taking two arguments), + and ×, and also a unary (=1-ary) functionsymbol − (that’s meant to be used for the function a 7→ −a, not the binaryfunction subtraction). We might also add symbols 0 and 1 as constants.

Example 3.1.4. Functions with more than two arguments are pretty common;for instance we might want to have some polynomial functions (or, if we weredealing with C, perhaps some analytic functions) built into our language, saya 3-ary function symbol f with which we could express the function given byF (x, y, z) = x2 + y2 + xz + 1.

Example 3.1.5. Relation symbols with more than two arguments are not socommon but here’s an example. Take the real line and define the relationB(x, y, z) to mean “y lies (strictly) between x and z.

To recap: in the case of propositional logic there was essentially just onelanguage (at least once we had chosen a set of propositional variables): in thecase of predicate logic there are many, in the sense that when defining any suchlanguage one has to make a choice from certain possible ingredients. There is,however, a basic language which contains none of these extra ingredients andwe’ll introduce that first. Actually even for the basic language there is a choice:whether or not to include a symbol for equality. The choice between inclusionor exclusion of equality rather depends on the types of application one has inmind but for talking about sets-with-structure it’s certainly natural to includea symbol “=” for equality.

3.2 The basic language

The basic (first-order, finitary, with equality) language L0 has the following:(i) all the propositional connectives ∧, ∨, ¬, →, ↔(ii) countably many variables x, y, u, v, v0, v1, ...(iii) the existential quantifier ∃(iv) the universal quantifier ∀(v) a symbol for equality =

Then we go on to define “terms” and “formulas”. Both of these, in differentways, generalise the notion of “propositional term” so remember that the word“term” in predicate logic has a different meaning from that in propositionallogic.

Formulas and free variablesA term of L0 is nothing other than a variable (you’ll see what “term” really

means when we discuss languages with constant or function symbols). The freevariable of such a term x (say) is just the variable, x, itself: fv(x) = {x}.

An atomic formula of L0 is an expression of the form s = t where s andt are terms. The set of free variables of the atomic formula s = t is given byfv(s = t) = fv(s) ∪ fv(t).

The following clauses define what it means to be a formula of L0 (and,alongside, we define what are the free variables of any formula):

(0) every atomic formula is a formula;(i) if φ is a formula then so is ¬φ, fv(¬φ) = fv(φ);

34

Page 36: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(ii) if φ and ψ are formulas then so are φ∧ψ, φ∨ψ, φ→ ψ and φ↔ ψ, andfv(φ ∧ ψ) = fv(φ ∨ ψ) = fv(φ→ ψ) = fv(φ↔ ψ) = fv(φ) ∪ fv(ψ);

(iii) if φ is a formula and x is any variable then ∃xφ and ∀xφ are formulas,and fv(∃xφ) = fv(∀xφ) = fv(φ) \ {x}.

((iv) plus the usual “that’s it” clause)A sentence is a formula σ with no free variables (i.e. fv(σ) = ∅).Just as with propositional logic we do not need all the above, because we may

define some symbols in terms of the others. For instance, ∧ and ¬, alternatively→ and ¬, suffice for the propositional connectives. Also each of the quantifiersmay be defined in terms of the other using negation: ∀xφ is logically equivalentto ¬∃x¬φ (and ∃x is equivalent to ¬∀x¬) so we may (and in inductive proofssurely would, just to reduce the number of cases in the induction step) dropreference to ∀ in the last clause of the definition.

We also remark that we follow natural usage in writing, for instance, x 6= yrather than ¬(x = y).

If φ is a formula then it is so by virtue of the above definition, so it hasa “construction tree” and we refer to any formula occurring in this tree as asubformula of φ. We also use this term to refer to a corresponding substringof φ. Remember that any formula is literally a string of symbols (usually wemean in the abstract rather than a particular physical realisation) and so wecan also refer to an occurrence of a particular (abstract) symbol in a formula.

As well as defining the set of free variables of a formula we need to definethe notion of free occurrence of a variable. To do that, if x is a variable then:

(i) every occurrence of x in any atomic formula is free;(ii) the free occurrences of x in ¬φ are just the free occurrences of x in its

subformula φ;(iii) the free occurrences of x in φ∧ψ are just the free occurrences of x in φ

together with the free occurrences of x in ψ;(iv) there are no free occurrences of x in ∃xφ.In a formula of the form Qxφ we refer to φ as the scope of the quantifier

Q (∃ or ∀). Any occurrence of x in Qxφ which is a free occurrence of x in φ(the latter regarded as a subformula of Qxφ) is said to be bound by that initialoccurrence of the quantifier Qx. So a quantifier Qx binds the free occurrencesof x within its scope.

A comment on use of variables when you are constructing formulas. Notethat bound variables are “dummy variables”: the formula ∃xf(x) = y and∃zf(z) = y are, intuitively, equivalent. A formula with nested occurrences ofthe same variable being bound can be confusing to read: ∃x(∀x(f(x) = x) →f(x) = x) could be written less confusingly as ∃x(∀y(f(y) = y)→ f(x) = x). Ofcourse these are not the same formula but one can prove that they are logicallyequivalent and the second is preferable.

Another informal notation that we will sometimes use is to “collapse repeatedquantifiers”, for example to write ∀x, y(x = y → y = x) instead of ∀x∀y(x =y → y = x). Sometimes the abbreviations ∃!, ∃≤n, ∃=n are useful.

35

Page 37: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

3.3 Enriching the language

The language L0 described above has little expressive power: there’s really notmuch that we can say using it; the following list just about exhausts the kindsof things that can be said.∀x(x = x);∀x∀y(x = y → y = x);∀x∀y∀z(x = y ∧ y = z → x = z);∃x∃y∃z(x 6= y ∧ y 6= z ∧ x 6= z ∧ ∀w(w = x ∨ w = y ∨ w = z));∃x(x 6= x).

We are now going to give the formal definitions of the possible extra ingredi-ents for a language but, since this is just a brief introduction to predicate logic,these definitions are included just so that you have precise definitions to refer toin case you have a question that is not answered by the perhaps less formal ex-position that I will give in lectures. That exposition will focus on a limited classof examples and on actually making sense of the meanings of various formulasin specific examples. So what follows is just for reference.

As we discussed earlier, precisely what we should add to the language L0

depends on the type of structures whose properties we wish to capture withinour formal language. We therefore suppose that we have, at our disposal, thefollowing kinds of symbols with which we may enrich the language:

• n-ary function symbols such as f ( = f(x1, . . . , xn));(since an operation is simply a function regarded in a slightly different way,we don’t need to introduce operation symbols as well as function symbols, butwe do use “operation notation” where appropriate, writing, for instance, x+ yrather than +(x, y))

• n-ary relation symbols such as R (= R(x1, . . . , xn))(1-ary relation symbols, such as P (= P (x)), are also termed (1-ary) predicatesymbols);

• constant symbols such as c.

Formulas of an enriched language Suppose that L is the language L0 en-riched by as many function, relation and constant symbols as we require (thesignature of L is a term used when referring to these extra symbols). Exactlywhat is in L will depend on our purpose: in particular, L need not have functionand relation and constant symbols, although I will, for the sake of a uniformtreatment, write as if all kinds are represented. If S is the set of “extra” symbolswe have added then we will write L = L0 ∨ S. (It is notationally convenientto regard L as being, formally, the set of all formulas of L, so then, writing,for example, φ ∈ L makes literal sense. Thus the “∨” should be understood assome sort of “join”, not union of sets.)

The terms of L, and their free variables, are defined inductively by:(i) each variable x is a term, fv(x) = {x};(ii) each constant symbol c is a term, fv(c) = ∅;(iii) if f is an n-ary function symbol and if t1, . . . , tn are terms, then f(t1, . . . , tn)

is a term, fv(f(t1, . . . , tn)) = fv(t1) ∪ · · · ∪ fv(tn).

The atomic formulas of L (and their free variables) are defined as follows:

36

Page 38: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

(i) if s, t are terms then s = t is an atomic formula, fv(s = t) = fv(s)∪ fv(t);(ii) ifR is an n-ary relation symbol and if t1, . . . , tn are terms, thenR(t1, . . . , tn)

is an atomic formula, fv(R(t1, . . . , tn)) = fv(t1) ∪ · · · ∪ fv(tn).

The formulas of L (and their free variables) are defined as follows:(0) every atomic formula is a formula;(i) if φ is a formula then so is ¬φ, fv(¬φ) = fv(φ);(ii) if φ and ψ are formulas then so are φ∧ ψ, φ∨ ψ, φ→ ψ and φ↔ ψ and

fv(φ ∧ ψ) = fv(φ ∨ ψ) = fv(φ→ ψ) = fv(φ↔ ψ) = fv(φ) ∪ fv(ψ);(iii) if φ is a formula and x is any variable then ∃xφ and ∀xφ are formulas,

and fv(∃xφ) = fv(∀xφ) = fv(φ) \ {x}A sentence of L is a formula σ of L with no free variables (i.e. fv(σ) = ∅).Since formulas were constructed by induction we prove things about them by

induction (“on complexity”) and, just as in the case of propositional terms, theissue of unique readability raises its head. Such inductive proofs will be validonly provided we know that there is basically just one way to construct any givenformula (for two routes would give two paths through the induction and hence,conceivably, different answers). Unique readability does hold for formulas, andalso for terms. Both proofs are done by induction (on complexity) and are notdifficult.

3.4 L-structures

Suppose that L is a language of the sort discussed above.

Formulas and sentences do not take on meaning until they are interpretedin a particular structure. Roughly, having fixed a language, a structure for thatlanguage provides: a set for the variables to range over (so, if M is the set then,“∀x” will mean “for all x in M”); an element of that set for each constant symbolto name (so each constant symbol c of the language will name a particular, fixedelement of M); for each function symbol of the language an actual function (ofthe correct arity) on that set; for each relation symbol of the language an actualrelation (of the correct arity) on that set. Here’s the precise definition.

An L-structure M (or structure for the language L) is a non-empty setM , called the domain or underlying set of M, we write M = |M|, togetherwith an interpretation in M of each of the function, relation and constant sym-bols of L. By an interpretation of one of these symbols we mean the following(and we also insist that the symbol “=” for equality be interpreted as actualequality between elements of M):

(i) if f is an n-ary function symbol, then the interpretation of f inM, whichis denoted fM, must be a function from Mn to M ;

(ii) if R is an n-ary relation symbol, then the interpretation of R in M,which is denoted RM, must be a subset of Mn (in particular, the interpretationof a 1-ary predicate symbol is a subset of M);

(iii) if c is a constant symbol, then the interpretation of c in M, which isdenoted cM, must be an element of M .

If no confusion should arise from doing so, the superscript “M” may bedropped (thus the same symbol “f” is used for the function symbol and for theparticular interpretation of this symbol in a given L-structure).

37

Page 39: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

3.5 Some basic examples

The basic languageAn L0-structure is simply a set so L0-structures have rather limited value

as illustrations of definitions and results.

In lectures we will give a variety of examples, concentrating on languages Lwhich contain just one extra binary relation symbol R.

Directed graphs An L = L0 ∨ {R(−,−)}-structure M consists of a set Mtogether with an interpretation of the binary relation symbol R as a particularsubset, RM, of M ×M. That is, an L-structure consists of a set together witha specified binary relation on that set.Given such a structure, its directed graph, or digraph for short, has for itsvertices the elements of M and has an arrow going from vertex a to vertex biff (a, b) ∈ RM. This gives an often useful graphical way of picturing or evendefining a relation RM (note that the digraph of a relation specifies the relationcompletely).

Certain types of binary relation are of particular importance in that theyoccur frequently in mathematics (and elsewhere).

Posets A partially ordered set (poset for short) consists of a set P and abinary relation on it, usually written ≤, which satisfies:

for all a ∈ P , a ≤ a (≤ is reflexive);for all a, b, c ∈ P , a ≤ b and b ≤ c implies a ≤ c (≤ is transitive);for all a, b ∈ P , if a ≤ b and b ≤ a then a = b (≤ is weakly antisymmetric).

The Hasse diagram of a poset is a diagrammatic means of representing a poset.It is obtained by connecting a point on the plane representing an element a ofthe poset to each of its immediate successors (if there are any) by a line whichgoes upwards from that point. We say that b is an immediate successor of aif a < b (i.e. a ≤ b and a 6= b) and if a ≤ c ≤ b implies a = c or c = b: we alsothen say that a is an immediate predecessor of b.

Equivalence relations An equivalence relation, ≡, on a set X is a binaryrelation which satisfies:

for all a ∈ X, a ≡ a (≡ is reflexive);for all a, b ∈ X, a ≡ b implies b ≡ a (≡ is symmetric);for all a, b, c ∈ X, a ≡ b and b ≡ c implies a ≡ c (≡ is transitive).

The (≡-)equivalence class of an element a ∈ X is denoted [a]≡, a/ ≡ orsimilar, and is {b ∈ X : b ≡ a}. The key point is that equivalence classes areequal or disjoint: if a, b ∈ X then either [a] = [b] or [a] ∩ [b] = ∅. Thus thedistinct ≡-equivalence classes partition X into disjoint subsets.

38

Page 40: MATH20302 Propositional Logic - School of Mathematicsmprest/MATH20302.pdf · MATH20302 Propositional Logic Mike Prest School of Mathematics Alan Turing Building ... tion (s_t), of

3.6 Definable Sets

If φ is a formula of a predicate language L and φ just the one free variable x say(in which case we write φ(x) to show the free variable explicitly) then we canlook at the “solution set” of φ in any particular L-structure M. This solutionset is written as φ(M) and it’s a subset of the underlying set M ofM, being theset of all elements a ∈M such that, if each free occurrence of x in φ is replacedby a, then the result (a “formula with parameter a”) is true in M.

That is: φ(M) = {a ∈ M : φ(a) is true}, where φ(a) means the expressionwe get when we substitute each free occurrence of x by a.

I will give examples in lectures but it’s something that you’ll already haveseen in less formal mathematical contexts, as is illustrated by the followingexamples.

Suppose that our structure is the real line R with its usual arithmetic (+,×, 0, 1) and order (≤) structure (I’ll use the same notation, R, for the structureand for the underlying set.). Take the formula φ, or φ(x), to be 0 ≤ x ≤ 1.Then the solution set φ(R) = {a ∈ R : 0 ≤ a ≤ 1} - the closed interval withendpoints 0 and 1.

Suppose, again with the reals R as the structure, that our formula, with freevariable x, let’s call it ψ this time, is x × x = 1 + 1. Then the solution setψ(R) = {a ∈ R : a× a = 1 + 1} = {a ∈ R : a2 = 2} = {−

√2,√

2}.For yet another example, again using the reals, take the formula, say θ, with

free variable x (so we can write θ(x)) to be ∃y (y × y = x). Then the solutionset θ(R) = {a ∈ R : ∃b ∈ R b2 = a} = R≥0 - the set of non-negative reals (sincethese are exactly the elements which are the square of some real number).

(The solution set for a formula with more than one free variable can be de-fined in a similar, and probably obvious, way, but we’ll concentrate on exampleswith one free variable.)

39