CS 641 Lecture -*- Outline -*- * An algebraic presentation of the booleans operators, following Cohen This follows chapter 2 of Cohen's book "Programming in the 1990s", Springer-Verlag, 1990. The ideas are originally due to Dijkstra. It has the advantage that, being more algebraic, you use reasoning about equivalence more often than antisymmetry proofs in practice. ** The equivalence (<==>) Cohen makes the following postulates for the equivalence. ------------------------------------------ POSTULATES FOR EQUIVALENCE for all Boolean predicates P, Q, R <==> is equality for Booleans: |- (P <==> Q) = (P = Q) <==> is symmetric: |- (P <==> Q) <==> (Q <==> P) <==> is associative: |- ((P <==> Q) <==> R) <==> (P <==> (Q <==> R)) <==> has unit true: |- (P <==> true) <==> P ------------------------------------------ The first postulate is also assumed (see p. 111) by Back and von Wright. This allows us to substitute equals for equals when we have an equivalence. This should really be formulated as a definition ^ (x <==> y) = (x = y) where (<==>) : Bool -> Bool -> Bool From this definition, symmetry of equivalence follows trivially. Lemma (<==> is symmetric): (P <==> Q) = (Q <==> P) Proof: |- P <==> Q = { definition of <==> } P = Q = { = is symmetric } Q = P = { definition of <==> } Q <==> P QED Corollary. |- (P <==> Q) <==> (Q <==> P) Proof: Use the definition of <==> and the above lemma. QED Now we prove associativity for the equivalence, using two lemmas. Lemma. ((P <==> Q) <==> R) ==> (P <==> (Q <==> R)) Proof: We assume (P <==> Q) <==> R in the calculation below. (P <==> Q) <==> R |- P <==> { ==> antisymmetric } * P ==> { discharge } * [ P ] Q <==> R <==> { <==> T rule, to get the shape of the main assumption } ("T" <==> Q) <==> R <==> { by local assumption, T <==> P } (P <==> Q) <==> R <==> { by assumption, ((P <==> Q) <==> R) <==> T } T . Q <==> R * Q <==> R ==> { discharge } [ Q <==> R ] T <==> { by assumption, T <==> (P <==> Q) <==> R } (P <==> Q) <==> "R" <==> { substitute equals for equals, we are assuming Q <==> R } (P <==> Q) <==> Q ==> { case analysis } * [ Q ] (P <==> "Q") <==> "Q" <==> { by assumption, Q <==> T, twice } (P <==> T) <==> T <==> { <==> T rule, twice } P * [ !Q ] (P <==> "Q") <==> "Q" <==> { by assumption (and <==> F rule), Q <==> F, twice } (P <==> F) <==> F <==> { <==> F rule } !("P <==> F") <==> { <==> F rule } !(!P) <==> { double negation (ex. 6.4) } P . P . P . (Q <==> R) QED Lemma. (P <==> (Q <==> R)) ==> (P <==> Q) <==> R) Proof: We assume P <==> (Q <==> R) in the calculation below. P <==> (Q <==> R) |- (P <==> Q) <==> { ==> antisymmetric } * R ==> { discharge } * [ R ] P <==> "Q" <==> { <==> T rule } P <==> (Q <==> "T") <==> { by local assumption, T <==> R, } P <==> (Q <==> R) <==> { by assumption, (P <==> (Q <==> R)) <==> T } T . P <==> Q * P <==> Q ==> { discharge } * [ P <==> Q ] T ==> { by assumption } P <==> (Q <==> R) <==> { substitute equals for equals, we are assuming P <==> Q } Q <==> (Q <==> R) ==> { case analysis } * [ Q ] "Q" <==> ("Q" <==> R) <==> { by assumption, Q <==> T, twice } T <==> (T <==> R) <==> { <==> T rule, twice } R * [ !Q ] "Q" <==> ("Q" <==> R) <==> { by assumption, Q <==> F, twice } F <==> (F <==> R) <==> { <==> F rule } !("F <==> R") <==> { <==> F rule } !(!R) <==> { double negation (ex. 6.4) } R . R . R . R QED Corollary (<==> is associative): ((P <==> Q) <==> R) <==> (P <==> (Q <==> R)) Proof: This follows by antisymmetry of ==> and the preceding two lemmas. Some more basic properties of equivalence. Lemma (<==> has unit T): (P <==> T) <==> P Proof: |- ("P <==> T") <==> P <==> { <==> is symmetric } (T <==> P) <==> P <==> { <==> is symmetric } P <==> (T <==> P) <==> { <==> T rule } T QED Corollary (<==> is reflexive): (P <==> P) <==> T Proof: This follows by use of symmetry. Now we have the lemma of mutual implication. Lemma (mutual implication). (A <==> B) <==> (A ==> B) /\ (B ==> A) Proof: |- (A ==> B) /\ (B ==> A) <==> { ==> antisymmetric } * (A ==> B) /\ (B ==> A) ==> { ==> antisymmetric } * (A ==> B) /\ (B ==> A) ==> { /\ elimination } (A ==> B) * (A ==> B) /\ (B ==> A) ==> { /\ elimination } (B ==> A) . A <==> B * A <==> B ==> { /\ introduction } * A <==> B ==> { discharge } * [ A <==> B ] ("A" ==> B) <==> { assumption } (B ==> B) <==> { ==> reflexive } T . (A ==> B) * A <==> B ==> {discharge } * [ A <==> B ] (B ==> "A") <==> { assumption } (B ==> B) <==> { ==> reflexive } T . (B ==> A) . (A ==> B) /\ (B ==> A) . A <==> B QED Note: From now on we omit all appeals to symmetry and associativity for <==>. ** disjunction (\/) Cohen makes the following postulates for disjunctions ------------------------------------------ POSTULATES FOR DISJUNCTION disjunction is associative and symmetric disjunction is idempotent: |- P \/ P <==> P disjunction distributes over equivalence: |- P \/ (Q <==> R) <==> P \/ Q <==> P \/ R ------------------------------------------ We start by proving these as lemmas. We use a lemma to capture the common reasoning behind the two (symmetric) parts of the proof of symmetry. Lemma. |- P \/ Q ==> Q \/ P Proof: |- P \/ Q ==> { case analysis } * [ P ] "P" \/ Q <==> { by assumption P <==> T } T \/ Q <==> { T \/ rule } T <==> { by assumption T <==> P } P ==> { \/ introduction (the second rule on p. 112) } Q \/ P * [ !P ] "P" \/ Q <==> { by assumption, P <==> F } F \/ Q <==> { F \/ rule } Q ==> { \/ introduction (the first rule on p. 112) } Q \/ P . Q \/ P QED Lemma (disjunction is symmetric). |- P \/ Q <==> Q \/ P Proof. |- P \/ Q <==> { ==> antisymmetric } * P \/ Q ==> { lemma above } Q \/ P * Q \/ P ==> { lemma above with P,Q := Q,P } P \/ Q . Q \/ P QED Lemma (disjunction is associative) |- (P \/ Q) \/ R <==> P \/ (Q \/ R) Proof. |- (P \/ Q) \/ R <==> { ==> antisymmetric } * (P \/ Q) \/ R ==> { case analysis } * [ R ] (P \/ Q) \/ "R" <==> { by assumption, R <==> T } (P \/ Q) \/ T <==> { T \/ rule, symmetry } T <==> { by assumption, T <==> R } R ==> { \/ introduction } Q \/ R ==> { \/ introduction } P \/ (Q \/ R) * [ !R ] (P \/ Q) \/ "R" <==> { by assumption, R <==> F } (P \/ Q) \/ F <==> { F \/ rule, symmetry } P \/ "Q" <==> { F \/ rule } P \/ (Q \/ "F") <==> { by assumption, F <==> R } P \/ (Q \/ R) . P \/ (Q \/ R) * P \/ (Q \/ R) ==> { case analysis } * [ P ] "P" \/ (Q \/ R) <==> { by assumption, P <==> T } T \/ (Q \/ R) <==> { T \/ rule, symmetry } T <==> { by assumption, T <==> P } P ==> { \/ introduction } P \/ Q ==> { \/ introduction } (P \/ Q) \/ R * [ !P ] "P" \/ (Q \/ R) <==> { by assumption, P <==> F } "(F \/ Q)" \/ R <==> { F \/ rule } "Q" \/ R <==> { F \/ rule } ("F" \/ Q) \/ R <==> { by assumption, F <==> P } (P \/ Q) \/ R . (P \/ Q) \/ R . P \/ (Q \/ R) QED Lemma (disjunction is idempotent): |- P \/ P <==> P Proof. |- P \/ P <==> { ==> antisymmetric } * P \/ P ==> { \/ elimination } * P ==> { ==> reflexive } P * P ==> { ==> reflexive } P . P * P ==> { \/ introduction } . P \/ P . P Lemma (disjunction distributes over equivalence): |- P \/ (Q <==> R) <==> { case analysis } * [ P ] "P" \/ (Q <==> R) <==> { by assumption, P <==> T } T \/ (Q <==> R) <==> { T \/ rule } T <==> { <==> is reflexive } "T" <==> "T" <==> { T \/ rule, twice } "T" \/ Q <==> "T" \/ R <==> { by assumption, T <==> P, twice } P \/ Q <==> P \/ R * [ !P ] "P" \/ (Q <==> R) <==> { by assumption, P <==> F } F \/ (Q <==> R) <==> { F \/ rule } "Q" <==> "R" <==> { F \/ rule, twice } "F" \/ Q <==> "F" \/ R <==> { by assumption, F <==> P, twice } P \/ Q <==> P \/ R . P \/ Q <==> P \/ R QED Here's an interesting lemma not in the Back and von Wright book, and a proof using these axioms. (From Cohen, p. 29) Lemma (disjunction distributes over itself): |- X \/ (Y \/ Z) <==> (X \/ Y) \/ (X \/ Z). Proof: We start with the more complex side. |- (X \/ Y) \/ (X \/ Z) <==> { associativity of \/ } X \/ Y \/ X \/ Z <==> { symmetry of \/ } "X \/ X" \/ Y \/ Z <==> { disjunction is idempotent } X \/ Y \/ Z <==> { associativity of \/ } X \/ (Y \/ Z) QED ** conjunction (/\) Cohen makes the following postulate for conjunction ------------------------------------------ POSTULATE FOR CONJUNCTION The "Golden Rule": |- P /\ Q <==> P <==> Q <==> P \/ Q ------------------------------------------ This makes sense, because <==> is associative. For example, you can parse it as a definition of conjunction: ^ P /\ Q = ((P <==> Q) <==> P \/ Q). Or think about what this means in terms of lattices. You may recall that we started trying to prove this in class, and wound up trying to prove some lemmas about it. I now think the most convincing proof of this involves a case analysis... Proof: We calculate as follows |- P <==> Q <==> P \/ Q <==> { ==> antisymmetric } * P <==> Q <==> P \/ Q ==> { case analysis } * [ P ] "P" <==> Q <==> "P" \/ Q <==> { by assumption, P <==> T, twice } T <==> Q <==> T \/ Q <==> { T <==> rule } Q <==> "T \/ Q" <==> { T <==> rule } Q <==> Q <==> { <==> is reflexive } T <==> { by assumption, T <==> P } P ==> { /\ introduction } P /\ Q * [ !P ] P <==> Q <==> "P" \/ Q <==> { by assumption, P <==> F } P <==> Q <==> "F \/ Q" <==> { F \/ rule } P <==> "Q <==> Q" <==> { <==> is reflexive } P <==> T <==> { T <==> rule } P ==> { /\ introduction } P /\ Q . P /\ Q * P /\ Q ==> { case analysis } * [ P ] "P" /\ Q <==> { by assumption, P <==> T } T /\ Q <==> { T /\ rule } Q <==> { T <==> rule } "T" <==> Q <==> { <==> is reflexive } P <==> P <==> Q <==> { symmetry } P <==> Q <==> "P" ==> { \/ introduction, focusing lemma } P <==> Q <==> P \/ Q * [ !P ] "P" /\ Q <==> { by assumption, P <==> F } F /\ Q <==> { F /\ rule } F <==> { by assumption, F <==> P } P <==> { T <==> rule (and symmetry), to get an equivalence } P <==> "T" <==> { <==> is reflexive, to get closer to the desired shape } P <==> Q <==> Q <==> { F \/ rule, to introduce a disjunct } P <==> Q <==> "F" \/ Q <==> { by assumption, F <==> P } P <==> Q <==> P \/ Q . P <==> Q <==> P \/ Q . P /\ Q QED Now we can prove various other properties of conjunction. Note that now we are starting to use proofs of equivalences where we don't use an antisymmetry argument; these arguments are now buried in the proofs of the lemmas above. Lemma (conjunction is symmetric): |- P /\ Q <==> Q /\ P Proof: This is the kind of thing where it would suffice to say, "this is a corollary of the golden rule and symmetry of equivalence and disjunction." But here it is... |- P /\ Q <==> { golden rule } "P <==> Q" <==> P \/ Q <==> { <==> is symmetric } Q <==> P <==> P \/ Q <==> { \/ is symmetric } Q <==> P <==> Q \/ P <==> { golden rule } Q /\ P QED Lemma (conjunction is associative). |- (P /\ Q) /\ R <==> P /\ (Q /\ R) Proof. |- (P /\ Q) /\ R <==> { golden rule } ("P /\ Q") <==> R <==> ("P /\ Q") \/ R <==> { golden rule, twice } (P <==> Q <==> P \/ Q) <==> R <==> "(P <==> Q <==> P \/ Q) \/ R" <==> { disjunction distributes over equivalence } P <==> Q <==> P \/ Q <==> R <==> P \/ R <==> Q \/ R <==> P \/ Q \/ R <==> { <==> is associative } P <==> Q <==> R <==> Q \/ R <==> "P \/ Q <==> P \/ R <==> P \/ Q \/ R" <==> { disjunction distributes over equivalence } P <==> ("Q <==> R <==> Q \/ R") <==> P \/ ("Q <==> R <==> Q \/ R") <==> { golden rule, twice } P <==> (Q /\ R) <==> P \/ (Q /\ R) <==> { golden rule } P /\ (Q /\ R) QED Lemma (conjunction is idempotent). |- X /\ X <==> X Proof. This has a nice proof using the golden rule (Cohen, p. 31), which avoids the use of antisymmetry. Proof: |- X /\ X <==> { golden rule } X <==> X <==> X \/ X <==> { \/ is idempotent } X <==> "X <==> X" <==> { <==> is reflexive } X <==> T <==> { T <==> rule } X QED The following is an immediate corollary. Corollary (conjunction distributes over itself). |- X /\ (Y /\ Z) <==> (X /\ Y) /\ (X /\ Z) Note that the conjunction does not distribute over equivalence. Instead, there is the following: Lemma |- X /\ (Y /\ Z) <==> X /\ Y <==> X /\ Z <==> X. I leave the proof of this to you (see Cohen p. 32). Here are the "absorption laws" (i) |- X \/ (X /\ Y) <==> X (ii) |- X /\ (X \/ Y) <==> X Let's prove the second of these (Cohen p. 33), I leave the other to you. |- X /\ (X \/ Y) <==> { golden rule, with P,Q := X, X \/ Y } X <==> X \/ Y <==> "X \/ X" \/ Y <==> { idempotence of \/ } X <==> "X \/ Y <==> X \/ Y" <==> { <==> is reflexive } X <==> T <==> { T <==> rule } X ** implication (==>) Cohen makes the following postulate for implication ------------------------------------------ POSTULATE FOR IMPLICATION Correspondence between \/ and ==> : |- P ==> Q <==> P \/ Q <==> Q ------------------------------------------ We can prove this as a lemma (note the connection to correspondence, as on p. 34 of Back and von Wright). Lemma (correspondence between \/ and ==>). |- P ==> Q <==> P \/ Q <==> Q Proof. |- P ==> Q <==> { case analysis * [ P ] "P" ==> Q <==> { by assumption, P <==> T } T ==> Q <==> { T ==> rule } Q <==> { T <==> rule, to get a <==> in the formula } "T" <==> Q <==> { T \/ rule, to get an \/ in the formula } "T" \/ Q <==> Q <==> { by assumption, T <==> P } P \/ Q <==> Q * [ !P ] "P" ==> Q <==> { by assumption, P <==> F } F ==> Q <==> { F ==> rule } T <==> { <==> is reflexive, to get <==> and Q's in the formula } "Q" <==> Q <==> { F \/ rule, to get an \/ in the formula } "F" \/ Q <==> Q <==> { by assumption, F <==> P } P \/ Q <==> Q . P \/ Q <==> Q QED The dual of the above correspondence is another useful characterization of implication. Lemma (correspondence between /\ and ==>). |- X ==> Y <==> X /\ Y <==> X Proof (Cohen p. 35) |- X /\ Y <==> X <==> { golden rule (divided as (P /\ Q <==> P) <==> (Q <==> P \/ Q) } Y <==> X \/ Y <==> { correspondence between \/ and ==> } X ==> Y QED Here are some other properties of implication you may have fun proving for yourself (see Cohen pp. 35-36). The names are my own. Lemma (currying). |- X ==> (Y ==> Z) <==> X /\ Y ==> Z The above is also called uncurrying when used from right to left. Lemma (modus ponens as a formula). |- X /\ (X ==> Y) <==> X /\ Y Lemma (transitivity as a formula). |- (X ==> Y) /\ (Y ==> Z) ==> X ==> Z Lemma (implication cases). |- (X ==> Y) \/ (Y ==> Z) We've already shown mutual implication above. Lemma. |- X <==> Y <==> X \/ Y ==> X /\ Y ** consequence (<==) Unlike Back and von Wright, Cohen gives a slightly different definition of the consequence operator: ------------------------------------------ CONSEQUENCE (REVERSE IMPLICATION) ^ P <== Q = P \/ Q <==> P ------------------------------------------ But it's a trivial matter of using the definitions to show the following, which is equivalent to Back and von Wright's definition. Corollary (consequence formula). |- (X <== Y) <==> (Y ==> X) ** negation (!) Cohen gives the following postulates for negation ------------------------------------------ POSTULATES FOR NEGATION Definition of negation: |- !(P <==> Q) <==> !P <==> Q Excluded middle: |- P \/ !P Definition of F: |- F <==> !T ------------------------------------------ The law of excluded middle is called "not exhaustion" in Back and von Wright. The "definition of F" is a lemma in homework problem 6.4). This leaves us with the first postulate, which we state as a lemma. Lemma (definition of negation). |- !(P <==> Q) <==> !P <==> Q Proof. |- !(P <==> Q) <==> { <==> F rule } (P <==> Q) <==> F <==> { <==> is associative and symmetric } (P <==> F) <==> Q <==> { <==> F rule } !P <==> Q QED Here are some lemmas that are not in Back and von Wright. We leave most of the proofs to you (see Cohen p. 37-40). Lemma (transfer of negation). |- !X <==> Y <==> X <==> !Y Proof. |- !X <==> Y <==> { definition of negation } !(X <==> Y) <==> { definition of negation } X <==> !Y QED Lemma (DeMorgan's laws). (i) |- !X /\ !Y <==> !(X \/ Y) (ii) |- !X \/ !Y <==> !(X /\ Y) Lemma (Complement rules). (i) |- X \/ (!X /\ Y) <==> X \/ Y (ii) |- X /\ (!X \/ Y) <==> X /\ Y Lemma (Shuffle, a.k.a shunting). |- (X /\ Y) ==> Z <==> (X ==> !Y \/ Z) Lemma. |- X <==> !X <==> false ** discrepancy (<=!=>) This relation is the complement of equivalence. It's defined as follows ------------------------------------------ THE DISCREPANCY (LOGICAL INEQUALITY) Definition of discrepancy: ^ P <=!=> Q = !(P <==> Q) ------------------------------------------ From this we can prove the following lemmas, which Cohen gives as additional postulates. Lemma (<=!=> is symmetric). |- P <=!=> Q <==> Q <=!=> P Proof. This is immediate from the definition and symmetry of <==>. |- P <=!=> Q <==> { definition of discrepancy } !("P <==> Q") <==> { <==> is symmetric } !(Q <==> P) <==> { definition of discrepancy } Q <=!=> P QED Lemma (discrepancy associates with equivalence). |- P <==> (Q <=!=> R) <==> (P <==> Q) <=!=> R Proof. |- P <==> (Q <=!=> R) <==> { definition of discrepancy } P <==> !(Q <==> R) <==> { transfer of negation lemma (above) } !P <==> (Q <==> R) <==> { <==> is associative } "!P <==> Q" <==> R <==> { definition of negation } !(P <==> Q) <==> R <==> { transfer of negation lemma (above) } (P <==> Q) <==> !R <==> { definition of discrepancy } (P <==> Q) <=!=> R QED Lemma (<=!=> is associative). |- (P <=!=> Q) <=!=> R <==> P <=!=> (Q <=!=> R) Proof. |- (P <=!=> Q) <=!=> R <==> { definition of discrepancy } !("(P <=!=> Q) <==> R") <==> { definition of discrepancy } !(!(P <==> Q) <==> R) <==> { transfer of negation lemma (above) } !("(P <==> Q) <==> !R") <==> { associativity of equivalence } !(P <==> ("Q <==> !R")) <==> { transfer of negation lemma (above) } !(P <==> ("!Q <==> R")) <==> { definition of discrepancy } !(P <==> (Q <=!=> R)) <==> { definition of discrepancy } P <=!=> (Q <=!=> R) QED Now we have the analogs of the <==> T and <==> F rules. We leave the proofs to you. Lemma (<=!=> F rule). |- X <=!=> F <==> X Lemma (<=!=> T rule). |- X <=!=> T <==> !X