Advanced summer program and resource for students age 11-14
who show high promise and love mathematics


Inference Rules of Natural Deduction

Normal human reasoning is generally a train of thought moving linearly from the premises to the conclusion. This natural process is mimicked by the "Natural" Deduction Method of Propositional Logic (also called Propositional Calculus, abbreviated PC). This method in PC is what is used in mathematics proofs.

A Natural Deduction proof in PC is a sequence of wffs beginning with one or more wffs as premises; fresh premises may be added at any point in the course of a proof. New wffs are generated by applying "rules" to any wff or a group of wffs that have already occurred in the sequence. This means a Natural Deduction system has two aspects: A set of rules and a method for applying the rules. Thus it is similar to the procedure of deriving theorems in mathematics. So we turn the situation around and say that all mathematics proofs are instances of Natural Deduction.

Natural Deduction uses two kinds of rules: Rules of inference and rules of replacement. (The term "rule of inference" is often used to cover both types.) The former are tautologous conditionals (that is, valid wffs of the form A É B; in other words a string of the form A É B which evaluates to T for all truth value assignments to the propositional variables), and the latter are tautologous biconditionals (that is, valid wffs of the form A º B). The former apply only to entire lines of proof, while the latter apply to components within a line as well as to the whole line. Any tautologous conditionals and biconditionals could serve as valid rules of inference; but these are infinite in number. A small number of tautologies are chosen that are able to transform other wffs. Here is a popular set, known as Copi's set of rules [1].

Table 1. Inference Rules              
Modus Ponens (MP)
p É q
\ q
Modus Tollens (MT)
p É q
\ ~p
Hypothetical Syllogism (HS)
p É q
q É r
\ p É r
Disjunctive Syllogism (DS)
p Ú q
\ q
Constructive Dilemma (CD)
(p É q) .(r É s)
p Ú r
\ q Ú s
Destructive Dilemma (DD)
(p É q) .(r É s)
~q Ú ~s
\ ~p Ú ~r
Simplification (Simp.)
\ p
Conjunction (Conj.)
\ p.q
Addition (Add.)
\ p Ú q

Table 2. Replacement Rules              
De Morgan's Theorem (DM)~(p.q) º (~p Ú ~q)
~(p Ú q) º (~p.~q)
Commutation (Com.)(pÚq) º (qÚp)
p.q º q.p
Association (Assoc.)[(pÚq)Úr] º [pÚ(qÚr)]
[p.(q.r)] º [(p.q).r]
Distribution (Dist.)[p.(qÚr)] º [(p.q) Ú (p.r)]
[p Ú (q.r)] º [(p Ú q).(p Ú r)]
Double Negation (DN)p º ~~p
Material Implication (M. Imp.)(p É q) º (~p Ú q)
Transposition (Trans.)(p É q) º (~q É ~p)
Material Equivalence (M. Equiv.)(p º q) º [(p É q).(q É p)]
(p º q) º [(p.q) Ú (~p.~q)]
Law of Exportation[(p.q) É r] º [(p É (q É r)]

Note, again, that both the inference rules and the rules of replacement are tautologies. Substitution of a tautology in a single wff changes the wff but does not change its truth value; this is what a rule of replacement does.

The inference rules in Table 1 operate at once on one or more than one of the previous wffs in the deduction sequence and produces a new wff. The symbol A É B is called a conditional, A is the antecedent (premise), and B is the consequent (conclusion). All but two (Addition and Simplication) rules in Table 1 are Syllogisms. A Syllogism is an argument the conclusion of which is supported by two premises. A typical form is: All A is C; all B is A; therefore all B is C. In a Syllogism, one (major premise) contains the term that is the predicate of the conclusion. The predicate of the conclusion in our example is 'is C'. So the major premise is 'All A is C'. The other premise (minor premise) contains the term that is the subject of the conclusion. Let us look at the rules of Table 1.

1. Modus Ponens (MP): The original Latin name of the rule is Modus Ponendo Ponens, which means the method (modus) that affirms the consequent (ponendo) by affirming the antecedent (ponens). If p É q is true, and p is true, then q is true. A shorter way of saying this is 'if p É q and p then q.' A way of writing this is:
p É q
\ q
Now let us consider an 'instance' of this wff by substituting declarative sentences to stand for p and q. For instance, let p stand for 'the sun is shining' and q for 'Mary is at the beach'. Then the instance of MP is:

If the sun is shining, then Mary is at the beach.
The sun is shining.
Therefore Mary is at the beach.

So why is this a tautology? To see it, let us rewrite 'if p É q, and p, then q'. It becomes ((p É q) . p) É q. Look at its truth table.

pqp É q(p É q).p[(p É q).p] É q

Since all entries in the last column are T's, we have a tautology. That is, regardless of the assignment of truth values to its variables, the wff [(p É q).p] É q always yields a T value.

2. Modus Tollens (MT): If p É q is true, and ~q true, then ~p is true. The latin name is Modus Tollendo Tollens, which means by denying (tollendo) the consequent, we deny (tollens) the antecedent of the conditional.

If the sun is shining, then Mary is at the beach.
Mary is not at the beach.
Therefore the sun is not shining.
As above, a truth table would show that the wff representing MT, namely [(p É q). ~q] É ~p, is a tautology.

3. Hypothetical Syllogism (HS): From p É q and q É r to infer p É r. Also known as "chain reasoning", a hypothetical syllogism is not limited to two premises. Any number of premises can be used, provided that each one preserves the chain: the consequent of the first conditional must be the antecedent of the second.

If the sun is shining, then Mary is at the beach.
If Mary is at the beach, then she is swimming.
If she is swimming, then she will be tired tonight.
Therefore, if the sun is shining, Mary will be tired tonight.
É q). (q É r)] É (pÉ r) is a tautology.

4. Disjunctive Syllogism (DS): From p Ú q and ~p to infer q.

Either the sun is shining or it is raining.
The sun is not shining.
Therefore it is raining.

5. Conjunction (Conj): From p and q to infer p.q This rule will be familiar by now, because it is implicit in the logical operation of joining two true atomic statements and writing them as a compound statement, linked by the "&-sign". All it is saying is that if two independent statements are true, then the conjoined statement is true.

6. Simplification (Simpl): From p.q to infer p. This is the reverse of conjunction. If a conjoined statement is true, then each of its atomic statements is true. But this rule as stated (from p & q to infer p) seems to permit only the left conjunct to be inferred. However, note the 'commutation' rule in Table 2: p.q º q.p. So from p.q, we can also infer q, which is the right conjunct.

7. Addition (Add): From p to infer p Ú q. For a disjunction to be true, only one of the disjuncts needs to be true. So, if you know that p is true, you can add any other statement whatsoever to it by means of the disjunctive logical operator and the resulting compound statement will be true.

8. Constructive Dilemma (CD): From (p É q) and (r É s) and p Ú r to infer q Ú s.

If John moves to Alaska, then he will freeze this winter;
but if he moves to Miami, then he will burn up next summer.
Either he will move to Alaska or Miami.
Therefore, he will either freeze this winter or burn up next summer.

Colloquially, we call a 'dilemma' a predicament where we have to choose between alternatives neither of which yields a pleasant outcome. In a dilemmatic argument two hypotheticals constitute a major premise. These hypotheticals are called the 'horns'; they give an impression of presenting us with a predicament. The minor premise is a (and/or-type) disjunction; it is said to 'take the dilemma by its horns'

Examples of Deductive Arguments

Example 1. Prove that a contradiction implies any proposition B.
(We have seen the convention that False can imply either True or False. That is, False can imply anything.) We use the fact that ~(p.~p) being a tautology in PC, its negation (p.~p) is a contradiction.

(2)A[1, Conjunction Elimination]
(3)~A[1, Conjunction Elimination]
(4)~A Ú B[3, Addition]
(5)B[2,4, Disjunctive Syllogism]
In the last step, we are using steps (2) and (4) to infer B using Disjunctive Syllogism. How? (4) says ~A Ú B is true, (2) says A is true; then by DS, B must be true. The conclusion is (A.~A) implies B, for any B. Notice the column of numbers on the left and the column of square brackets on the right. The numbers denote the numerical order of the steps. The square brackets are the justifications for each step, identifying which previous lines in the proof are used as well as which inference rule. You may observe that in the two-column proofs in geometry in school, the right column contains the justifications. That is Natural Deduction!

Example 2. Prove Modus Tollens
If A É B is true, and ~B true, then ~A is true.
(1) A É B[Premise]
(3)~A Ú B[1, Material Implication]  (See Replacement Rules in Table 2 for Material Implication.)
(4)~A [2,3, DS]

So, we have derived MT from two inference rules (Material Implication and DS) in the list. This means that MT could be excluded from a minimal list of inference rules. True! The reason for keeping MT in a list is that it is convenient in that it would often save a few steps in a proof. Using Material Implication and Disjunctive Syllogism we can also prove MP. This means we can eliminate MP and MT from our list. On the other hand, we can use MP and Material Implication to prove DS. And so on!

Example 3. Prove Constructive Dilemma
(A É B) .(C É D)
\ B Ú D

(1) (A É B) .(C É D)[Premise]
(2)A Ú C[Premise]
(3)C É D[1, Simplification]
(4)~A É C [2, Material Implication]
(5)~A É D [3,4, HS]
(6)~D É A [5, Transposition, DN]
(7)A É B [1,Simpl.]
(8)~D É B [6,7, HS]
(9)D Ú B [8, Material Impl. and DN]
(10)B Ú D[Commutation]

Notice in this example that some steps involved the use of more than one rule.

The theory of proof is not difficult. What is difficult is coming up with the chain of steps to reach the conclusion.

[1] I. M. Copi, Symbolic Logic, 5th Edition. (New York: Macmillian, 1979)

--  G.R.T.

>>  Proof Methods




Send suggestions to
Copyright Ó 2001-2007 MathPath
Last updated - January 20, 2007