Natural Deduction: Technique of writing logical and mathematical proofs in a
format that is close to ordinary ('natural') reasoning, in that the proofs rely
on rules of reasoning that refer only to the grammatical form (and not to the
meaning or interpretation) of formulas.
There are many systems of Natural Deduction (ND), a technique that was first
independently conceived of and developed by Gerhard Gentzen and by Stanislaw
Jaskowski in the 1930ies.
Until then, the standard technique of proving in formal logic was in terms of
axioms plus some rules, usually modus ponens and substitution, so that the
proved theorems were those formulas that could be obtained from the axioms by
these rules.
One important difference from the then normal axiomatic style of proof was
that in ND one assumed far more rules and no or hardly any axioms, with the
result that many proofs were easier to find and to read. In part this is due to
the format, and in part to a keyidea of ND: That of subordinate proofs. These
are proofs within proofs, that make it possible to compose the proof of a
formula from the proofs of several other formulas.
In the following sections I give a number of rules for propositional logic,
predicate logic, and set theory and will be mostly following Cartwright,
Seebohm, and Tennant.
1. Preliminaries
A proof in ND is supposed to have a specific format, namely that it is a sequence of numbered lines of formulas,
each of which has a justification in terms of an assumed rule of inference.
Thus, the general format is
(1) F1 J1
...
(i) Fi Ji
...
(n) Fn Jn
where at each line (i) there is a formula Fi followed by a justification of
the formula Ji, and the justification either is (a) that Fi is an assumption or
else (b) refers to earlier lines in the proof and to some assumed rule of
inference or (c) names a theorem of which the formula is an instance.
To start with the case (a), the simplest proof
consists of an assumption, which has its own rule AI = Assumption Introduction,
which looks like this:
AI Assumption Introduction 
n 
A_{n} 
AI 
Assumption Introduction says line n consists of assumption An, which is
justified by AI, where "AI" is the justification.
All in all, what the rule amounts to is the freedom to assume what one
pleases, as long as it is clearly justified as assumption, and what is assumed
is a wellformed formula. To repeat:
A proof is supposed to have a certain format:
 it consists of numbered lines, with one formula per line
 every line has a justification
 every justification, if not AI, refers to previous lines in the proof
Thus schematically it looks like this:
1 
A_{1} 
J1... 
... 
... 

n 
A_{n} 
Jn... 



p 
A_{p} 
Jp, 1,n 
Here the linenumbers, that function as names of the formulas that
immediately follow them, are on the left, followed by schematic indications of
formulas, and schematic indications of their justifications. In the given
schema, at line p formula Ap is supposed to be justified by the earlier lines 1
and n.
Also, that this is a rule is indicated by the horizontal line: From lines 1
and n, regardless of whatever other lines there are in the proof, line p may be
inferred on the strength of the rule Jp.
Two other simple oneline rules which one often needs in NDproofs are these,
both of which are called Repetition:
Rep
Repetition 
n 
A_{n} 
Jn 



p 
A_{n} 
Rep,n 
Repetition says line p follows from line n by Rep, that simply repeats the
earlier line of a proof.
So far, apart from AI and Rep no rules of inference have been given and
assumed. This we will proceed to do now.
2. Propositional Logic
There are quite a few different systems of ND for Classical Propositional
Logic, for which we will be given rules of inference in this section. The format
and rules presented here are given because it is clear and simple.
II Implication Introduction 
p 
A 
AI 



q 
B 




r 
A => B 
II,p,q 
Implication Introduction says that one can infer at line q+1 that A=>B (read
as: "A implies B") if one has already a subordinate proof of B (by some rules)
from the assumption A
Note that at this point we have the rules that enable us to prove that
Theorem: A=>A
Namely: We can assume A by AI, then repeat that by Rep, and thus have a
subordinate proof of A from A, for which reason we can use II to infer A=>A.
Note the conventions used in the writing of this rule:
 Each lower line is a later stage in the proof, and therefore lower line
numbers must be higher
 At line p the assumption of A is made
 At line q, possibly with many intermediate formulas between p and q, B is
concluded
 At line r, A=>B is concluded by II, on the strength of lines p and q
There is one important qualification on the lines that are part of a
subordinate proof:
 Rule for subordinate proofs: NO line in a subordinate proof may
occur as a justification OUTSIDE the subordinate proof itself
The reason is that the lines in a subordinate proof may and often do depend
on the assumption that started the subordinate proof, and this assumption
usually is not in force anymore later on in the proof.
IE Implication Elimination 
p 
A => B 

q 
A 




r 
B 
IE,p,q 
Implication Elimination is the NDversion of the classical rule that is used
as the main or only rule in axiomatic proofs, the rule of Modus Ponens. It says
that if one has that A=>B and one also has that A, then from these two one can
infer B.
It is now already possible to prove theorems about implication, like this
Theorem: (A=>B)=>((B=>C)=>(A=>C))
Here is the proof:
1 
(A=>B) 
AI 
2 
(B=>C) 
AI 
3 
A 
AI 
4 
B 
IE,1,3 
5 
C 
IE,2,4 
6 
A=>C 
II,3,5 
7 
(B=>C)=>(A=C) 
II,2,6 
8 
(A=>B)=>((B=>C)=>(A=C)) 
II,1,7 
Note how this proof can be read from the page, as it were: 'Suppose
(1) A implies B, and suppose (2) B implies C. Now suppose (3) A. Then
from 1 and 3 it follows that B. And then it follows from B and 2 that C.
Then it follows from C and 3 that A implies C. Therefore from 2 B
implies C implies A implies C, and finally from (1) the theorem'.
This sort of proof also illustrates the name 'Natural Deduction'.
NI Negation Introduction 
p 
A 
AI 



q 
B 

r 
~B 




r+1 
~A 
NI,p,q,r 
Negation Introduction involves a subordinate proof: To prove that ~A one
assumes A in a subordinate proof and derives a contradiction from it, that
usually will depend on earlier assumptions or conclusions in the proof, and
concludes that since A does entail a contradiction, A must be false.
NE Negation Elimination 
p 
~~A 




q 
A 
NE,p 
Negation Elimination allows the removal of a pair of negations.
Of course, one or both of these negations may have come from NI, or AI or
some other rules.
CI Conjunction Introduction 
p 
A 

q 
B 




r 
A&B 
CI,p,q 
Conjunction Introduction allows the introduction of a conjunction of two
statements on condition that each statement has been proved already.
CE Conjunction Elimination
(left) 
p 
A&B 




r 
A 
CE,p 
Conjunction Elimination allows the inference of one conjunct if one has the
conjunction.
The present form gives the rule to infer the left conjunct, and the next rule
does the same for the right conjunct:
CE Conjunction Elimination
(right) 
p 
A&B 




r 
B 
CE,p 
Conjunction Elimination allows the inference of one conjunct if one has the
conjunction.
At this point we have the rules to prove
Theorem: ~(A&~A)
Here is the proof:
1 
(A&~A) 
AI 
2 
A 
CE,1 
3 
~A 
CE,1 



4 
~(A&~A) 
NI,2,3 
Next, we formulate the rules for disjunction. To introduce a disjunction we
have two rules:
DI Disjunction Introduction
(left) 
p 
A 




r 
AVB 
DI,p 
Disjunction Introduction allows the inference of a disjunction if one has one
disjunct.
DI Disjunction Introduction
(right) 
p 
B 




r 
AVB 
DI,p 
Disjunction Introduction allows the inference of a disjunction if one has one
disjunct.
At this point we can infer theorems like
Theorem: A=>(AVB)
(using AI, DI, II)
Theorem: (A&B)=>(AVB) (using AI, CE, DI, II)
The rule for eliminating a disjunction is more complicated and involves the
idea of subordinate proof:
DE Disjunction Elimination 
p 
AVB 

q 
A 
AI 



r 
C 

s 
B 
AI 



t 
C 




t+1 
C 
DE,q,r,s,t 
Disjunction Elimination allows the inference of a conclusion C from a
disjunction AVB if one has subordinate proofs of C from both A and B
This is also known as 'proof by cases': To prove that 'it is cold' from the
premiss 'it freezes or it snows' one may prove that if it freezes then it is
cold, and if it snows then it is cold.
At this point we have the rules to prove a theorem like
Theorem: ~(AVB) => ~A&~B
Here is the proof:
1 
~(AVB) 
AI 
2 
A 
AI 
3 
(AVB) 
DI,2 
4 
~(AVB) 
Rep,1 
5 
~A 
NI,2,3,4 
6 
B 
AI 
7 
(AVB) 
DI,6 
8 
~(AVB) 
Rep,1 
9 
~B 
NI,6,7,8 
10 
~A&~B 
CI,5,9 
Now we have formulated the rules for most of the standard logical
connectives, namely 'not', 'or', 'and' and 'implies'. The one remaining
connective is if and only if, abbreviated iff, that is in fact a
conjunction of two implications. The rule corresponds to this:
EI Equivalence Introduction 
n 
A_{n} 

p 
A 
AI 



q 
B 




r 
B 
AI 



s 
A 




s+1 
A iff B 
EI,p,q,r,s 
Equivalence Introduction allows the introduction of A iff B from subordinate
proofs of B from A and of A from B.
There are again, as for 'or' and 'and' two rules for eliminating either part
of an equivalence
EI Equivalence Elimination 
p 
A iff B 

q 
A 




r 
B 
EI,p,q 
Equivalence Elimination allows the introduction of B from A iff B if one also
has A.
EI Equivalence Elimination 
p 
A iff B 

q 
B 




r 
A 
EI,p,q 
Equivalence Elimination allows the introduction of A from A iff B if one also
has B.
The above rules taken together are sufficient to prove all theorems of
standard propositional logic. That this is so takes some cleverness, and one way
of doing this is by showing the rules allow the derivation of the axioms of some
standard system of propositional logic, for if one has derived these one has
shown that in the given set of NDrules for PL one can prove the same as in a
standard axiomatic system for PL, since that usually relies on the rule of Modus
Ponens, which is the same as the above rule of Implication Elimination.
The point of the present section was merely to give the rules. It remains to
be said that each of the rules except AI is valid in the customary sense in
terms of the standard truthtable definitions for logical connectives: Given one
assumes the standard truthtable definitions for 'not', 'and', 'or', 'implies'
and 'if and only if' one can prove that each of the rules of inference given
above is valid in the sense that if the assumptions of the rules are true, then
so is the conclusion of the rule in each case.
3. Predicate Logic
Predicate Logic extends Propositional Logic by refining the analysis of
statements. In Propositional Logic, the smallest unit, so to speak, is that of a
statement, which cannot be further analyses in PL. In Predicate Logic,
statements are supposed to be made up of subjects and predicates. The subjects
are supposed to be like common or individual names, and the predicates are
supposed to be what a statement is apart from the terms in it.
Thus, if in 'Romeo loves Juliet' 'Romeo' and 'Juliet' are supposed to be
subjectterms, 'loves' is supposed to be a predicate term.
For more, see Logical Notation and Relations. Here I take a subjectpredicate
analysis of statements for granted, and also the introduction of subjectterms
and predicate terms, and the notion of subjectvariables.
Given that, the logical terms of Classical Predicate Logic are the
quantifiers, namely the expressions 'for all' and its equivalents or near
synonyms ('for each', 'for every') and 'for some' and its equivalents or near
synonyms ('there exists', 'there is'), and the notion of identity, which allows
one to assert that two terms are identical.
Thus, the formulas of predicate logic comprise such forms as
 for every x for every y if x is smaller than y then y is larger than x
 for every x for every y for every z if x=y and y=z then x=z
 for some x for some y x is smaller than y
and very many more.
In this section I give NDrules for equality and for the quantifiers, and
start with the former.
R= =
Reflexivity of identity 
p 
t1=t1 
R= 
Reflexivity of identity allows the assertion of an identity of a term to
itself.
This is a rule that does not require any other assumptions and thus is like
an axiom: Everything (named by a term) is identical to itself.
S= = Symmetry
of identity 
p 
t1=t2 




r 
t2=t1 
S=,p 
Symmetry of identity allows the inference of t2=t1 from a line with t1=t2.
T= =
Transitivity of identity 
p 
t1=t2 

q 
t2=t3 




r 
t1=t3 
T=,p,q 
Transitivity of identity allows the inference of t1=t3 from two lines to the
effect that t1=t2 and t2=t3.
= =
Substitution of identity 
p 
Ft1 

q 
t1=t2 




r 
Ft2 
=,p,q 
Substitution of identity allows the inference of Ft2 from two lines to the
effect that Ft1 and t1=t2.
≠ =
Nonidentity introduction 
p 
Ft1 

q 
~Ft2 




r 
t1≠t2 
≠,p,q 
Nonidentity introduction allows the inference that two terms are
not identical from two lines to the effect that one term has and
another term lacks some predicate.
QAI For All Introduction 
p 
Ft 
[*] 



r 
(x): Fx 
QAI,p 
For all introduction allows the inference that (x): Fx from the premiss that
Ft, provided that the following restrictions [*] hold on t:
[1] t does not occur in any noneliminated assumption used in the proof
[2] if t occurs in F in line p then t is everywhere uniformally substituted by x
in line r.
The reason for the restrictions is that without them the inference may be
invalid. Thus, if t occurs in a noneliminated assumption or in a proved line
then at these place some restriction may be asserted on t that is not compatible
with the conclusion that for every x Fx.
With these restrictions, intuitively speaking to the effect that Ft can be
proved for t without any special assumptions about t, the idea of the proof is
that if one can prove for any arbitrary t that it satisfies the predicate F  as
for example: "Gt V ~Gt", where "F" pans out as "G_ V ~G_" writing an underscore
"_" where the variable "t" fits  then one can prove for every x that F.
QAE For All Elimination 
p 
(x): Fx 




r 
Ft 
QAE,p 
For all elimination allows the inference of Ft from (x):Fx.
This has no restrictions, and the idea of the proof is simply that if it is
true for every x that F, then it must be true for any arbitrary t one can
mention, including those that may occur already in the proof, that F.
QEI For some
introduction 
p 
Ft 




r 
(Ex): Fx 
QEI,p 
For some introduction allows the inference of (Ex):Fx from Ft.
There are no restrictions, and the idea of the proof is simply that if it is
true for t that it is F, then it must be true for some variable x replacing t in
Ft that it is F: If Bucephalos is a horse, then some x is a horse.
QEE For some
elimination 
p 
(Ex): Fx 

q 
Ft 
[*] 



r 
C 




r+1 
C 
QEE,p,q,r 
For some elimination allows the inference of C from a conditional proof of C
from (Ex):Fx and Ft, provided that the following restrictions [*] hold on t:
[1] t does not occur in any noneliminated assumption used in the proof
[2] t does not occur in C
[3] t does not occur in (Ex):Fx
The reason for the restrictions is that without them the inference may be
invalid. Thus, if t occurs in a noneliminated assumption or in a proved line
then at these place some restriction may be asserted on t that is not compatible
with the conclusion that C.
With these restrictions, intuitively speaking, the idea of the proof is that
if one can prove that C that does not mention t does follow from (Ex):Fx and Ft
(as restricted), then C is true from those assumptions, and does neither include
(Ex) nor t.
This may look somewhat abstruse, but allows one to argue e.g. as follows:
p Some number is greater than 0.
q 1 is greater than zero.
r There is a positive
number.
From q
t+1 There is a positive number.
