First published Tue Jul 2, 1996; substantive revision Fri Aug 18, 2017

Inconsistent Mathematics began historically with foundational considerations. Set-theoretic paradoxes noted by Russell and others led to attempts to produce a consistent set theory as a foundation for mathematics. But, as is well known, set theories such as ZF, NBG and the like were in various ways ad hoc. Hence, a number of people including da Costa (1974), Brady (1971, 1989), Priest, Routley, & Norman (1989, pp. 152, 498), considered it preferable to retain the full power of the natural comprehension principle (every predicate determines a set), and tolerate a degree of inconsistency in set theory. Brady, in particular, has extended, streamlined and simplified these results on naive set theory in his book (2006); for a clear account see also Restall’s review (2007).

These constructions require, of course, that one dispense at least with the logical principle ex contradictione quodlibet (ECQ) (from a contradiction every proposition may be deduced, also recently called explosion), as well as any principle which leads to it, such as disjunctive syllogism (DS) (from A-or-B and not-A deduce B). ECQ trivialises any inconsistent theory (triviality = every sentence is provable), which makes it useless for mathematical calculation. But considerable debate (Burgess 1981, Mortensen 1983), made it clear that dispensing with ECQ and DS was not so counter-intuitive, especially when a plausible story emerged about the special conditions under which they continue to hold.

It should also be noted that Brady’s construction of naive set theory opens the door to a revival of Frege-Russell logicism, which was widely held, even by Frege himself, to have been badly damaged by the Russell Paradox. If the Russell Contradiction does not spread, then there is no obvious reason why one should not take the view that naive set theory provides an adequate foundation for mathematics, and that naive set theory is deducible from logic via the naive comprehension schema. The only change needed is a move to an inconsistency-tolerant logic. Even more radically, Weber, in related papers (2010), (2012), has taken the inconsistency to be a positive virtue, since it enables us to settle several questions that were left open by Cantor, namely, that the well-ordering theorem and the axiom of choice are provable, and that the Continuum Hypothesis is false (2012, 284). Some of these come out provably both true and false; wherein Weber is concerned to advance proofs of the classical recapture, which is the project of showing that traditional results remain true (2010, 72). This is invigorating new ground. Weber also showed something essential to this project, namely, that Cantor’s Theorem continues to hold; that is, it does not depend on overly-strong logical principles which are contested by paraconsistentists. Retaining Cantor’s Theorem is important in Weber’s view, since different orders of infinity remain available in inconsistent set theory.

In addition, mathematics has a metalanguage, for talking about mathematics itself. This includes the concepts: (i) names for mathematical statements and other parts of syntax, (ii) self-reference, (iii) proof and (iv) truth. Gödel’s contribution to the philosophy of mathematics was to show that the first three of these can be rigorously expressed in arithmetical theories, albeit in theories which are either inconsistent or incomplete. The possibility of a well-structured example of the former of these two alternatives, inconsistency, was not taken seriously, again because of belief in ECQ. However, in addition natural languages seem to have their own truth predicate. Combined with self-reference this produces the Liar paradox, “This sentence is false”, an inconsistency. Priest (1987) and Priest, Routley and Norman (1989, p. 154) argued that the Liar had to be regarded as a statement both true and false, a true contradiction. This represents another argument for studying inconsistent theories, namely the claim that some contradictions are true, also known as dialetheism. Kripke (1975) proposed instead to model a truth predicate differently, in a consistent incomplete theory. We see below that incompleteness and inconsistency are closely related.

But these remarks have been about foundations, and mathematics is not its foundations. Hence there is a further independent motive, to see what mathematical structure remains wherever the constraint of consistency is relaxed. But it would be wrong to regard this as in any way a repudiation of the structures studied in classical mathematics: inconsistent structures represent an addition to known structures.

Robert K. Meyer (1976) seems to have been the first to think of an inconsistent arithmetical theory. At this point, he was more interested in the fate of a consistent theory, his relevant arithmetic R#. This amounts to the axioms for Peano arithmetic, with a base of the quantified relevant logic RQ, and Meyer hoped that the weaker base of relevant logic would allow more models. He was right. There proved to be a whole class of inconsistent arithmetical theories; see Meyer and Mortensen (1984), for example. In a parallel with the above remarks on rehabilitating logicism, Meyer argued that these arithmetical theories provide the basis for a revived Hilbert Program. Hilbert’s program was the project of rigorously formalising mathematics and proving its consistency by simple finitary/inductive procedures. It was widely held to have been seriously damaged by Gödel’s Second Incompleteness Theorem, according to which the consistency of arithmetic was unprovable within arithmetic itself. But a consequence of Meyer’s construction was that within his arithmetic R# it was demonstrable by finitary means that whatever contradictions there might happen to be, they could not adversely affect any numerical calculations. Hence Hilbert’s goal of conclusively demonstrating that mathematics is trouble-free proves largely achievable as long as inconsistency-tolerant logics are used.

The arithmetical models used by Meyer and Mortensen later proved to allow inconsistent representation of the truth predicate. They also permit representation of structures beyond natural number arithmetic, such as rings and fields, including their order properties. Axiomatisations were also provided. Recently, the finite inconsistent arithmetical collapse models, a strictly larger class than those studied by Meyer and Mortensen, have been completely characterised by Graham Priest. Collapse models are obtained from classical models by collapsing the domain down to congruence classes generated by various congruence relations. When members of the same congruence class are identified, the theories produced are inconsistent. For example, Meyer’s initial construction collapsed the integers under the congruence modulo 2. This puts 0 and 2 in the same congruence class, and in a suitable three-valued logic, both 0=2 and not-(0=2) hold. Priest showed that these models take a certain general form, see Priest (1997) and (2000). Strictly speaking, Priest went a little too far in including “clique models”. This was corrected by Paris and Pathmanathan (2006), and extended into the infinite by Paris and Sirokfskich (2008). Even more recently, Tedder (2015) obtained axiomatisations for the class of finite collapse models with a different background logic, Avron’s A3.

One could hardly ignore the examples of analysis and its special case, the calculus. For a model-theoretic approach to these see Mortensen (1990, 1995)

Now Meyer’s original approach to the natural numbers, that is R#, was axiomatic rather than model-theoretic. The axiomatic approach has also been taken in analysis by McKubre-Jordens and Weber (2012). In axiomatising analysis with a base of paraconsistent logic, their paper pushes Meyer’s approach to arithmetic via R# a long way further. These same authors (forthcoming) rework the theory of integration as it was in Archimedes’ hands, which employs the method of exhaustion, using paraconsistent reasoning. This gives a result “up to inconsistency”, which means that one is able to prove “Classical result or contradiction”. The classical result can be then seen to be recapturable by the classical move disjunctive syllogism applied to the classically-false (inconsistent) second disjunct.

It is certainly important and worthy to pursue this direction, but a mild caution is entered here: the axiomatic project is a bit different from inconsistent mathematics. As noted earlier, Meyer in this phase was consistentist – he sought a consistent theory with an inconsistency-tolerant logic. With similar motivation, he was also concerned to try to settle what he called “the gamma problem”, which was essentially the question of whether the axiomatic theory R# could be shown to contain classical Peano arithmetic as a sub-theory. If this were so, then his proof of nontriviality for R# would immediately yield a new proof of the negation consistency of classical Peano arithmetic! Note that this would not be contrary to Godel’s Second Theorem, since presumably the proof of the gamma result would not be confined to finitary techniques. (In the case of Meyer’s theory, it turned out to be not so.)

There have proved to be many places throughout analysis where there are distinctive inconsistent insights. The examples in the remainder of this section are drawn from Mortensen (1995). For example: (1) Robinson’s (1974) non-standard analysis was based on infinitesimals, quantities smaller than any real number, as well as their reciprocals, the infinite numbers. This has an inconsistent version, which has some advantages for calculation in being able to discard higher-order infinitesimals. Interestingly, the theory of differentiation turned out to have these advantages, while the theory of integration did not. A similar result, using a different background logic, was obtained by Da Costa (2000). (2) Another place to find applications of inconsistency in analysis is topology, where one readily observes the practice of cutting and pasting spaces being described as “identification” of one boundary with another. One can show that this can be described in an inconsistent theory in which the two boundaries are both identical and not identical, and it can be further argued that this is the most natural description of the practice. (3) Yet another application is the class of inconsistent continuous functions. Not all functions which are classically discontinuous are amenable of inconsistent treatment; but some are, for example f(x)=0 for all x<0 and f(x)=1 for all x≥0. The inconsistent extension replaces the first < by ≤, and has distinctive structural properties. These inconsistent functions may well have some application in dynamic systems in which there are discontinuous jumps, such as quantum measurement systems. Differentiating such functions leads to the delta functions, applied by Dirac to the study of quantum measurement. (4) Next, there is the well-known case of inconsistent systems of linear equations, such as the system (i) x+y=1, plus (ii) x+y=2. Such systems can potentially arise within the context of automated control. Little work has been done classically to solve such systems, but it can be shown that there are well-behaved solutions within inconsistent vector spaces. (5) Finally, one can note a further application in topology and dynamics. Given a supposition which seems to be conceivable, namely that whatever happens or is true, happens or is true on an open set of (spacetime) points, one has that the logic of dynamically possible paths is open set logic, that is to say intuitionist logic, which supports incomplete theories par excellence. This is because the natural account of the negation of a proposition in such a space says that it holds on the largest open set contained in the Boolean complement of the set of points on which the original proposition held, which is in general smaller than the Boolean complement. However, specifying a topological space by its closed sets is every bit as reasonable as specifying it by its open sets. Yet the logic of closed sets is known to be paraconsistent, ie. supports inconsistent nontrivial theories; see Goodman (1981), for example. Thus given the (alternative) supposition which also seems to be conceivable, namely that whatever is true is true on a closed set of points, one has that inconsistent theories may well hold. This is because the natural account of the negation of a proposition, namely that it holds on the smallest closed set containing the Boolean negation of the proposition, means that on the overlapping boundary both the proposition and its negation hold. Thus dynamical theories determine their own logic of possible propositions, and corresponding theories which may be inconsistent, and are certainly as natural as their incomplete counterparts.

On closed set logic and boundaries as a natural setting for contradictory theories, see Mortensen (2003, 2010). Weber and Cotnoir (2015) also explore the inconsistency of boundaries, arising from the incompatibility of the three principles (i) there are boundaries, (ii) space is topologically connected, and (iii) discrete entities can be in contact (i.e., no space between them). This is a very interesting problem, as all three are plausible; in particular there do seem to be boundaries in our world. An initially surprising feature of this account is that boundaries come out as “empty”; after all, null entities are contrary to the spirit of mereology. But this is not so shocking as it turns out that they are only empty in the sense that they have members inconsistently.

Category theory throws light on many mathematical structures. It has certainly been proposed as an alternative foundation for mathematics. Such generality inevitably runs into problems similar to those of comprehension in set theory; see, e.g., Hatcher 1982 (pp. 255–260). Hence there is the same possible application of inconsistent solutions. There is also an important collection of categorial structures, the toposes, which support open set logic in exact parallel to the way sets support Boolean logic. This has been taken by many to be a vindication of the foundational point of view of mathematical intuitionism. However, it can be proved that that toposes support closed set logic as readily as they support open set logic, to date the only category-theoretic semantics for a paraconsistent logic. This should not be viewed as an objection to intuitionism, however, so much as an argument that inconsistent theories are equally reasonable as items of mathematical study. See Mortensen (1995 Chap 11, co-author Lavers). This position has now been taken up, extended and ably defended by Estrada-Gonzales (2010, 2015a, 2015b). The same author (2016) undertakes to provide a category-theoretic description of trivial theories, with the aim of showing that triviality is not such an uninteresting feature for mathematical theories to have. The present author remains unconvinced, since a trivial theory is surely useless for mathematical calculation; but the ingenuity of the arguments must be conceded.

Duality between incompleteness/intuitionism and inconsistency/paraconsistency has at least two aspects. First there is the above topological (open/closed) duality. Second there is Routley * duality. The Routley Star * of a set of sentences S, is defined as S* = df {A: ~A is not in S}. Discovered by the Routleys (1972) as a semantical tool for relevant logics, the * operation dualises between inconsistent and incomplete theories of the large natural class of de Morgan logics. Both kinds of duality interact as well, where the * gives distinctive duality and invariance theorems for open set and closed set arithmetical theories. On the basis of these results, it is fair to argue that both kinds of mathematics, intuitionist and paraconsistent, are equally reasonable.

A very recent development is the application to explaining the phenomenon of inconsistent pictures. The best known of these are perhaps M. C. Escher’s masterpieces Belvedere, Waterfall and Ascending and Descending. In fact the tradition goes back millennia to Pompeii. Escher seems to have derived many of his intuitions from the Swedish artist Oscar Reutersvärd, who began his inconsistent work in 1934. Escher also actively collaborated with the English mathematician Roger Penrose. There have been several attempts to describe the mathematical structure of inconsistent pictures using classical consistent mathematics, by theorists such as Cowan, Francis and Penrose. As argued in Mortensen (1997), however, no consistent mathematical theory can capture the sense that one is seeing an impossible thing. Only an inconsistent theory can capture the content of that perception. This amounts to an appeal to a cognitive justification of paraconsistency. One can then proceed to display inconsistent theories which are candidates for such inconsistent contents. There is an analogy with classical mathematics on this point: projective geometry is a classical consistent mathematical theory which is interesting because we are creatures with an eye, since it explains why it is that things look the way they do in perspective.

Inconsistent geometrical studies are further developed in Mortensen (2002a), where category theory is applied to give a general description of the relationships between the various theories and their consistent cut-downs and incomplete duals. For an informal account which highlights the difference between visual “paradoxes” and the philosophically more common paradoxes of language, such as the Liar, see Mortensen (2002b).

More recently, inconsistent mathematical descriptions have been obtained for several classes of inconsistent figures, exemplified by Escher’s Cube (found in his print Belvedere), the Reutersvärd-Penrose triangle, and others. See Mortensen (2010).

Recently, an alternative technique for dealing generally with contradictions has emerged. Brown and Priest (2004) have proposed a technique they call “Chunk and Permeate”, in which reasoning from inconsistent premisses proceeds by separating the assumptions into consistent theories (chunks), deriving appropriate consequences, then passing (permeating) those consequences to a different chunk for further consequences to be derived. They suggest that Newton’s original reasoning in taking derivates in the calculus, was of this form. This is an interesting and novel approach, though it must meet the objection that to believe a conclusion obtained on this basis, one should believe all the premisses equally; and so an argument of the more common form, appealing to all the premisses without fragmenting them, should be eventually forthcoming. The objection is thus that Chunk and Permeate is part of the context of discovery rather than the context of justification.

Recently, Benham et. al. (2014) have extended these methods to the Dirac delta function. This broadens the class of applications, and so strengthens the technique. However, it also becomes clear there, that there is a close parallel between (one large class of) Chunk and Permeate applications, and (consistent) non-standard analysis: wherever Chunk and Permeate takes a derivative by shifting chunks to one where infinitesimals are zero, non-standard analysis takes a derivative by defining derivatives to be “standard parts only”. Of course, equivalence between these two techniques does not show which is explanatorily deeper. Developments are to be awaited with interest.

To conclude: there has been appearing lately quite a bit of philosophical material, that is sympathetic to the cause of inconsistent mathematics. Colyvan (2000) addresses the issue that inconsistent mathematical theories imply inconsistent mathematical objects as their subject-matters. He also takes up the important task of providing an account of how inconsistent mathematics can have a branch which is applied mathematics. Priest (2013), like Colyvan, notes that inconsistent mathematics adds to the platonist mix. Berto (2007) usefully surveys paradoxes and foundational issues, and sets out some of the arithmetical results that bear on important philosophical issues like the Incompleteness Theorems. Van Bendegem (2014) pursues the interesting motivation that change is always a state of anomaly, so that always changing implies always anomalous. Examples include infinitesimals, complex numbers and infinity. Caution should be adopted over thinking that inconsistency is always anomalous, however, if only because it is simply more material for mathematical study.

It should be emphasised again that these structures do not in any way challenge or repudiate existing mathematics, but rather extend our conception of what is mathematically possible. This, in turn, sharpens the issue of Mathematical Pluralism; see e.g., Davies (2005), Hellman and Bell (2006), or Priest (2013). Various authors have different versions of mathematical pluralism, but it is something along the lines that incompatible mathematical theories can be equally true. The case for mathematical pluralism rests on the observation that there are different mathematical “universes” in which different, indeed incompatible, mathematical theorems or laws hold. Well-known examples are the incompatibility between classical mathematics and intuitionist mathematics, and the incompatibility between ZF-like universes of sets respectively with, and without, the Axiom of Choice. It seems absurd to say that ZF with Choice is true mathematics and ZF without Choice is false mathematics, if they are both legitimate examples of mathematically well-behaved theories.

The primary question for the philosophy of mathematics is surely what is mathematics. Duality operations like topological duality or Routley* reinforce the point that incomplete/inconsistent duals are equally reasonable as examples of mathematics. From this point of view, disputes about which of intuitionist or classical or inconsistent mathematics to accept seem pointless; they are all part of the subject matter of mathematics. This point is made effectively by Shapiro (2014, in contrast see his 2002). Shapiro’s distinctive position has other ingredients: mathematics as the science of structure, and mathematical pluralism implying logical pluralism (on logical pluralism see also Beall and Restall 2006); but we do not take these up here.

For what it is worth, the present writer thinks that some version of mathematical pluralism is obviously true, if one takes mathematics to be firstly about mathematical theories allowing for inconsistency , and only secondly about the objects internal to those theories. There is, of course, no problem about incompatible theories, as structures of propositions, co-existing. The primacy of theories fits, too, with the natural observation that the epistemology of mathematics is deductive proof. It is only if one takes as a starting point the primacy of the mathematical object as the truth-maker of theories, that one has to worry about how their objects manage to co-exist.