Prospectus for Equideductive Topology
Paul Taylor
July 2009 (revised February 2011)
This document concerns the way in which I see
equideductive logic
as (necessarily) the way forward for Abstract Stone Duality.
It is addressed to those who already have some familiarity with
previous work in this programme, in order to invite comments
on these ideas and how they should be pursued and promoted.
It also explains why certain avenues are either cut off
or dependent on investigating other topics first,
leading to a summary of the work to be done and its necessary order.
1 The current state of ASD
1.1 Two views of what "ASD" means
The original conception of ASD (and the reason for its name) was
that the idea of Marshall Stone's duality between algebra and
geometry could be captured by a category with an object Σ
for which the adjunction Σ(−)−|Σ(−)
is monadic.
The inspiration came from the Lindenbaum-Tarski-Paré theorem
for set theory and elementary toposes,
and the hope was that this could eventually be applied to other subjects
besides general topology.
As research has developed over the years towards applications
in general topology and basic real analysis,
the programme has increasingly become associated with
its leading topological model, the category of locally compact spaces.
Most of this document acquiesces to the perception that
ASD has hitherto been about locally compact spaces.
According to this view, equideductive topology is then
straightforwardly a generalisation to all sober topological spaces.
However, this was, as I have said, not the idea, so
Section 5.4 reviews the ways in which the monadic
principle and other stronger categorical ideas may perhaps
be realised.
1.2 Applications and computation
ASD became more prominent in 2005 when, together with
Andrej Bauer, I presented
two papers at CCA
about the construction of the Dedekind reals, the Heine-Borel theorem
and the intermediate value theorem.
These papers appeared in refereed journals in August 2009 and August 2010.
From this work emerged some complete axioms
for the real line in the
form of a language based on quantifiers and a syntax for Dedekind cuts.
This calculus was clearly asking for a computational interpretation,
of which Andrej provided a
prototype.
Given the success in this direction,
and the considerable moral support that the CCA community gave me,
it is reasonable to ask why we have not vigorously pursued these applications.
For example, you might have expected me to have written about
differentiation and integration and not just continuous functions,
and Andrej to have released a working calculator.
With regard to the "textbook" development of analysis and general
topology, the reason is that those topics that could have been
developed within the existing framework would have produced
results that are very similar to the standard accounts.
Given that ASD completely rejects the "standard" set-theoretic
foundations of analysis, this outcome would actually have been
an indicator that it is sufficiently powerful for the job.
However, other people would inevitably have compared it solely
with the classical textbook accounts, and therefore unfairly.
Also, it is not clear how this line of investigation would have
found problems that genuinely challenge the strengths and merits
of the ideas of ASD.
(On the other hand, the investigation of the Heine-Borel and
intermediate value theorems raised many more points than either
I or others had imagined that they would.)
1.3 Shortcomings of the existing calculus
The chief reason for not building higher was that
it was clear to me that I could not do so without first
digging deeper.
Most obviously, a calculus that could only handle locally compact
spaces would have very limited applications, and in particular would
never be able to handle functional analysis or measure theory.
Besides this,
the calculus based on the monadic principle
was technically very difficult to use.
In particular, the construction of a new type must begin
with a guess for a new term, called a nucleus,
and any paper using this had to devote a great deal of space
to the proof that this term satisfies certain awkward equations.
The rule was therefore that only one nucleus could be introduced
in each paper.
Even I found this very taxing, and nobody else would have had the
patience or confidence to apply this technique.
In fact, the situation was even worse: computably based locally
compact spaces could not, in fact, be studied on their own.
This is because the underlying recursion
necessarily required the use of equalisers (or equational hypotheses)
as well as powers of Σ.
Also, one could not speak of compact subspaces of non-Hausdorff spaces,
because these need not be locally compact.
For these reasons, the applications of ASD were developed using
an ad hoc extension that allowed equational hypotheses
in the contexts.
I also made tentative use of the notation
{x:A|∀φ.[¯]φ⇒φx}
for the compact subspace defined by a necessity operator [¯],
but only as a name for a construction that could be justified
independently, for example as a
closed subspace.
This modus operandi was justified by the fact that the
textbook categories of topological spaces and locales have
the required equalisers in the whole category,
whilst also having a subcategory of locally compact spaces
that has powers of Σ.
1.4 Implementation in Coq
During the summer of 2008,
Andrej Bauer,
Danko Ilik and
Matija Pretnar
worked on the translation of the part of the ASD calculus for R
into Coq.
It was natural to interpret at least the order, conjunction,
λ-abstraction and application on Σ by the
implication and connectives of Coq.
In this, the order-judgement-rule hierarchy of the existing
ASD calculus with its ad hoc addition of equational hypotheses
meant that implication could only be nested to depth three.
On my advice, they made no attempt to implement the
monadic calculus for defining new types.
Also, the theoretical justification within ASD for identifying the
proof-theoretic connectives of Coq with the lattice ones in Σ
in ASD was not clear.
However, empirical investigations both on paper and within Coq
suggested that such an identification was the content of the
Euclidean principle, and that the emerging equideductive logic
was the natural form into which to translate proof rules of
a commonly used form,
including those that had been used to define the ASD language.
2 Equideductive categories
2.1 Reformulating equilogical spaces
Amongst my many previous attempts to extend ASD beyond local
compactness (see Section 5.4), the concrete ones often began
from Dana Scott's
equilogical spaces,
whilst the abstract ones
suggested that iterated equalisers and exponentials might be
normalised using partial equivalence relations.
I also often found myself wanting to use the notation
E = {x:X|∀y:Y.f x y=g x y} → X |
→
→
|
ΣY |
|
for the equaliser of two maps targeted at an exponential.
I had not been keen on equilogical spaces themselves because of
their reliance on (the logic of) the underlying sets of the spaces.
However, I repeatedly came back to the normalisation as PERs,
which involved an exactness property that is similar to the one
that I had been looking for, although weaker.
This meant that any construction of the kind that I wanted would
have to be some direct variation on equilogical spaces,
without using my monadic completion.
In other words, I needed to study equilogical spaces myself seriously.
In many of my attempted constructions the objects came in two parts,
rather than being represented by a single partial equivalence relation.
The quotient part may be encoded as a subspace in the exponential,
and subspaces are captured by the logic with ∀, ⇒ and =.
I sketched of the construction of a CCC with equalisers,
identifying its exactness property,
and gave a summary in three lectures.
2.2 The definition of an equideductive category
While I was filling in the details of the construction of the CCC S,
the subcategory Q of objects with no quotient part became more and more
important.
For example, if I knew that Q had coproducts,
the construction of S would be less complicated,
as would its characterisation using the exactness property.
It would also be simpler if (abstract) sobriety
could be introduced as early as possible.
I therefore split the notes into two parts
and concentrated on the the subcategory Q.
It is axiomatised as follows: it has
- all finite limits;
- a pointed object ∗:1→Σ in Q; and
- a full subcategory A ⊂ Q of urtypes; such that
- A ⊂ Q is closed under products;
- Σ and all powers ΣA for A ∈ A exist in Q
and belong to A;
- Q has partial products based on the object Σ;
- Σ is injective with respect to all of the maps in the class M;
and
- there are enough injectives: every Q-object X
is the source of some M-map X→ΣA with A ∈ A; where
- M is the class of monos defined by the partial products
and intersections; and
- all objects Γ ∈ Q respect the universal properties mentioned.
The desired simplifications are justified
because it can be shown from these axioms that
- all urtypes are sober (in my abstract sense);
- the adjunction Σ(−)−|Σ(−) on A
is monadic, as in ASD;
- the product of any epi with any object is epi,
and epis form a factorisation system with the class M; and
- Q has stable disjoint coproducts (it is extensive).
2.3 Examples of equideductive categories
- Sets (any elementary topos);
- sober topological spaces in the textbook sense;
- countably based sober topological spaces;
- not locales;
- the term model of the syntax; and
- domain- and proof-theoretic interpretations of the language of
proofs in the logic.
Locales fail because products do not preserve epis.
This can be seen in terms of the "linear algebra" of sup-lattices,
whose tensor products do not preserve monos.
The abstract CCC that is constructed from sober topological spaces
is the subcategory of Scott's equilogical spaces that is generated
by exponentials and equalisers. This leaves out some of the less
topological equivalence relations that one might put on the points
of a topological space.
Starting instead from countably based sober topological spaces,
we obtain a similar subcategory of
Alex Simpson's category QCB (quotients of countably based spaces).
Equideductive logic is therefore compatible both with Simpson's
own work on this category and Matthias Shroeder's application
of it to Klaus Weihrauch's TTE.
2.4 Affine varieties
Postscript (Feb 2011): I don't think this topic will go anywhere.
The category of affine varieties is the formal opposite of that
of rings,
by which I mean here commutative associative algebras with a unit
in the symmetric monoidal closed category of
locally convex topological vector spaces,
in which the hom-objects carry the weak* topology.
Using topological vector spaces, the existence of the partial products
needed for equideductive logic depends on the Hahn-Banach theorem.
This would fail in the same way as it does for locales if we tried
to generalise to affine varieties over a ring: the modules have
to be flat.
For a categorist, the easiest way to handle topological vector
spaces is by using
Mike Barr's Chu construction
based on discrete vector spaces.
In this, my notion of affine variety is encoded as (M,R),
where R is a ring (discrete commutative algebra over K=R or
C) and M is an R-module,
whose elements we think of as the "continuous" linear maps R→ K.
I haven't worked out the details of the representation,
but the morphisms (M,R)→(N,S) are pairs f:S→ R and g:M→ N
satisfying a Chu-like condition.
Not having any background in algebraic geometry,
I do not intend to elaborate this application significantly myself.
However, I would like to spell out the categorical details of
the construction to the extent that it remains analogous to
the topological case and therefore provides a justification of
the underlying structure.
I would then like to encourage a better qualified person to
develop a synthetic account of algebraic geometry
analogous to mine for general topology.
The same construction would also be of interest as a model of full
linear logic, in particular perhaps to
those
who are using these
methods from theoretical computer science to study quantum mechanics.
2.5 The Yoneda embedding
All of the previous constructions of cartesian closed extensions of
topological spaces and locales, and also schemes, lie within the
(possibly illegitimate) category of presheaves over the original
category.
Briefly, an equideductive category is one that lies nicely
within its category of presheaves. In particular, any equaliser
of a pair of maps between a
representable and the exponential of a representable is again
representable. This happens for topological spaces and affine
varieties but (probably) not for locales.
In contrast to my previous comments on this
subject, this means that the categories of sober topological spaces
and of affine varieties have natural boundaries, even when one
would like to have a CCC available.
The point is that they are not only simpler to use but also closed
under all of the "simpler" type constructions,
that is, apart from those that are necessarily more complicated.
This may be seen as a explanation of the utility of (sober)
topological spaces and the relative lack of it for locales.
The role of my construction alongside the Yoneda embedding
is to identify the smallest CCC of presheaves,
ie the category generated by equalisers and exponentials.
The limits, exponentials, coproducts and epis (but not the coequalisers)
from an equideductive category agree with those in the CCC.
Arguably, it may be advisable to present these ideas in a purely
categorical way, without the logic, in order to make them accessible
to other mathematicians who use category theory but not λ-calculi,
such as algebraic geometers.
2.6 Another Chu construction
Postscript (Feb 2011): one construction of the enclosing CCC
is as a coreflective subcategory of a Chu-like category;
this suggests a categorical explanation of some of the more
unpleasant logical formulae that arise with equilogical spaces.
In another representation of S, the objects are (X,A),
where X,A ∈ Q, but A is an algebra for the Σ2-monad.
Morphisms are again Chu-like pairs.
I envisage applying this to the case of affine varieties and comparing
it with the notion of scheme in algebraic geometry.
If my notion is sufficiently similar, it would have the advantage
of being represented in a substantially simpler way,
with no need for Grothendieck toposes.
The exponential Σ(X,A) is (A,B),
in which the old algebra becomes the new space,
which is what I had envisaged for a generalisation to
other Stone-like dualities between algebra and geometry.
Some time ago, I had guessed that B would be the free
algebra (Σ2 X) on X, but this is not correct.
Unfortunately, the function-spaces in my CCC involve very complicated
equideductive expressions.
This is also a problem with other constructions of the CCC,
and mine may actually be a little simpler,
but I feel that some axiom is missing.
2.7 The analogy with complex numbers
We may think of locally compact spaces, which admit powers of Σ
but not equalisers, as like the positive real numbers, which have
square roots but not subtraction.
Sober topological spaces and my category Q gain equalisers at
the cost of exponentials, just as the whole real line gains subtraction
but loses square roots.
The opposite of the Kleisli category over Q plays the role of
the purely imaginary numbers.
Finally, the whole CCC, with both equalisers and exponentials, is like
the whole complex plane, with both subtraction and square roots.
The utility of the complex numbers, from the point of view of the
arithmetic of the reals, is that we may invoke them temporarily
to facilitate computation (for example to find the real roots of
a cubic equation when there are three of them, or to do Fourier
analysis of an audio signal), but this is a conservative
extension, because the same result could have been obtained without
them, albeit less easily.
Likewise, we invoke more general kinds of topological space,
including the cartesian closed extension, as ways to facilitate
computations with real or integer results.
However, with the available cartesian closed extensions,
I feel that we are in the situation of knowing how to manipulate real
numbers and imaginary numbers, and how to add them together
to get complex numbers, but not how to multiply complex numbers.
3 Equideductive logic
3.1 What sort of logic do we expect?
The appealing feature of the fragment of
the ASD language for elementary real analysis
is the extent to which it resembles
familiar use of the language of predicate calculus.
The aim of this programme is therefore to find some more expressive
language that is as close as possible to ordinary mathematical usage.
In particular it should allow the definition of subspaces using the same
idiom as that for subsets.
(It is almost impossible to get mainstream mathematicians to engage
in any discussion of foundations, but my impression of what they mean
when they say that they use set theory as foundations is that they
use the subset-forming notation {x ∈ X|p(x)}.)
The deal, therefore, that this language would offer ordinary mathematicians
is that its types would automatically carry "the right" topology
and its terms would automatically be continuous (and, in principle computable)
so long as certain restrictions on the use of the symbolism
are obeyed.
Postscript (Feb 2011): ASD gave "the right" topology on R
because of the monadic principle. However, this is not a consequence of
equideductive logic. Whilst it can be expressed and is consistent, this
expression is not a natural one. Some stronger theory of the kind about
which I speculate below is therefore still needed.
3.2 Equideductive logic
Equideductive logic is like the familiar predicate calculus:
it has an underlying object language,
together with predicates over the object urtypes.
The object language is the sober λ-calculus,
whose urtypes are formed using N, × and Σ(−).
(Semantically, these are locally compact spaces.)
The terms are formed using the usual connectives associated with these
urtypes, together with focus, but with the restriction that the body
of a λ-term must be of some urtype ΣX.
The predicates are generated from T and equality using &,
⇒ and ∀, subject to the variable-binding rule
that all free variables that occur on the left of ⇒ must be
bound by the next ∀.
The rules of the underlying sober λ-calculus must be re-expressed
in this logic, in particular using the extensionality rule
(λx.φx)=(λx.ψx) iff ∀x.(φx=ψx). |
|
Having defined the predicate calculus on urtypes,
we may introduce types that are formed using the
"comprehension" (subset-forming) notation from set theory.
Semantically, these are sober topological spaces.
The types are the objects of a category, in which the morphisms
are tracked by terms between injective urtypes.
Postscript (Feb 2011): there is now a draft paper on
Equideductive Categories and their Logic.
The exponential urtypes in the object language allow us to use an old idea
from higher order logic to define an existential quantifier.
This agrees with the epis in the category, which have to be
preserved by products, but not pullbacks.
In the logic, this quantifier only obeys a weak Frobenius (distributive)
law in which the extra conjunct must have distinct variables;
substitution and cut are not allowed.
Postscript (Feb 2011): there is now a draft paper on the
Existential Quantifier in Equideductive Logic.
3.3 The Euclidean principle and the quantifiers
As the tentative investigations with Coq in Section 1.4
suggested, in topology (and set theory, but not algebraic geometry)
we need an axiom to identify equideductive implication with
the lattice order on Σ.
Similarly, the conjunction & should agree with ∧ in Σ.
More fundamentally, any term φx:Σ is identified
with the predicate p(x)=(φx=T).
In equideductive logic, the quantifiers may range over any type,
yielding a predicate.
In ASD, on the other hand, ∀ may only range over a compact
space and ∃ over an overt one, giving a term.
This difference has a beautiful explanation in equideductive logic: a
compact or overt space is exactly one for which the equideductive
quantifier is tracked by a term of higher type.
The language that results from this has three levels:
- the arithmetic of N and R;
- terms of type Σ or ΣX, which serve both as
continuous functions and as open or computable subspaces; and
- equideductive predicates, which serve as general subspaces.
As in ASD, the quantifiers that range over compact or overt subspaces
stay within the second level, but those that violate this rule,
together with implication, pass to the outermost level.
Postscript (Feb 2011): there is now a draft paper on
Equideductive Topology.
3.4 A calculus for the CCC
My version of the equilogical space construction produces a CCC
(with general exponentials YX, not just ΣX)
that also has equalisers, and a certain exactness property.
All of these features may be embodied in a new type theory
(proper or internal language).
In particular, terms of equaliser type are formed using admit.
There is another operation called Admit,
whose introduction rule captures the exactness property.
However, this is rather complicated,
the more so because the existing models require side conditions on the types.
Nevertheless, it is a step towards an abstract axiomatisation
instead of working directly with the construction of a model.
3.5 A robust proto-logic
The basic system described above provides both a CCC with equalisers
and a nice subcategory. The latter already exhibits many of the
features of the category of sober topological spaces, in particular
the special role of locally compact spaces within it.
The formalism is a sub-system of existing type theories,
and in particular readily and naturally translatable into Coq.
Although there are many properties that one might desire but that this
structure does not have, the language is sufficiently expressive to
allow these extra properties to be formulated in it.
Altogether, the considerations in this document lead to the
conclusion that, whatever the future directions of development
of ASD may be, equideductive logic is the only formalism in which
this can be done.
Moreover, any additional axioms that may be proposed for ASD
will be built on top of equideductive logic,
instead of requiring another foundational revolution.
4 Well-pointed developments
One possible way forward would be to focus on the models provided
by sober topological spaces or countably based such spaces.
Since this is consistent with, indeed very close to, the orthodox theory,
it has the advantages that is should be more readily acceptable to
a wider mathematical audience
(like A lambda calculus for real analysis)
and that the applications agenda should be more clearly laid out.
4.1 Replacing the monadic λ-calculus
It appears that the urtypes within any equideductive category
are closed under Σ-subspaces,
and likewise the equideductive category within the CCC.
The monadic λ-calculus (at least, with the view of ASD
that it was about locally compact spaces) is therefore translatable
into equideductive logic.
However, a review of the applications that have been made of
this so far in the programme would be advisable.
Postscript (Feb 2011): sobriety is built in to equideductive
logic but not the monadic property: it has to be added.
You may then ask whether the effort that was put into justifying
the nucleus for the Dedekind reals
was wasted, given that the subspace of Dedekind cuts may now
be constructed much more easily.
In fact, this Lemma will be needed to prove a different Theorem,
namely that the interval [0,1] so constructed is compact.
4.2 The underlying set functor
An earlier partially successful extension of ASD postulated an
underlying set functor,
that is, a right adjoint to the inclusion of the full subcategory
of overt discrete spaces.
This has the effect of making this subcategory into an elementary topos,
so that the abstractly axiomatised category can be compared with
the traditional ones built from set theory.
However, this work came to a halt because of the lack of suitable models
beyond the locally compact case.
In the light of equideductive logic, this idea could now be
reformulated as an abstract characterisation of the textbook category
of sober topological spaces over an elementary topos.
The symbolic language that embodies this characterisation would
combine those for ASD, equideductive logic and the underlying set
functor. In particular, there would be an axiom to say that the
counit |X|→ X from the "underlying set" of an object X is epi.
The original objective of this investigation was to measure the
precise difference between the computable theory of ASD
and textbook general topology with the strength of set theory.
That is, to show what aspects of the traditional theory have to
be sacrificed in the computable setting.
4.3 Topology with the Phoa principle
The "middle tier" of ASD exploits the precise lattice duality
between open and closed ideas that exists in the theory that
includes the lattice structure on Σ and the Phoa principle,
but not Scott continuity.
Quite apart from the replacement of the monadic λ-calculus
with equideductive topology as the foundations of ASD, a new account
is needed of this part.
This is because the existing account consists of the very
first published paper,
another that was written at the same
time but fell foul of the referees,
and numerous ad hoc treatments distributed throughout the
other papers.
4.4 Towards a textbook
I would like to stress first that I do not intend to write
a textbook at this stage.
The ideas still need to be tried out in pieces in journal papers
before they are ready to be put together in a textbook.
It is also still by no means clear what topics from general topology
should be covered.
The greatest impediment to a clear treatment of ASD has always
been the need to explain it in terms of traditional topology, so
I envisage a textbook being written "in the target language".
It would introduce the axioms progressively throughout the book,
starting with those for the (natural and) real numbers.
Since the cartesian closed category is not only a technically
conservative extension but also a relatively minor modification
of the equideductive formulation of sober topological spaces,
it could be introduced in a similar way to the complex numbers
(§2.7).
That is, as a temporary diversion into a stronger theory,
which is the eliminated when we obtain the base-type results
of the computation.
The real line and other types would be introduced in the first
instance as primitive. Their "constructions" from N
could be given later in the form of proofs within the language
that these primitive types are isomorphic to others that
are constructed.
Likewise, the demonstration that sober topological spaces over
a topos provide a model could be presented together with the
underlying set axiom.
4.5 Machine implementation
This could be built on the preparatory work that Matija Pretnar
did in 2008. Topological constructions would be done on top of this.
In particular, the computational implementations
of the Dedekind reals have not necessarily followed
the theoretical constructions.
There shouldn't be such a disparity.
The programming should feed back into the mathematics,
in particular to give a theoretical construction
that is the same as the implementation.
Such an implementation would not, however, include the underlying set
functor or the well-pointedness axiom, because these are not computable.
It would therefore be consistent with the "pointless" work below.
5 "Pointless" topology
5.1 What are locales for?
Locale theory is famous for its treatment of general topology without
the axiom of choice, and for the existence of locales without enough
points. However, apart from a single piece of work by
Alex Simpson
on the locale of random numbers, I know of no programme of research that
actually exploits ill-pointed locales for applications in other parts
of mathematics.
The potential "pointless" applications that I have in mind would be
to probability, measure and distribution theory, and to the Fourier
duality.
However, since I now know that locale theory and equideductive logic
are incompatible, if the latter is going to be able to contribute
to pointless applications, this will have to be by finding models
from other sources besides locale theory and traditional topology.
On the other hand, affine varieties are similar to locales but
do satisfy equideductive logic, so should provide another useful
testing ground for additional axioms.
5.2 Realisability models
Besides its reliance on the logic of the underlying sets of spaces,
I am uncomfortable with equilogical spaces because of the way in
which they use equality on the left hand side of an implication.
Consider a physical example. If a key and a lock are manufactured
together then intensionally one fits the other, and the lock
operates. But in practice, the key need only fit the lock within a
certain non-zero tolerance.
We would expect, therefore, that, if some conclusion is asserted
to follow from an equation, then in fact it would still happen
within some tolerance. Or, if it is to follow from equality
of two infinite sequences, then in fact only finitely many terms
need agree.
The simplicity of the equideductive calculus allows us to regard
an implication between equations as asking for a proof.
Like other computational objects, this proof cannot in fact
demand the strict validity of its antecedent, but only some
finite part of it.
Since this predicate calculus is so simple, there are many proof-theoretic
ideas that we may apply to it.
In particular, proofs of & and ∀⇒
may be encoded using pairing and λ-abstraction
(realisability).
This suggests that they may be interpreted in the object language.
The work that I, amongst others, did in the 1980s and 90s on
domain-theoretic models of polymorphism should play some role here.
On the other hand, I would now do domain theory with ASD,
so there should be some direct translation of the language of
proofs of the predicates into the underlying λ-calculus.
It seems to me that it is necessary to identify proof-theoretic
Scott continuity as the first step towards developing
pointless interpretations and extensions of equideductive logic.
I have in mind a compact hypothesis principle,
that if (∀x:A.p(x)⇒φx y)⇒ψy
is provable then there is some compact subspace
K ⊂ {x:A|p(x)} with necessity operator [¯]
such that [¯]φ⇒ψ.
5.3 Higher recursion theory
The various cartesian closed extensions of topological spaces have in
the past been used to investigate the semantics of higher recursion
theory.
One might therefore expect new models of equideductive logic that
are themselves based on proof-theoretic ideas to throw new light
on the topologies of the hierarchies
N, NN, NNN,...
and R, RR, RRR,...
5.4 Richer categorical properties
As I said at the beginning of this document, the original idea of
ASD was a hypothesis on the structure of a category.
This hypothesis was successful in the case of elementary real analysis,
where it provided a recursive model with the Heine-Borel
theorem, contrary to received wisdom.
The hope was that the same categorical intuition would yield
further examples of "the right" topology, for example
in higher-type recursion theory.
Over many years, I have attempted to find models of this hypothesis
based on locales or equilogical spaces, and also to demonstrate
consistency of the hypothesis by abstract proof theoretic methods.
However, this search had been fruitless.
The setting of equideductive logic makes it substantially clearer
what needs to be done, either semantically or syntactically,
to satisfy the original conjecture, if this is possible.
Indeed it helps to formulate what that conjecture ought to be.
However, it does not solve the problem.
We find ourselves working with equideductive predicates that become
more and more complicated.
This is, in the analogy, like knowing how to add but not to multiply
complex numbers.
It is clear that some simplifying axiom (analogous to distributivity
of multiplication over addition) is needed.
I have some guesses what it may be, but have limited faith in them,
and they are useless without some proof of consistency.
These axioms are plainly never going to emerge from purely semantic
ingredients (locales, equilogical spaces à la Scott,
presheaves, ...). So the only way forward is to investigate the
proof-theoretic models first.
6 Plan of work
As a result of the arguments that I have set out in this document,
it seems to be necessary to work on topics in the following order.
Equideductive categories, examples and logic:
- The construction of the CCC, taking equideductive logic and
the properties of equideductive categories as given,
the exactness property and the characterisation of such CCCs,
the type theory that captures the exactness property,
the equivalent Chu construction and the formulation of the
Euclidean principle.
- Equideductive categories as those that "lie nicely" within
their Yoneda embedding, and the example of affine varieties.
- Equideductive logic.
- Review of the uses of the monadic λ-calculus in ASD,
and its replacement with equideductive logic.
Well-pointed equideductive topology:
- New account of topology with the Phoa principle,
based on this, including
general recursion.
- Compact and overt subspaces, proper and overt maps
in equideductive topology.
- Characterisation of the category of sober topological
spaces over an elementary topos using the underlying set
functor and a well-pointedness axiom.
"Pointless" equideductive topology:
- Domain-theoretic models and realisability structures
for proof-relevant equideductive logic.
- Applications to higher-type recursion theory and
categorical exactness.
- Applications to functional analysis.
This is
www.PaulTaylor.EU/ASD/equdcl/prsp2009.html
and it was derived from ASD/prsp2009.tex
which was last modified on 20 February 2011.