Topology and Logic - Department of Computer Science - University of ...

0 downloads 166 Views 4MB Size Report
Oct 20, 2012 - Asymmetric Topology in Computer Science. Steve Matthews. Department of Computer Science. University of Wa
Topology and Logic Asymmetric Topology in Computer Science

Steve Matthews Department of Computer Science University of Warwick Coventry, CV4 7AL, UK [email protected] with Michael Bukatin (Nokia Corporation, Boston) and Ralph Kopperman (CCNY)

AMS Fall Central Sectional Meeting University of Akron, Akron Ohio October 20th-21st, 2012

Acknowledgments

I

May I thank the organisers of the Special Session on A Survey of Lattice-Theory Mathematics and its Applications for their kind invitation to speak today.

I

On the occasion of his 70th birthday may I acknowledge the outstanding contribution and role model of Professor Ralph Kopperman in the progress of asymmetric topology.

Abstract Let us take topology to be the study of mathematical properties preserved by continuous deformation, and logic to be the study of truth through deductive reasoning in a formal language. As continuity depends on the infinitary notion of limit, and deductions are finite, there seems to be little commonality between these subjects. But today’s pervasive influence of computers in mathematics necessitates that topologists understand how continuous deformation is to be programmed, and conversely that computer programmers have more access to topology to model their computations. Today’s talk will survey how topology & logic have been steadily growing closer during the past century, and then present our own research upon developing a middle way for topology and logic.

Abstract

Contents Metric spaces Topological spaces Logic via Topology (Scott) Topology via Logic (Vickers) Many valued truth logic Non zero self distance Asymmetry for metric spaces Partial information Negation as failure Failure takes time No such thing as a free lunch Discrete partial versus fuzzy metrics Conclusions and further work Further reading c David Kopperman

Metric Spaces

Definition (1) A metric space (Fréchet, 1906) is a tuple (X , d : X × X → [0, ∞)) such that, d(x, x) = 0 d(x, y ) = 0 ⇒ x = y d(x, y ) = d(y , x) d(x, z) ≤ d(x, y ) + d(y , z)

Maurice Fréchet (1878-1973)

Example (1) | · − · | : (−∞, +∞)2 → [0, ∞) where |x − y | = x − y if y ≤ x , y − x otherwise.

Metric Spaces I

As with most mathematics there is an unquestioned philosophical identification in the theory of metric spaces between ontology and epistemology. That is, all that exists in a metric space is presumed knowable (by examining distances), and all that is knowable (by distance) is presumed to exist.

I

Hence in the time of Fréchet it was eminently sensible to axiomatize self distance by d(x, x) = 0 and d(x, y ) = 0 ⇒ x = y .

I

The past century has taught us that some problems are undecidable (i.e. there are more truths than are sound reasoning of proofs), and more recently that fallacies (i.e. believed but unsound reasoning) such as might arise in large scale data processing can pass us by unnoticed.

I

How well prepared are metric and topological spaces for the information age?

Metric Spaces The ideal of logic for ascertaining perfect truth in Star Trek’s portrayl of the 23rd century is a wonderful caricature of how we appreciate undecidable problems and fallacies in the information age of the 20-21st centuries. Captain Kirk arbitrates between his closest confidants, the totally logical Mr Spock and the passionate Dr McCoy. Mr Spock’s reasoning is infallible but as a result incomplete. Dr McCoy’s reasoning is emotional, daring, wider reaching, but as a result fallible. Just as Kirk needs Spock & McCoy to explore the galaxy so we need to bring together the infallibility of logical reasoning with the abstract expressiveness of (say) topology.

Topological Spaces Definition (2) A topological space is a pair (X , τ ⊆ 2X ) such that, {} ∈ τ and X ∈ τ ∀A, B ∈ τ . A ∩ B ∈ τ (closure under finite intersections) S ∀Ω ⊆ τ . Ω ∈ τ (closure under arbitrary unions) Each A ∈ τ is termed an open set of the topology τ , and each complement X − A a closed set.

Definition (3) For each topological space (X , τ ) a basis of τ is a Ω ⊆ τ such that each member of τ is a union of members of Ω .

Example (2) The finite open intervals (x, y ) are a basis for the usual topology on (−∞, +∞) .

Topological Spaces Definition (4) For each metric space (X , d) , a ∈ X , and  > 0 an open ball B (a) = {x ∈ A : d(x, a) < } .

Lemma (4) For each metric space (X , d) the open balls form the basis for a topology τd over X . Ontology ≡ epistemology in the sense of Hausdorff separation (i.e. T2 ), a 6= b ⇒ ∃, δ . B (a) ∩ Bδ (b) = {} (distinct points can be separated by disjoint neighbourhoods)

Example (3) Let d be the metric on the set {F , T } of truth values false & true such that d(F , T ) = 1 . Then τd = 2{F ,T } , and {{F }, {T }} is a basis for τd .

Topological Spaces I

Approximation in point set topology by neighbourhoods is approximation of a totally known limit point by totally known open sets.

I

So, is not an open set such as an open ball B (a) an approximation for each and every point x ∈ B (a) ? The answer would be YES! if ontology and epistemology of naive set theory were consistent & synonymous.

I

Russell’s paradox of 1901 argues that if we could define R = {x|x 6∈ x} then R ∈ R ⇔ R 6∈ R .

I

Russell in 1916

This paradox (i.e. self contradiction) known to Russell et.al. led to our understanding of incompleteness in logic and computability theory.

Topological Spaces I

And so Russell’s paradox of 1901 was subsequently addressed through restriction to consistent truths (e.g. typed sets) and exclusion of known contradictions.

I

Later large scale information processing such as second world war codebreaking introduced the possibility of contradictions unknowable & unavoidable in practice although knowable in theory. Topological spaces thus need the following development.

I

Neighbourhood Approximation (i.e. open sets) ⇒ Consistent Approximation (e.g. T0 separable) ⇒ Inconsistent Approximation (Topology with mistakes)

For some researchers topology has always been & will always be the study of Hausdorff separable neighbourhoods. Now we argue a case for the further generalisation of topology through to inconsistency.

Topological Spaces Definition (5) A topological space (X , τ ⊆ 2X ) is T0 separable if a 6= b ⇒ ∃O ∈ τ . (a ∈ O ∧ b 6∈ O) ∨ (b ∈ O ∧ a 6∈ O) (distinct points can be separated by a neighbourhood)

Lemma (5) T2 ⇒ T0 Inspired by Dana Scott in the 1960s asymmetric T0 separable topology became meaningful.

Definition (6) The information ordering (in the study of consistent approximation) is, a v b ⇔ ( ∀O ∈ τ . a ∈ O ⇒ b ∈ O )

Topological Spaces I

a v b is read as compute up from a to b. This is a model for consistent approximation. No mistakes are possible, and there is no way to undo a computation. E.g. there is no way to return from b down to a .

I

This model is consistent with the 1901 time of paradoxes where contradictions could be managed by exclusion. It was just viable in the 1970s where fallible programmers of mainframe computers could be made responsible for managing their mistakes. Now try telling an iPod user that they can never make a mistake and there is no such thing as an undo option in their favourite app.

I

Sadly consistent approximation does not scale up to meet today’s demands for inconsistency.

Topological Spaces Hausdorff separability was taken for granted until the 20th century story of incompleteness, Bletchley Park code breaking, and Computer Science developed. I

∀ x . x = x (identity)

I

∀ x, y . x = y ⇒ y = x (symmetry)

I

∀ x, y , z . x = y ∧ y = z ⇒ x = z (associativity)

Machine Room, Hut 6, 1943

T2 separability is consistent with a two valued logic of truth (false and true). As a means of mathematically describing many structures in our natural 3D world metric spaces remain a powerful tool. However, increasingly within our everyday world people need to be as aware of how such structures are to be effectively represented within a computer as with what is their inherent nature.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

> 

I @ @ @

F

T I @

 @ @⊥

> (pronounced top in lattice theory and both in four valued logic) introduces the possibility of an overdetermined mathematical value, F , T (pronounced false, true) typifies two well defined consistent distinct values in a mathematical theory. ⊥ (pronounced bottom in lattice theory and either in four valued logic) introduces the possibility of an underdetermined mathematical value.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

> 

I @ @ @

T

F 

I @ @ @



> as computing power and apps grow so does the scope for humans to enter and expect computers to (consistently) process their inconsistent data. F , T the traditional idealised realm of mathematics in which all values, axioms, theorems, etc. are presumed to be logically consistent. ⊥ perhaps the wait is finite, perhaps infinite, we just cannot know how long if ever it will take to compute the desired mathematical value.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

> 

I @ @ @

F

T I @

 @ @⊥

We assume that information is partially ordered. ⊥ @ F @ > and ⊥ @ T @ >. F 6v T and T 6v F x v y ⇒ f (x) v f (y )

(two valued logic is unchanged). (functions are monotonic).

We envisage a vertical structure of information processing orthogonal to a given horizontal mathematical structure.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

Definition (7) A partially ordered set (poset) is a relation (X , v ⊆ X × X ) such that x v x (reflexivity) x v y ∧ y v x ⇒ x = y (antisymmetry) x v y ∧ y v z ⇒ x v z (transitivity)

Definition (8) For each poset (X , v) , (X , @) is the relation such that, x @ y ⇔ x v y ∧ x 6= y .

Example (4) The real numbers (−∞, ∞) are partially ordered by the relation x v y iff x ≥ y .

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

Definition (9) A lattice is a partially ordered set in which each pair of points x, y has a unique greatest lower bound (aka infinum or meet) denoted x u y , and unique lowest upper bound (aka supremum or join) denoted x t y .

Example (5) A set is a lattice when partially ordered by set inclusion. Infinum is set intersection, and supremum is set union.

Example (6) The extension of two valued truth logic from {F , T } to {F , ⊥, >, T } where F u T = > and F t T = ⊥ is a lattice.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

Definition (10) A two argument function (in infix notation) op is symmetric if x op y = y op x for all x and y .

Definition (11) A two argument function (in infix notation) op is associative if x op (y op z) = (x op y ) op z for all x, y , & z .

Example (7) In a distributive lattice, meet & join exist, are symmetric, are associative, and x t (y u z) = (x t y ) u (x t z) and x u (y t z) = (x u y ) t (x u z) .

Definition (12) A function f over a lattice is distributive if f (x u y ) = (f x) u (f y ) and f (x t y ) = (f x) t (f y ) .

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

Many valued truth logic has an important role in the joint progress of mathematics and computer science. Before the age of computing, mathematics had to find its own way (for us metric spaces). When computing gathered pace in the 1960s it was from the presumption that mathematics was always to be computed bottom-up upon a single machine architecture from the consistent nothing of ⊥ . Now, in today’s world of parallel network based computing, there is an additional increasingly demanding necessity that pre-computed possibly inconsistent information may arrive top-down from other sources (machine or human) to be reconciled consistently with a mathematical model below. Truth table for negation. P ⊥ F T > ¬P ⊥ T F > Negation is monotonic and distributive.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

Truth table for sequential and (computing left-to-right). P∧Q ⊥ F T > ⊥ ⊥ F ⊥ ⊥ F ⊥ F F F T ⊥ F T > > ⊥ F > > Sequential and is monotonic, not symmetric as ⊥ ∧ F 6= F ∧ ⊥ , and ⊥ ∧ Q = ⊥ for each Q .

Truth table for parallel and (Belnap logic). P∧Q ⊥ F T > ⊥ ⊥ F ⊥ F F F F F F T ⊥ F T > > F F > > Parallel and is monotonic, symmetric, distributive, and above sequential and.

Many Valued Truth Logic Adding Unknown and Contradiction to Truth Valued Logic

I

And so we see that the structure of two valued truth logic {F , T } can be generalised in such a way that renders it more relevant to incorporate concepts which become both obvious and necessary in the age of computing.

I

It is, in effect, sufficient to use our intuition to extend two valued truth logic. Sadly it is more challenging to see how to extend more sophisticated mathematical structures such as metric spaces.

I

The insight here is to appreciate that the metric concept of self distance could meaningfully (as in each of mathematics and CS) be non-zero as well as zero.

Non zero self distance Partial metric spaces

Definition (13) A partial metric space (Matthews, 1992) is a tuple (X , p : X × X → [0, ∞)) such that, p(x, x) ≤ p(x, y ) (small self-distance) p(x, x) = p(y , y ) = p(x, y ) ⇒ x = y (indistancy implies equality) p(x, y ) = p(y , x) (symmetry) p(x, z) ≤ p(x, y ) + p(y , z) − p(y , y ) (triangularity)

Definition (14) An open ball B (a) = {x ∈ A : p(x, a) < } .

Lemma (5) The open balls form the basis for a topology. This is asymmetric in the sense that there may be x, y such that y ∈ cl{x} ∧ x 6∈ cl{y } (i.e. T0 separation).

Non zero self distance Partial metric spaces

Definition (15) For each partial metric space (X , p) the partial ordering is x vp y ⇔ p(x, x) = p(x, y ) .

Lemma (6) A metric space is precisely a partial metric space for which each self distance is 0 . In such a space the partial ordering is equality. Since the introduction of partial metric spaces other equivalent or alternative formulations have been suggested.

Non zero self distance Partial metric spaces

Definition (16) A weighted metric space is a tuple (X , d, | · | : X → (−∞, ∞)) such that (X , d) is a metric space.

Lemma (7) If (X , d, | · |) is a weighted metric space then p(x, y ) =

d(x, y ) + |x| + |y | 2

is a partial metric such that p(x, x) = |x| for each x ∈ X . If (X , p) is a partial metric space and dp (x, y ) = 2 × p(x, y ) − p(x, x) − p(y , y ) , |x|p = p(x, x) then (X , dp , | · |p ) is a weighted metric space.

Non zero self distance Partial metric spaces

After introducing non zero self distance to metric spaces the question of negative distance arises. Suppose we now allow distance to be negative. Then we can introduce the following partial metric for many valued truth logic. p(>, >) = −1 p(F , F ) = p(F , >) = p(T , T ) = p(T , >) = 0 p(F , T ) =

1 2

p(⊥, ⊥) = p(⊥, F ) = p(⊥, >) = p(⊥, T ) = 1 Note: we choose this particular partial metric in order that the induced weighted metric space has the familiar metric dp (F , T ) = 1 .

Non zero self distance Partial metric spaces

I

A partial metric space (X , p) generalises Fréchet’s landmark notion of metric space (X , d) through generalising the notion of self-distance.

I

The initial motivation was provided by computer science requiring the notion of partial ordering & T0 separability for computable functions. However, this was too weak for in comparison to pre-CS mathematics which could afford the luxury of assuming their topological spaces to be Hausdorff separable (i.e. T2 ).

I

While it is true that CS motivated non zero self-distance and partial metric spaces could there be other interesting applications & interpretations of this research?

Non zero self distance Partial metric spaces

Definition (17) A based metric space is a tuple (X , d, φ ∈ X ) such that (X , d) is a metric space. That is, a based metric space is a metric space with an arbitrarily chosen base point. By defining |x| = d(x, φ) we make a weighted metric space (X , d, | · |) in which φ = > and |φ| = 0 . Alternatively, assuming d to be bounded above by some a , define |x| = a − d(x, φ) to make a weighted metric space (X , d, | · |) in which φ = ⊥ and |φ| = a . A potential use of based metric spaces in CS is in computer games, where both a 3D space and the view of that space from any point in space has to be modelled.

Partial Information Scott Domain Theory

I

x vp y iff p(x, x) = p(x, y ) models the computing notion that the data content of x can be increased to that of y .

I

Very relevant to classical problems of recursion in logic, computability theory, and programming language design of the 1960s. But, these problems are now resolved, and language design today faces very different challenges.

I

The what truth of x vp y is in general insufficient to determine the how algorithm used to compute from x to y .

I

With the benefit of hindsight it is easy to argue that the asymmetric topology of domain theory could not progress beyond its roots of modelling the what of computation.

Partial Information Scott Domain Theory

> 

I @ @ @

T

F 

I @ @ @⊥

In the least fixed point domain theory tradition of Kleene, Tarski, & Scott ¬T = F , ¬F = T , ¬⊥ = ⊥ , ¬> = > , and tn≥0 ¬n (⊥) can be used to define the ideal meaning for ψ recursively defined by ψ = ¬ψ .

Partial Information Scott Domain Theory

> 

I @ @ @

T

F 

I @ @ @⊥

BUT! ψ 6= ⊥ in the real world of computing as any attempt to compute the value of ψ would in some finite time time out (whatever this may mean) with an error message such as Control Stack Overflow. Thus while the above least fixed point theory is fundamental to the nature of computing it is far from being the whole story.

Partial Information Scott Domain Theory

In the functional programming language Haskell we can write the definition for negation in two valued logic as follows. data Bool = True | False not :: Bool -> Bool not True = False not False = True psi :: Bool psi = not psi

From many artistic works it is clear that recursion (and computable loops in general) can only be finitely known. What then is the cost of knowledge?

Jeff Crouse "Recursion shirt"

Partial Information Scott Domain Theory

I

Meanwhile, at Warwick’s Centre for Discrete Mathematics & its Applications (DIMAP) algorithms have truly flourished as a vigorous form of combinatorics, a how branch of totally defined mathematics for the study of countable data structures.

DIMAP logo

I

For example, a finite directed graph is studied to determine its shortest path.

I

Like it or not the reality is that DIMAP has no need for asymmetric topology and partial information.

I

However, this need not imply the demise of domain theory. Rather, we ask how can domain theory reach out to contemporary computing concepts?

Partial Information Scott Domain Theory

Number 6: Where am I? Number 2: In the Village. Number 6: What do you want? Number 2: We want information. Number 6: Whose side are you on? Number 2: That would be telling. We want information... information... information. Number 6: You won’t get it. Number 2: By hook or by crook, we will. Number 6: Who are you? Number 2: The new Number 2. Number 6: Who is Number 1? Number 2: You are Number 6. Number 6: I am not a number, I am a free man! Number 2: [Sinister laughing] ... (www.netreach.net/~sixofone)

The Prisoner (1967)

Patrick McGoohan (1928-2009)

Partial Information Static domain theory

Number 6: Where am I? Number 2: In DIMAP. Number 6: What do you want? Number 2: We want algorithms. Number 6: Whose side are you on? Number 2: That would be telling. We want algorithms... algorithms... algorithms. Number 6: You won’t get it. Number 2: By hook or by crook, we will. Number 6: Who are you? Number 2: The new Number 2. Number 6: Who is Number 1? Number 2: You are Number 6. Number 6: I am not a number, I am a discrete domain theorist! Number 2: [Sinister laughing] ...

The Prisoner (1967)

www.sixofone.co.uk

Partial Information Scott Domain Theory I

As a mathematics undergraduate of metric & topological spaces in 1976-7 at Imperial College I became a Prisoner of T2 costless separation.

I

Trying to become a free mind of PhD repute in Computer Science 1980-3 at Warwick I was still a Prisoner of the misnomer that the cost of producing each x in domain theory could be incorporated within the emerging notion of non zero self distance for metric spaces.

I

In the 1960s dynamic, interactive, adaptive, or intelligent systems were few, while in today’s world of the web and cloud computing there are few that are not.

I

And so, how can the concepts of non zero self distance & asymmetric topology of partial metric spaces avoid the same fate of domain theory?

Partial Information Scott Domain Theory

I

Introducing the notion of cost to computing a partial metric distance p(x, y ) is an important initial step in developing a future theory of metric spaces that may be termed dynamic, interactive, adaptive, or intelligent.

I

For example, in classical logic we want to retain the double negative elimination theorem ¬¬A ↔ A while in addition asserting that ¬¬A is more costly to compute than A .

I

Applying this idea to our earlier definition ψ = ¬ψ would mean that we could retain the ideal Kleene, Tarski, & Scott domain theory meaning tn≥0 ¬n (⊥) of least fixed points and use cost as a criteria for defining an error Control Stack Overflow.

Negation as failure Non-monotonic reasoning

"I went to Stanford to study for a PhD in Mathematics, but my real interest was Logic. I was still looking to find the truth, and I was sure that Logic would be the key to finding it. My best course was axiomatic set theory with Dana Scott. He gave us lots of theorems to prove as homework. At first my marks were not very impressive. But Dana Scott marked the coursework himself, and wrote lots of comments. My marks improved significantly as a result."

Dana Scott (photo by B. Otrecht, 1969)

from Robert Kowalski: A Short Story of My Life and Work April 2002

Bob Kowalski

Negation as failure Non-monotonic reasoning I

Negation as failure (NAF) is a non-monotonic inference rule in logic programming used to derive a truth for a formula not p from failure to derive the truth of p .

I

not p may be different from ¬p depending upon the completeness of the inference algorithm and that of the formal logic system assumed.

I

"Although it is in general not complete, its chief advantage is the efficiency of its implementation. Using it the deductive retrieval of information can be regarded as a computation" (Negation as Failure, Keith L. Clark, Logic and Databases 1978, p. 113-141). Keith L. Clark

Negation as failure Non-monotonic reasoning

I

My sincerest apologies to Professor Kowalski who taught me an inspiring introduction to logic programming in Prolog at Imperial College 1979-80. Grounded in the costless idealism of complete logic and T2 separation of metric spaces I totally misunderstood the practical necessity and wisdom for NAF in computing.

I

Looking back may I now offer the following thoughts upon NAF. (1) There is an implicit reasonable assumption that programmers use intelligent algorithms that for all intents & purposes progress computationally. Thus complete logic is a fine ideal, but is not a practical necessity in computing. (2) Failure can be monotonic.

Failure takes time Monotonic reasoning

Although ¬¬⊥ = ⊥ in domain theory (as ¬¬A ↔ A in logic) we require a model for which the cost of computing ¬⊥ is discernibly greater than that of computing ⊥ . Consider the following sequence p0 , p1 , . . . of partial metrics. pn (T , T ) pn (F , F ) pn (F , T ) pn (T , ⊥) pn (F , ⊥) pn (⊥, ⊥)

= = = = = =

p ppp p p p ppp ppppp p ppppppp

2−n 2−n 2−n + 2−1 2−n + 1 2−n + 1 2−n + 1

F

ppppp

pppppp

@ I @

ppppp

pppppp

ppppppppppp

pppp pppppp

ppppp

pppp

pppp p p pp pp pp p p p p p p p ppppp

T



pn (⊥, ⊥)

@ @ @ @



p is monotonic in the sense that each pn is a partial metric having the usual domain theory ordering ⊥ v F ∧ ⊥ v T , and ∀x, y ∀0 ≤ n < m . pn (x, y ) > pm (x, y )

Failure takes time Monotonic reasoning

I

Let our cost function | · | : X → (ω → an+1 0 1 0 ∀n ≥ 0 . an > an > an2 > . . . > an+1 0 0 ) × 2−m ∀m, n ≥ 0 . anm = an+1 + (an0 − an+1

Then {(X , pm )|m ≥ 0} is a discrete partial metric space.

Failure takes time Into the abyss I

Suppose x0 v x1 v x2 v . . . to be a domain theory chain of approximations. Then we have their (data) content weights |xi | = p(xi , xi ) for each i ≥ 0 .

I

The computational abyss observed by Wadge can pass by unnoticed in domain theory. Suppose that for either a finite or infinite period an xi (and hence |xi | ) remain constant as i increases. Then this has no significance in domain theory, but may or may not be a computational catastrophe.

I

That is, when computational resources (including human intelligence) just happen to be sufficient no one notices or cares about (what may we term) Wadge’s abyss , but when they are not a catastrophe may well arise.

I

With discrete partial metric spaces we can now do more to express Wadge’s abyss within domain theory.

Failure takes time Into the abyss

I

Suppose that xi v xi+1 for some i in our domain theory chain, and that xi = xi+1 . Then there is no data increase from xi to xi+1 , but it is reasonable to assume a computational pause of whatever kind does take place (which may or may not terminate).

I

Suppose our domain theory chain to be definable by a discrete partial metric space (X , p : X × X → {a0 , . . . }) . Then there exists j ≥ 0 such that |xi | = aj .

I

Now assume the pause from xi to xi+1 to be of length m > 0 . Then the cost of producing the value xi+1 can be defined as ajm

Failure takes time Into the abyss I

Interestingly the partial metric notion of size |xi | = p(xi , xi ) (which is fixed for all time) now becomes dynamic. From xi it is rarely predictable what resources (affordably finite, unaffordably finite, or infinite) are needed to compute xi+1 .

I

What we can do with a discrete partial metric is to dynamically keep track of a chain of values, aj0 = ajm0 > ajm1 > ajm2 >

...

as long as is affordably finite and until (if ever) the value xi+1 is computed. I

Thus Wadge’s abyss is observable by means of keeping track of a chain (i.e. x ) of chains; bringing domain theory a step closer to the complexity of algorithms.

Failure takes time Into the abyss I

So far in our work we have omitted Wadge’s earlier notion of complete object in Lucid, the Dataflow Programming Language. In the sense of partial metrics a completely defined (data content) object is precisely one whose self-distance is zero. It has proved to be a very useful assumption for contraction mapping style fixed point theorems over partial metrics.

I

Our work today upon Wadge’s abyss uses the range [0, ∞) for counting both increasing data content and increasing cost. In our cost-conscious research "one that cannot be further completed" may turn out to be an unaffordable computation.

I

We have yet to consider how contraction mapping theorems might be formed for discrete partial metric spaces.

Failure takes time Into the abyss

Dana Scott’s inspired work has given us the T0 topology to model partial information. But, a computable function could still be an impossible object as depicted in Escher’s Waterfall which appears to produce an unending flow of water. Equivalent to the paradox of the Penrose Triangle impossible object.

The supposed paradox disappears in discrete partial metric spaces Waterfall where (dynamic) cost can be Lithograph by M.C. Escher (1961) composed with (static) data content.

Failure takes time Hiatons

I

Ashcroft & Wadge (1977) introduced a programming language called Lucid in which each input or output is a discrete (i.e. finite or countably infinite) sequence of data values. In domain theory terms, h i @ hai @ ha, bi @ ha, b, ci @ ha, b, c, di @ . . .

I

Let p(x, y ) = 2−n where n is the largest integer such that ∀i < n . xi = yi . Then p is a partial metric inducing the above ordering.

I

An unavoidable implication here is that each data value in a sequence takes the same amount of time to input (resp. output).

Failure takes time Hiatons

I

Wadge & Ashcroft (1985) tried to add a pause to Lucid.

I

For example, the following would-be ’sequence’ tries to add pauses termed hiatons to static domain theory. h∗, 2, 3, ∗, 5, ∗, 7, ∗, ∗, ∗, 11, . . . i

I

But ∗ is neither a well defined data value (such as 0) nor is say h∗, 2i a partial value comparable to h2i in domain theory. So, what is a hiaton ? Frustratingly the temporal intuition of a pause appears to be sound, but will not fit into domain theory.

Failure takes time Hiatons

Time 0 1 2 3 4 5 6 7 ...

Data hi v h1i v h1i v h1, v h1, v h1, v h1, v h1, ...

Hiatons hi h1i h1, ∗i 2i h1, ∗, 2i 2i h1, ∗, 2, 2, 3i h1, ∗, 2, 2, 3i h1, ∗, 2, 2, 3i h1, ∗, 2, ...

∗i ∗, ∗, ∗,

Discrete Partial Metric 2−1 + (2−0 − 2−1 ) × 2−0 2−2 + (2−1 − 2−2 ) × 2−0 2−2 + (2−1 − 2−2 ) × 2−1 2−3 + (2−2 − 2−3 ) × 2−0 2−3 + (2−2 − 2−3 ) × 2−1 3i 2−4 + (2−3 − 2−4 ) × 2−0 3, ∗i 2−4 + (2−3 − 2−4 ) × 2−1 3, ∗, ∗i 2−4 + (2−3 − 2−4 ) × 2−2 ...

As Escher’s Waterfall paradox is resolved by separating time from data so we resolve the ambiguity of each hiaton by making it a time only concept.

Failure takes time Hiatons

The above example of a data sequence, an associated domain theory, and hiatons are all observable from a single discrete partial metric. Data Discrete Partial Metric hi 2−1 + (2−0 − 2−1 ) × 2−0 v h1i 2−2 + (2−1 − 2−2 ) × 2−1 v h1, 2i 2−3 + (2−2 − 2−3 ) × 2−1 v h1, 2, 3i 2−4 + (2−3 − 2−4 ) × 2−2 ... ... Now though this discrete partial metric is non deterministically discovered at the run time of computation, not in general knowable in advance determined by only the properties of data. Note how the ontology of data is consistent with but (in general) less than our epistemology of computation.

Failure takes time Hiatons

The embedding retraction relationship from (a suitable) partial metric to a discrete partial metric works as follows. an → an0 = an anm → danm e = an an is (as with a partial metric) a predictable distance for partially defined values, but one that can now dynamically change at the run time of computation to pass through the observable discrete distances anm0 , anm1 , . . . > an+1 .

No such thing as a free lunch Moore’s Law ?

Colossus (1944) was invented to decipher encrypted messages too complex for humans to manage by hand. Ironically computers soon became too powerful to be valued by humans. It is as if we are forever going round in circles, the functionalities of computers sooner or later are overtaken by the reality of their cost. Why let ourselves become trapped in this pointless infinite loop? Why not ensure our mathematics is always in one-to-one correspondence with its cost? The traditional answer is Moore’s Law, which states that the number of transistors which can be placed inexpensively upon an integrated circuit doubles every two years. According to Moore’s Law there is a free lunch in computing! Really?

"In 1994, a team led by Tony Sale (right) began a reconstruction of a Colossus computer at Bletchley Park" (Wikipedia)

Steve Jobs launches the iPad tablet (2010).

No such thing as a free lunch Moore’s Law ?

Even if Moore’s Law accurately describes the reality of hardware development there appears to be little correlation with human demand upon computers, which oscillates between seeing machines as a free lunch and asking the impossible. Now it’s official! "The current programme of information and communications technology (ICT) study in England’s schools will be scrapped from September, the education secretary has announced. It will be replaced by an ’open source’ curriculum in computer science and programming designed with the help of universities and industry." http://www.bbc.co.uk/news/education-16493929 (11/1/12)

The same report contains the following quote from Ian Livingstone (an advisor to the UK government’s education secretary) "Children are being forced to learn how to use applications, rather than to make them. They are becoming slaves to the user interface and are totally bored by it, ..."

No such thing as a free lunch Moore’s Law ?

The UK has a challenging economic imperative to rescue it’s fading heritage as a leading creative nation in the world of science and engineering. The supposed post-Thatcherite position is that each person should pay their own way through education, and the profit motive of capitalism should drive one’s interests (academic fees are typically GBP 9, 000 per year from 2012-13). Are we dear friends, experts in logic and mathematics, really to believe that our research can be determined by a single economic theory?

No such thing as a free lunch Moore’s Law ?

"Freegans are people who employ alternative strategies for living based on limited participation in the conventional economy and minimal consumption of resources. Freegans embrace community, generosity, social concern, freedom, cooperation, and sharing in opposition to a society based on materialism, moral apathy, competition, conformity, and greed. ... Freeganism is a total boycott of an economic system where the profit motive has eclipsed ethical considerations and where massively complex systems of productions ensure that all the products we buy will have detrimental impacts most of which we may never even consider." (freegan.info)

integral-options.blogspot.com /2007/01/scavenging-to-liveupdated.html

"Freegans rummage through trash bags outside Jefferson Market on 6th Ave" (posted by T.M. Meinch on The Trowel)

Discrete Partial versus Fuzzy Metrics I

A longstanding interesting question is in what sense can we say fuzzy metric is synonomous with partial metric?

I

There is (as I understand) no presumption that fuzzy distance is necessarily computable distance. That is, fuzzy distance could be ontologically sound but epistemologically incomplete.

I

Thus discrete partial metric strengthens our notion of fuzzy distance where fuzzy is primarily an epistemological concept, but weakens a primarily ontological concept.

I

If fuzzy can ignore the cost of epistemology then there seems to be little connection between fuzzy metric and partial metric.

Discrete Partial versus Fuzzy Metrics

I

This brings us back to where we began. Cost is axiomatic in our studies.

I

Cost is consistent with but stronger than computability.

I

Our approach is to derive cost from domain theory, in contrast to complexity theory which is traditionally defined in combinatorics a total separate from any denotational partial model.

I

Partial metrics are hard to justify in either the total world of metric topology or in the combinatorics of complexity theory, but where reconciling the two is axiomatic they have yet to be explored in detail.

Conclusions and further work Topology and logic I

Computer Science has become an overly specialised, selfish, highly stressful, career path. I resign! Now I am a Prisoner in research and teaching.

I

We need more compassion for creative time on discrete domain theory, not publish or perish mania.

I

Computability, topology, partiality, & complexity theory can all be reconciled. How?

I

Are discrete partial metric spaces a way forward? Or, is Wadge’s infinitesimal logic better?

Episode 1 of The Prisoner. Arrival. The man angrily resigns.

c /dcs/acad/sgm/dpm/BuddhaOf

Conclusions and further work Topology and logic

"It has long been my personal view that the separation of practical and theoretical work is artificial and injurious. Much of the practical work done in computing, both in software and in hardware design, is unsound and clumsy because the people who do it have not any clear understanding of the fundamental design principles of their work. Most of the abstract mathematical and theoretical work is sterile because it has no point of contact with real computing. One of the central aims of the Programming Research Group as a teaching and research group has Christopher Strachey (1916-75) been to set up an atmosphere in which Founder of the Oxford Programming Research Group (1965) this separation cannot happen."

Further reading I

E.A. Ashcroft & W.W. Wadge. Lucid, a Nonprocedural Language with Iteration, Communications of the ACM, pp. 519-526, Vol 20, No 7, July 1977.

I

M. Bukatin, R. Kopperman, S. Matthews & H. Pajoohesh. Partial Metric Spaces, American Mathematical Monthly, Volume 116 Number 8, pp 708-718, October 2009.

I

W. Byers. How Mathematicians Think. Using Ambiguity, Contradiction, and Paradox to Create Mathematics, Princeton University Press, 2007 (paperback, 2010).

I

Martin Campbell-Kelly. Christopher Strachey, 1916-75 A Biographical Note. Annals of the History of Computing, Volume 7, Number 1, January 1985.

...

Further reading ... I

R.D. Kopperman. Asymmetry and duality in topology, Topology Appl. 66 (1995) 1-39.

I

Bob Kowalski. www.doc.ic.ac.uk/~rak/

I

Dana S. Scott. Outline of a mathematical theory of computation. Technical Monograph PRG-2, Oxford University Computing Laboratory, Oxford, England, November 1970.

I

S. Vickers. Topology Via Logic, Cambridge Tracts in Theoretical Computer Science 5, 1989.

I

W.W. Wadge & E.A. Ashcroft. Lucid, the Dataflow Programming Language, APIC Studies in Data Processing No 22, 1985.