What is so “natural” about Gentzen’s “Natural Deduction”? It is that our brains are conditioned by the Environment In Which We Happen To Evolve, and the environment is governed by what we call the Law of Causality, which has been captured in the East by the Buddha’s “Dependent Origination principle” and even more precisely by Modus Ponens in the West.

When this exists, that comes to be; with the arising of this, that arises. When this does not exist, that does not come to be; with the cessation of this, that ceases.

Our intuition is that if we know that “A causes B” and that “A” is present, then “B” emerges or occurs (weather patterns, seasons, agricultural observations). The “knowledge” that “A” causes “B” first comes from repeated observations, then intuitively, and then when we (someone else) establish the actual causality (captures and experimentally verifies the relationship) using the scientific method.

Modus Ponens is exactly a proper generalization of this recurring observation into an abstract rule of inference – if we know that “A implies B” and that “A” is true, then we can conclude that “B” is true.

How universal is this abstract rule (or a principle) is? Well, it is universal enough to be used in all of mathematics. This rule of inference is so general and fundamental that it is almost taken for granted (correctly by our intuition and it feels right no matter race or age).

It seems that humans have captured the causality principle itself in its minimal (just a single “premise” or a necessary requirement), essential and abstract form.

That premise corresponds to the another universal notion – “necessary and sufficient to…”, and this is the underlying fundamental principle behind every chemical reaction or even every physical process. Once conditions are meet (and the constraints are satisfied) a process or a reaction “spontaneously” occurs.

It is only “natural” that mathematicians have captured and properly abstracted out this universal notion. Non-bullshit math is just a collection (and an abstract science) of such properly captured abstractions.

Back to Gentzen.

Each proof rule consists of zero or more formulas written above a line, called the premises, and a single formula written below the line, called the conclusion.

The interpretation of a rule is that when all the premises hold, then the conclusion follows.

The above is a direct translation of the causality principle into the language of logic. The “premises” are the “causes” and the “conclusion” is the “effect” – “multiple (or partial) causation”, as everything is in the Universe.

It is not a random coincidence that the symbol of an “arrow” is used to denote an implication – it is, intuitively, a “single step of causality”. The arrow is a “natural” visual symbol for the causality principle itself.

Now you could realize what chained (connected) arrows mean in a /function composition and what directed, acyclic graphs could capture. The whole Universe can be viewed as one such graph, if you will, and its fringe is what we call “now”. OK, ok.

So, composition of functions as the way to express every possible computation is another properly captured universal notion. The composition is defined as nesting of function calls, and this, in turn, captures the notion of an order within a process (when it is meaningful).

Note that the order can be arbitrary (as in which part is added to which “first”) or necessary (as in chemical reactions or biological “pathways”). Again, a function composition establish an implicit order, so the returned value (a “conclusion”) becomes the argument (a “premise”). No time here. Only the notion of “necessary and sufficient”.

Where the “introduction/elimination pairs” came from? Well, this another intuitive capturing.

There are basically two kinds of processes in the universe (haha) – some processes “seemingly spontaneously” occur because (caused by) what is “already out there’ (in the environment or in a particular locality). We call them “physical processes”, meaning that it “just working of the laws of physics”. Other kind of processes is biological processes – the evolved continuous processes caused by what we call Biological Evolution.

It is quite difficult to accurately capture the fundamental difference in words, but basically there are some “stable” molecular “expressions” which are being evaluated by the “runtime” of physical laws in a particular locality. The principal difference is that some molecular arrangements are not “random”, but being “selected” or used by the evolutionary processes.

Having a warm (just right, not too hot, not too cold) water on the surface of a planet is necessary, but not sufficient. The “elements” of particular kind are required, and other necessary conditions (and constraints) so that [energetically (and structurally) stable-enough “molecular structured” could emerge. This is the “physical part”.

The process of biological evolution “selects” and uses some of these emergent molecular structures, so they become the building blocks (of building blocks of building blocks).

There is a catch – when something is selected by evolution it is because it is being used somewhere. It is that simple and that fundamental. This is an introduction. The use is elimination. The mind of an external observer captured and generalized this principle intuitively, without clear understanding.

One more time, whatever is non-random (there is no such thing as randomness – it is a mathematical concept – everything is caused by everything else), that is being selected by the processes of biological evolution, has its uses somewhere. “Selection” is introduction (a lambda abstraction, literally), elimination is the use (“application”). Yes, this is why the Lambda Calculus is the way it is.

But how? Well, the primary use of human language is to describe and communicate some aspects of the shared environment to each other. The proper use of a language “naturally” or even necessarily creates and “introduces” concepts (of the mind of an observer) and then uses them in a language. Utility.

Notice that there is no “purpose” or any other abstract humanitarian bullshit, just utility – these structures are being used somewhere [else].

Math and logic can be traced back to What Is – all the properly captured abstract notions. This is how “double-barrel” discoveries (once by a logician, once by a theoretical computer scientist) of Wadler came to be.

  • function abstraction, function application (+ the bindings in the environment)
  • ADT constructors, structural pattern-matching (de-structuring)

Sum is OR, Product is AND, a single argu,enr function is => nesting, parameterization, recursion