Some hard problems will never be solved by “brainstorms” (more like bullshit-storms) or any about of babbling by Chuds. They require the trained minds like of C.A.R. Hoare or Leslie Lamport. or Philip Wadler or Martin Odersky – the outliers and top performers.. Ideally, they have to be mathematically mature, and preferably on the compiler side (so they also understand important implementation aspects).
The “direction” is actually clear and well-understood (a small miracle), and it is in restricting too general concepts and formalisms with additional constraints (so they actually match the environmental constraints), or just typing.
They same principle has been successfully applied (intuitively back then) to restrict way too general The Lambda Calculus (to avoid “paradoxes”), and before that, intuitively again, to restrict too general formalism of the Set Theory to require to specify which Set (a particular number system) we are within.
What is typing? (yes, yes, I know!) Conceptually, it is just “tagging” values with (type) tags, and then using a relation such as “has this particular tag”. This is how to partition the universe of values into non-overlapping partitions (subsets). The early [Common] Lisp guys got it just right back then.
Since then typing has been understood (at least by some not so bone-headed) not as rigid classifications but as duck-typing or having particular “traits” (or “flawours”) or a set of “attributes”, where these attributes are not just symbols but type-signatures (required interfaces).
This is nothing new, most of the mathematical “strictures” (Monoids, Groups) has been defined in this way – by the set of operations. This is what Barbara Liskov specialized for the filed of CS as Abstract Data Types.
Since then we have generalized this notion into abstract interfaces. and, in Rust
, a clear syntactic distinction between abstractions and its actual implementations (the impl
blocks).
The clear distinction between an abstraction itself (formally defined) and its implementations is most important notion (realization). They must be clearly separated with sort of mental abstraction barrier.
We create (properly generalize) so called Concepts of the Mind from repeated observations, then we define mathematical abstractions from concepts of the mind, and study and specify their properties. After that we partition (bucket-sort) these abstractions using relations on common properties into classes and what not.
The point is to “stay in mathematics” most of the time and consider representations and implementations as an actual construction.process.
So, we just arrived at the principle stated by Leslie Lamport (and some others) that mathematics is how one model the world and formally define the concepts. Lets see how it may work.
At the level of modeling (concrete mathematics) its is well-understood that the set-theory can be considered as the foundation of Mathematics – all math could be built from it. The systems of logic show us how to construct valid new theorems based on previously proven results (more about this later).
At the level of programming (computing) the Lambda Calculus are the foundation, and everything computable could be computed within this stems (formalism).
Both of these are way too general, has to be restricted with at least some typing to be actually useful. So, the questin becomes what is the appropriate typing?
The last significant advance was the realization that type-classes and traits (set-uniton and set-intersection and not proper-subsets) are the most fundamental operations, and this it is proper meaning of the “composition over inheritance”. meme.
The most recent innovative and successful application of this principle of necessary restrictions and that a type-system has to match reality was the formalization of the behavior of references and box-like imperative variables in Rust
.
What is the next step? It seems to me that the concept of a Future
or an Event
is too general to be actually useful and that a restrictive typing (partitioning) is necessary (and probably will be sufficient) to solve the current fucking horrendous mess.
I am not the first one to arrive at this, of course. The necessity to develop an adequate mathematical theory and then an algebra and even small specialized calculus is the only way to actually successfully tackle complexity. Leslie Lamport did this with TLA+, Martin Odersky did DOT calculus and now working at algebraic effects based on the notion of a capacity.
I would argue that to solve concurrency we have to type the events and specilaize way too general futures (and events)into highly specialized partitions, each of which would have its own mini-theory and mini-algebra. The set-theoretic formalism will be good-enough.
Okay, how? Well, just like everything else – begin with actual reality, with What Is (the way the hardware people are doing).
The key is the notion of composabilty, which is as central to mathematics as Set Theory itself - without proper and well-understood function composition there would not be advanced mathematics. The Category theorists got this (and only this) right – they have captured the universal principle of composition (andnesting). Or were it the Group theorists formalizing the too abstract notions of relations and functions?
Anyway, everything has to compose. This is the ideal. Any model of streams (of events) which does not compose is just wrong.
How do we compose? Well, we have to form a Monoid over a composition operation(s). It is that simple. At an implementation side it has to be a Monad (again, nothing new here).
What is the problem then? The problem is that we cannot sort out (bucket-sort) the values, so nothing actually composes due to leaking/exposure of incompatible implementation details which has to be hidden behind an abstraction barrier (which is what a Monad accomplishes).
Can we do better? Absolutely! This is when the notion of the duck-typing and of compable traits will be useful. We have to form a monad overs composition based on traits, not just a single set of values.
Traits has to be layered, to form specialized layers of DLSs, and thus achieving vertical partitioning between layers of implementations. Layred DSLs is an old idea. The new idea is that they have to be based on traits, which could be traced back to proper generalizations of the fundamental mathematical notions (which, in turn, properly capture certain aspect of What Is).
Now what? Well, re-reading the Architecture of Complexity by Symon. There are universal patterns – a hierarchical structures and layered systems everywhere. Another universal pattern which was not mentioned was message passing (as universal as being used everywhere in biology) and the modern notion of the packet-based communication.
The central notion, however. is still of an ordered sequence and of a stream. Both of these are (must be) composable by definition.
In lazy pure functional languages, like Haskell the only way to ensure (establish) a particular order of evaluation is by explicit nesting of function calls. Function composition is just nesting, both in the implementations of (.)
and <<=<<
. This is the very beginning of so-called “structured concurrency”.
Yes, we just have to nest function calls so that everything composes. For that we better have Monoids, and to actually from them we need a hierarchy of traits, with a Monoid at each level (over composition operatons).
What is the problem then? Over-generalization of the concepts of a Future or Event. They don’t compose in principle. Partition (specialize) them properly, and everything will suddenly “click”. What an idiotic idea was to lump “timers” and “buffers” together under a single concept, given that we should not have any time whatsoever (it should be just another “interrupt” - a particular kind of an event which cannot be ignored – like the signal of an alarm clock which literally abstracts out the time).
The remaining challenge is to come up with proper partitioning of the events, but looks like the hardware people did this thousand times. Their packet-streams compose and so are their sequences.
Let some HN Chuds read this.