Look, ma, another top HN post in the “heavily policed safe space for mediocrity” – https://news.ycombinator.com/item?id=45074248. They are babbling about some “philosophy of programming”, at least about how they think it should be.
Well, even before the Atlas Shrugged there were “two kinds of philosophers” – ones which could see (and explain) things as they [really] are, and those who mastered the art of sophistry and abstract bullshitting, and can explain things as they want them to be. The laterr, by the way, call this ability a “creativity” (the word-marker of midwit), and the whole shitshow – a “philosophy”.
The proper philosophy, however, is trying to answer just one question: What Is? and, for each sub-question about this or that particular aspect of “What Is” – Why things are the way they are?
There is a fundamental principle which unitedly and justify the whole endeavor: Most of things are the way they are, because this is the only way they can be – a “fixed-point” has been reached by endless evolutionary trials-and-errors which selects the most stable (and most of the time – energy efficient) intermediate forms/ among wast number of possible ones. This is how Nature works (Is), and this is how the things are in the human (social) world as well.
This, in turn, captures the abstract (but non-bullshit) notions of perfection (nothing more to take away or optimize) and quality (a measure of how close to perfection a thing is), and of elegance and even of a beauty (which is lack of non-uniformity and ugliness – deformities and asymmetry in general .
In Biology, the perfection is the state of being optimal for a given purpose (subject to the “physical” constraints of the environment), and the quality is the degree of closeness to this optimal state. The optimal state is the one which is most stable and most energy efficient for a given purpose. Unsurprisingly, these are also the hallmarks of excellence in human’s engineering and design.
And this, by the way, justifies and explains the “elegance is not optional” [rare, non-bullshit] CS maxim.
Again, mere joggling with abstraction and sophisticated handwaving is not a “philosophy”, but sophistry and socially constructed bullshit.
To understand the world (and thus to master programming, which is the only /way) one has to master mathematics (which is a study of properly captured, and abstracted out generalized notions, “objects” and their properties ) and biology (which are stable-enough (to reproduce), vastly complex systems which spontaneity emerged from the continuous “shuffling” of the atoms in a particularly consrained physical environment).
Since the DNA and the composition of 20 base amino acids has been discovered and the underlying principles have been intuitively understood, smart people realized that not all arrangements of atoms (or amino acids) are “equal”. Moreover, some mathematician, but mostly computer scientists intuitively realized that there are “universal” arrangements, such as sequences, trees, DAGs and lookup tables.
The early LISP tradition and then the MIT Scheme sheism were the first attempts to capture this intuition and the uniformity of the underlying structures as the essence of programming. LISP culture was focused mostly on the “forms” of the data and on the uniform structure which unduly both the code and data, even to the claim that they are the same forms (just like in biology, enzymes and the structural proteins they made and transform are made of the same chained amino acids).
The early FP tradition (ML, Miranda, Haskell) was focused on the mathematical properties of the functions and their composition, and on the “purity” of the functions, and less about the data structures, postulating that “lists” are good-enough for everyting, to the extent to develop a special syntax for constructing manipulating them.
The beginning of the non-bullshit “philosophy” of programming is this realization that just a few carefully selected “forms” are “enough for everything”. It is “the same” (in principle) in math, logic, biology and CS.
Why is this so? (which is an intermediate non-bullshit philosophy, if you will). Because the “forms” are the “fixed points” of not just the evolutionary process, but of the “captures” of the underlying universal notions of what shaped the Universe itself (and no, it is not “numbers” or even “lines”), which are stable and energy efficient. The “forms” are the “optimal” arrangements of the atoms (or amino acids) for a given purpose, and thus they are the most likely to re-emerge and persist.
Both, Nature’s biology and human’s mathematics and CS, are trying to rely on and capture these “forms” and their properties, and to use them to build more complex structures (organisms or programs) which are also stable and “energy efficient”. This is the very beginning of the non-bullshit “philosophy” of programming.
There is a lot more to it, of course, but nothing is abstract bullshit or socially accepted handwaving. Everything is grounded in, an can be (and must be) traced back to “What Is” and “Why things are the way they are”.
At a higher level, above “the primitives” there are what we call “patterns”, which re-emerge again and again at all levels. It is the intuitive understanding of these recurring patterns which gave rise to the “SICP” sub-culture, being a bit of a premature ejaculation, if you ask me, but in the right direction. The more sane and systematic approach was of Barbara Liskov and her “data abstraction” and “abstract data types”, which are better, proper “captures” of the universal patterns, but the popular culture have chosen the SICP.
The universal principles captured so far: “the stable intermediate forms at all levels”, “the hierarchical, layered structure of complexity”, “the necessary (required) one-to-one correspondence between the inherent layered structure of complexity and the layered structure of the code which is trying properly capture and mathematically model it”.
This has been intuitively understood and “emerged” in the structure of standard libraries of the classic programming languages, such as early LISPs (even before Common Lisp), SML (even before Ocaml) and R5RS Scheme (which is, again, neglected modules and proper ADTs due to way too much excitement). All the universal data forms (sequences, trees, DAGs, hash tables) are there, and the standard libraries are layered to reflect the inherent layered structure of complexity of the universal problems they are trying to solve. This is, of course, not a random coincidence.
Another level up – Abstract Data Types, [proper] Abstraction Barriers and Abstract Interfaces. The underlying universal notions is of “partitioning” and of “nesting” – of cell membranes, “tubes”, whole organs, specialized tissues.
At the levels of sub-systems and systems, the universal notions are of “separation of concerns”, “encapsulation”, “modularity”, “hierarchical layering” and necessary “signaling” and communication “emerge”. There are essentially two kinds of signalling – electrical, via “wires”, and chemical, via “structured messages” (or “packets”). The former is fast, but requires a lot of energy and is not very flexible. The latter is slow, but is very energy efficient and very flexible.
It is not a random coincidence that Internet is packet-based message-passing. The smartest guys behind Erlang intuitively understood these aspects – a proper isolation (partitioning) and structural, without any hidden imperative state message-passing.
Notice that even if we are going higher in an abstraction hierarchies, everything can be traced back to What Is, and this is what constitutes a non-bullshit “philosophy” of programming.
Lots of seemingly difficult questions can be answered in this way, without any abstract handwaving and sophistry. Why immutable data is necessary? Because it is the only way to have a stable intermediate form which can be shared without unintended side-effects. Why pure functions are necessary? Because they are the only way to have a stable intermediate form of computation which can be reasoned about mathematically, and thus can be composed into more complex forms.
Why “imperative shared state” is must be avoided (and if cannot - clearly separated behind an impenetrable abstraction barrier, like of a Monad)? Because this is the only way to properly capture the inherently stateful nature of the real world, but it must be properly isolated and encapsulated to prevent unintended side-effects.
Why “proper, total process isolation” is necessary? Because it is the only way to capture the inherently concurrent nature of the real world, but it must be properly isolated and encapsulated to prevent unintended side-effects.
The languages based on discrete mathematics, researched by math majors (and sometimes physicists) have just a few carefully selected “means” to properly capture the complexity of the real world into a layered, partitioned systems, which mimics the underlying structure of complexity, using just a few “universal forms” (sequences, trees, DAGs, hash tables) and a few “universal operations” (composition, recursion, higher-order functions, pattern-matching).
At a higher level, where the patterns emerge, the principles outlined by Michael Jacson and Barbara Liskov – abstraction by parameterization, abstraction barriers, abstract data types, separation of concerns, modularity, hierarchical layering, proper signaling and communication – are the “means” to structure the solution.
Parameterization is universal –it is not just functions, but also Algebraic Data Types (ADTs, generic types) and whole modules (functors).
Abstraction barriers and abstract data types are universal – they are the only way to properly capture the inherently stateful nature of the real world, but they must be implemented (and passed around as) immutable, and thus state-less structured data bindings, to prevent unintended side-effects.
Separation of concerns, modularity and hierarchical layering are universal, and re-emerge at all levels of complexity.
At the highest and the most abstract level, Algebraic Data Types, and Composable High-level, State-less Abstract Interfaces (Iterators and other declarative abstractions) are the “high-level building blocks”. This is the essence of the proper “philosophy” of programming, and everything else is just a socially constructed bullshit.
Now a few examples: WhatApp as an Erlang server-side application, built by just a 10 or so smart guys. Clojure, which was bold-enough to actually rely on immutable data structures (and crappy imperative loops). And, especially, Haskell, which is still an executable system of a high-order pure logic (System F Omega), augmented with advanced types and properly captured abstractions built on top of the universal forms and operations.
Everything can be traced back to “What Is” and “Why things are the way they are”, and nothing here is an abstract handwaving and sophistry.
And, of course, the whole “AI” bullshit (at the level of mere and only syntactic forms, without any underlying understanding whatsoever) is just a desperate attempt to avoid the hard work of properly capturing the complexity of the real world into a layered, partitioned systems, which mimics the underlying structure of complexity, using just a few “universal forms” (sequences, trees, DAGs, hash tables) and a few “universal operations” (composition, recursion, higher-order functions, pattern-matching).
And yes, the “cognitive load” builds up as the direct consequence of, and along with the unnecessary, redundant absrations, mismatching in the one-to-one correspondence betwen the layers and individual concepts of the problem domain and their representation at the same highest possible level of abstraction in the code, and using any form of an stateful imperative crap. This, except the fact that imperative crap cannot cognitively scaled in principle (it will inevitably collapse under its weight, just like any large-enough abstract bullshitting), has been well-understood all the way back in the Golden Age of Programming 1970s , 1980s and the early 1990s).
Yes, it is possible to build a very large stable system, such as Google Chrome in C++ (a whole bullshit ideology being actually implemented in an vastly complex, inherently buggy imperative code), but it requires billions of paid man-hours through decades of effort.