There are lots of literal degens (undeveloped, unrefined, lacking any sense of beauty and elegance) who keep parroting the stupid mantra “syntax does not matter” or “syntax is not important”, or “it is a mater of individual prurience”. Idiots.

There is a universal human notion which we call [careful and even obsessive] “attention to details”. In certain cultures this part is very prominent, and one can tell. Every neurologically normal person could name that particular country. Other countries had either brief periods of such “enlightenment” in their past, that can still be observed in traditional architectural and art forms, or have some occupations in which attention to details is the paramount, like that long lost German automobile engineering in the 70s and 80s.

Such cultures,, unsurprisingly, are considered the most civilized and refined and exhibit all the hallmarks of a civilized, peaceful and advanced societies.

Same considerations apply to intellectual endeavors, which can be seen as forms of an art, such as mathematics and math-based functional programming.

But how such details evolve, come into existence and become parts of a tradition? The answer is almost universal – by endless trials-and-errors together with a feedback loops form a community (environment).

This is how the current mathematical notation has been evolved. What most people are just brushing away as unimportant is actually very deep and profound human achievement. It is has [partial] consistency (infix), uniformity (ratios), orthogonality (ratios again) and even universality (use of parentheses). One cannot easily come up with such a combination of complementing features, in only can be gradually evolved, or, indeed, discovered.

So, how do we come up with a good, almost universal (like the current mathematical notation) syntactic forms for a programming. Well , we just do again (without a shame) what the ancient sages did – they took the most used and universally agreed parts of the traditional mathematical notation, and augmented it (mostly too verbose) .

What happened next, is that a whole committee of math majors, some of whom developed their own petty functional languages, designed the almost perfect (well, mostly right) language called Haskell. In Haskell the most “universal” syntactic foams has been captured and “perfected” (minimized). All we have to do is, well, just do this again.

There is an important point to realize. If we look at the whole set of ancient traditions, which include ISWIM, APL, Algol, CLU, Prolog, Erlang, SML, Miranda, Common Lisp, Scheme and even SHEN, the C programming language tradition is not just the canonical case the “Worse Is Better” social pattern, it is an obvious enshittification. As cultured persons we won’t even consider C++.

So, what universal mathematical notions the Haskell syntactic forms have captured just right? Notice that by “just right” we mean that there is literally noting more talk about, like using parenthesis for grouping and implicit nesting.

High-level principles

The noting of overloading is to have more than one meaning associated with a word (a term) depending on an implicit context.

Overloading of symbols is hard to comprehend so use of a specialized embedded DSLs as distinct “contexts”.

Thus is the reason why toplevel type-signatures shall be at its own line, written in its own embedded DSL.

In general. the mathematical notation is “flat”, so using toplevel annotations on a previous line is a major innovation to unclutter the syntactic forms.

Symbols shall be used systematically, according to the “level of universality”, which just means how accurately they capture the associated notion. The classic example is using of just juxtapositon for function application, which “naturally” generalizes to currying (multi-argument functions). Leaving the parenthesis alone was a major win of de-cluttering (and un-overloading).

Similarly, the most menial syntax shall be used for pattern bindings, as Erlang and SML do – patterns are uniformly everywhere [a binding occur]. Function classes defined by pattern-matching simply follow. The same syntactic forms shall be reused in a complex expressions (case, match, etc), as one being a specialization of the other (function classes are being rewritten as case expressions).

Scala3, with the systematic attempts to de-clutter its syntax,by using juxtaposition for single argument methods is the sign that its authors have a good taste.

The symbols unusually agreed upon

The symbols and the properly captured notions

  • = equality, being the same “thing”
  • -> transformation and a distinct step, a mapping. In expression with clauses captures the notion of an implicit partial function, an implicit lambda.
  • | partition, a sum-type and a clause (partial function). UNIX use of it as a pipelining operator is wired.
  • , a product [type] in Haskell instead of * as in math, SML and Ocaml
  • () nesting, explicit ordering,
  • \(\dot\) as the function composition operator. shines when used for chaining. Implemented as [more universal notion of]nesting of function calls.
  • ! a command, a distinct linguistic form, an explicit “mark”
  • ? a question, a distinct linguistic form which demands an answer, an explicit “mark”
  • ' a quote and a prime as in x'

Combinations

  • <= and >=
  • := assignment, nothing to talk about
  • == an equality predicate, same
  • != with an implicit notion for ! being a logical negation (NOT) /= maybe more direct “rendering”
  • x.y member accessors. This is so pervasive that nothing can be done.
  • \x a specially quoted character. This is just a strike of luck.

The fact that algebraic sum-types and the partial functions (clauses) defined on each tag (data-constructor) match perfectly is more than just a random coincidence and intuitively captures a universal pattern.

The Haskell’s use of (,) for both type- and data-constructors, and of (->) as an infix partially-applicable type-constructors (with “sections)” is a stroke of a genius in a zealous persuit of uniformity.

Maybe, just maybe, (*) syntactic form for a product would be more consistent , but the comma captures our notion of enumeration.

Scheme did the “marks” just right – ? for a predicate, and ! for a command (a procedure). These are just naming conversions, but shall be enforced in the syntax.

Using ? as a chaining monadic operator is a wrong idea , while having a syntactic sugar for such implicit operators is a great one.

LISP programmers tried to introduce the (! xxx) and (? xxx) forms, but they are strikingly inferior to.

Controversial

  • x <- <block> a common idiom (very general) in many languages, including Haskell, usually introduces a new variable (or a binding)
  • x -> <block> a standard idiom of receiving from a channel, similar to that one in Go.

The notion of a “source” and a “sync” are universal, and there are the corresponding “endpoints” (of a channel),so they be better properly captured in a syntax.

The do-notation in Haskell and for-comprehensions of Scala3 indicate the right direction.

In Haskell, withing a so-called do-block, the syntactic form a <- e denotes “receiving a value from a given (particular) context”. This is an intuitive attempt to capture a very general universal notion.

This is a very serious topic. It seems that we should have a standardized syntactic sugar which desugars into a particular nesting of monadic combinators (lifting and composition), the way Haskell does impure code and F# does async stuff.

This is the right way to properly design and implement specialized embedded DSLs.

But there is more. The imperative operations of “send” and “receive” can be mapped to monadic operations too. the generalized notion of lifting (into a given context) is “isomorphic” to sending (whatever that means), and taking a value from evaluation of “an action within a given context” is isomorphic to “receive”.

Erlang, being way ahead of its time, does a structural pattern-matching on receive as a fundamental language construct (idiom). Defining all the clauses explicitly (and declaratively, like in a case-analysis) even turns it into a proper expression, similar to match or case. It is, indeed, a match expression.

What they overlooked is the “underlying” monadic structure. For “sending” it is just “lifting” and “forgetting about it”, which is just like putchar :: Char -> IO () . The corresponding getChar :: IO Char shall not be a function but a full-blown pattern-matching expression as in Erlang.

The proper design would be to have an “imperative-style” syntactic sugar of -> for receive (into a block) that desugars into proper pattern-matching expression, still within an implicit monadc context.

The <- form introduced a new variable binding, for the value taken from a given minadic context, which can be “receive” from channel, which is just a particular specialization, as it is in the Haskell’s do-notation.

OK, I have already went a bit too far.

Obsessive attention to the minutest details in an art form

Trying to move from more “universal” (general) to specific

Parenthesis

Using parenthesis to override default operator precedence with a notion of an implicit nesting as in function composition.

\((2+3)*4\) is not the same as \(2+3*4\)

This means that the expression inside the parenthesis is nested [within] the outer expression and has to be evaluated first.

This is the traditional way to establish an explicit order of evaluation (overriding the order defined by operator precedence).

LISP championed the use of parenthesis to explicitly define nested structures, which is obviously related and just more general notion.

(+)
(+ 2 3)
(let ((a 1)) (+ a a))

Apostrophe

\[x’ x’’\]

This is where the art and elegance shine – use of a single quote ' (till the end of a parenthesized expression).

'()

Common Lisp uses a “back-quote” for a special kind of quotation within a macro. together with a “matching” comma , as an “unquote” (an inverse operation).

This is a manifestation of minimalism and subtlety, which are defining characteristics of an art form.

Backslash in LaTeX

This is another stroke of a genius – witting \dot or \hat or \tilda as a mnemonic mapping to a unicode symbol (with a particular numerical codon) is fantastic.

Notice that the form \dot\ loses all the elegance and appeal. “Till the end of a literal” rule is good-enough. Again, same goes for the LISP quoting.

Compare this approach to the sheer stupidity Microsoft uses to memorize the non-meaningful numbers and then to press Alt-x . It is exactly such small details that distinguish an art from a crap.

Traditional mathematical notation

the traditional symbol to denote equality (and occasional reasoning) \[=\] Notice, that this denotes a statement of fact, not an overloaded predicate.

“maps to” (an arrow) \[->\]

\[+ - * / < <= > >=\] There is noting more to talk about. These symbols capture the underlying universal mathematical notions perfectly. Intuitively ones sees “less-than-or-equal” as being captured in these symbols.

Haskell

juxtaposition for function application

parentheses for explicit nesting (for infix-to-prefiix trans-forms and for, naturally, sections)

(.) f g = \x -> f (g x)

the minimalist lambda notation

\x -> x + 1

universal bindings

f = \x -> x + 1

The use of the quality symbol for defining a binding (within the current environment)

universal pattern-matching

currying, partial application and

\x -> \y -> x + y

There is already a lot is going on, a lot of “right things” – nested closures – x is captured, implicit lexical scoping

sections (the syntax for partially applied infix binary operators)

syntactic sugar for multiple argument functions which desugars to nested lambdas (+) x y = x + y

at a higer level Haskell got right type anntations at a separate line using a specialized DSL this is what Scala should do to clear the mess

Erlang universal pattern-marching

SML functionn clauses