So I compiled that zed “editor” thing everyone is talking about. And I watched carefully the shilling video. Conclusion? Everything is fucked up beyond repair.
Why, yes, I understand, a project which is trying to compete with Cursor, but with their own halfassed “vscode” writtern in Rust slop (as a wrapper to the webrts, lmao).
It compiles hundreds of crates into something that looks like vscode, sacrificing any safety concerns (simply because the underlying C++ code is inherently unsafe and imperative, even with -fno-exceptions -fno-rtti). “Written in Rust” thus is just a “marketing” meme.
Why, I understand that re-using a half of Chromium – “not reinventing the wheel” (meme) – is a “smart” decision (yes, Chromium already implemented all the unicode and rendering related stuff in a crappy imperative OO fashion, utilizing millions of man-hours). There is no better strategy for pushing out something “viable” as fast as possible.
The primary cause of [our] suffering is still an Ignorance. Modern “vape-coders” have no idea how to design a system and why. The “ancient sages” of LISP, ML and UNIX systems had some decent mathematical background (sometimes not deep enough, so we have 0x0 for both NULL and False and for the End-Of-ASCII-String marker) and so even fucking Java has nulls and aliasing bugs .
Those well-educated people, constrained by severe limitations of the hardware in the 70s and 80s, designed the systems that were so amazingly good, that they are worth studying even today, which is 50 years later.
Modern degens just dismiss them as a “boomer technology”, as if it becomes less brilliant and less “done right” and less amazing just because it is “old” (is Penicillin old? a V8 engine?).
Okay, the old system designers, the best ones, have been heavily influenced not just by the findings of the abstract algebra of the time, but also by the “explosion” in the field of molecular biology after the Watson and Crick discovery. Modern degens are just ignorant of everything what isn’t on currently trending on social media.
By the virtue of thinking about the most complex system which ever existed, (the complexity of which is so staggering, that we cannot even comprehend it fully) they have got an intuitive knowledge of what kind of a structure a complex system should have, and on which universal principles it should have be build on. The “structure” is, of course, a hierarchy of distinct layers of non-leaking “black-box” (or cell-like) abstractions, and the main principles are (a closely related) of separation of concerns, and do just one thing but do it well ( which captures the universal notion of specialization in cell Biology).
Sussman (the great wizard) even wrote his last book about application of these principles.
At the highest level of abstraction there are the [properly captured] universal notions of “nesting” (which is how composition “works”) of an “abstraction barrier”, which underlay any abstract “interfaces” (and corresponds to a cell or an organelle membrane), etc.
So they made UNIX, Emacs, and then (much later) the X11, on which ignorant millennials used to shit upon on low-iq “technical” forums. Another example is TeX (and LaTeX), which has been designed from the first principles (of an nested layered structure of a writtern language) and thus have almost zero of “technical debt” (idiotic and ignorant design decisions, such as Ethereum or PHP were based upon).
Their text editors were good at just editing text, (later – structured/nested text) the concerns of how that text will be displayed has been separated and delegated to terminals, which, later, in turn, delegated the concerns about fonts and rendering to the corresponding libraries of the X11 layered stack. Even today we could re-implement a font-rendering with freetype, and now harfbuzz and what not, and all the X11 applications would have all the benefits.
The concerns of “which fonts to use” have been uniformly delegated to the Xresources subsystem, along with a name-based (a table-like) uniform configuration framework. There are so many things have been done right, so no one even noticed (as in a perfectly engineered Lexus LS400 or any similar marvels).
Here is something realize (again). A Hardware Terminal which implements communication and rendering of streams of ASCII characters (as strings) is still the golden standard and is still good enough for editing text fillers (yes, yes, I know, Unicode and shieet, but pay attention to what I am saying). If, unlike Emacs or [Neo]Vi[m], your editor is not fully functional in a terminal (or over ssh), you have ruined something very rare and precious, without even realizing it.
Try to understand that any modern “protocols”, such as Language-Server and now MCP are (properly) expression-level ones, where expressions are still [unfortunately unicode ] strings. No notions of “fonts”, or “rendering” are “Out There”.
As for expressions, it has to be realized that they come in a few distinct forms – sum-types, product-types, function-types and special [syntactic] forms, which are not just the basis of a proper, sound typing, but of a proper syntax highlighting (not the imperative keywords or imperative looping or indexing constructs). Microsoft failed to build a proper, an enginering marvel protocol because they were concerned with the imperative crap first and foremost, so they missed on a unification. MCP is the same kind of imperative syntax-level (not the underlying structural semantics) crap.
Anyway, the Emacs sages got it (partially) right by putting the structure of the LISP code in the spotlight and focusing on supporting it as a “first class” notion (abstraction). This explains the “miracle” of Emacs as it is. But they missed the underlying “universal” structure of the underlaying (yes, nesting) Algebraic Data Types, which only mathematician’s could actually “see”, and which has to be the basis of the “protocols”.
Notice that none of the hundreds of crates are required.
The moral is an old one – once you realize which level of abstraction is proper and the “highest”, you can grow a language (yes, yes) around these appropriate abs tactions (proper ADTs), and thus reduce your code size in an order of magnitude (not even joking) by avoiding unnecessary cossings of abstraction barriers and mixing in irrelevant and inappropriate low-level “stuff” (especially too low-level types) into the high-level code.
This is not just an opinion (you know), this is the only right way (the right way is always a [local] optimum). Identifying the right notions, abstracting them out and most important – staying within the proper abstraction barriers, is what “whole” Biology has been spontaneously evolved “on”.
The old, world-heritage systems (notably, UNIX, vi, Emacs, TeX/LaTeX, X11, MIT Scheme, SNL/NJ) has been intuitively build upon such kind of notions (while never been articulated this clearly).
It still can be done in just a “few lines” of code (like the Edwin), and your zed is, sorry, just a vape-coded [in Rust!] bullshit.
Yes, Unicode and the necessary fonts (and thus rendering) support endlessly complicated everything, but still. it can be done by a clear abstraction barriers and related separation of concerns just as it has already been done with Neovim on a X11 terminal “stack” (it can be done much better by staying “high-level” within the proper abstraction barriers, using an appropriate “grown up” embedded DSL, which mimics biology).
Here I should go on about that no LLM or “AI” could even come up with such kind of understanding (in principle), not merely because none of this kind of reasoning has been in its training set, but because they operate at the level of syntactic “tokens”, not the level of underlying meaning, leave alone the whole observation -> generalization -> abstraction -> naming “pipeline” which produces the principles and the meaning, but I already did this so many times that it became a little bit boring, so I have to stop.