LLMs-generated Rust code

Rust code is the ultimate evidence of the principal inability for a probability-based generating algorithm (based on sampling from a “learned” probability distribution over “tokens”) to come up with something that passes the type checker, but for the most trivial cases. The “causality” is that generation of the complex syntactic forms without the actual, proper understanding principles and heuristics is, well, “problematic”. The running example is these were subtle “already borrowed” panics, where the issue in with the underlying semantics, and the syntax is “correct”. The problem is that a recursive function has to drop all the borrows before a recursive call, and this constraint cannot be expressed in the syntax unless by redundant bindings which will automatically be drooped at the end of the scope. ...

June 25, 2025 · <lngnmn2@yahoo.com>

Software In The Era of AI

https://www.youtube.com/watch?v=LCEmiRjPEtQ and, of course, the No.1 spot on the Chuddie safe space https://news.ycombinator.com/item?id=44314423. Karpathy is shilling “Cursor” and other cloud-based mettered AI services (which have to pay back their debts). Probably has an interest in it and some other meme AI startups. Nothing to see here. We should some day know which marketing “genious” came up with this “winning strategy” – to metter every single token (byte) and try to sell this to corporations. Corporations do not want to be mettered like that, they want to metter normies, the way Cellular operators do, and they never use any normies plans themselves. ...

June 19, 2025 · <lngnmn2@yahoo.com>

Yes, it is time to scream and panic

There is something to be actually realized as soon as possible (which I recently experienced directly) –the condition-probability-based “glorified autocomplete” could generate some small pieces of code (with intelligent prompting) that simply above (in intrinsic quality) 90% of all the related crap that could be found on Github, which easily extrapolates to 90% of people in existence who identify themselves as “programmers”. Period. The code will be better than high-dopamine, “laser-focused” (on Adderall or just amphetamines), but not so well-educated on fundamentals zealots could produce. Better than inexperienced in realities of the actual open-source software ecosystems academics could even hope to write, no matter how good their theoretical understanding is. With very specific and properly constrained prompting, which takes into account the major results of the last 50-or-60 years of PL and CS research, especially in the field of math-based Functional Programming and Immutable Data Structures, one can get almost optimal (approaching perfection small pieces of code, provided you understand what theoretical-limited perfection actually looks like in this particular case, and why. ...

June 12, 2025 · <lngnmn2@yahoo.com>

Just a Packaged Slop

DESCRIPTION: Removing the veil of Maya to see things as they really are What to do when you have discovered that something is wrong with the world? Nothing, this happens all the time. Everything is wrong with C++, but everyone uses it, everything is wrong with packaged food, especially the toxic crap Nestle produced, and everyone is buying it. Nothing can be done. Here is what is wrong with your “AI” and “LLMs”. ...

April 30, 2025 · <lngnmn2@yahoo.com>

Large Ladyboy Models

Classy Andrej is making shilling videos from Thailand (he leaked his location in the video ) targeting normies (the previous set of videos has been partially filmed in Japan. Andrej is living a truly digital nomad’s life). https://www.youtube.com/watch?v=EWvNQjAaOHw Why would he shill? Well, he and guys like him made a lot of promises, not to us (who tf cares), but to the money guys, that this particular technology will completely transform the world, and that they are the very top guys in the field, so money shall be given to them (to the affiliated companies and entities). ...

March 9, 2025 · <lngnmn2@yahoo.com>

Coding with LLMs

DESCRIPTION: Idiots, idiots everywhere. Now I can accurately summarize what coding using LLMs actually /is in just a few sentences. Recall how people usually describe a code maintenance job: we have this code to run, while the original developers are gone and leave us no design documentation. This hypothetical situation is exactly what you get when an LLM finished spewing out the slop: you now have some code, very cheap, even for free, but it is not yours, the underlying understanding (of the whys) is not in your head, and the original developer is already gone. Disappeared. ...

February 24, 2025 · <lngnmn2@yahoo.com>

Grok3

Well, I’ve watched it. There are a few things to realize. The code it generated ran without an issue with the simulation task. It, however, is incomprehensible without understanding of all the details (like any other code). This is, probably, so because they have feed in a lot of very similar internal code in the training phase. The gibberish from the “thinking” phase might be helpful or it may be equally cryptic. ...

February 18, 2025 · <lngnmn2@yahoo.com>

AI Slop

slop noun Cambridge dictionary food that is more liquid than it should be and is therefore unpleasant liquid or wet food waste, especially when it is fed to animals Oxford Learner’s Dictionary ​waste food, sometimes fed to animals liquid or partly liquid waste, for example urine or dirty water from baths There is also a very related term “goyslop” from internet sewers (losers are always looking for someone to blame and hate [instead of themselves]). ...

February 16, 2025 · <lngnmn2@yahoo.com>

Deepseek In Action

Let’s do it again, because why tf not, especially given the magnitude of the current mass-hysteria about this AI meme (it is literally everywhere and even on Slashdot, which is the last bastion of sanity, there are 4 articles in a row with “AI” in the title). what are the roles of type-classes in Haskell and traits in other languages? This is supposedly a naive and uninformed question I asked Deepseek R1 14b. Its output will be given below verbatim. ...

February 15, 2025 · <lngnmn2@yahoo.com>

Reasoning LLMs

AUTHOR: <lngnmn2@yahoo.com> When I was a kid they told me not to stare at the sun I had this vision, that the brain structures are sort of like trees, while the “branches” are just like patches thorough our yard after fresh snow. Some of them remain thin, just someone walked across it absentmindedly, some gets broadened by a heavy re-use. Who would plow through a fresh snow while one could take follow the path that is already here. ...

February 11, 2025 · Ln Gnmn