Deepseek R1

DESCRIPTION: Memes and mirrors. Nowadays things are moving way too fast. It is not just controlled trial-and-error, it is literally throwing everything at the wall (to see what sticks). It started with that meme “Attention Is All You Need”, when they just came up with an “architecture” that sticks. That “attention” and “multi head attention” turned out to be just a few additional layers of a particular kind. No one can explain the actual mechanisms of how exactly or even why the layers are as they are (abstract bullshit aside)....

January 26, 2025 · <lngnmn2@yahoo.com>

Selfawareness

Another day – another bullshit from some Chud. https://thewaltersfile.substack.com/p/bootstrapping-self-awareness-in-gpt Self-awareness and awareness is general is not at a language level (or information level). Animals, obviously, have awareness, but not a “language level awareness” due to the fact that their brains lack any language areas. The series of mutations and subsequent developments that lead to a human language is unique to humans. All other animals are using just “voices” - pitch, volume, distinct cries, etc....

November 20, 2023 · <lngnmn2@yahoo.com>