This is a specific kind of mass gaslighting and sunk cost fallacy: “Think of all the money we spent on cleaning, don’t you believe it’s clean?” /s
This is a specific kind of mass gaslighting and sunk cost fallacy: “Think of all the money we spent on cleaning, don’t you believe it’s clean?” /s
I guess Germans do need to be particularly good at this, based on the mega words they can have.
On the other hand, when listening to American Youtubers read something onscreen, it seems like they use some internal rainbow table to look up prefixes of words, and then just autocomplete the word based on probability.
I say this because during reading they often substitute words with some that sound similar, but are not semantically close to what is written.
Some people have great trouble splitting words into their component parts, as if their internal GPT just stores everything as single token like “redneck”, so they never split it semantically or conceptually into red+neck.
Then the mistake was made many years ago by releasing the code under GPLv2, with no obligation to contribute back.