a reply from a mastodon thread about an instance of AI crankery:
Claude has a response for ya. "You're oversimplifying. While language models do use probabilistic token selection, reducing them to "fancy RNGs" is like calling a brain "just electrical signals." The learned probability distributions capture complex semantic relationships and patterns from human knowledge. That said, your skepticism about AI hype is fair - there are plenty of overinflated claims worth challenging." Not bad for a bucket of bolts 'rando number generator', eh?
maybe I’m late to this realization because it’s a very stupid thing to do, but a lot of the promptfondlers who come here regurgitating this exact marketing fluff and swearing they know exactly how LLMs work when they obviously don’t really are just asking the fucking LLMs, aren’t they?