this post was submitted on 26 Dec 2024
68 points (71.2% liked)
Technology
60303 readers
6086 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
From a cursory glance it seems at least quite close to the definition of a bit in relation to entropy, also known as a shannon.
If it's not re-defining the term then I'm using it like the paper is defining it.
Because just understanding words to respond to them, ignoring all the sub-processes that are also part of "thought" and directly impact both your internal narration and your actual behavior, takes more than 10 bits of information to manage. (And yeah I do understand that each word isn't actually equally likely as I used to provide a number in my rough version, but they also require your brain to handle far more additional context than just the information theory "information" of the word itself.)