this post was submitted on 26 Dec 2024
68 points (71.2% liked)

Technology

60303 readers
3397 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Thanks to @[email protected] for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (1 children)

You are confusing input with throughput.

No I'm not, I read that part. Input is for instance hearing a sound wave, which the brain can process at amazing speed, separating a multitude of simultaneous sounds, and translate into meaningful information. Be it music, speech, or a noise that shouldn't be there. It's true that this part is easier to measure, as we can do something similar, although not nearly as well on computers. As we can determine not only content of sounds, but also extrapolate from it in real time. The sound may only be about 2x22k bit, but the processing required is way higher. And that's even more obviously way way way above 10 bit per second.

This is a very complex function that require loads of processing. And can distinguish with microsecond precision it reaches each ear to determine direction.
The same is the case with vision, which although not at all the resolution we think it is, requires massive processing too to interpret into something meaningful.

Now the weird thing is, why in the world do they think consciousness which is even MORE complex, should operate at lower speed? That idea is outright moronic!!!

Edit:

Changed nanosecond to microsecond.

[–] [email protected] 4 points 1 week ago (1 children)

As I suggested to someone else, without any of us actually reading the paper, and I know I do not have the requisite knowledge to understand it if I did, dismissing it with words like "moronic" is not warranted. And as I also suggested, I don't think such a word can generally be applied to Caltech studies. They have a pretty solid reputation as far as I know.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (2 children)

I'm not fucking reading a paper with such ridiculous claims, I gave it a chance, but it simply isn't worth it. And I understand their claims and argumentation perfectly. They simply don't have a clue about the things they make claims about.
I've been investigating and researching these issues for 40 years with an approach from scientific evidence, so please piss off with your claims of me not understanding it.

[–] [email protected] 4 points 1 week ago (1 children)

Without evaluating the data or methodology, I would say that the chance you gave it was not a fair one. Especially since you decided to label it "moronic." That's quite a claim.

[–] [email protected] 1 points 1 week ago* (last edited 1 week ago) (1 children)

It's 100% moronic, they use terminology that clearly isn't fit for the task.

[–] [email protected] 2 points 1 week ago (1 children)

"100% moronic" is an even bolder claim for someone who has not evaluated any of the claims in the paper.

One might even say that calling scientific claims "100%" false is a not especially scientific approach.

[–] [email protected] 1 points 1 week ago (1 children)

If the conclusion is moronic, there's a pretty good chance the thinking behind it is too.
They did get the thing about thinking about one thing at a time right though. But that doesn't change the error of the conclusion.

[–] [email protected] 2 points 1 week ago (1 children)

Again, I would say using the "100%" in science when evaluating something is not a very good term to use. I think you know that.

[–] [email protected] 2 points 1 week ago (1 children)

Yeah OK that's technically correct.

[–] [email protected] 1 points 1 week ago (1 children)

It's also been pointed out that they are using 'bit' in a way people here are not thinking they are using it: https://lemmy.world/comment/14152865

[–] [email protected] 1 points 1 week ago* (last edited 1 week ago) (1 children)

Oh boy.

Base 2 gives the unit of bits

Which is exactly what bit means.

base 10 gives units of “dits”

Which is not bits, but the equivalent 1 digit at base 10.

This just shows the normal interpretation of bits.

If it's used as units of information you need to specify it as bits of information. Which is NOT A FREAKING QUANTIZED unit!

And is just showing the complete uselessness of this piece of crap paper.

[–] [email protected] 1 points 1 week ago (1 children)

I’m interested in what you mean. Could you ELI5 why bits of information can’t be used here?

[–] [email protected] 2 points 1 week ago* (last edited 1 week ago) (1 children)

I suppose it can, but just calling it bits is extremely misleading. It's like saying something takes 10 seconds, but only if you are traveling 90% at the speed of light.
It such extremely poor terminology, and maybe the article is at fault and not the study, but it is presented in a way that is moronic.

Using this thermodynamics definition is not generally relevant to how thought processes work.
And using a word to mean something different than it usually does BEFORE pointing it out is very poor terminology.
And in this case made them look like idiots.

It's really too bad, because if they had simply stated we can only handle about 10 concepts per second, that would have been an entirely different matter, I actually agree is probably right. But that's not bad IMO, that's actually quite impressive! The exact contrary of what the headline indicates.

[–] [email protected] 2 points 1 week ago (1 children)

I get your argument now. Do note that this entropy is about information theory and not thermodynamics, so I concur that the Techspot article is at fault here.

[–] [email protected] 2 points 1 week ago* (last edited 1 week ago) (1 children)

I get your argument now.

Thanks. ;)

Do note that this entropy is about information theory and not thermodynamics

https://en.wikipedia.org/wiki/Information_theory

A key measure in information theory is entropy.

Meaning it's based on thermo dynamics.

And incidentally I disagree with both. Information theory assumes the universe is a closed system, which is a requirement for thermodynamics to work. which AFAIK is not a proven fact regarding the universe and unlikely IMO.

2nd law of thermodynamics (entropy) is not a law but a statistical likelihood, and the early universe does not comply, and the existence of life is also a contradiction to the 2nd law of thermodynamics.

I have no idea how these ideas are so popular outside their scope?

[–] [email protected] 2 points 1 week ago (1 children)

Information theory is an accepted field. The entropy in information theory is analogous and named after entropy in thermodynamics, but it's not actually just thermodynamics. It looks like its own study. I know this because of all the debate around that correcthorsebatterystaple xkcd.

[–] [email protected] 0 points 1 week ago (1 children)

I'm not sure if you are making a joke, or also making a point. But boy that XKCD is spot on. 😋 👍
I think within it's field thermodynamics works, but it's so widely abused outside the field I've become sick of hearing about it from people who just parrot it.
I have not seen anything useful from information theory, mostly just nonsense about information not being able to get lost in black holes. And exaggerated interpretations about entropy.
So my interest in information theory is near zero, because I have discarded it as rubbish already decades ago.

[–] [email protected] 2 points 1 week ago (1 children)

For one, password security theory that actually works (instead of just "use a special character") is based on information theory and its concept of entropy.

[–] [email protected] 0 points 1 week ago

OK I don't think information theory is actually needed for that. Just a bit of above average intelligence apparently.
Yes it's true some use the term entropy, instead of just the statistical amount of combinations, and obviously forcing a special character instead of just having it as an option, makes the number of possibilities lower, decreasing the uncertainty, which they then choose to call entropy. Which counter intuitively IMO is called increased entropy.

[–] [email protected] 2 points 1 week ago (1 children)

What is your realm of research? How have you represented abstract thought by digital storage instead of information content?

[–] [email protected] 2 points 1 week ago* (last edited 1 week ago)

Mostly philosophical, but since I'm also a programmer, I've always had the quantized elements in mind too.

In the year 2000 I estimated human level or general/strong AI by about 2035. I remember because it was during a very interesting philosophy debate at Copenhagen University. Where to my surprise there also were a number of physics majors.
That's supposed to be an actually conscious AI. I suppose the chances of being correct were slim at the time, but now it does seem to be more likely than ever.