this post was submitted on 17 Dec 2024
569 points (92.7% liked)

memes

10645 readers
3058 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
569
submitted 1 day ago* (last edited 20 hours ago) by [email protected] to c/[email protected]
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 1 day ago (6 children)

I gave it a math problem to illustrate this and it got it wrong

If it can’t do that imagine adding nuance

[–] [email protected] 11 points 1 day ago (4 children)

Well, math is not really a language problem, so it's understandable LLMs struggle with it more.

[–] [email protected] 11 points 1 day ago (3 children)

But it means it’s not “thinking” as the public perceives ai

[–] [email protected] 5 points 1 day ago (2 children)

Hmm, yeah, AI never really did think. I can't argue with that.

It's really strange now if I mentally zoom out a bit, that we have machines that are better at languange based reasoning than logic based (like math or coding).

[–] [email protected] 1 points 17 hours ago

Not really true though. Computers are still better at math. They're even pretty good at coding, if you count compiling high-level code into assembly as coding.

But in this case we built a language machine to respond to language with more language. Of course it's not going to do great at other stuff.

load more comments (1 replies)