this post was submitted on 23 Nov 2024
233 points (87.9% liked)

Technology

59597 readers
3307 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I'm usually the one saying "AI is already as good as it's gonna get, for a long while."

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 73 points 1 day ago (12 children)

It's absurd that some of the larger LLMs now use hundreds of billions of parameters (e.g. llama3.1 with 405B).

This doesn't really seem like a smart usage of ressources if you need several of the largest GPUs available to even run one conversation.

[–] [email protected] 21 points 1 day ago (9 children)

I wonder how many GPUs my brain is

[–] [email protected] 9 points 23 hours ago (2 children)

I don't think your brain can be reasonably compared with an LLM, just like it can't be compared with a calculator.

[–] GetOffMyLan 17 points 22 hours ago (1 children)

LLMs are based on neural networks which are a massively simplified model of how our brain works. So you kind of can as long as you keep in mind they are orders of magnitude more simple.

[–] [email protected] 1 points 1 hour ago

At some point it becomes so “simplified” it’s arguably just not the same thing, even conceptually.

load more comments (6 replies)
load more comments (8 replies)