this post was submitted on 17 May 2024
502 points (95.0% liked)

Technology

58303 readers
5 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 6 months ago (2 children)

It's only easier to verify a solution than come up with a solution when you can trust and understand the algorithms that are developing the solution. Simulation software for thermodynamics is magnitudes faster than hand calculations, but you know what the software is doing. The creators of the software aren't saying "we don't actually know how it works".

In the case of an LLM, I have to verify everything with no trust whatsoever. And that takes longer than just doing it myself. Especially because an LLM is writing something for me, it isn't doing complex math.

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

If a solution is correct then a solution is correct. If a correct solution was generated randomly that doesn't make it less correct. It just means that you may not always get correct solutions from the generating process, which is why they are checked after.

[–] [email protected] 0 points 6 months ago (1 children)

Except when you're doing calculations, a calculator can run through an equation substituting the given answers and see that the values match... Which is my point of calculators not being a good example. And the case of a quantum computer wasn't addressed.

I agree that LLMs have many issues, are being used for bad purposes, are overhyped, and we've yet to see if the issues are solvable - but I think the analogy is twisting the truth, and I think the current state of LLMs being bad is not a license to make disingenuous comparisons.

[–] [email protected] 1 points 6 months ago

Its left to be seen in the future then