this post was submitted on 03 Jun 2024
1471 points (97.9% liked)

People Twitter

5234 readers
576 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 5 months ago (1 children)

Llms are excellent at consuming web data.

[–] [email protected] 6 points 5 months ago (1 children)

Not if you want to ensure the validity of the compiled coupons/discounts. A custom algorithm would be best but data standardization would be the main issue, regardless of how you process it.

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (1 children)

What does validity mean in this case? A functionary LLM can follow links and make actions. I'm not saying it's not "work" to develop your personal bot framework, but this is all doable from the home PC, with a self hosted llm

Edit and of course you'll need non LLM code to handle parts of the processing, not discounting that

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago)

The LLM doesn't do that though, that the software built around it that does that which is what I'm saying. Its definitely possible to do, but the bulk of the work wouldn't be the task of the LLM.

Edit: forgot to address validity. By that I mean keeping a standard format and ensuring that the output is actually true given the input. Its not impossible, but its something that requires careful data duration and a really good system prompt.