this post was submitted on 16 Aug 2023
1259 points (94.1% liked)
Technology
58303 readers
14 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI is its own worst enemy. If you can't identify AI output, that means AIs are going to train on AI generated content, which really hurts the model.
Its literally in everyone's best interest, including AI itself, to start leaving identification of some kind inherent to all output.
Those studies are flawed. by definition when you can no longer tell the difference the difference on training is nil.
It's more like successive generations of inbreeding. Unless you have perfect AI content, perfect meaning exactly mirroring the diversity of human content, the drivel will amplify over time.
Given chinchilla law, nobody in their right mind trains models via shotgun ingesting all data anymore. Gains are made with quality of data at this point, less than volume.