this post was submitted on 28 Sep 2024
630 points (99.4% liked)
Science Memes
11086 readers
1217 users here now
Welcome to c/science_memes @ Mander.xyz!
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
- Don't throw mud. Behave like an intellectual and remember the human.
- Keep it rooted (on topic).
- No spam.
- Infographics welcome, get schooled.
This is a science community. We use the Dawkins definition of meme.
Research Committee
Other Mander Communities
Science and Research
Biology and Life Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- !reptiles and [email protected]
Physical Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Humanities and Social Sciences
Practical and Applied Sciences
- !exercise-and [email protected]
- [email protected]
- !self [email protected]
- [email protected]
- [email protected]
- [email protected]
Memes
Miscellaneous
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Just use the llm to make the books that the llm then uses, what could go wrong?
Someone's probably already coined the term, but I'm going to call it LLM inbreeding.
I suggested this term in academic circles, as a joke.
I also suggested hallucinations ~3-6 years ago only to find out it was ALSO suggested in the 1970s.
Inbreeding, lol
The real term is synthetic data
but it amounts to about the same
sort of
There was some research article applying this 70s computer science concept to LLMs. It was published in Nature and hit major news outlets. Basically they further trained GPT on its output for a couple generations, until the model degraded terribly. Sounded obvious to me, but seeing it happen on the www is painful nonetheless...
Soylent AI? Auto-infocannibalism
It's quite similar to another situation known as data incest
It can only go right because corporations must be punished for trying to replace people with machines.