this post was submitted on 26 Aug 2024
49 points (96.2% liked)
Fuck AI
1281 readers
1 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 8 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Completely irrelevant. The title and posted article are talking about unintentionally training LLM text generation models with prior output of other AI models. Not having enough training data for other types of models is a completely different problem and not what the article is about.
Nobody is going to "trawl the web for new data to train their next models” (to quote the article) for a model trying to cure diseases.