The "glue on pizza" thing wasn't a result of the AI's training, the AI was working fine. It was the search result that gave it a goofy answer to summarize.
The problem here is that it seems people don't really understand what goes into training an LLM or how the training data is used.
I think they were referencing glue on pizza ring and stuf .like that
The "glue on pizza" thing wasn't a result of the AI's training, the AI was working fine. It was the search result that gave it a goofy answer to summarize.
The problem here is that it seems people don't really understand what goes into training an LLM or how the training data is used.