You must log in or register to comment.
Yeah, I’ve gotten a couple ‘omg U dumb, ur wrong’ type responses when i mention this. However, it’s not my idea or something - this has been widely discussed.
Here, you can read this research paper:
https://arxiv.org/abs/2305.17493v2
What will happen to GPT-{n} once LLMs contribute much of the language found online? We find that use of model-generated content in training causes irreversible defects in the resulting models, where tails of the original content distribution disappear.
or this consumer article:
Okay fair enough.
I acted like a smug prick, but you’re absolutely right.