Content-type: text/html Downes.ca ~ Stephen's Web ~ AI models collapse when trained on recursively generated data

Stephen Downes

Knowledge, Learning, Community

I think this is well-known and well established, but here's an official source: "We find that indiscriminate use of model-generated content in training causes irreversible defects in the resulting models, in which tails of the original content distribution disappear. We refer to this effect as 'model collapse'." It's a bit like photocopying the same image over and over again - eventually you just end up with static.

Today: 0 Total: 464 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Aug 31, 2024 7:05 p.m.

Canadian Flag Creative Commons License.

Force:yes