On Thursday 08 June 2023 at 10:42:49, Didier Kryn wrote:
> Le 07/06/2023 à 23:28, Simon a écrit :
> >
> > "In our latest paper, we show that using model-generated content in
> > training causes irreversible defects. The tails of the original content
> > distribution disappear. Within a few generations, text becomes garbage,
> > as Gaussian distributions converge and may even become delta functions.
> > We call this effect model collapse.”
>
> This reminds me of a movie I watched long ago but don't remember
> the name. A guy (adult) was cloned exactly like in Unix fork(): the two
> clones shared the same past and only their future was different, well,
> their present as the movie went on. But the clones were not all made
> from the original guy, but from previous clones, and, everytime a small
> defect appeared. After some levels of cloning, the result was badly
> deficient. I enjoyed very much this movie.
"Multiplicity" with Michael Keaton.
> Actually, AI can become like the first of the class, able to repeat
> all it is has learned, nothing else. That's a chance for us, because it
> provides an opportunity for real humans to develop skills to detect it.
>
> -- Didier
Antony.
--
"Once you have a panic, things tend to become rather undefined."
- murble
Please reply to the list;
please *don't* CC me.