-
“a hypothetical scenario in which a machine learning system trained on its own output becomes unable to function properly or make meaningful predictions”
-
Via ted byfield: “If you’ve wondered what AI-bots are ~thinking while they generate an image, here you go.” Reverse-engineering the training samples which Stable Diffusion et al are combining for a given text query, in the laion5B or laion_400m datasets
(tags: ai clips laion ml stable-diffusion text2image)