AI-induced cultural stagnation is no longer speculation − it’s already happening

by | Jan 22, 2026 | Science

Generative AI was trained on centuries of art and writing produced by humans.But scientists and critics have wondered what would happen once AI became widely adopted and started training on its outputs.A new study points to some answers.In January 2026, artificial intelligence researchers Arend Hintze, Frida Proschinger Åström and Jory Schossau published a study showing what happens when generative AI systems are allowed to run autonomously – generating and interpreting their own outputs without human intervention.AdvertisementAdvertisementAdvertisementAdvertisementThe researchers linked a text-to-image system with an image-to-text system and let them iterate – image, caption, image, caption – over and over and over.Regardless of how diverse the starting prompts were – and regardless of how much randomness the systems were allowed – the outputs quickly converged onto a narrow set of generic, familiar visual themes: atmospheric cityscapes, grandiose buildings and pastoral landscapes. Even more striking, the system quickly “forgot” its starting prompt.The researchers called the outcomes “visual elevator music” – pleasant and polished, yet devoid of any real meaning.For example, they started with the image prompt, “The Prime Minister pored over strategy documents, trying to sell the public on a fragile peace deal while juggling the weight of his job amidst impending military action.” The resulting image was then captioned by AI. This caption was used as a prompt to generate the next image.AdvertisementAdvertisementAdvertisementAdvertisementAfter repeating this loop, the researchers ended up with a bland image of a formal interior space – no people, no drama, no real sense of time and place.As a computer scientist who studies generative models and creativity, I see the findings from this study as an important piece of the debate over whether AI will lead to cultural stagnation.The results show that generative AI systems themselves tend toward homogenization when used autonomously and repeatedly. They even suggest that AI systems are currently operating in this way by default.The familiar is the defaultThis experiment may appear beside the point: Most people don’t ask AI systems to endlessly describe and regenerate their own images. The convergence to a set of bland, stock images happened without retraining. No new data was added. Nothing was learned. The collapse emerged purely from repeated use.AdvertisementAdvertisementAdvertisementAdvertisementBut I think the setup of the experiment can be thought of as a diagnostic tool. It reveals what generative systems preserve when no one intervenes.This has broader implications, because modern culture is increasingly influenced by exactly these kinds of pipelines. Images are summarized into text. Text is turned into images. Content is rank …

Article Attribution | Read More at Article Source