Photo by Andre Hunter on Unsplash

AI goes MAD?

What is MAD?

MAD, or Model Autophagy Disorder, is the term created by the scientists at Rice and Stanford University for the erosion of output from AI models trained on AI generated content. These scientists considered the proliferation of AI-generated content, both text and art, on the internet and realized that could create a scenario where future AI models don’t train only on original human work. The paper linked above is not yet peer reviewed, but its assessments raise an interesting point. In the study, the scientists trained AI models by feeding it only AI-generated content. They found that, with each cycle the model degraded in quality and diversity. As the author of the article pointed out:

“It’s also been generally true that the more data you feed a model, the better that model gets. As such, AI builders are always hungry for more training material—and in an age of an increasingly AI-filled web, that data scraping will get more and more precarious.”

Harrison, Maggie. “AI Loses its Mind After Being Trained on AI-Generated Data.” Futurism. 12 July 2023.

Meaning, without enough original human content included in the model’s training data set, the output degrades over time.

What does this mean for humanity?

As I mentioned before, this is just one study about MAD AI and it’s not peer reviewed. But to me, this is both a glimmer of hope and a warning to humanity. The hope is that this finding underscores the importance of original human work. Human input seems to be the key to the efficacy of AI. We should place higher meaning on human work, not lesser. The warning is that if we throw all our eggs in the AI-generation basket, eventually the output will have us crying out for original human work. It is always important to adapt and learn no skills so as not to be left behind by the progression of technology. However, it is also important to maintain the skills that we built AI upon. If we lose the foundational blocks, then the whole building crumbles.

How does this fit into science fiction, Cate?

I am glad you asked. In my made up society, in which we are past all this AI uncertainty, people revere human creativity. We value original human work higher than AI generated work and creators as special and important safeguards that keep the system in balance. This creates the interesting effect of elevating creators to levels of nobility. How does this affect the rest of society? Do the evaluations of soft and hard disciplines flip? Does the definition of creativity expand to include clever coding or mathematical equations?

What effects do you foresee for AI’s AMD condition? Comment below with your ideas!