How generative models could go wrong
A big problem is that they are black boxes
In 1960 norbert wiener published a prescient essay. In it, the father of cybernetics worried about a world in which “machines learn” and “develop unforeseen strategies at rates that baffle their programmers.” Such strategies, he thought, might involve actions that those programmers did not “really desire” and were instead “merely colourful imitation[s] of it.” Wiener illustrated his point with the German poet Goethe’s fable, “The Sorcerer’s Apprentice”, in which a trainee magician enchants a broom to fetch water to fill his master’s bath. But the trainee is unable to stop the broom when its task is complete. It eventually brings so much water that it floods the room, having lacked the common sense to know when to stop.
Explore more
This article appeared in the Science & technology section of the print edition under the headline “How generative models could go wrong”

From the April 22nd 2023 edition
Discover stories from this section and more in the list of contents
Explore the edition
Satellites are polluting the stratosphere
And forthcoming mega-constellations will exacerbate the problem
AI models are dreaming up the materials of the future
Better batteries, cleaner bioplastics and more powerful semiconductors await
Mice have been genetically engineered to look like mammoths
They are small and tuskless, but extremely fluffy
Is posh moisturiser worth the money?
Don’t break the bank
How artificial intelligence can make board games better
It can iron out glitches in the rules before they go on the market
The skyrocketing demand for minerals will require new technologies
Flexible drills, distributed power systems and, of course, artificial intelligence