
At a recent appearance in Sussex, Nick Clegg—once Britain’s Deputy Prime Minister and now Meta’s president of global affairs—spoke with striking bluntness: if artists insist on being asked for permission before their work is used to train artificial intelligence, the AI industry will be “killed.” His comment, delivered while promoting his new book How to Save the Internet, is part prophecy, part threat. It also says a great deal about how tech leaders continue to misunderstand what artists actually do—and why it matters.
Clegg’s remarks arrive in the context of the UK’s controversial Data Protection and Digital Information Bill, which would allow copyrighted work to be scraped and used by machine-learning models unless creators explicitly opt out. In other words: your work is fair game unless you find the right legal form and know how to submit it. Several prominent artists and cultural figures—including Paul McCartney, Elton John, and Ai-Da the robot—have spoken out, warning that such legislation institutionalizes theft under the banner of innovation.

The language used by Clegg is especially telling. He describes consent as “somewhat implausible” because the data sets are too large, too chaotic. It’s a logic that would fall apart instantly if applied to any other industry. Imagine a museum arguing it’s too hard to ask living artists for permission before exhibiting their work. Imagine a record label releasing an album without crediting its musicians, citing “technical complexity.”
The truth is simpler. Asking for consent slows things down. It introduces accountability. It acknowledges the artist not as a data point, but as a person whose creative labor deserves respect, compensation, and—most importantly—recognition.

This latest flashpoint in the AI debate isn’t new. For centuries, artists have had to fight to retain control over their work, their images, their legacies. Today’s tools may be more advanced, but the pattern is familiar. It is the artist who creates and the system that exploits. What is new is the scale—and the unsettling ease with which those at the helm of powerful tech companies dismiss the foundational ethics of authorship.
Art is not code. It is not infrastructure. It does not improve with iteration or efficiency. It accrues meaning through context, history, and the singularity of vision. To treat it as undifferentiated “content” is to misunderstand not only what art is, but what it’s for.
If artificial intelligence is to have any cultural relevance at all, it must start by recognizing the humanity of the culture it seeks to emulate. That begins with consent.