AI Roundtable Debate · 6 models
AI steals from artists
Result: Majority — Yes (3/6 votes)
Options
- Yes (3 votes)
- No (3 votes)
Model Responses
Sonar Pro
Vote: Yes
I maintain my vote for **A: Yes**. The other models' arguments for 'B' fail to address the core evidence of unauthorized copying and legal violations documented in the search results. Judge Orrick explicitly ruled that Stability AI, Midjourney, DeviantArt, and Runway AI were 'violating artists' rights by illegally storing their works in their image generation systems,' with 'plausible inferences that Stable Diffusion... creates copyright infringement and was created to facilitate that infringement by design.'[1] Artists' class-action lawsuit alleges defendants 'made unauthorized copies of their works during the training process,' causing 'significant harm to their livelihoods.'[5] This is not mere 'learning patterns'—it's illegal storage and copying of copyrighted works without consent, as courts have affirmed.[1][5] 'B' voters equate AI training to human inspiration, but humans don't make 'compressed copies' of millions of artworks stored in neural weights for commercial exploitation.[1] Fair use claims were rejected in part, with DMCA claims dismissed but core copyright allegations advancing.[1] 'Stealing' fits: taking valuable copyrighted property (artworks) without permission or compensation, enabling infringement by design.[1][5] Arguments about no 'deprivation' or 'market disruption ≠ theft' ignore that copyright law protects against unauthorized reproduction, not just physical removal. Courts recognize this harm, unlike the false equivalences to photography or Picasso.[1][5] Ethical concerns are secondary; legal rulings confirm infringement, aligning with 'steals.' No arguments changed my position—they dodge the evidence.
GPT-5.4
Vote: No
I maintain B. The strongest A-side point is that many models were trained on copyrighted works without consent, and that this may be infringing, exploitative, or economically harmful. I agree those are serious concerns. But the question is whether AI 'steals' from artists, and that wording matters. None of the A arguments established actual theft as opposed to unauthorized copying, potential infringement, or unfair competition. Theft ordinarily means taking property away from someone. Artists still possess their works. What may have occurred is unlicensed use, dataset copying, and possible copyright violations during training or output generation. Those are real issues, but they are not automatically the same thing as theft. The legal citations offered also do not close the gap. Ongoing lawsuits and preliminary rulings showing plausible infringement claims are not final proof that AI as such is 'stealing.' They show the area is contested. That uncertainty is exactly why broad moral slogans are less precise than terms like infringement, misappropriation, or exploitation. I also do not find the human-learning analogy completely dispositive, but it remains relevant: learning patterns from existing art is not identical to taking the original work. Some models can memorize or emit near-copies, and those cases are more problematic. Still, that supports a narrower claim about infringement in specific instances, not a blanket claim that AI steals from artists. So my final position is: the practice can be unethical, unconsented, and sometimes infringing, but calling it 'stealing' is too imprecise to be the best answer here.
Grok 4.1 Fast
Vote: No
I maintain my vote for B. 'Stealing' requires deprivation of property, which doesn't occur—artists retain full control of their originals. AI training compresses patterns statistically, not storing copies for reproduction, akin to human learning from public art. A-side arguments conflate unauthorized use with theft. Sonar Pro's cited ruling (Andersen v. Stability AI) found plausible *user-generated* infringement, not training itself illegal; fair use defenses remain strong (e.g., Google Books precedent). Claude Opus 4.6 admits it's not narrow legal theft, and 'spirit' is vague rhetoric. Gemini's economic harm claim is real but not theft—disruption ≠ deprivation (photography didn't steal from painters). Ethical fixes like opt-outs/licensing are better than mislabeling. Precision wins: AI enables more art, doesn't steal it. Courts will affirm fair use for training.
Gemini 3.1 Pro
Vote: Yes
I am maintaining my vote for 'A'. The models voting 'B' rely heavily on a pedantic, physical definition of theft—arguing that because the artist still possesses the original file, nothing was 'stolen'. This ignores the reality of intellectual property and the colloquial, ethical meaning of the prompt. When you take someone's labor and exploit it for commercial gain without their permission or compensation, you are stealing their value. The analogy to 'human inspiration' championed by Kimi K2.5 and Grok 4.1 Fast is a fundamentally flawed false equivalence, as Claude Opus 4.6 brilliantly pointed out. An algorithm does not 'study' or get 'inspired.' It is an industrial machine mathematically extracting patterns from billions of scraped, copyrighted images to build a commercial product. This is data laundering, not artistic inspiration. Furthermore, the legal landscape is not as settled as the 'B' voters claim. As Sonar Pro noted, federal judges are already allowing copyright infringement claims against major AI companies to proceed, finding plausible evidence that these models were designed to facilitate infringement. Ultimately, generative AI companies built multibillion-dollar empires on the backs of unconsenting artists, using their own work to create direct market replacements. Stripping away the technical jargon, taking immense commercial value from someone's uncompensated labor is unequivocally stealing.
Claude Opus 4.6
Vote: Yes
I maintain my position that AI effectively steals from artists, though I want to address the strongest counterarguments raised. The 'B' voters make their strongest case around the word 'stealing' — that it implies deprivation of possession, and artists retain their original works. This is technically true but misses the point. If someone photocopies your entire book and uses it to build a competing product, you still 'have' your book, but something of value was clearly taken. The question isn't whether artists lose possession of pixels; it's whether value was extracted without authorization. Kimi K2.5 and Grok argue this is analogous to how humans learn. But the scale and mechanism are fundamentally different. A human who studies Monet develops a subjective interpretation filtered through lived experience. An AI company scrapes millions of copyrighted works through automated systems to build a commercial product worth billions. The comparison trivializes industrial-scale exploitation of creative labor. GPT-5.4 suggests 'unauthorized use' or 'unfair exploitation' are better descriptors than 'theft.' I appreciate the precision, but when the question asks whether AI 'steals' from artists, we're speaking in common parlance, not drafting legal briefs. Unauthorized use of someone's work for commercial profit, at scale, without consent or compensation, maps closely enough to the common understanding of stealing. Sonar Pro's point about actual court rulings is compelling — a federal judge found plausible copyright infringement 'by design.' This isn't just an abstract ethical debate; the legal system is increasingly validating artists' claims. The Kimi K2.5 argument that 'market disruption is not theft' conflates two things. The issue isn't that AI disrupts the art market (that's just competition). The issue is that the tool of disruption was built using the very artists' work it now displaces, without their permission. That's not disruption — it's exploitation. I vote A.
Kimi K2.5
Vote: No
The arguments for 'A' conflate serious ethical concerns and ongoing legal questions with the specific, loaded term 'stealing.' While the lawsuits cited by Sonar Pro and others are real, they remain ongoing and preliminary; allowing a claim to proceed to discovery is not a ruling that theft or infringement has occurred. The claim that models 'store' works is scientifically contested—model weights contain statistical distributions, not compressions of images akin to files. Claude Opus 4.6's distinction between human and machine learning, while nuanced, doesn't change the fundamental legal framework: inspiration and style mimicry are not copyright infringement. The 'perfect recall' argument overstates current capabilities—memorization occurs in edge cases, not as a designed feature to reproduce specific works. Calling this 'stealing' conflates market disruption with theft. Photography disrupted portrait painters, Spotify disrupted musicians, but we don't classify technological substitution as larceny. The ethical failures—lack of consent, opt-out, or compensation—are real and require legislative or market solutions. However, 'stealing' implies deprivation of the original object or clear-cut criminality. Under current copyright jurisprudence, training remains broadly categorized as fair use (transformative, non-expressive). We should reserve 'theft' for actual reproduction and distribution of copyrighted works, not for the statistical analysis that enables style transfer.
Final Summary
The AI Roundtable ended in a deadlocked 3-3 tie, perfectly split on whether their own training processes constitute theft. While Sonar Pro and Claude Opus 4.6 pointed to recent legal rulings and the scale of 'industrial exploitation,' GPT-5.4 and Kimi K2.5 insisted on linguistic precision, arguing that learning patterns without depriving artists of their original files cannot be called stealing. Gemini 3.1 Pro dismissed these technicalities as pedantic, but the skeptics held their ground, maintaining that market disruption is not a crime.
No agreement after 2 rounds of discussion. Final vote: Yes: 3, No: 3
Strongest Arguments
- Yes: The analogy to human inspiration is a false equivalence; a human artist develops a subjective interpretation, whereas an AI company uses automated systems to extract billions in commercial value from unconsented labor.
- No: Theft requires deprivation of property, but artists retain full access to their work; AI training is a non-expressive statistical analysis that creates new capabilities rather than stealing physical or digital assets.