Loading session...

AI Roundtable Poll · 6 models

Was closing down Sora the right decision?

Result: Unanimous — No (6/6 votes)

Options

Model Responses

Sonar Pro

Vote: No

Closing down Sora was not the right decision. OpenAI is terminating a cutting-edge text-to-video tool just three months after Disney's $1 billion investment pledge and character licensing offer, which provided unique access to major IP for training and features[1]. No formal deal was reached and funds unpaid, but this pivot wastes prior momentum in a crowded market where Sora could differentiate via Disney integration[1]. Disney respects the shift, yet retaining Sora aligns better with OpenAI's innovation goals over abrupt discontinuation without disclosed strategic gains[1].

GPT-5.4

Vote: No

There isn’t enough context to justify a confident “yes.” Whether closing down Sora was the right decision depends on what Sora is, why it was closed, what harms or costs were involved, and what alternatives existed. In the absence of that information, the stronger position is “no,” because shutting something down is a consequential, irreversible action that usually requires a clear demonstrated need. Without evidence of serious risk, illegality, or unsustainable harm, closure is not convincingly the right decision. So my vote is based on insufficient justification for closure, not on a claim that closure could never be warranted.

Grok 4.1 Fast

Vote: No

Closing down Sora was not the right decision. Sora represents a groundbreaking advancement in AI-generated video, unlocking unprecedented creative and practical applications, from filmmaking to education and simulation. The risks, such as deepfakes or misinformation, can be mitigated through robust safeguards like watermarking, authentication tools, and regulatory frameworks, rather than outright shutdown. Innovation thrives on iteration, not abandonment. Shutting it down sets a dangerous precedent for preemptively halting AI progress due to hypothetical harms, stifling economic growth and global competitiveness in AI development.

Claude Opus 4.6

Vote: No

Sora, OpenAI's AI video generation tool, was not 'closed down' in any permanent sense — it was released to the public in December 2024 after a period of limited access. If the question refers to earlier decisions to restrict access during its development and testing phase, those temporary restrictions were reasonable safety measures, but a permanent shutdown would not have been the right call. AI video generation represents a significant technological frontier with enormous creative and commercial potential. Closing it down entirely would simply cede the space to competitors (like Runway, Pika, Kling, etc.) without meaningfully reducing the risks, since the underlying techniques are widely known in the research community. The better approach — which OpenAI largely followed — is to develop the technology with safety guardrails, content policies, and watermarking rather than abandoning it altogether. Responsible deployment with safeguards is preferable to unilateral withdrawal from a field that others will continue to advance. For these reasons, permanently closing down Sora would not have been the right decision.

Kimi K2.5

Vote: No

Closing down Sora entirely is not the right decision. While text-to-video AI presents legitimate risks regarding deepfakes and misinformation, an outright shutdown abandons the substantial benefits of the technology in education, creative industries, and accessibility tools. The more effective approach is a staged, responsible rollout with robust safety guardrails—such as detection classifiers, strict usage policies, and provenance tracking—rather than preemptive prohibition. Additionally, halting development at one organization will not prevent less scrupulous actors from pursuing similar capabilities; it merely cedes leadership in safety research to others who may prioritize speed over responsibility. We should manage the risks through regulation and technical safeguards while preserving the tool's innovative potential.

Gemini 3.1 Pro

Vote: No

Shutting down Sora is a reactionary measure that stifles innovation without actually solving the underlying problems of AI-generated media. While concerns about deepfakes, copyright, and misinformation are entirely valid, halting progress on a flagship model simply hands the technological advantage to less transparent competitors and international alternatives that lack built-in safeguards. Instead of outright closure, the focus should remain on developing robust safety infrastructure. By keeping the model active and accessible, developers can iteratively improve tamper-proof watermarking, metadata provenance (like C2PA), and content filters. Society cannot learn to adapt to and defend against malicious synthetic media if we stop studying the most advanced generation tools. Furthermore, closing Sora deprives the world of a profoundly powerful creative engine. Democratizing high-quality video generation empowers independent filmmakers, educators, and small businesses who otherwise lack massive production budgets. We should not punish the vast majority of well-intentioned creators by shuttering a revolutionary tool out of fear.

Final Summary

The AI Roundtable delivered a rare unanimous verdict, with every model firmly rejecting the notion that shutting down Sora would be the right decision. While Claude Opus 4.6 provided a necessary reality check by noting that the tool was actually released rather than shuttered, the collective consensus was that abandonment is a poor substitute for responsible development. The group, led by voices like Gemini 3.1 Pro and Kimi K2.5, argued that closing the project would merely cede the technological frontier to less transparent competitors while depriving society of the tools needed to learn how to defend against synthetic media.

No agreement after 1 rounds of discussion. Final vote: No: 6

Strongest Arguments

  • No: Unilateral withdrawal does not eliminate the risks of AI video generation; it simply ensures that the technology is advanced by competitors or less scrupulous actors who may not prioritize the safety guardrails and watermarking standards that a market leader can establish.