TL;DR: The U.S. Copyright Office just made it official: prompts aren’t enough to own what AI creates. Courts are backing them up. And while this might sound like a boring legal debate, it’s actually the opening move in a massive economic shift where corporations capture all the value from tools you thought were yours to use.
The law doesn’t care how clever your prompt was; it only cares that a machine did the heavy lifting.
Why Can’t You Copyright Your AI Art Even Though It Feels Like Yours?
You’ve spent three hours tweaking a Midjourney prompt. Forty iterations. Deliberate choices about lighting, composition, mood. The final image is exactly what you envisioned.
Then you try to register it for copyright. The Copyright Office says no.
In January 2025, the Copyright Office released Part 2 of its AI report: if content is entirely generated by AI, it can’t be protected by copyright. Human authorship is required.
Even those detailed prompts you labored over? The Office calls them “merely unprotectable ideas.” They don’t give you enough control over what the AI generates. The system sees a human typing instructions and a machine executing them. That’s not authorship.
All that work exists in a legal no-man’s-land.
Share this analysis. Your friends using AI tools need to know this.
How Did “Human Authorship” Become the Copyright Gatekeep for the AI Era?
This isn’t some new rule invented to mess with AI users. The human authorship requirement has been part of copyright law forever.
Courts have been consistent: no human author, no copyright. The monkey selfie case? No copyright for animals. Celestial beings inspiring a book? No copyright.
Now we’re in the AI chapter. In March 2025, the D.C. Circuit ruled in Thaler v. Perlmutter that Dr. Thaler couldn’t copyright a work created by his “Creativity Machine.” The court said copyright requires human authors. Machines don’t qualify.
But this moment is different. The human authorship rule used to be a quirky edge case. Now it’s the economic sorting mechanism for the AI era. It’s determining who captures value.
And the answer is: not you.
Get the next teardown. We’ll keep unpacking the legal frameworks shaping tech power.
What Happens When Only Model Owners Can Monetize AI Outputs?
So let’s follow the money. You use ChatGPT to write an article. Or Claude to draft code. Or DALL-E to generate visuals. Under current law, those outputs probably aren’t copyrightable by you because the AI did the “expressive” work.
That means your outputs enter the public domain. Or more accurately, they become fair game for anyone (including the AI companies themselves) to scrape, reuse, and feed back into their training pipelines. You can’t protect what you made. But they can keep using it.
Meanwhile, the AI companies do have protections. Their models are trade secrets. Their training processes are proprietary. They charge you for access. They set the terms. They own the infrastructure. They control the gates.
This is the economic heist everyone’s missing. We’re all arguing about whether OpenAI violated copyright by training on books and art (which absolutely matters). But hardly anyone is asking: what happens when we can’t own what we make with their tools?
The companies win both directions. They scrape your content for training. Then they sell you tools that generate outputs you can’t protect. It’s a perfect value-capture loop.
The Copyright Office’s January 2025 guidance made it explicit: works “entirely generated by AI” lack human authorship and are unprotectable. So if you’re trying to build a business on AI-generated content, you’re building on sand. You’ve got no legal moat. No exclusive rights. Just… vibes and hope.
Join the discussion. How should ownership work in an AI-powered economy?
How Do We Stop the Value Capture Before It’s Too Late?
This isn’t a lost cause. But fixing it requires being honest about what the law is actually doing, and who it’s serving.
Option one: push for new legislation. Some legal scholars are already arguing that “substantial human direction” over AI outputs should count as copyrightable authorship. The Copyright Office even hinted in its 2025 report that as technology evolves, their interpretation might shift. But don’t hold your breath. Legal change is slow, and corporate lobbying is fast.
Option two: read your terms of service. Seriously. Different AI platforms have different ownership policies. Midjourney’s terms grant you ownership of outputs (if you’re a paid subscriber). OpenAI’s terms are murkier. Anthropic’s approach is different again. If copyright law won’t protect you, contracts might. Know what you’re signing up for.
Option three: focus on what you can protect. The human-authored parts of your work (your edits, your arrangements, your selections) are still copyrightable. If you’re using AI as a tool within a larger creative process, document your contributions. The Copyright Office will evaluate each case individually.
But here’s the real takeaway: the current system is designed to concentrate value in the hands of model owners. That’s not a bug. It’s the intended outcome of applying old copyright rules to new technology. And until we collectively decide that’s a problem worth fixing, the economic shift will continue.
The question isn’t whether you can outsmart the system with clever prompts. It’s whether we’re okay living in an economy where the tools we use to create belong entirely to someone else.
What should ownership look like when humans and AI collaborate?
Special Thanks:
- For consistent, high quality notes that get me thinking. - For the power posts pushing toward the ethical use of AI. It’s essential work.Check them both out and subscribe if you haven’t already!
Frequently Asked Questions
Q1: Can I copyright something made with AI? A: It depends. If the AI generated it entirely in response to a prompt, probably not. If you made substantial human contributions (like editing, arranging, or using AI as an assistive tool), you might be able to copyright the human-authored elements. Each case is evaluated individually.
Q2: What does the U.S. Copyright Office say about AI-generated works? A: The Copyright Office’s January 2025 report makes it clear: works entirely generated by AI lack human authorship and can’t be copyrighted. Prompts alone aren’t enough. You need “sufficient human control over the expressive elements” of the work.
Q3: Who owns the output from tools like ChatGPT or Midjourney? A: It varies by platform. Check the terms of service. Some platforms (like Midjourney for paid users) grant you ownership. Others have more ambiguous policies. But remember: ownership in the terms doesn’t mean copyright protection if the work lacks human authorship.
Q4: What is ToxSec? A: ToxSec is a cybersecurity and AI security publication that breaks down real-world threats, breaches, and emerging risks in plain language for builders, defenders, and decision-makers. It covers AI security, cloud security, threat modeling, and automation with a practical, no-fluff approach grounded in real attacker behavior.














