
______________________________________
Welcome to the show.
Today we're talking about fear and AI, and why panic prevents writers from making informed decisions about the most powerful creative tool of our generation.
I spent six months researching AI and copyright law. Not because I wanted to become a lawyer. Because I owed it to myself—and to every writer considering AI collaboration—to understand what was actually happening versus what people feared might happen.
Marie Curie said it best: "Nothing in life is to be feared, it is only to be understood. Now is the time to understand more, so that we may fear less."
That's what we're doing today.
Fear Creates Paralysis
Here's what nobody tells you about the AI panic: most of it is based on incomplete information.
I watched writer after writer make career decisions—refusing to learn AI, avoiding collaboration entirely, missing opportunities—all based on headlines they'd read but never investigated.
Think about it: Would you make major financial decisions based on Reddit threads? Would you choose medical treatment based on Twitter arguments?
Then why make creative career decisions based on panic and hype?
I realized something crucial: the biggest risk isn't AI itself. It's writers making decisions based on incomplete information.
That's when I decided to do the work. Six months of research. Court documents. Terms of service. Legal analysis. Not to become an expert. To become informed.
Fear #1: AI Companies Stole My Book to Train Their Models
This is the fear that dominates writer forums. "They used my book without permission. They're profiting from my work. It's theft."
In 2024, authors filed a class action lawsuit against Anthropic AI. The allegation: Anthropic used millions of copyrighted books to train Claude without permission.
The headlines screamed: "AI Company Caught Using Pirated Books!"
But here's what the headlines often missed.
The court ruled that Anthropic's use of the books for training was acceptable as "fair use." The court distinguished between two separate issues:
Can AI companies legally train on copyrighted material? Yes, according to the court.
Can they acquire that material through illegal means? No. That's the problem.
Anthropic settled the lawsuit—not because training on books is illegal, but because acquiring them from pirated "shadow libraries" was questionable.
What This Means for Writers
Let me be explicit:
Your published books CAN be legally used for AI training—if companies acquire them legally
You cannot stop legal AI training
The only legal protection is against piracy—the illegal acquisition method, not the training itself
Just because the courts declared it legal doesn't mean writers have to like it. Many don't. That's valid.
But understanding the legal reality helps you make informed decisions about your own AI use.
I can resist this reality and refuse to engage with AI. Or I can accept it and use the training that happened to improve my craft. I chose the second option.
Fear #2: My Conversations with AI Are Training Future Models
Here's what you need to know:
Both Anthropic and OpenAI now let you control whether your conversations train future models. You can opt out. Your conversations remain private either way.
And you own the output—both companies explicitly assign ownership of generated content to you.
Does opting out affect AI performance? No. There's no evidence that opting out reduces quality or capabilities.
Does allowing training make my writing available to others? No. Training data becomes part of the model's general knowledge—it cannot share your specific passages with other users.
The key: understand your options and make informed decisions.
Fear #3: AI Is Eliminating Writing Jobs
I admit: this is the most viable fear. It's also the oldest fear in the writing profession.
In 1474, scribes petitioned to outlaw the printing press. The Senate refused. Scribes initially lost copying work. Many transitioned to original writing.
When word processors arrived in the 1980s, writers feared they would destroy authentic creativity. Gore Vidal warned: "The word processor is erasing literature."
Each technological leap displaced
traditional roles while creating new opportunities.
Yes, some jobs will be lost. I believe this is inevitable. However, there will always be a place for writers who continue to grow, who take craft seriously, and who leverage every available tool. It means being creative and looking for new opportunities instead of clinging to the past.
The pattern is clear: technology changes the profession. Writers who adapt and develop new skills alongside new tools tend to thrive.
The question isn't whether AI will change writing. It already has. The question is: will you develop skills that complement AI, or will you pretend it doesn't exist?
Fear #4: I'll Lose My Ability to Copyright
This fear stems from confusion about AI-generated versus AI-assisted work.
Here's the distinction:
AI-generated: The AI creates the content with minimal human input. Copyright protection is uncertain.
AI-assisted: A human makes all creative decisions, writes the prose, and uses AI as a coaching tool. Full copyright protection likely applies.
If you use AI as a writing coach—helping develop characters, plot, and scenes—but you write the actual prose and make all creative decisions, your human authorship receives full copyright protection.
The distinction between "generated by" and "assisted by" matters legally. Use AI ethically as a coach, not as a content generator, and your copyright remains intact.
The Biggest Risk
After six months of research, here's what
I concluded:
The biggest risk is writers making decisions based on incomplete information.
Fear-based decisions mean refusing to learn AI collaboration skills, avoiding tools that could strengthen your craft, and making career choices based on panic instead of research.
Informed decisions mean understanding the legal landscape, knowing your rights and limitations, and developing skills that complement AI rather than compete with it.
Here's the truth about the four fears:
Stolen books: Courts say training on legally obtained books is fair use. You're only protected against piracy.
Training on conversations: You control this. Your conversations remain private. You own the output.
Job elimination: Technology always changes professions. Writers who adapt tend to thrive.
Copyright loss: AI-assisted work maintains copyright protection.
None of this means you must use AI. It means: make informed decisions based on facts, not fear.
Starting Your Own Journey
If you want to explore ethical AI collaboration—the kind that strengthens your craft rather than replaces it—I teach exactly this in his course.
Here's the truth: informed writers make better decisions than fearful writers.
Your voice matters. Your story matters. And you deserve accurate information.
Transform fear into clarity. Panic into informed choice. Paralysis into possibility.
See you in the next podcast.