Your Novel Is Not Training Data
In September 2023, the Authors Guild — alongside George R.R. Martin, John Grisham, Jodi Picoult, and over a dozen other authors — sued OpenAI for training its models on their books without permission. Since then, the number of copyright infringement lawsuits against AI companies has climbed past seventy. Brandon Sanderson delivered a keynote calling out the fundamental problem: that AI art tools take the creative labor of real people and use it as raw material, with or without consent.
This isn’t an abstract debate. If you’re a writer, it’s about your words.
The quiet risk of cloud writing tools
Most writers don’t read terms of service. But buried in the fine print of many cloud-based writing platforms, you’ll find language granting broad licenses to your content — for “improving the service,” “training models,” or “enhancing features.”
Even tools with good intentions often can’t guarantee what happens to your data:
- Acquisitions — the company that respects your privacy today might be bought by one that doesn’t tomorrow
- Policy changes — terms of service can be updated at any time, often without meaningful notice
- Third-party processors — your text might pass through APIs and services with their own data policies
- Breach exposure — unpublished manuscripts sitting on a server are one security incident away from being public
The moment your draft leaves your machine, you’ve lost a degree of control you can never fully get back.
What authors are saying
The concern isn’t theoretical. Brandon Sanderson argues that even if AI models were trained on entirely non-copyrighted work, the fundamental problem remains — the machine does the creative labor without being changed by it, while real artists lose the market for their craft. Sarah Silverman, Richard Kadrey, and Christopher Golden filed early lawsuits against OpenAI. The New York Times sued Perplexity AI in late 2025 for copying millions of articles.
And in one of the most significant rulings so far, a federal court found that while AI training itself might qualify as fair use, acquiring training data through pirated book collections does not. The message is clear: the creative community is drawing a line.
For novelists, the takeaway is simple. Your unpublished manuscript is your most valuable asset. Treat it accordingly.
How Writefully So keeps your work yours
Writefully So is local-first in the most literal sense. Your projects live in a folder on your computer — ~/Documents/Writefully-So/ by default. The app stores your data in a local SQLite database inside each project folder. No servers, no accounts, no sync services, no network requests at all.
That means:
- No one can train on your writing — it never leaves your machine
- No terms of service govern your manuscript — it’s just files on your disk
- No company shutdown can take your work — you have the only copy
- No breach can expose your draft — there’s nothing to breach
We don’t collect telemetry, analytics, or crash reports. The app is invisible to us, and your writing is invisible to everyone but you.
The caveat: you own the backup too
The tradeoff of local-first is real. If your hard drive dies and you don’t have a backup, your novel goes with it. There’s no “restore from cloud” safety net. You’re in control — and that means backups are your responsibility.
Built-in protection
Writefully So has a few layers of protection built in:
- Project backups — the app creates timestamped database backups inside your project’s
06_Backups/folder, keeping the five most recent copies automatically - Chapter snapshots — you can take manual snapshots of any chapter at any time, and the app auto-snapshots every ten minutes while you write. If you restore a snapshot, it first saves your current state, so restores are always reversible
- Graveyard system — deleted chapters, characters, maps, and codex entries aren’t actually deleted. They move to a graveyard where you can recover them anytime
These protect you from accidental edits and deletions. But they live inside the same project folder — so they won’t save you from a dead drive.
External backup strategies
For real disaster protection, you need copies outside your machine. Here are proven approaches:
- Time Machine (macOS) — set it up once and forget it. Your entire Documents folder, including all Writefully So projects, gets backed up automatically to an external drive
- External drive — periodically copy your
~/Documents/Writefully-So/folder to a USB drive or external SSD. Simple and reliable - Cloud sync for backups — use Dropbox, iCloud Drive, Google Drive, or similar to sync your project folder. Yes, this puts copies in the cloud — but you’re syncing your own files on your own terms, not handing them to an app’s servers under a terms-of-service license
- Git — if you’re technical,
git initinside a project folder and commit periodically. You get full version history and can push to a private repo for off-site backup - 3-2-1 rule — the gold standard: three copies, on two different types of media, with one stored off-site
The key difference: when you choose to put a copy in Dropbox or on a USB drive, you control the terms. No one is quietly licensing your work to train a model. That’s fundamentally different from writing inside a cloud app that processes your keystrokes on someone else’s infrastructure.
Write freely, back up deliberately
The rise of AI hasn’t changed what good writing requires — vulnerability, honesty, and the willingness to write badly before you write well. But it has changed the stakes of where you do that writing.
Your first draft — messy, raw, full of ideas you haven’t figured out yet — deserves to exist in a space that belongs only to you. And it deserves a backup strategy that keeps it safe without signing it away.
Writefully So is free to download and use. Your stories are yours. Keep them that way.
This post was drafted with AI assistance and reviewed by our team. We use AI for product communication — never for creative writing, and never to train on yours.
Back to all posts