Opinion: AI, Fair Use, Content Scraping Crisis. Innovation or Exploitation?

Estimated reading time: 4 minutes

This one’s not about SEO. I have many thoughts on AI and SEO. With this latest Anthropic settlement, it’s worth documenting my thoughts in this blog (so I can go back and read it again in a few years, if the world hasn’t ended).

The AI boom has been huge in our SEO industry. It also bulldozed straight into one of the oldest legal battlefields in America: copyright and Fair Use.

Big Tech wants AI amnesty. It wants to keep scraping millions of books, articles, news sites, blogs, images, and training materials without permission. After all, it’s all “transformative,” therefore Fair Use, right? Sounds a bit iffy to my common sense. Meanwhile, courts, authors, journalists, and indie publishers are lining up to say: absolutely not.

Fair Use doctrine is at the center of this AI amnesty fight. Fair Use, if I understand this correctly (I hope I do), is a rule designed for teaching, commentary, and transformative work — not for scraping millions of copyrighted books to train trillion-parameter models.

Large language models exist because humans created the data that trains them. That’s the paradox no one has solved. AI needs data to learn. Humans create that data. So who owns what? Who pays whom? And how much “training” becomes too much?

No one knows. That’s exactly why Silicon Valley is now lobbying for an amnesty that would legally give them permission to keep using scraped content under the umbrella of innovation. They’re trying to codify the status quo into law before the lawsuits pile high enough to make it impossible.

“Let’s forget how we got here and focus on innovation.” Am I right? That line reminded me of the snafu with the COVID pandemic too. “Let’s forget how we got there and focus on the next pandemic.” All in favor say aye. Those who got kickbacks, say it louder.

The $1.5 billion Anthropic settlement is the clearest sign yet that courts aren’t buying the amnesty narrative. “You used copyrighted data. You profited from it. Now pay.”

That’s fair. Everything about AI (and I say this over and over) is that there are too many unknowns as well as legal challenges. If courts rule that AI cannot be trained on copyrighted work without explicit permission, does that mean every model we use today becomes legally radioactive? The entire ecosystem would need to be rebuilt from scratch. 

If courts say AI companies can scrape anything under Fair Use, then content creators, researchers, authors, etc., the ones who actually fuel the machine, lose every ounce of leverage they have left.

I cheer for innovation, but I know I’m exploited.

For indie publishers, journalists, and professional writers, many are looking at paychecks shrinking, contracts disappearing, and needing second or third jobs. Their work becomes raw material for billion-dollar AI models.

I’m thinking of Stalin’s “Who? Whom?” Who is doing something, and to whom it is being done. (No, I’m not a commie. That was the example that came to mind.)

Apply it to the AI content battle.

Who benefits? Big Tech. Their models get smarter and faster.

Whom does it happen to? Writers. Journalists. Indie publishers. The small creators who produced the very corpus that made AI possible.

Meanwhile, on the global stage, the dynamics get even more complicated.

Outside the U.S., many drive-by SEOs don’t worry about copyright exposure at all. They can spin up disposable sites, pump out AI content at scale, burn the domains down, rinse, repeat. U.S. laws can’t reach them. The financial incentives are enormous, especially in countries where multiple websites can generate meaningful income or where agencies resell AI-generated content to naive clients who can’t tell the difference.

It’s not all exploitative. As much as I don’t like many of the socially negative things I see in AI, the same AI that fuels spam also opens doors for many.

Students in developing countries now have free tutors. Aspiring professionals can learn coding, marketing, and writing skills they’d never have had access to. Workers can upskill, start businesses, and compete globally in ways that weren’t possible even five years ago.

AI hurts creators.
AI empowers creators.

AI destroys industries.
AI creates industries.

Without a clear regulatory path, the courts, Congress, and Big Tech are all playing chicken with the future of the creator economy.