There have been a lot of predictions around how AI is going to change the world of SEO and content marketing, such as:
- “AI is going to remove the need for writers”
- “You’re going to be able to produce 10X the content with AI”
- “Search is going to become so much more competitive because companies will be able to churn out 100s of blog posts in the amount of time it takes a human to produce 1 blog post”
- “Keyword strategy won’t be important because you can produce so much content that you’ll be able to rank for any keyword you want to target”
- “Writers will become prompt engineers – all you need to be good at is telling AI what to write for you”
All of the arguments center around the theme of companies being able to do way more with less resources, but none of the conversation has focused around the quality of AI written content and if your customers will actually want to read anything produced with AI.
To me, that’s where the center of the conversation should be.
Here are some of the questions that I’ve been asking myself:
- Can AI produce the content that we produce better than a human on our team can?
- Can it educate customers on a product or industry better than a human can?
- Does it have unique and compelling arguments that are different from what everyone else writes on the subject?
The answer to all of those questions at this point is no. This is why we haven’t been as bullish on AI as other content agencies and vendors pushing the AI narrative.
It’s not to say that AI can’t get there, but I think people are focusing too much on the speed of work instead of the quality of work and that’s a dangerous game to play if you’re wanting results from your content investment.
Let me explain why I think this.
ChatGPT Content is Mirage Content 2.0
In 2016, I wrote an article called Mirage Content, it argued that most of the long form content produced by marketing departments looked, on the surface, like it was well-written, but if you actually read the details it was just high-level fluff. Specifically, when you do a Google search on a topic, many of the articles on the first page have interesting titles, but when you click into them, they all just regurgitate versions of the same few talking points.
I argued in that piece that this problem stemmed from two flaws in the content production process:
- Most companies would hire freelance writers to write on a subject they had no expertise in. That writer would “research” the topic by Googling it and reading the top 10 results and then rehashing what those articles already said. We coined this The Google Research Paper approach, as it mimics a high school student producing a research paper where you can clearly tell the person doesn’t have expertise on the topic they’re writing about.
- The second flaw was that most articles weren’t specific enough. They would cover broad concepts, which forces them to cover them at a high level. They never got deep enough into the details that people actually care about. This specificity problem is also a function of the writer not having expertise on the topic they were writing about.
Now if we think about what AI does, it basically mimics the flawed approach outlined above. AI is trained from a large portion of the web, books, and articles written on a topic, so when you ask it about anything, it simply rehashes what’s been written on a topic (in grammatically pristine English, to be fair).
But what is the “correct” answer? How does AI decide what to say? What does it say on topics where there is no right answer? Does it ever take a strong stance on a topic?
The answer is that AI writers are biased towards saying the most average thing because that’s how they’re built. They’re programmed to produce a response that is as similar to the information they were trained on. All AI-writing tools know is what they’ve been trained on, which is by definition, things that people have already said. They are literally programmed to find the most likely word that goes after the word it just wrote, over and over again.
This is why AI content feels so much like what I defined as Mirage Content 7 years ago: when you first read it, it sounds really articulate and well-written, but when you stop to really dissect the arguments, you realize it’s just stating the same generic arguments in different ways.
So as a content marketing tool, AI is effectively just making it easier to produce mirage content at scale. Many companies and agencies are taking the same flawed approach they used before to produce content and are now doing it faster with technology. Instead of hiring freelancers to Google a topic for an hour and regurgitate what the top 5 results are saying into their own blog post, they can now just use AI to do the exact same thing, but faster.
So if this is the type of “content marketing” you do (largely surface-level coverage of introductory topics), I absolutely agree that AI could be a good replacement for this type of writing. This is writing where you don’t care about selling your personal or company’s expertise to someone else, and where you don’t care about having your company or CEO’s opinion on the topic woven into the pieces. There’s a large number of writers and agencies who produce mirage content that will get replaced because their approach and strategy was flawed to begin with.
But most companies don’t want to produce content that says the same thing as everyone else. They want to share unique, original, or provocative opinions. They want to explain why the approach their competitors are taking to solve the industry problems are wrong, and why their approach is better. They want to be known for their thoughts and their opinions and be considered “thought leaders” in their industry (easier said than done).
This kind of high-quality content is effective. It helps convince customers about your product over competitors and it helps your brand and ideas stand out from the rest. But if your process to produce great content starts with getting the argumentation on a topic from AI, you’ve already lost. You’re just going to end up with the same unoriginal content as everyone else.
Going forward, I think that AI written content will be a race to the bottom. Companies are going to experiment with it, the web is going to get flooded with more bad content, and I think in a year or two the content that will be most valued will be the ones produced by humans. People have always wanted to read something that challenges their thinking, educates them on something new, or has a differing opinion. The way to produce content like that going forward isn’t with AI, it’s with humans.
There are many people that might disagree with this opinion, but I have yet to see anyone show me an example of a good piece of content produced with AI. So if you have one, feel free to share a link to it below.