Is it because I’m in the SEO world or are you picking up on the debate around artificial intelligence (AI) vs. human content creation too, particularly around ChatGPT?
It was a feature on The Last Leg a couple of weeks ago: will AI replace human writers? They put ChatGPT to the test by seeing if it could successfully write a joke. The answer? Not really.
Then, on 27th February on the Last Week Tonight, John Oliver spoke passionately about AI’s pros and cons, as well as the rise of ChatGPT. This segment highlighted some notable AI fails (for example, a self-driving car that hit and killed a pedestrian because it hadn’t been programmed to recognize jaywalkers), as well as AI successes, not least ChatGPT’s skill at writing an Eminem-style rap about kittens!
(At the time of writing this article, the Last Week Tonight episode isn’t available in the UK on YouTube, but you’ll probably find it here when it is – it’s worth a watch!).
In the last five days, I’ve received 15 newsletters from different businesses I follow all talking about ChatGPT! The varying perspectives are fascinating. Some people are clearly rattled by using AI for content creation, while others are excited and inspired.
This is a huge conversation with lots to consider, so I thought you might find it helpful if I pull together my thoughts about the AI vs. human content creation debate in a blog.
Buckle up, it might be a long one!
What is ChatGPT?
Right now, ChatGPT is the AI tool that everyone is talking about. The prototype launched at the end of November 2022 and became an overnight sensation.
ChatGPT is essentially an AI-based chatbot that uses natural language processing and deep learning techniques to generate human-like responses to a wide range of conversational queries.
Broadly speaking, deep learning consists of collecting, analysing and interpreting large amounts of data to make future predictions or to create content based on what the AI is able to understand from the data.
This tool draws from a massive amount of text data from the internet, including books, articles and websites, to be able to generate responses and content that is relevant, informative and – sometimes – indistinguishable to what a human might write in response to the same query.
The training data for ChatGPT includes both structured and unstructured data. It’s what is known as a “generative model”, which means that it uses the knowledge and patterns learned from the training data to generate new text rather than repeating the training data itself.
People are excited about tools such as ChatGPT because they have the potential to be used in a number of ways – for example, to provide customers with conversational experiences when using a live chat facility or to give a good quality customer service or technical support response at any time.
Additionally, people are now using ChatGPT to create content for websites, blogs, courses, product descriptions, social media posts, and more.
Currently, ChatGPT is a free service, although it sometimes struggles with the amount of people logging in to use it. As a result, there is a paid option ($20/month at the time of writing) that gives subscribers priority access to the AI chatbot, even at peak times when free users will have to wait.
Other AI tools like ChatGPT
There are many other AI tools now available, depending on whether the user needs help with writing and content generation, coding, translation, research, productivity or online conversation.
One area where people are particularly excited is in the field of medicine and research, where AI may be a gamechanger in terms of the prediction and early identification of certain illnesses.
Another area creating waves is the host of AI art generator tools that can help users to generate artwork in various styles, usually from a text description. Again, debate is raging about the ethics of these tools. Is it plagiarism if AI tools draw on the work of human artists to create? On the flipside, don’t all creators draw on the artists and artwork that has inspired them? Is the ability to create a distinctly human ability? Does AI-generated artwork devalue the work of human artists?
It’s a minefield! Right now, Getty Images is suing Stability AI, the creators of AI art tool Stable Diffusion over alleged copyright infringement. I imagine people will be watching the outcome of this case with interest and that a variety of lawsuits may follow.
The pros and cons of this technology are thought-provoking and potentially far-reaching.
It’s worth being aware that the current AI tools are built around the same open-source text sets for training and learning purposes, even if how they learn or generate content differs slightly. Google’s AI tool, Bard (more about this below), might shake things up though as it will have access to Google’s search index.
The potential of tools like ChatGPT
I can see why ChatGPT and other AI tools are creating such a buzz right now. Gone are the days when anyone needs to stare at a blank page and wonder where to start. ChatGPT can draw on information from across the internet – the output of millions of human minds – to inspire you.
Stuck for blog ideas? ChatGPT has loads. Need to write hundreds of product descriptions? ChatGPT can help. Want advice about structuring a compelling course? ChatGPT can give you a logical structure.
And because ChatGPT remembers what you said earlier in the conversation, it can learn from your searches and requirements and refine its answers to better reflect what you want.
Leveraging data from the internet, AI tools can also help people to create content that’s optimised for searches and better reflects the search intent that is likely to bring your ideal customers to your website.
I think ChatGPT is particularly exciting for small business owners who don’t currently have the budget to outsource but still want to improve their content and SEO while saving precious time.
BUT – and this is crucial – I genuinely don’t think it’s sensible to hand all content creation over to ChatGPT. Human input is essential. While AI tools can generate high-quality content quickly and efficiently, they can’t replace the creativity and strategic thinking that humans bring to the table.
Also, we have to recognise that the output of AI tools is dictated by the input of data. If the data is flawed from the outset, which, given that it’s human-generated, it might be, the output can be equally flawed. Who can forget how, in 2016, conversing on Twitter taught Tay, Microsoft’s AI Twitter chatbot, to become racist in less than 24 hours?
The limitations of AI tools like ChatGPT (or why I believe human input won’t become obsolete)
Before you start a new conversation in ChatGPT, it clearly sets out some limitations on the welcome screen:
Specifically, we’re told that the tool may:
- Occasionally generate incorrect information
- Occasionally produce harmful instructions or biased content
- Have limited knowledge of the world and events after 2021
A risk to reputations
The point about incorrect information is significant. In February, it was reported by Search Engine Land and other sources that an AI-generated article had been published on the Men’s Journal website that included 18 errors, including several “serious” inaccuracies.
As a health-focused “your money or your life” (YMYL) type website, accuracy is everything. Google and other search engines want people to be able to come to YMYL websites and know that they can trust the advice they read. The article even included a statement that it had been “reviewed and fact-checked by our editorial team”, reassuring people that there had been a human element to the content creation. However, the article’s inaccuracies would suggest otherwise and could potentially damage the credibility of the site in question.
This example illustrates why it would be foolhardy to generate content using AI tools and then send it, unchecked and unedited, into the world.
Let’s consider biased content too.
It would seem that, due to the biases within the training data, AI tools are learning long-standing prejudices. As Diginomica reports, there have been some high-profile ‘embarrassments’ that have made it into the public domain, such as:
- AI tools struggling to recognise non-white faces because they were trained on images that were overwhelmingly white (in 2015, a Google image recognition algorithm auto-tagged pictures of black people as gorillas).
- Sentencing algorithms piloted in the US were accused of being more likely to discriminate against black defendants – this appears to be because the algorithm looked at factors such as prior arrests when determining whether someone was likely to reoffend. Of course, we know that black neighbourhoods in America tend to have a heavier police presence and that black people are more likely to be arrested than a white person for the same offence. This bias in society translated to the AI algorithm, even though black and white offenders have an almost identical rate for reoffending.
- An Amazon hiring tool used to rank job candidates and sift through CVs learned to downgrade applicants who attended women’s colleges or mentioned the word “women’s” in their application. This is because the training data came from the CVs of a predominantly male workforce.
One term that’s been coined for these biases is “pale male data (PMD)” and it’s certainly something that programmers will need to address if AI is to be equitable.
ChatGPT says it has been “trained to decline inappropriate requests” but the reality is that AI is potentially a mirror to the best and worst of human nature and we must be alert to potential bias.
AI limitations within the context of SEO and content creation
I thought I’d ask ChatGPT itself what it considers its limitations to be in terms of SEO and content creation. Here’s what it told me:
From an SEO and content creation perspective, these are serious limitations that we need to keep front of mind.
AI tools can only give responses based on the query inputted by the user, but they may not understand the wider context or audience, which can impact on the quality of the reply. You spend all day interacting with your customers, but ChatGPT doesn’t.
As we’ve mentioned in a different context above, another limitation is that AI tools have to learn from the human-generated content that’s already available online. This content might be inaccurate, biased, or out of date. The chances are that this is how 18 errors ended up in that Men’s Journal article I referenced above! I imagine that it won’t be long before people are required by search engines to identify AI-generated content to help address this.
Also, I’ve noticed that ChatGPT will sometimes return answers to a query with a statement of fact but won’t cite the source, or it will mention the source but not link to it, leaving you with a hunt to find where the information originated. I’d be wary of presenting any information as fact unless you have evidence from a reputable source to back it up.
The potential for plagiarism or, certainly, a lack of originality could be problematic too, and I know this is something with which schools and universities are already grappling.
While researching this topic, I came across an article from Local Marketing Makeovers where they had asked ChatGPT to generate the Home page copy for five different restaurant websites; significant sections of the text were duplicated across all five responses, creating the sense of one voice rather than five unique brand identities.
In today’s marketplace, businesses need to let their personalities shine, rather than being a carbon-copy of what’s already out there.
Knowing this, I think we can be excited about ChatGPT but also realistic about its limitations.
How can AI tools help small business owners?
Before writing this blog, I spent time on ChatGPT, exploring different scenarios to see where it might deliver the best results.
For me, the number one benefit is the tool’s ability to pull together ideas and inspiration for content. In one scenario, I asked it to produce ideas for a free challenge I could run to help small business owners feel more confident about blogging. It instantly presented me with 10 challenge ideas – all very workable – that I could run on social media or for my mailing list.
ChatGPT didn’t write the content. It didn’t break down how to run the challenge (although it could if I asked it), but it did give me a starting point.
I’ve also asked ChatGPT to give me blog headline ideas around a topic, points I could include in a blog, challenge names, insights into assorted topics, the name for an Etsy shop, recommended keywords, and other scenarios. I find it to be an exciting antidote to writer’s block!
If you sometimes find a blank page daunting or you struggle to come up with content ideas, then there’s no doubt that ChatGPT can be an effective timesaver.
I can see the potential from an SEO perspective too. AI tools could help business owners to identify relevant keywords to target for their business, and provide an at-a-glance list of high volume-low competition keywords.
There’s scope to get tips from ChatGPT about optimising content for specific keywords and how to structure a blog for optimal SEO. You can even give ChatGPT your business web address and ask for meta descriptions for every page!
In addition, the tool can provide guidance on technical aspects of SEO, such as improving page loads speed, fixing broken links or how to implement Schema markup.
This is an interesting conversation with lots of nuances and something SEO experts are being asked about a lot at the moment. I think it’s fair to say that we’re all still learning.
Google is known to be heavily invested in natural language processing (NLP) and has released several NLP-related products such as Google Assistant and Google Translate. In fact, in a recent article Google’s CEO Sundar Pichai said that the company had been reorientated around AI development six years ago, so it’s a priority.
It was in the same article that Pichai introduced Google’s “experimental conversational AI service” called Bard. This is currently being trialled by “trusted testers” but due for release to the public “in the coming weeks”.
Pichai says that Google is working to bring AI advancements into all of its products, starting with search. He says that soon we will see, “AI-powered features in Search that distil complex information and multiple perspectives into easy-to-digest formats, so you can quickly understand the big picture and learn more from the web”.
Right now, Bard looks similar in its capabilities to ChatGPT. However, Bard will be learning from Google’s internal LaMDA (Language Model for Dialogue Applications) and search index, while ChatGPT uses an older language model and the same open-source data as other AI tools.
As for the question of whether Google is likely to penalise AI-generated content in search results, it will be interesting to see how Bard influences this.
In April 2022, Google’s John Mueller warned against using AI content, likening it to automatically generated content and suggesting that it might be penalised. However, fairly soon afterwards, Google’s guidelines changed slightly to reflect more tolerance towards AI content.
Under its Spam Policies, Google now makes it clear that auto-generated content could be penalised if it is unoriginal or fails to add sufficient value to the user experience. As you can see from the screenshot below, the guidelines emphasise a continuing need for human review or curation and regard for quality.
Earlier in February 2023, Search Engine Land reported that “Google sets the record straight on AI-generated content in search results”, highlighting that Google would prioritise “high-quality content, regardless of whether humans or machines generate it”.
Google reiterates that content should be created for people first, then search engines, with a focus on expertise, experience, authoritativeness and trustworthiness (E-E-A-T). If you’re using AI solely to manipulate Google’s rankings, it could spell trouble.
The search engine says it is continuing to develop its tool and algorithms to consider:
- Who created the content
- How they created it (AI or human-generated)
- Why it was created
I think AI content – when mixed with some human input – will still rank well if it provides genuine value to the intended audience.
AI vs human content creation
Knowing all of the above, should so-called “white collar” or “knowledge workers” or creatives be worried?
There’s no doubt that people are questioning what AI tools such as ChatGPT mean for those of us who spend our lives creating digital content of one kind or another.
Will businesses continue to employ SEO experts, copywriters, content writers or social media managers if they can get an AI tool to write their content for free? Will tools like this drive down prices? Will content creators need to pivot their businesses in some way?
One e-newsletter I received last week suggested exactly this, reasoning that many creatives will need to reposition their roles in order to save their livelihoods. People may challenge the need for a copywriter to toil over a first draft, for example, if ChatGPT can churn one out in seconds. Instead, the sender mused, we need to focus on different skillsets. Yes, AI tools can produce a passable first draft, but what about the planning, the personality, the strategy? These are areas where humans are currently irreplaceable. So, instead of content writers, will we suddenly see an increase in “Content Strategists”?
Time will tell, of course.
Right now, I think it’s important to keep the role of AI tools like ChatGPT in perspective. As we’ve seen, there are limitations.
The most prudent approach, in my view, is to see AI tools as just that, tools that help us to deliver amazing services to our customers.
But we mustn’t lose touch with our unique voices, skills, creativity, ability to think outside the box or our big, open hearts. From what I’ve seen so far, a lot of AI-content is extremely beige and lacks personality and trust. Humans bring a richness that can’t be replicated (for now, at least!)
We need to find a way to adapt and use these incredible but also flawed tools in a way that feels right for our audience, Google and our own ethics.
I believe the best content is genuinely helpful, provides real value and gives the writer’s unique perspective. AI can support that, not supersede it. So, instead of seeing AI as an enemy, our challenge is to how to make it an ally.
What are your thoughts on using AI for content creation?
Hazel Jarrett, director of SEO at SEO+, is well-known in the SEO space, has won many awards during her 20-year career and has been published on various well-known sites. Through her services and training programs, her SEO strategies have generated 10s of millions of sales for her clients, earning her a big reputation for delivering the results that matter.
Want to follow Hazel on social media? You’ll find her via the icons below.