In the near future, your favorite news platform, journal, or blog will use AI. In fact, in some capacity or other, they probably already are.
This applies to almost every industry. Sending emails is faster, writing a report is no longer labor-intensive, and summarising meeting minutes is far easier.
But while AI tools are exploding in popularity across almost every vector, like any tool, they can be blunted by misuse.
CCN spoke with Jon Gillham, Founder and CEO of Originality.ai, to explore the broader implications of AI technology in journalism.
In May 2023, NewsGuard identified 49 news and information sites that “appear to be almost entirely written by artificial intelligence software.” Seven months later, that number had ballooned to 687.
While some would naively assume that journalistic integrity is at the heart of any news report, the reality is often far more nuanced.
Plagiarism in newsrooms and, indeed, across any institution is one of the highest forms of professional abuse.
But the fact remains that news platforms face a see-saw of balancing efficiency with authenticity – at the same time attempting to stay competitive in an industry that requires strict fact-telling alongside a racing news cycle.
Given the pressure on journalists to meet tight deadlines, it can be tempting to use one or several of the AI tools available. While popular models such as ChatGPT offer efficiency, if you don’t check the content produced, you risk compromising the authenticity of what you are reporting.
Add hallucinations to the mix, and you end up with a poorly written article that includes things that didn’t actually happen.
Originality.ai’s Jon Gillham advises transparency about AI usage to build trust with the audience. Some news platforms are, in fact, doing just that. It has become normal to see the tag at the bottom of a news article: “Written by AI using a human editor.”
“Journalists need to be extremely careful with AI use, although I appreciate that given their ever-increasing pressure on turnaround times, that certainly isn’t easy,” Gillham tells CCN.
“If [journalists] are going to undertake a new story, what truly sets it apart and provides readers with a well-thought-out piece is to offer unique insights, as well as a clear and distinguishable voice,” he adds.
Gillham explains that Originality.ai uses “sophisticated methods” to distinguish between AI-generated and human-written content. Their AI detection tool leverages multiple models and supervised machine learning, trained on millions of content pieces.
“Our engineers train the detector on millions of content pieces (such as web content, blog posts, and books) made by humans and AI so it can learn how to identify and differentiate them.”
This approach, in theory, claims to ensure accuracy and minimize false positives, a common frustration for content creators whose original work is erroneously flagged as AI-generated.
It isn’t that AI writes better than humans; in fact, it often writes very poorly. The result is that we are seeing the same type of content ad infinitum.
As Gillham notes, content creators can lean into their unique voice and style to make their content authentically theirs—even if they choose to get some support from AI.
“Google is now prioritizing content with new opinions and expertise,” says Gillham, suggesting that generic, copy-paste type content will eventually fall to the wayside.
The April 2024 core algorithm update is also an indication that Google plans to penalize AI-generated content. What we can infer from these updates is that Google is tightening its policies against using AI to produce unoriginal content at scale.
“Google’s recent algorithm updates have been extremely interesting. They make sense when you think about how Google’s business model operates,” says Gillham. “After all, if the top search engine results for queries are all AI-generated, why would you use Google? What would prevent you from just asking tools like ChatGPT directly?”
Google’s algorithm updates will likely penalize AI-generated content to preserve the search engine’s relevance. Gillham predicts a greater emphasis on the authorship and legitimacy of content producers as Google seeks to prioritize original, high-quality content in its search results.
Whether an AI or human creates content, the key to success in Google’s search rankings is to ensure high-quality and original content that provides value to users. Google’s algorithms are designed to reward content that meets these standards.
Originality.ai’s CEO believes this is very tied to Google’s AI search initiative:
“I’m anticipating that even more focus will be placed on the authorship of content, the legitimacy and reputation of the person producing it, and how the content improves and builds on what is already publicly available.”
He adds, “After all, something’s got to feed their AI search initiative, and if you keep promoting AI content, why would anyone continue to produce new content?”