Skip to content
Go backGo back

Article

Should people care if AI writes the news?

Growth opportunities for publishers who embrace humanity in their integration of AI

AI’s ability to generate content creates an existential question for journalists - are we needed anymore? The answer is of course yes, but with AI playing a growing role in the writing of the “first draft of history” another pair of equally sizable questions exist - do people care if AI writes the news, and should they?

Many people's answers are far less straightforward. The crux of this dilemma could create an opportunity for those organisations that get AI right, not only in its technological integration but in its adoption by staff. Creating trust around the use of this technological marvel could generate opportunities for growth as readers look for guarantees of integrity, and most importantly humanity in its application.

Creating trust around the use of this technological marvel could generate opportunities for growth as readers look for guarantees of integrity, and most importantly humanity in its application.

Those newsrooms that have adopted AI are experimenting in different ways. Some, for example, have focused on boosting engagement - such as Forbes’ AI chatbot that allows users to search archives in a conversational manner. Others have focused on the newsroom - such as Newsquest who have built a tool that takes journalists’ research and produces stories for review. This “human in the loop” system has been integrated with the aim of “alleviating the burden of the mundane but very important tasks, freeing them up to do that human touch journalism that really resonates”.1

AI is especially useful in streamlining repetitive tasks such as transcribing interviews, generating summaries, creating new formats using text-to-video or text-to-audio tools, finding the right picture or video to include in stories, improving headline SEO, and much more. Freeing journalists up from these types of task can help them focus on the intrinsic value-add of humans - that of connecting dots and creating clear narratives.

The potential applications are clearly very exciting, and far-reaching, with publishers having the opportunity to catalyse growth. Yet, as with many advancements, caution must still be exercised.

In July 2023 the National Union of Journalists expressed “grave concern” over an AI-generated article titled “Opinion: Should refugees in Ireland go home?” published by the Limerick Leader.

Despite the content of the article being “benign” the clickbait headline, which would unlikely have passed through human editing, “largely ignores the human dimension, the pain and suffering of those forced to flee persecution or human rights abuses”.

This example shows the important role journalists play in the creation of news, understanding the “human dimension” and lived experience which a computer programme lacks. As studies have shown that media coverage can profoundly shape public attitudes,2 it is important that effective guardrails such as “human in the loop” systems and frameworks to ensure accuracy are prioritised over quick integration.

But do readers care? AI is playing an increasingly important role in our daily lives, with young people particularly embracing its use cases in creative ways. A recent BBC article uncovered how younger generations have turned to “therapy bots”. These chatbots, accessible through Character.AI, have become incredibly popular, with the most used of them receiving more than 78-million messages in just over a year. This level of popularity shows that people will embrace AI tools if convenient despite their limitations and concerns about their competency.

Recent studies have found that 70% of global senior leaders in media companies think that AI will weaken trust in the news overall, 40% of people have concerns over the accuracy and quality of AI-generated content and 72% of people would prefer to read content written by a human.

Despite this, recent studies have found that 70% of global senior leaders in media companies think that AI will weaken trust in the news overall,3 40% of people have concerns over the accuracy and quality of AI-generated content and 72% of people would prefer to read content written by a human.4 These responses show that people do care about AI’s role in the newsroom, even if they are willing to embrace the convenience of the technology in other spheres. The news industry has faced many systemic challenges over the past two decades. In the US alone 33% of newspapers have closed and 66% of newspaper journalists have lost their jobs since 2005. It is vitally important that publishers avoid the sort of mistakes that alienate readers and instead use AI in a way that promotes their values and doesn’t undermine the quality of their work.

Audiences need to be aware of how media organisations are using AI, whether that be to produce content, enhance it or support a human in its creation. Readers should have full knowledge of its role in writing a story so they can critically assess the information accordingly, as they would if it was written by a specific journalist.

Many news organisations have already begun to do this, highlighting the importance of reinforcing existing relationships with readers. Heidi News, a Swiss science and health brand, published an “ethical charter: on AI explaining “the framework and ethics that govern our activity, and above all the relationship of trust with our readers”. The Guardian has followed suit, publishing a “clear and concise explanation of how we plan to employ generative AI” to ensure the upholding of “highest journalistic standards”.

The New York Times has embraced the human appeal of their publication, redesigning their byline pages to highlight the writers behind the stories as it is these journalists they believe “set them apart”. This human factor is vital to engaging readers. Individual voices offer different perspectives that can differ from the majority, offering a unique narrative that cannot be replicated. AI on the other hand offers an amalgamation of information, drawing the mean from a variety of sources and only voicing the majority view on an issue. The difference can be likened to that of a store-bought cake mix versus a homemade recipe. Both have their respective uses in different contexts but the latter done well is more likely to wow your dinner guests. Both of these approaches have a role to play, but when it comes to engaging readers to the point of payment, the human touch could be the difference maker in a crowded market.

People should - and do - care about the providence of their news, whether it's human or machine.

People should - and do - care about the providence of their news, whether it's human or machine. AI cannot replace the “lived experience” of journalists that the New York Times has so effectively begun to showcase in its new bylines. However, this is a good thing for AI adoption in newsrooms. Readers caring about the integrity of their news will help to ensure that AI is used safely, holding news organisations to account to enable news to remain accurate and honest. There is a clear growth opportunity for those organisations that invest their time and resources on ensuring transparency and ethical use of AI, train their newsroom to get the most out of the technology and ensure that they continue to emphasise the human element of the news. Chat GPT can source a Michelin-star recipe, but the true art, the cooking and tasting of the meal, still requires a “human in the loop”.

If you want to learn more about how we can help integrate AI into your organisation, get in touch here.

1Doherty-Cove J (2023), Artificial Intelligence in Journalism, London, NCTJ

2Kellstedt PM (2003) The Mass Media and the Dynamics of American Racial Attitudes. New York: Cambridge University Press.

3Newman N (2024), Journalism, media, and technology trends and predictions 2024, Reuters Institute

4Dansie H (2023), Why readers are telling us they want humans, not AI, reporting the news, The Media Leader

About the author

Joseph Miller joined FT Strategies in September 2023 and is passionate about helping clients undergo sustainable transformation to safeguard the future of the news industry. Before joining FT Strategies, Joe worked at the Financial Times’ global events business, FT Live, overseeing content and events on topics from digital transformation to ESG to accelerating fan engagement in football. Joe holds a Msc in International Relations, and specialised in social media’s effect on the news cycle.

You might also like

FT Strategies brand symbol
FT Strategies brand symbol
FT Strategies brand symbol
FT Strategies brand symbol
FT Strategies brand symbol

Get started

We've helped companies around the
world future-proof



their businesses
- and we can do the same for you.