Skip to content
Go backGo back

Report

A Future with AI: Livestream Report

Written in collaboration with FT Strategies and the Google News Initiative
Written in collaboration with FT Strategies and the Google News Initiative
Written in collaboration with FT Strategies and the Google News Initiative
Written in collaboration with FT Strategies and the Google News Initiative
Written in collaboration with FT Strategies and the Google News Initiative

We recently hosted News in the Digital Age: A Future with AI - a livestream event in partnership with Google. With over 1200 registrations for the event, there is clearly strong interest in and demand for an industry conversation about AI.

“The Financial Times and Google are aligned on a vital mission to take a bold and responsible approach to the opportunities that AI creates. With FT Strategies’ deep understanding of news publishing, and Google’s expertise in AI, we’re working together to enrich knowledge of the AI landscape.”

Sulina ConnalManaging Director, News and Books Partnerships, EMEA, Google

This livestream event was the first of four programmes we are running together with Google in 2023 and early 2024 to help news and media organisations seize the opportunity that AI brings. You can access the on-demand video here. The programme supports Google News Initiative's mission to work with publishers and journalists to fight misinformation, share resources and build a diverse and innovative news ecosystem. You can learn more about our AI Design Sprint here, and AI Foundation programme here.

Two lively expert panels on the day covered the following themes:

We define artificial intelligence (AI) as algorithms, automations and computer programs which learn from data to analyse patterns and perform specific tasks. There is a distinction between non-generative AI and generative AI (GenAI). Non-generative AI is most often used to analyse existing data while generative AI is different in that it creates new data (e.g. text, images, video) - unlocking a new category of use cases built on media and content data.

Prior to the event we surveyed over 1000 registered attendees from organisations around the globe to understand the current state of their AI usage, and the main challenges to meeting their AI goals. Alongside data capabilities (15%), and finding the right talent (10%), the primary challenges for media organisations looking to use AI are:

  1. Defining and building an AI strategy (only 4% of respondents have a “clear, embedded vision and strategy” for AI in their organisation).
  2. Implementing AI technology (30% of respondents see “adapting to new technologies” as a main challenge for their organisation).

Our panel of industry experts aimed to cut through the noise of AI hype to provide practical insight into AI strategy development and the realities of implementing and experimenting with AI. We were pleased to be joined by:

Panel 1 - Thinking strategically about AI:

Moderated by Kate Sargent (Chief Data Officer, The Financial Times)

Panel 2 - Implementing and experimenting with AI:

Moderated by Chris Gathercole (Director of Tech Research, The Financial Times)

Companies should develop an AI strategy to support their wider business strategy

Developing an AI strategy is crucial to ensuring your organisation makes the most of AI opportunities and overcomes challenges. We define an AI strategy as a plan for responding to the opportunities and risks posed by AI in a way which supports broader business goals and helps inform longer-term capability building.

  • AI strategy should not be isolated from overall business priorities - Your AI strategy should be aligned with your business strategy. As put by Joseph Teasdale: “having an AI strategy is part of having a good business strategy”. Lucky Gunasekara added that to make a start, you can ask questions like; “What are the opportunities?” What do we want to prioritise?” How does this serve both our business, and the relationship with readers?”
  • Consider your organisation’s values when defining a strategy - As with business strategy, your organisation’s values will help to shape your AI strategy. Kate Sargent explained that alongside implementing AI, she is working to ensure “the power of this technology continues to chime with our values''. Olle Zachrinson echoed this, saying “for us, we always have to go back to our public service mission, [...] we’re trying to put these public service values at the centre of the technology”. Joseph Teasdale provided questions to start with, including; “Who are we?” and “What are we trying to do in the world?” to help define and prioritise why and how you want to use AI in your organisation, and set guardrails for your AI experimentation.
  • It is important to test, learn and iterate - This was emphasised as a key point by many panellists across both discussions. Once you are aligned on values, guidelines, and KPIs (between business strategy and AI strategy, and between cross-functional teams), it is crucial to start testing and experimenting. This will help you quickly identify the areas that are working, and those that are not. It gives your team the agency to try things, learn, share feedback, and try again, allowing your organisation to gain knowledge at pace.

“(AI) is like surfing, you want to get out in the water, you want to get used to it, you want to get in that practice of running experiments regularly.”

Lucky GunasekaraCEO, Miso.AI

AI can be applied to many, but not all, areas of news and media businesses

AI is not new. We found that 84% of news and media organisations are already experimenting with AI or using AI models in some area of their business. Established AI use cases include segmentation, personalisation, and different forms of data analysis such as anomaly detection. We asked our livestream viewers “What are the AI use cases which will most likely improve your performance?” - the leading responses were:

  • Content Production (e.g. headline generation) - 35% of respondents
  • Audience Engagement (e.g. comment moderation) - 32% of respondents

“AI is not just going to change the world for you, it’s going to change the world for your customers as well.”

Joseph TeasdaleHead of Tech, Enders Analysis

Generative AI is more accessible than previous forms of AI as there are more tools and applications for non-technical users. Given its greater applicability to content data, GenAI also brings a new set of opportunities to the newsroom.

  • GenAI can be used to support operational efficiency, commercial optimisation, editorial decision-making, and content creation - Our panel identified four major areas as opportunities for generative AI. Paige Bailey gave an example of Google using AI to drive operational efficiency by automating workflows to improve code quality. Lucky Gunasekara described how AI can significantly improve the quality of search systems, given AI models can better understand the context of data and articles, which is a use case with both operational and customer-facing possibilities. Bram De Ruyck discussed how GenAI is being used to analyse Mediahuis’ content against their values and provide insights to the editorial team to ensure content stays true to their aims, effectively supporting editorial decision-making. Other use cases for GenAI include commercial optimisation (e.g. personalised paywalls and pricing), and content creation (e.g. headline generation, summarisation, and translation).
  • AI could unlock significant value from content archives - Bram De Ruyck discussed how one of the most impactful uses of AI in his organisation has been allowing journalists and editors to find relevant archive content to support new content production - saving time for hundreds of journalists in their daily work. Joseph Teasdale commented on the possibilities of licensing these data, saying that, for many, the “monetary value of [their archive] data to AI providers is maybe not that great” unless it covers a domain-specific niche that could “give a particular model an edge” that “it doesn’t already have”.
  • AI will augment, enhance, and change roles, but human expertise remains crucial - Joseph Teasdale explained that GenAI can “empower the most effective creatives”, by “giving everyone access to [...] specialised useful skills”. The panel discussed how this is not new - computers have had this effect since they were invented - but new forms of AI open the door to coding, data analysis, language (through translation), creative workflows and more. However, importantly, our panel noted that AI is not ‘on the ground’ and able to discover and investigate new facts about the world, unlike expert journalists. As Joseph Teasdale put it, “(AI) models don’t actually have access to reality, which is an important part if you’re breaking a news story”.

“AI models are only tools to help try to make experts more efficient”

Paige BaileyLead Product Manager (Generative Models), Google DeepMind

Data and cross-functional collaboration are necessary for implementing AI

Our panel discussed how they have managed previous transformations in the news and media industry, and how they are setting up their organisations and teams to adopt AI and start experimenting now. When we asked our livestream viewers “What is most important for media organisations adopting AI”, their responses were:

  • Strong data and analytics capabilities - 38%
  • Sound strategy - 35%
  • A clear ethics framework - 19%
  • Good governance principles - 8%

  • Stakeholder collaboration is required for AI implementation - Louise Story argued that the adaptation to AI is not too dissimilar from previous technological changes to the news and media industry of the past, emphasising the need to break down silos, “you need to have cross-company collaboration and common KPIs”. Lucky Gunasekara added that you need to get stakeholders involved throughout the business, not just from the tech team, but from; editorial, commercial, legal, ethics, marketing.
  • Organised and accessible data is key to unlocking AI’s potential - Louise Story emphasised the importance of good and accessible data: “the first thing you need to do is get your data in order. [...] If companies have data that is not accessible, they cannot play in this game”. Data is not only required for training AI models - the process by which models learn the underlying patterns (or, in the case of GenAI, the tone, style or specific content) - but also for making predictions and synthesising new content. In order to make useful models for your organisation, it’s therefore important that your data are available and accessible. As Louise Story advised: “you need to work with your tech teams to be building platform capabilities to collect, analyse, and take action on your data in real time.”
  • AI training and understanding should not be limited to only the tech teams - Our panel agreed that to start implementing AI into your organisation all employees should invest time in understanding AI, not just tech teams. Louise Story said “It’s really important that leaders of all areas, not just tech”, invest in understanding (AI). In training your current staff on AI, Paige Bailey put it memorably - “demos are better than memos”! Being able to “show people something visceral, that they can see, touch, and try” will bring stakeholders on board, and make understanding AI tools easier. In areas your organisation does not have the required skill set, the panel discussed recruiting data specialists to maximise the potential opportunities of AI.

Specific actions can and should bring focus to the ethical considerations of AI implementation

Creating guidelines around ethics and transparency are key to setting the foundations for your AI strategy. This foundation will help to set parameters for your AI experiments, and how your organisation utilises AI in the long term. An ethics framework will help to guide ethical decisions, and an AI usage policy will define specific (dis)allowed use cases and tools. Together, they should provide your employees with agency when experimenting with AI.

Depending on where AI is being used, the level of risk and scrutiny may vary. For example, use cases in business operations are typically less scrutinised than use cases which influence reader experiences.

  • Building ethics guidelines will help guide and guardrail your AI use, but these documents should be allowed to change - Our panel discussed the usefulness of building an AI ethics framework and AI usage policy to help guide and guardrail your organisation’s adoption of AI. Bram De Ruyck discussed how Mediahuis transparently developed an AI framework to give a point of view of how they will use AI technology. He noted that this is a “living document”, it can change, “and maybe faster than some people expect.” Lucky Gunasekara suggested questions for organisations to consider: “What are the red-lines we’re not going to cross?”, “Are we going to own this (AI) model, or use someone else’s (AI) model?”.

“We’ve always been [...] very open to answer questions from the audience, competitors, stakeholders, or the political sphere”

Olle ZachrisonHead of AI, Swedish Radio
  • Collaboration and transparency are key to implementing AI - Collaboration and transparency between organisations in the industry, and consumers, has been identified as having an important role to play when adopting and using AI. Collaboration will allow organisations to develop practice standards, and align on where and how AI should and should not be used. Olle Zachrinson said “The obligation to be transparent [...] and explain this better to the audience is increasing”. He added that Swedish media outlets are working to enter a transparency standard, rather than waiting for regulation to be set.
  • Generative AI requires guardrails to ensure accuracy - Our panel discussed an example of GenAI going wrong in the newsroom by misinterpreting the meaning of a headline and providing misinformation as a result. This simple error shows how processes need to be in place when using GenAI. A possible solution for this issue would be ensuring a human-in-the-loop process, in this case ensuring that a human is reviewing and accepting any content generated by an AI before publishing. Chris Gathercole noted that allowing GenAI to publish content directly to your audience is a fragile system, and Paige Bailey emphasised the importance of human-in-the-loop processes to ensure any errors are caught before publishing.

Thank you for reading

Access the on-demand video here: News in the Digital Age: A Future with AI

We hope you enjoyed a recap of the FT Strategies & Google livestream event, News in the Digital Age: A Future with AI. If you would like to chat about any of these topics with our FT Strategies expert consultants, please contact us at [email protected].

About the author

Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant
Sam Gould, Senior Consultant

Sam has 6 years of experience helping clients to solve strategic business challenges using data. He has helped organisations in both the public and private sectors to define strategic roadmaps and processes for using AI. He has also designed and built innovative data solutions, working with senior stakeholders as part of critical delivery-focused teams.

Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant
Yosef Abebe, Junior Associate Consultant

Yosef has a degree in marketing and communications and 4 years of experience in marketing, content production, and project management. He is currently working to understand AI's effects on the media industry.

You might also like

FT Strategies brand symbol
FT Strategies brand symbol
FT Strategies brand symbol
FT Strategies brand symbol
FT Strategies brand symbol

Get started

We've helped companies around the
world future-proof



their businesses
- and we can do the same for you.