Balancing innovation and ethics in the AI age

Maeve Harris's avatar
Maeve Harris

As AI continues to dominate across the board, creating a framework to uphold journalism ethics has never been more important.

AI in the newsroom is nothing new. Organisations, including the Associated Press, have been using it for the past decade to carry out tasks such as partially automating stock market or sports reports.

What’s new is that, with the advent of generative AI, this technology has gone mainstream and is now directly impacting audiences. 

Where generative AI and journalism meet

“We have centred our coverage around this whole burgeoning of generative AI – how does it impact ordinary people?” said Thomson Reuters Foundation editor-in-chief Yasir Khan in reference to the foundation’s recent investigation into AI being used for surveillance in the US prison system. 

This focus on accountability and the socio-economic implications of AI underscores journalism’s critical role in addressing the broader impacts of technological advancements – beyond the traditional realm of reporting on funding rounds and new consumer products. 

“While keeping an eye on the development of AI, we are also tapping into where it’s being used in ways that require accountability,” said Yasir, explaining the company’s focus on digital rights, data privacy, surveillance, and inequalities that AI technology may create or exacerbate.

It’s clear that AI is no longer the preserve of the technology reporter. It’s cropping up regularly in news stories across politics, finance, entertainment, education and more. 

But do people fully understand how AI is impacting society? It’s pervasive, cropping up in some surprising news stories in 2023. In reference to the Hollywood strikes, Associated Press (AP) executive editor Julia Pace explained, “a lot [of coverage has been] about what the risks of AI could be … I think we all have to become internal experts in order to have it infused throughout the coverage”.

“We can’t inform our audiences about what’s real and what’s not unless we know how to do that internally as well.”

Associated Press executive editor Julie Pace

Julie underscored the importance of fostering AI literacy within news organisations. AP’s proactive approach, for example, involves developing internal frameworks and standards to identify AI-generated content, particularly in relation to disinformation. 

“We can’t inform our audiences about what’s real and what’s not – about what to look for – unless we know how to do that internally as well,” Julie pointed out.

This reflects a broader trend in journalism – a balancing act between embracing technological advancements and upholding ethical standards. As AI becomes more integrated into newsrooms, the industry faces the dual challenge of utilising AI’s potential while also mitigating its risks.

Yasir emphasised treating media literacy as a public health issue, underscoring the importance of equipping future generations to be shrewd news consumers. According to Yasir however, Generative AI should not be met with panic when it comes to trusting the veracity of images or pieces of writing online. 

It might be slightly more convincing, but fake news has always existed. “Malcolm X said – a long time before I was born – that you don’t believe everything you read in the newspapers, right? That was always a good rule; be a savvy consumer of the media,” said Yasir.

To find out more about the 2023 trends across media, marketing, and the creator economy, take a look at our marketing and media insight report.

Image: Web Summit

Marketing & Media

Podcasts revolutionising media in every corner of the world

Journalism was once monopolised by single-source news platforms. In the last decade, however, there has ...

February 9
Marketing & Media

What skills are crucial for the modern CMO?

The role of a chief marketing officer has evolved to a more comp...

February 9