The European Federation of Journalists (EFJ) organised on 18-19 September 2023, together with the Austrian journalist’s union GPA, a meeting with its Expert Group members. Around 40 journalists and trade unionists from 25 European countries discussed in Vienna the challenges and opportunities posed by Artificial Intelligence (AI) in journalism, and its implications for ethics, jobs and authors’ rights.

“AI in journalism doesn’t mean AI does journalism.” This is the approach taken by the Austrian news agency APA, which some of the EFJ Expert Group members visited. APA has been using automated processes since 2019 for elections, then for coverage of Covid-19, and is one of the first media outlets in Europe to have developed guidelines for the use of AI in journalism. 

Katharina Schell, APA deputy editor-in-chief in charge of innovation, stressed the importance of applying AI to the news agency’s journalistic values, ie. providing independent, fact-based, trustworthy, transparent and balanced information. At no point does the AI intervene in the editorial process, she explained: “At APA, all original texts are written by journalists. Machines do not write stories, but are used as tools to change texts into different formats and help journalists extract large amounts of data that would take journalists a long time to process and be potential sources of error”. ChatGPT, for instance, is not a threat for Katharina Schell, as “it doesn’t meet the APA quality standards”. For the time being.

The need for regulation

Austrian public broadcaster ORF has been using AI to summarise and translate texts, to transcribe interviews, and is currently testing avatars to present news programmes. However, “no decision should be taken by AI,” insisted ORF digital project manager Florian Matscheko during a panel discussion. “Humans decide, AI does. We should think AI as a tool, not a workforce”. With AI already widely used in the journalism and film industries, Matscheko calls for regulation that would require, among other things, the use of AI to be declared and AI-generated journalistic output to be labelled for transparency purposes. The need for regulation was echoed by Deniz Wagner, adviser at the Office of the OSCE Representative on Freedom of the Media, who suggests looking at what AI means for democracy and the media. AI must remain under democratic control and this should be done through transparency obligations, she said.

The ethical challenges posed by AI are huge in a world where public trust in journalism is in decline. “If we look at things in a positive light, we could think that artificial intelligence could lead to the resurgence of journalism and media as gatekeepers against fakes, but that requires being able to identify them,” said Allan Boye Thulstrup from the Danish Journalists’ Union (DJ). Less repetitive tasks, more time to go on the field and to investigate, more appreciation from the public for journalistic work: that sounds like the dream of many journalists. 

Yet the reality seems to be quite different. Several news media outlets have already announced job cuts partly because of the development of automated programmes. Should we be preparing for a future with fewer journalists in newsrooms? The concerns for jobs and working conditions are real. The fear of being downgraded, of losing one’s jobs, of acquiring new complex skills, of losing income for freelancers, must be addressed in collective agreements, said journalists and trade unionists, in order to protect workers and ensure that they are involved in discussions with management and developers. 

Fair remuneration of authors

Further questions were raised about copyright. Who owns the results of ChatGPT? How do you pay a journalist whose text has been used by an AI tool? What if, as a journalist, I don’t want my work to be used by AI? These are the billion-dollar questions, literally.  “AI is using original content and making money out of it,” said Mogens Blicher Bjerregard, authors’ rights expert and member of the Danish journalists’ union. “My answer for the time being is that journalists should grant exclusive rights as much as possible in order to prohibit free use of original content”. 

In June 2023, it was reported that AI and some media companies are discussing copyright issues, and in particular a pricing model for news content used as training data for AI models. While talks are in their early stages, several options are on the table, such as creating a “quantitative model” similar to the one developed by the music industry – which would require AI companies to start tracking and disclosing their usage of media content – or an annual agreement for unlimited use of media companies’ content. 

For an ethical AI

At the European level, discussions on the AI Act, the first-ever regulation on Artificial Intelligence, are still ongoing. Stakeholders, such as the International Federation of Reproduction Rights Organisations (IFRRO) and the EFJ, are calling for boundaries to be set to preserve the integrity of copyright and licensing systems. They ask for appropriate remuneration of rightsholders and for transparency obligations with regard to the use of copyrighted material in the training of AI models.

Journalists’ and media freedom organisations are also joining forces for an ethical AI. Last month, the EFJ joined the international committee to develop a charter aimed at regulating the use of AI in media. The committee intends to come up with “a strong international reference” to preserve the quality of information and public trust in journalism. It will develop a set of principles, rights, and obligations for information professionals – free from any economic rationale. This initiative, launched by Reporters without Borders (RSF), will deliver its first results before the end of the year 2023.

This article was taken from the European Federation of Journalist (EFJ) website.

Discover more from European CERV Project E-engAGEd

Subscribe now to keep reading and get access to the full archive.

Continue reading