Journalists, bloggers, authors and others who create original content, should think twice before leaving journalism to robots
The video on YouTube seems very professional at first. A male voice reads out in English to moving images from the streets of Helsingør city centre and Danish classrooms with schoolchildren in front of computers. It’s about Google being kicked out of Danish primary schools because of problems with the tech company’s data management. The whole case of Google Chromebooks is outlined in the long video, and slowly you realise that the real photos are spiced up with a lot of so-called stock photos – or archive photos of planned situations, and those photos are not from Denmark.
Recently, Jesper Graugaard (the father from Helsingør who started the case) sent the video to me because we both appear in it. The Youtube channel Sporting Nation has found some photos of us from the web, which they use when quoting us. The text comes from an article in the technology magazine Wired, which we both approved. But Sporting Nation copied the text and then mixed som other texts so all of it is not correct. It is simply fake news generated by artificial intelligence. And Sporting Nation? I can’t find anything about them – other than that they are from the USA, which is the only thing they say on their YouTube channel.
The video, mediocre and flawed in every way, is made for one purpose: to make money on advertising. The video begins with adverts for Momondo and the Nordic Swan Ecolabel. The film is interrupted several times by adverts from Wolt and McDonalds, among others. I wonder if these companies have any idea? At the same time, the film is probably being made in close cooperation and revenue sharing with Google-owned YouTube, which usually makes a big deal about fighting misinformation.
It’s not new that big tech is monetising fake news. Nor is it news that the internet will be flooded with fake news. In a previous column, I quoted a report from Europol that estimates that 90 per cent of all content on the internet will be generated by artificial intelligence by 2026, and that much of it will be lies. What’s new is that more and more publicist media outlets and content creators are going berserk with excitement over generative AI such as ChatGPT, and many are experimenting with automating journalism. And this is where I want to lift my index finger.
Marketers, influencers, school students and many others are now using robots to generate content. In the vast majority of cases, it is mediocre content coming out of e.g. ChatGPR, because it is created using all the content that is already on the web.
There is no doubt that journalists and other original content creators should use artificially intelligent tools to assist, inspire and guide them. But if they want to differentiate themselves from the masses and get customers to pay for content and adverts, they need to be far better Thant automated content. They need to deliver hand-held, creative and original content, and they’re not going to do that with generative AI – at least not right now. There are people who believe that machines will be much better than humans – even when it comes to creativity.
Therefore, it is also a good idea for publicist media to establish a set of guidelines for the use of generative AI. They could be inspired by Wired’s guidelines:
- We do not publish (or edit) stories with text generated by AI except when the fact that it’s AI-generated is the whole point of the story. (In such cases we’ll disclose the use and flag any errors.)
- We may try using AI to suggest headlines or text for short social media posts.
- We may experiment with using AI as a research or analytical tool
- We do not publish AI-generated images or video. We specifically do not use AI-generated images instead of stock photography
That way, Wired ensures that it continues to rise above mediocrity.
Translated with the help of www.DeepL.com/Translator (free version)
Photo: Philipp Schmitt & AT&T Laboratories Cambridge / Better Images of AI / Data flock (faces) / CC-BY 4.0
The main part of this article was first published in Politiken