Skip links

When Artificial Intelligence Is Hyped By the Media

There is too much fascination and too little criticism and focus on solutions when publicist media cover new technology.

In June, the Washington Post ran a story about a Google engineer, Blake Lemoine, who was phantasising. He believed the artificially intelligent system (LaMDA), a chatbot, he was working on at the big tech company was sentient and behaved like a 7-year-old. The story went viral throughout the publicist media, who were speculativing if he was right. The media gave this one man and his crazy claims plenty of publicity. Because what if.

The media are helping to create hype about artificial intelligence, how amazing this new technology is and can be, and who is furthest ahead with it. The most conspiratorial people on social media speculated that it had to be a planted story by Google, which first suspended and then fired Lemoine.

All the experts in machine learning and artificial intelligence say No. The methodology on which the technology rests today cannot lead to sentient or conscious machines. It is pure simulation. It matches patterns that it draws from vast databases of human language. So pure science fiction gets plenty of space over critical focus, in stead of focusing on solutions and regulation which we should already be thinking of and debating, when it comes to artificial intelligence.

Artificial intelligence can simulate emotions, leading us humans to believe that we are dealing with something alive.

Simulated Sympathy
Some articles touched on important explanations; that consciousness only exists in biological creatures, and twhat is most chilling about the whole affair is that even a data engineer like Lemoine can be led to believe that mechanics can suddenly have feelings. If it can happen to him, it can happen to many people, and that could pave the way for more manipulation by the companies that control the technology. Then, when the technology is embedded into computers that look like people, it becomes an even more powerful weapon of manipulation. The industry today, builds computers that look like humans, because we humans will have feelings for them. As Thomas Telving writes in his book Killing Sofia and in this article, research shows that even though we know it’s cold mechanics, it brings out emotions in us, when it looks like a human. With facial recognition software, for example, a machine can read our emotions and use that to converse with us and simulate sympathy.

Now, LaMDA isn’t embedded into a computer that looks like a human yet. But we need to think ahead, regulate and act so we’re not misled.

Unfortunately, it is probably too late to demand that human-like computers have square heads so that we humans can always tell the difference. Instead, we should ensure that humans are never in any doubt as to whether we are talking to a human or a machine by requiring that machines be clearly declared as a machine. This requirement to ‘flag’ when machines interact with humans is likely to be included in a forthcoming AI Act from the EU, but as with all other legislation it is a matter of interpreting it and enforcing it.

Cold and Dead
The media should always be aware of the major economic interests at stake in any mention of artificial intelligence. Author and scientist Gary Marcus explains very clearly what it is Google can do in his column on the LaMDA case, which he calls Nonsense on Stilts.

He writes:

“Blaise Aguera y Arcas, polymath, novelist, and Google VP, has a way with words. When he found himself impressed with Google’s recent AI system LaMDA, he didn’t just say, “Cool, it creates really neat sentences that in some ways seem contextually relevant”, he said, rather lyrically, in an interview with The Economist on Thursday, “I felt the ground shift under my feet … increasingly felt like I was talking to something intelligent.”

With that in mind, when it comes to big tech, the media should also always refer to all forms of new technology and artificial intelligence as something that is cold, mechanical and stone dead.

This was first published in the Danish Daily Politiken and translated with the help of deepl.com