Young People’s Ability to Assess Reliability of Information is Decreasing
Search engines filled with AI-generated content only make it more difficult for young people to assess the reliability of information. For the sake of the children and society, technology comprehension should become a mandatory subject in school.
”Most humans are good. But when you give good people bad information, they make bad decisions.”
The quote is from an interview with historian Yuval Noah Harari based on his latest book, Nexus, in which he discusses the role of artificial intelligence in our information networks.
In the interview, he argues that the truth will lose in the battle of information in a free market: ”Because the truth is costly, it takes a lot of time and effort and money to research, to create a truthful account, whereas fiction and fantasy is very cheap. You don’t need to research anything, you just say the first thing that comes to your mind. The truth tends to be complicated because reality is complicated.”
For several years, educational researchers have emphasized the importance of children learning to think critically about the information they encounter – especially online. But with more and more people using generative AI like ChatGPT, and with generative AI built into certain apps and programs – such as Snapchat’s My AI or Microsoft’s Copilot – information critical skills have only become more important.
Information Literacy is Becoming Increasingly Important
After 9th grade, some of the guiding objectives in the Danish curriculum for the subject Danish (L1) include that students can:
- assess user-generated and expert-generated content
- plan and carry out phases of information search
- conduct targeted and critical information search
And have knowledge of:
- sender information and genres on the internet
- phases of information search
- source-critical search
(My translations.)
In a world full of misinformation and disinformation, these are extremely important goals.
Harari explains misinformation as something that emerges when a person tries to represent reality but misunderstands something – that is, not an intentional mistake. Conversely, he describes disinformation as an intentional lie that emerges when a person intentionally tries to twist our perception of reality.
Both types appear to be accelerating rapidly with the use of generative artificial intelligence.
We know that we cannot trust the information we get from generative artificial intelligence, and the consequences of this can lead to a greater risk of both being exposed to and reproducing and reinforcing misinformation. At the same time, the free information market is a paradise for disinformers who want to twist our perceptions of reality for one reason or another – perhaps most often for economic purposes.
Search Results are Filled with Generative Artificial Intelligence
One example – which is the frustration that led me to write this post – is that you are unable to avoid search results based on generative artificial intelligence. Even if, like me, you do not use generative artificial intelligence to search for information.
Webpages like fairytell.dk, byggeblik.dk, golfin.dk, and kvindely.dk – all of which (it seems) have totally auto-generated, artificial content – pop up when I use a regular search engine to examine phenomena or search for answers to questions. That is, in the perhaps soon-to-be-outdated way where I want to combine information from different sources myself to get and generate the kind of information that is reliable and takes time.
A recent example: As an experiment for this post, I searched on DuckDuckGo for: “are bananas healthy” (in Danish: “er bananer sunde”).
Search result number 9 was from the page skagenonline.dk, which appears quite often in my search results but seems to be based on information from other pages that may, but do not necessarily, use generative artificial intelligence. The page appears to be based on an outdated page about events in the city of Skagen in 2023.
Search result number 10 was from the site golfin.dk with the title: “Are bananas healthy? An in-depth guide to the health benefits and myths of bananas”, written by “The Owner” – though the page includes information about “Possible errors and advertisements on the page” as well as “Please note that the content may contain errors. Double-check important info. Ads, promotions, and paid content may appear” (my translations).

The search was made in Danish.
The purpose of these pages is therefore likely to sell ads based on high traffic to the pages. Not to serve as a source of information.
Young People’s Information Literacy is Decreasing
As a researcher within children’s and young people’s technology comprehension, I have participated in a study examining eighth-grade students’ computer and information literacy skills. The Danish results show that in 2023, only one percent of students were able to select relevant information and assess its reliability based on both the content and the communication context.
When we conducted the study five years earlier, in 2018, just over three percent were able to do the same. This means that this competence is decreasing, while the amount of misinformation is increasing.
Since the study involved eighth-grade students, this is the age group I can comment on with greatest confidence, but one can only imagine similar results for other age groups.
Technology Comprehension Should be Compulsory for Everyone
The results from ICILS give cause for concern in a society where it is incredibly important for children and young people to develop information-critical skills. Where they need to get better at determining what is true and false, and where they need to learn that generating reliable information takes time and effort. And that can be incredibly difficult. Not just for children, but for all of us. Especially with the emergence of generative artificial intelligence.
It is not uncommon to hear about misinformation and disinformation in the news, and digital technologies are often involved in producing or communicating false information. In my view, it speaks for itself that, as a responsible, democratic society, we should teach children and young people today to understand such digital technologies.
In Denmark, there is a strongly defined subject area – technology comprehension – which, in its current form, among other things, aims “to strengthen students’ ability to understand, create, and act meaningfully in a society where digital technologies and digital artifacts are increasingly serving as catalysts for change” (my translation), yet technology comprehension remains optional for all students in Danish school, even though experts and forward-thinking pioneers have emphasized its importance since the 1960s.
For the sake of the children and society, it should become mandatory.
(The pages listed are random examples found through searches. More than those listed exist.)
This post was originally published on Version 2 and is here translated to English with help from www.deepl.com
Photo: Gabriel Mihalcea