Researchers model next-generation AI on child language development

Researchers at the Vrije Universiteit Brussel (VUB) and Université de Namur have developed artificial intelligence that acquires language in the same way as children do: through interaction, play and actively deciphering meaning.
Their approach is radically different to that of the current generation of large language models such as ChatGPT, which rely purely on text statistics.
"Children learn their mother tongue by communicating with the people around them,” Katrien Beuls of UNamur said in a press release.
“As they play and experiment with language, they try to interpret the intentions of their conversation partners. In this way, they gradually learn to understand and use language constructions. This process, in which language is acquired through interaction and meaningful context, is at the heart of human language acquisition.”
Powerful but biased
Models such as ChatGPT learn by observing huge amounts of text to see which words often occur close to each other, in order to generate texts that are often indistinguishable from those written by humans.
“This leads to models that are extremely powerful in many forms of text generation, from summarising or translating texts to answering questions, but which at the same time exhibit a number of inherent limitations,” says Paul Van Eecke of VUB.
They are prone to bias and hallucinations, often struggle with reasoning and require large amounts of data and energy to build and use.
Direct interaction
The researchers propose an alternative model in which artificial agents learn by participating in meaningful interactions to develop language constructs that are directly linked to their environment and sensory perceptions.
"This process, in which language is acquired through interaction and meaningful context, is at the heart of human language acquisition"
This leads to models that are less prone to hallucinations and bias because their understanding of language is based on direct interaction with the world, and use data and energy more efficiently. They also understand language and context in a more human way.
“Integrating communicative and situated interactions into AI models is a crucial step in the development of the next generation of language models,” the researchers say. “This research offers a promising path towards language technologies that are closer to how humans understand and use language.”
#FlandersNewsService | © PHOTO IMAGEBROKER
Related news