The arrival of AI has had such an impact that even dictionaries have had to keep up and consider terms related to this technology. This has been the case with the Cambridge English Dictionary. The said dictionary recently announced its word of the year for 2023 and has added a new definition to it. Naturally, this new definition directly references artificial intelligence and the phenomenon it represents, although not in the best light.
To begin with, it’s worth mentioning that the word selected as the most prominent of 2023 is “hallucinate”. This word seems to have a strong connection with the AI phenomenon, particularly with its manifestations in the form of chatbots. Among them, the most recognized is ChatGPT, the creation of OpenAI. Despite being the most recognized, it is not exempt from making certain errors or, as the aforementioned English dictionary calls it, “hallucinating” with certain data, which are often misleading or false.
The hallucinations from AI continue to be a concerning issue to consider
The Cambridge English Dictionary ha updated the definition of “hallucinate” to include an additional meaning: “When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.” With this new definition, the dictionary takes a significant step to make users aware that AI is not infallible and is capable of providing inaccurate and false information, despite appearing entirely plausible.
There have already been cases of misleading and false results from AI chatbots, such as ChatGPT and Google Bard, which recently opened its doors to younger users, for instance. In fact, OpenAI’s AI chatbot caused issues for a US law firm that, after using it for a series of investigations, found that it cited some fictitious cases in court. Cambridge Dictionary itself has added two examples to clarify its new definition, one of which speaks about the kind of errors experienced by the aforementioned law firm.
Wendalyn Nichols, Director of Publishing at Cambridge Dictionary, mentioned: “The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical thinking skills to the use of these tools. AIs are fantastic at processing large amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the more likely they are to go astray.”
“Managing the tendency of generative AI tools to hallucinate will be key to ensuring that our users can continue to trust us,” Wendalyn added. “The emergence of a new meaning of ‘hallucinate’ is a good example of this. It’s human experts who track and capture changes in language that make the Cambridge Dictionary a reliable source of information about new words and senses, ones that publicly-facing AI tools have not yet learned.”
Beyond all this, the Cambridge Dictionary has not only highlighted even more the problems that can arise from the use and blind trust in AI-based technology (obviously, these hallucinations of artificial intelligence will diminish over time, but it’s true that currently they are far from disappearing and can cause many issues). Additionally, throughout 2023, several definitions related to AI have been added, and it’s expected that in 2024, this type of language will have much more presence and impact.