THE The word of the year is “hallucinate.” At first glance, this may not be the case seem AI-related. You might have guessed words like, you know, “artificial” or “AI” itself. But “hallucinate,” as Dictionary.com explains, is a major word in the world of AI and one that the site chose for a specific purpose.
As Dictionary.com defines it, in AI terms, hallucinating means “producing false information contrary to the user’s intent and presenting it as if it were true and factual.”
In a year when AI has gone mainstream, hallucinate has emerged as a particularly important word. Dictionary.com noted that it saw a 46% increase in searches in 2023 and an 85% increase in media usage.
“Hallucinate is particularly notable among the terms that AI has popularized because it refers not to one aspect of how AI works but to one of the ways in which it can malfunction“,” Dictionary.com wrote in a statement announcing the word of the year. “In this way, it is akin to other technical cautionary terms, like spam And viruswhich are now anchored in our language.
This is precisely one of the reasons why our lexicographers expect the word to remain relevant, at least in the near future. »
5 Ways AI Changed the Internet in 2023
For better or worse, we will all be learning and using AI-related terms for the foreseeable future. Mashable’s Cecily Mauran, in fact, wrote a comprehensive glossary of all AI terms you have to know. Among the words in the glossary: hallucination. As Mauran notes, some people might think that AI is omniscient and ultra-capable, but the fact that the term exists proves otherwise.
Mauran wrote: “(The hallucination) occurs because generative AI models work by predicting words based on a probabilistic relationship with the previous word. It is not capable of understanding what it generates. This is a reminder that ChatGPT can act sensitively, but it’s not“.