Friday, May 30, 2025
Peril Of Africa
  • Login
  • Home
  • News
    • Africa
    • Crime
    • Health
  • Politics
  • Opinions
  • Business
  • Lifestyle
No Result
View All Result
  • Home
  • News
    • Africa
    • Crime
    • Health
  • Politics
  • Opinions
  • Business
  • Lifestyle
No Result
View All Result
Peril Of Africa
No Result
View All Result

In Defense of AI Hallucinations

by admin
January 6, 2024
in Technology
Share on FacebookWhatsAppTweetShare

By WIRED

Source link

No one knows whether artificial intelligence will be a boon or curse in the far future. But right now, there’s almost universal discomfort and contempt for one habit of these chatbots and agents: hallucinations, those made-up facts that appear in the outputs of large language models like ChatGPT. In the middle of what seems like a carefully constructed answer, the LLM will slip in something that seems reasonable but is a total fabrication. Your typical chatbot can make disgraced ex-congressman George Santos look like Abe Lincoln. Since it looks inevitable that chatbots will one day generate the vast majority of all prose ever written, all the AI companies are obsessed with minimizing and eliminating hallucinations, or at least convincing the world the problem is in hand.

Obviously, the value of LLMs will reach a new level when and if hallucinations approach zero. But before that happens, I ask you to raise a toast to AI’s confabulations.

Hallucinations fascinate me, even though AI scientists have a pretty good idea why they happen. An AI startup called Vectara has studied them and their prevalence, even compiling the hallucination rates of various models when asked to summarize a document. (OpenAI’s GPT-4 does best, hallucinating only around 3 percent of the time; Google’s now outdated Palm Chat—not its chatbot Bard!—had a shocking 27 percent rate, although to be fair, summarizing documents wasn’t in Palm Chat’s wheelhouse.) Vectara’s CTO, Amin Ahmad, says that LLMs create a compressed representation of all the training data fed through its artificial neurons. “The nature of compression is that the fine details can get lost,” he says. A model ends up primed with the most likely answers to queries from users but doesn’t have the exact facts at its disposal. “When it gets to the details it starts making things up,” he says.

Santosh Vempala, a computer science professor at Georgia Tech, has also studied hallucinations. “A language model is just a probabilistic model of the world,” he says, not a truthful mirror of reality. Vempala explains that an LLM’s answer strives for a general calibration with the real world—as represented in its training data—which is “a weak version of accuracy.” His research, published with OpenAI’s Adam Kalai, found that hallucinations are unavoidable for facts that can’t be verified using the information in a model’s training data.

That’s the science/math of AI hallucinations, but they’re also notable for the experience they can elicit in humans. At times, these generative fabrications can seem more plausible than actual facts, which are often astonishingly bizarre and unsatisfying. How often do you hear something described as so strange that no screenwriter would dare script it in a movie? These days, all the time! Hallucinations can seduce us by appearing to ground us to a world less jarring than the actual one we live in. What’s more, I find it telling to note just which details the bots tend to concoct. In their desperate attempt to fill in the blanks of a satisfying narrative, they gravitate toward the most statistically likely version of reality as represented in their internet-scale training data, which can be a truth in itself. I liken it to a fiction writer penning a novel inspired by real events. A good author will veer from what actually happened to an imagined scenario that reveals a deeper truth, striving to create something more real than reality.

When I asked ChatGPT to write an obituary for me—admit it, you’ve tried this too—it got many things right but a few things wrong. It gave me grandchildren I didn’t have, bestowed an earlier birth date, and added a National Magazine Award to my résumé for articles I didn’t write about the dotcom bust in the late 1990s. In the LLM’s assessment of my life, this is something that should have happened based on the facts of my career. I agree! It’s only because of real life’s imperfectness that the American Society of Magazine Editors failed to award me the metal elephant sculpture that comes with that honor. After almost 50 years of magazine writing, that’s on them, not me! It’s almost as if ChatGPT took a poll of possible multiverses and found that in most of them I had an Ellie award. Sure, I would have preferred that, here in my own corner of the multiverse, human judges had called me to the podium. But recognition from a vamping artificial neural net is better than nothing.

Related Posts

Apple CEO Tim Cook laughs with President Donald Trump during a meeting in the White House, Washington, March 6, 2019.
Leah Millis | Reuters
Featured

High Price of Tariffs & Isolation – Trump’s Tech Policies Are Bad Economics

May 24, 2025
Despite their immense financial success, MTN and Airtel have consistently failed to provide full transparency in their mobile money services. Image maybe subject to copyright.
Africa

MTN, Airtel: Telecom Giants Exploiting East African Consumers

February 5, 2025
The UCC should focus on making telecom services accessible, affordable, and efficient, not creating hurdles that serve no purpose other than to frustrate and exploit the people.  Image maybe subject to copyright.
Featured

The Uganda Communications Commission’s SIM Card Policy: A Digital Dictatorship

December 10, 2024
Next Post

Israel’s war on Gaza: List of key events, day 92 | Israel War on Gaza News

Discussion about this post

Contacts

Email: [email protected]
Phone: +1 506-871-6371

© 2021 Peril of Africa

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • News
    • Africa
    • Crime
    • Health
  • Politics
  • Opinions
  • Business
  • Lifestyle

© 2021 Peril of Africa