Moneycontrol PRO
HomeTechnologySilicon Valley professionals are using this method to reduce AI hallucinations

Silicon Valley professionals are using this method to reduce AI hallucinations

A Gen AI tool is said to hallucinate when it starts making up answers that are factually incorrect or unverifiable.

June 18, 2024 / 07:30 IST
Artificial Intelligence

Generative AI or Gen AI is a type of artificial intelligence that can generate original content like text, art and music by drawing on existing content on the internet. But Generative AI is not always right in its answers. It can hallucinate. Silicon Valley professionals have come up with some ways to reduce Gen AI from hallucinating, the most popular approach being Retrieval Augmented Generation (RAG),  according to a report by Wired.

When a person hallucinates, they begin to see things or people that aren’t actually there. In a somewhat similar way, a Gen AI tool is said to hallucinate when it starts making up answers that are factually incorrect or unverifiable.

How does Retrieval Augmented Generation (RAG) reduce AI hallucinations?

A RAG-based approach doesn’t just try to fetch information encoded during the initial training of the Gen AI model like any general question thrown at a Gen AI tool like ChatGPT or CoPilot does. The Retrieval Augmented Generation (RAG) process, instead, fortifies or augments your prompts by first sourcing information from a “custom database”, and then a Large Language Model (LLM) generates the response based on the new data.

How Retrieval Augmented Generation (RAG) is different from a basic Gen AI query is that it reportedly causes the search engine to “pull in real documents” based on certain topics and then “anchor the response of the model to those documents.” This is what Pablo Arredondo, vice president of CoCounsel at Thomson Reuters, told Wired in an interview.

Using Retrieval Augmented Generation (RAG) is not a foolproof method, however, and the Gen AI model can still hallucinate but RAG can reduce the number of times it hallucinates.

Additionally, various factors reportedly come into play when professionals are looking at reducing AI hallucinations. They include the quality of the content RAG pulls the data from, the quality of search, and retrieval of the right content based on the question put to the Gen AI model. In the end, the output of the Gen AI model should be grounded in the data provided and be factually correct at the same time.

Invite your friends and family to sign up for MC Tech 3, our daily newsletter that breaks down the biggest tech and startup stories of the day

Utkarsh Saurbh
first published: Jun 18, 2024 07:30 am

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347