HomeTechnologySilicon Valley professionals are using this method to reduce AI hallucinations

Silicon Valley professionals are using this method to reduce AI hallucinations

A Gen AI tool is said to hallucinate when it starts making up answers that are factually incorrect or unverifiable.

June 18, 2024 / 07:30 IST
Story continues below Advertisement
AI
Artificial Intelligence

Generative AI or Gen AI is a type of artificial intelligence that can generate original content like text, art and music by drawing on existing content on the internet. But Generative AI is not always right in its answers. It can hallucinate. Silicon Valley professionals have come up with some ways to reduce Gen AI from hallucinating, the most popular approach being Retrieval Augmented Generation (RAG),  according to a report by Wired.

When a person hallucinates, they begin to see things or people that aren’t actually there. In a somewhat similar way, a Gen AI tool is said to hallucinate when it starts making up answers that are factually incorrect or unverifiable.

Story continues below Advertisement

How does Retrieval Augmented Generation (RAG) reduce AI hallucinations?

A RAG-based approach doesn’t just try to fetch information encoded during the initial training of the Gen AI model like any general question thrown at a Gen AI tool like ChatGPT or CoPilot does. The Retrieval Augmented Generation (RAG) process, instead, fortifies or augments your prompts by first sourcing information from a “custom database”, and then a Large Language Model (LLM) generates the response based on the new data.