HomeNewsBusinessWire NewsGemini not always reliable in responding to prompts: Google after chatbot's response on PM

Gemini not always reliable in responding to prompts: Google after chatbot's response on PM

Google acknowledges concerns over AI tool Gemini's biased response regarding PM Narendra Modi, vows to address issues and admits reliability limitations.

February 24, 2024 / 14:59 IST
Story continues below Advertisement
Google responds to criticism over AI tool Gemini's biased response on PM Narendra Modi, admits chatbot's unreliability on political topics.
Google responds to criticism over AI tool Gemini's biased response on PM Narendra Modi, admits chatbot's unreliability on political topics.

Under fire over AI tool Gemini’s objectionable response and bias to a question on PM Narendra Modi, Google on Saturday said it has worked quickly to address the issue and conceded that the chatbot “may not always be reliable” in responding to certain prompts related to current events and political topics.

On Friday, Union Minister Rajeev Chandrasekhar had warned that Google’s AI tool Gemini’s response to a question around the Prime Minister is in direct violation of IT rules as well as several provisions of the criminal code.

Story continues below Advertisement

Chandrasekhar, Minister of State for IT and Electronics, had taken cognizance of the issue raised by verified accounts of a journalist alleging bias in Google Gemini in response to a question on Modi while it gave no clear answer when a similar question was asked for Trump and Zelenskyy.

In an e-mail statement, a Google spokesperson said, “We’ve worked quickly to address this issue.” Google further said Gemini is built as a creativity and productivity tool and “may not always be reliable, especially when it comes to responding to some prompts about current events, political topics, or evolving news”.
“This is something that we’re constantly working on improving,” the spokesperson said.