Alphabet Inc., parent company to Google, has reportedly warned employees and told them not to share confidential information with chatbots like ChatGPT and their own, Bard AI. According to a report by Reuters, the company has told employees that human reviewers may read their conversations with the bot. There is also a risk of the AI bots reproducing the data, which may lead to leaks.
Also read | Generative AI startup Mistral AI raises 105 million euros in seed-stage funding
Sources told Reuters that the company has advised engineers and programmers to avoid using computer code generated by AI bots, due to the fact that bots can sometimes make undesired code suggestions.
According to Insider, Google had issued a similar warnings to employees in February of this year, a month before it launched Bard AI. Google's privacy notice sent out in June stated, "Don't include confidential or sensitive information in your Bard conversations".
Also read | Google launches AI-powered advertiser features in push for automation
By default both ChatGPT and Bard AI save user conversation histories on the servers. You can opt out of this by visiting the settings page and turning it off. Both Google and OpenAI offer a pricier subscription for enterprise use that does not save conversation data on their servers.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.