Google Bard warning users to stop giving personal information to the chatbot

Reading time icon 1 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Google Bard, one of ChatGPT’s top rivals in the AI-powered chatbot market, has just given a dire warning for its users to stop handing out personal information to the chatbot.

Bard is indeed designed to have conversations with users and answer their questions, but Google says that users should avoid sharing any information that they wouldn’t want a stranger to know.

“Please don’t enter confidential information in your Bard conversations or any data you wouldn’t want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies,” the recent update (dated November 2) reads.

This is because Bard’s conversations are reviewed by human reviewers, who are helping Google to improve the chatbot. Google says that it takes steps to protect user privacy, but users should be aware that their conversations may be seen by other people.

ChatGPT has also been a controversial subject recently. One Redditor found that the GPT 3.5 model has randomly added a user’s picture in the reply — completely out of the blue. 

Google Bard also now allows users to see responses while they are being generated, just like ChatGPT. This feature can be turned on or off in the Settings.

Leave a Reply

Your email address will not be published. Required fields are marked *