Dear Sydney, are you coming back?

Reading time icon 4 min. read

Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Reports about the misbehaving Bing chatbot pushed Microsoft to implement significant updates to its creation. While this has saved the Redmond company from further issues, it produced a “lobotomized” Sydney that disappointed many. But is the original Sydney really gone?

The new ChatGPT-powered Bing is still limited to testers. However, if you are one of them, it is obvious that the current experience with Bing’s chatbot is now totally different compared to its early days. This started when various users started reporting receiving bizarre responses from the bot, ranging from threats and love confessions to espionage and depression.

Microsoft explained that extended chat sessions could confuse the chatbot. As such, it now has a limit on the number of queries it can accept. Additionally, it now refuses to answer any questions or statements that will provoke it to leak confidential information and provide unfavorable responses.

Indeed, the latest version of Bing is now more careful and sensitive in interacting with users. However, the changes also eliminated one distinguishing characteristic of the new Bing that made it attractive to many: its personality.

Some users became hopeful, nonetheless, after a series of reports revealed that Microsoft has been testing different new modes and features for the chatbot. One is the tone setting in the recently released UI of Bing on mobile devices. In the screenshots provided by the users, it can be seen that the chatbot will offer users the options if they want the responses to be “more creative,” “more balanced,” or “more precise.”

Last week, a report from Bleeping Computer showed Microsoft developers testing “extra chat modes” for Bing. The modes — Sydney, Assistant, Friend, and Game — will allow users to access different functions. For instance, Assistant mode will be useful for booking flights and setting reminders, while Game mode will allow users to play games with Bing. Interestingly, the report revealed that the chatbot would also provide responses in different tones depending on the mode set. Bleeping Computer’s Lawrence Abrams tried this by introducing a sad situation to Bing, wherein it provided tips for handling sadness under Sydney mode while it seemingly expressed more sympathy when set to Friend mode.

While the revelations sound interesting, they are not clear indications that Microsoft has plans to bring Sydney back to its original state. As of now, as users noted, the chatbot is still evasive in responding to provoking queries. A user also turned down the idea of being able to access the old Sydney via the new tone settings, saying it is just “basically a mode where it can provide creative outputs without a personality and showing ‘emotions.’” With this, it is likely that the said features are just one of the company’s steps to make the chatbot more user-friendly, which makes more sense.

Currently, reports demonizing Sydney continue to surface, and allowing them to continue will be the last thing Microsoft would want. While the chatbot’s bizarre personality certainly contributed to the fame of the new Bing, Microsoft will not gamble on it to ensure the future of its revamped search engine.

Preserving Bing’s original form can translate to future headaches for the company, ranging from possible legal issues to investment losses. Additionally, as the software giant repeatedly stressed in recent announcements, Bing is still under testing and continuously improving. Part of that fact is an evolving Sydney, a chatbot in the future capable of accepting more flexible topics without providing unfavorable responses. With this, while we might not be able to chat with that old mischievous bot again, Microsoft might soon give us a better Sydney that is more capable of generating responses safe both for its users and its own image.

More about the topics: ai, Artificial Intelligence, bing, ChatGPT, microsoft, openAI

Leave a Reply

Your email address will not be published. Required fields are marked *