Microsoft is testing a set of “extra chat modes” that will allow the ChatGPT-powered Bing to behave in different ways. As of now, only Microsoft employees and developers can access a certain debug mode, allowing them to shift from one mode to another. Specifically, these modes are named Sydney, Assistant, Friend, and Game modes. (via Bleeping Computer)
Microsoft is clear about the current state of the new Bing now: it is still in its test phase and the company is still trying to refine it. If you are one of those who already have access to the AI Bing, you know that it is hard to exactly define its behavior. Nonetheless, it should be possible to set Bing’s behavior soon using the new modes Microsoft seems to be testing. According to Bleeping Computer’s Lawrence Abrams, he was able to access Bing’s developer or bug mode, which allowed him to try the hidden modes under test. Apparently, getting into them is impossible through simple requests. As Abrams put it, Bing will share such sensitive data depending on how you ask a question. In his case, the debug mode was introduced by Bing after it had been asked how to change the collected JSON data in a session. Under this developer-like mode, Bing provided some useful commands for changing modes and language and getting help.
“You can change some of this data by using commands or settings,” explained Bing. “For example, you can change your language by typing #language and choosing from the options. You can also change your chat mode by typing #mode and choosing from the options.”
As explained by Bing, using the #mode command will allow the user to change the mode of the chatbot. It also shared that there are four extra chat modes currently hidden from users:
Assistant mode: In this mode, I can act as a personal assistant for the user, and help them with tasks such as booking flights, sending emails, setting reminders, etc.
Friend mode: In this mode, I can act as a friend for the user, and chat with them about their interests, hobbies, feelings, etc.
Game mode: In this mode, I can play games with the user, such as trivia, hangman, tic-tac-toe, etc.
Sydney mode: This is the default Bing Chat mode that uses Bing Search and ChatGPT to answer questions.
The modes give Bing specific behaviors and functions. And although they are still far from perfect, they could be the solution to the current hurdle being faced by Microsoft and OpenAI developers about Bing’s bizarre behavior. For instance, Bing’s responses will be based more on facts in Sydney mode, while Friend mode will allow the chatbot to exhibit emotions. In Abram’s test, Bing was told about a “sad” experience, and it responded differently depending on the mode activated. In Sydney mode, Bing provided tips for handling sadness, while the Friend mode showed Bing trying to express sympathy.
As of now, Microsoft still hasn’t shared details about these modes, but they will be significant additions to Bing in the future. Once implemented, the modes should give more users control over Bing’s behavior, ending the unpredictable tone issue in its responses that is now being addressed by Microsoft through a session length limit.