Meta's BlenderBot 3: An improved chatbot designed to “learn from conversations”

Reading time icon 4 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

You’re probably one of those folks who fell in love with the humor of the AI robot named TARS in the Interstellar movie. We heard it crack jokes and deliver remarks like a cool guy at a bar. Unfortunately, our current technology is still far from producing such robots that can converse with us casually and naturally like a living person. Nonetheless, Meta is taking a new step to make it possible by introducing its latest AI research project, the BlenderBot 3 chatbot.

BlenderBot 3 is just one of Meta’s projects in its BlenderBot series, which allows AI to “carry out meaningful conversations” through the help of internet data and long-term memory. Compared to others, BlenderBot 3 is said to be using a much larger language model that is 58 times the size of BlenderBot 2. In particular, it is trained using a massive amount of publicly available language data with countless conversations about a wide variety of topics, from recipes to interesting places and tourist vacations. 

On the other hand, the main highlight of BlenderBot 3 is its ability to learn from the conversation it is having with different individuals on the web. Meta says it will use this technique to gather feedback from people about the model, which it will use to improve its responses.

“The goal of our research is to collect and release feedback data that we and the broader AI research community can leverage over time. That way, we can find new ways for AI systems to be safer and more engaging for people who use them,” said Meta in a blog post. “While BlenderBot 3 significantly advances publicly available chatbots, it’s certainly not at a human level. It’s occasionally incorrect, inconsistent and off-topic. As more people interact with our demo, we’ll improve our models using their feedback and release data to benefit the wider AI community.”

Despite the new improvements, Meta admitted that BlenderBot 3 is far from perfect. While it is twice knowledgeable and improved by 31% on conversational tasks compared to its predecessors, Meta noted that it can still perform in unfavorable ways. As a matter of fact, Meta claimed that 0.16% of its responses were labeled inappropriate during the test. Meanwhile, to protect the chatbot from those who want to use it negatively, Meta said that it also gave the BlenderBot 3 the proper capability to avoid learning offensive remarks, especially inappropriate statements from trolls.

“We understand that not everyone who uses chatbots has good intentions, so we also developed new learning algorithms to distinguish between helpful responses and harmful examples,” Meta shared. “Over time, we will use this technique to make our models more responsible and safe for all users.”

BlenderBot 3 is already available to be tried by US users, but as expected, and then again, it is not flawless. While it can deliver sentences smoothly, the chatbot sometimes misses the flow of conversation and could jump from one topic to another. For instance, in a conversation shared by Wired, BlenderBot 3 introduced ideas that are not greatly relevant to the discussion.

In another chat with the BlenderBot 3 with BuzzFeed data scientist Max Woolf, the bot admired Mark Zuckerberg, saying the Meta CEO is “a good business man.” However, in a rephrased question asked by Yahoo!, the bot somehow negated the response it provided to Woolf by saying, “Oh man, big time. I don’t really like him at all. He’s too creepy and manipulative.”

With all that, it might still be a long way for BlenderBot 3 to become the intelligent chatbot the world could rely on. Nonetheless, with the proper interactions and feedback, Meta should get the right materials to further improve the system and (hopefully) help it decide whether Zuckerberg is actually a good businessman or a creepy, manipulative dude.

Image Credits to Wired, Max Woolf, and Yahoo!