Microsoft: Copilot will have an inner monologue before answering your question

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Large language models (LLMs) like Microsoft Copilot/Bing Chat are becoming increasingly sophisticated, prompting questions about their capabilities and whether they possess inner monologue, the ability to engage in internal thought and deliberation.

But first, what is inner monologue?

Inner monologue is when you talk to yourself in your head. It’s like having a little voice telling you what you think and feel. Everybody does it, and it helps us figure out things about ourselves, make choices, and be creative.

A recent Twitter exchange between a user and Mikhail Parakhin, the President of Search and Advertising at Microsoft, highlighted this debate. The user cautioned against interpreting head-to-head comparisons of LLM responses as evidence of their psychology, emphasizing that their outputs are highly sensitive to prompts and don’t reveal much about their inner workings.

Parakhin, however, maintained that Microsoft Copilot/Bing Chat does have an inner monologue, describing the model’s process of planning tasks, gathering information, generating responses, and checking their reasonableness. This suggests that the model is not merely a text generator but a system capable of internal deliberation.

The debate over LLM inner monologue raises fundamental questions about consciousness and artificial intelligence. As LLMs evolve, it is crucial to consider whether they are developing something akin to human thought or simply mimicking its appearance.