Microsoft is developing its own large language model with 500 billion parameters
1 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Key notes
- The creation of MAI-1 signals Microsoft’s determination to reduce its reliance on OpenAI for AI models.
Through its strong partnership with OpenAI, Microsoft already leverages one of the industry’s top large language models (LLMs). Beyond OpenAI’s offerings, Microsoft Research consistently releases smaller language models (SLMs) to stay competitive with AI startups and open-source projects. Now, The Information reports that Microsoft is creating its own in-house LLM, code-named MAI-1.
While the Phi model family was trained with a maximum of 14 billion parameters, MAI-1 will utilize approximately 500 billion parameters. This positions MAI-1 to directly rival industry-leading LLMs like Google Gemini and Amazon Titan.
Mustafa Suleyman, who recently joined Microsoft, is leading the effort to develop MAI-1. Apart from hiring Mustafa and key members from Inflection AI, Microsoft also signed access to Inflection’s technology. So, Microsoft may be using some of the technology it acquired from Inflection to develop MAI-1.
This MAI-1 development news indicates that Microsoft clearly doesn’t want to depend entirely on OpenAI for AI models. It wants to have access to OpenAI’s state-of-the-art AI models, in-house developed MAI-1 which will have capabilties equivalent to OpenAI’s state-of-the-art models and Phi family of small models for on-device scenarios.
User forum
0 messages