Amazon Olympus AI language model reportedly has bigger parameters compared to GPT-4

November 8, 2023

Is Copilot the best AI companion out there? Help us find out by answering a couple of quick questions!

Amazon is entering the AI race against OpenAI and Google. Sources familiar with the matter told Reuters that the tech giant has allocated millions of dollars to build a new language model called Olympus.

From the look of it, Olympus could be the largest model trained in history so far. The latest GPT-4 from OpenAI, for example, has one trillion parameters. Amazon’s ambitious plan doubles the parameters to 2 trillion. Google Bard, on the other hand, has 137 billion.

A parameter is a number that acts like a tuning knob for an AI language model. These knobs are set during training using data, and adjusting them can make the model work better for specific tasks. The more, the better.

The report says that former Alexa head Rohid Prasad has brought together researchers from Alexa AI and the Amazon science team to lead the project. No words on exactly when the language model will be released just yet.

You may also remember that a few months back, Amazon splurged over $4 billion in investment for AI model startup Anthropic and another $300 million for Anthropic last year. 

The source also told the publication that the model will make its cloud computing platform, AWS, more appealing to enterprise customers who demand access to the most cutting-edge AI tools.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}