Amazon Olympus AI language model reportedly has bigger parameters compared to GPT-4

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Amazon is entering the AI race against OpenAI and Google. Sources familiar with the matter told Reuters that the tech giant has allocated millions of dollars to build a new language model called Olympus.

From the look of it, Olympus could be the largest model trained in history so far. The latest GPT-4 from OpenAI, for example, has one trillion parameters. Amazon’s ambitious plan doubles the parameters to 2 trillion. Google Bard, on the other hand, has 137 billion.

A parameter is a number that acts like a tuning knob for an AI language model. These knobs are set during training using data, and adjusting them can make the model work better for specific tasks. The more, the better.

The report says that former Alexa head Rohid Prasad has brought together researchers from Alexa AI and the Amazon science team to lead the project. No words on exactly when the language model will be released just yet.

You may also remember that a few months back, Amazon splurged over $4 billion in investment for AI model startup Anthropic and another $300 million for Anthropic last year. 

The source also told the publication that the model will make its cloud computing platform, AWS, more appealing to enterprise customers who demand access to the most cutting-edge AI tools.

User forum

0 messages