Amazon Is Building an LLM Twice the Size of OpenAI’s GPT-4

Few markets have grown as fast, in as short a time, as artificial intelligence (AI).

And as the technology is increasingly deployed across industries ranging from marketing, to payments, to insurance, execution speed is only becoming more important.

This, as per a Tuesday (Nov. 7) report, Amazon is working on an ambitious new large language model (LLM), which it could announce as soon as December.

Code named “Olympus,” the rumored LLM is set to be one of the largest foundation models ever trained at an alleged 2 trillion parameters — double that of the closest competitor, OpenAI’s state-of-the-art GPT-4 model, which has 1 trillion parameters. — Read More

#big7