by  	                            Maria Deutscher  	                         	                     When

A message from John Furrier, co-founder of SiliconANGLE:

submited by
Style Pass
2024-05-07 01:30:04

by Maria Deutscher

When OpenAI introduced GPT-3 in mid-2020, it detailed that the initial version of the model had 175 billion parameters. The company disclosed that GPT-4 is larger but hasn’t yet shared specific numbers. Some reports suggest that OpenAI’s flagship LLM includes 1.76 trillion parameters while Google LLC’s Gemini Ultra, which has comparable performance to GPT-4, reportedly features 1.6 trillion.

That Microsoft’s MAI-1 reportedly comprises 500 billion parameters suggests it could be positioned as a kind of midrange option between GPT-3 and ChatGPT-4. Such a configuration would allow the model to provide high response accuracy, but using significantly less power than OpenAI’s flagship LLM. That would translate into lower inference costs for Microsoft.

According to The Information, the development of MAI-1 is being overseen by Mustafa Suleyman, the founder of LLM developer Inflection AI Inc. Suleyman joined Microsoft in March along with most of the startup’s employees through a deal reportedly worth $625 million. The executive earlier co-founded Google LLC’s DeepMind AI research group.

Leave a Comment
Related Posts