by  	                            Kyt Dotson  	                         	                     Alibaba C

A message from John Furrier, co-founder of SiliconANGLE:

submited by
Style Pass
2024-09-19 19:30:04

by Kyt Dotson

Alibaba Cloud, the cloud computing arm of China’s Alibaba Group Ltd., today announced the release of more than 100 new artificial intelligence large language models open source as part of the Qwen 2.5 family of models.

Revealed at the company’s Apsara Conference, the new model series follows the release of the company’s foundation model Tongyi Qianwen, or Qwen, last year. Since then, the Qwen models have been downloaded more than 40 million times across platforms such as Hugging Face and Modelscope.

The new models range from sizes as small as a half-billion parameters to as large as 72 billion parameters. In an LLM, parameters define the behavior of an AI model and what it uses to make predictions about its skills such as mathematics, coding or expert knowledge.

Smaller, more lightweight models can be trained quickly using far less processing power on more focused training sets and excel at simpler tasks. In contrast, larger models need heavy processing power and longer training times and generally perform better on complex tasks requiring deep language understanding.

Leave a Comment
Related Posts