SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of t

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-11-24 16:00:14

SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device.

Our most powerful model is SmolLM2-1.7B-Instruct, which you can use as an assistant with transformers, trl, or using quantized versions with tools like llama.cpp, MLX, and transformers.js. For lighter applications, you can also use the smaller models SmolLM2-360M andSmolLM2-135M, which are suitable for on-device usage and can be integrated similarly. All available in this collection.

You can find more details on how to leverage the model for use cases such as text summarization, text rewriting and function calling in the model card: https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct

You can use the models locally with frameworks like llama.cpp, MLX, and transformers.js, which support SmolLM2. All models are available in this collection.

Leave a Comment