My weekend side project: MiniLLM, a minimal system for running modern LLMs on consumer GPUs ✨    🐦  Supports multiple LLMs (LLAMA, BLOOM, OPT)

JavaScript is not available.

submited by
Style Pass
2023-03-16 20:30:04

My weekend side project: MiniLLM, a minimal system for running modern LLMs on consumer GPUs ✨ 🐦 Supports multiple LLMs (LLAMA, BLOOM, OPT) ⚙️ Supports NVIDIA GPUs, not just Apple Silicon 🧚‍♀️ Tiny, easy-to-use codebase in Python (<500 LOC) https:// github.com/kuleshov/minillm …

My weekend side project: MiniLLM, a minimal system for running modern LLMs on consumer GPUs ✨ 🐦 Supports multiple LLMs (LLAMA, BLOOM, OPT) ⚙️ Supports NVIDIA GPUs, not just Apple Silicon 🧚‍♀️ Tiny, easy-to-use codebase in Python (<500 LOC) https:// github.com/kuleshov/minillm …

Leave a Comment