I wrote a book! Check out A Quick Guide to Coding with AI.
Become a super programmer!
Learn how to use Generative AI coding tools as a force multipl

The easiest way to run an LLM locally on your Mac

submited by
Style Pass
2025-01-09 02:00:03

I wrote a book! Check out A Quick Guide to Coding with AI. Become a super programmer! Learn how to use Generative AI coding tools as a force multiplier for your career.

I’ve written about running LLMs (large language models) on your local machine for a while now. I play with this sort of thing nearly every day. So, I’m always looking for cool things to do in this space and easy ways to introduce others (like you) to the world of LLMs. This is my latest installment.

While I’ve been using Ollama a ton lately, I saw this new product called LMstudio come up and thought I’d give it a shot. I’ll install it and try it out. This is my impression of LM Studio. But there’s a twist!

I’m doing this on my trusty old Mac Mini! Usually, I do LLM work with my Digital Storm PC, which runs Windows 11 and Arch Linux, with an NVidia 4090. It runs local models really fast. But I’ve wanted to try this stuff on my M1 Mac for a while now. I decided to try out LM Studio on it.

The first screen that comes up is the LM Studio home screen, and it’s pretty cool. It has a bunch of models listed, and you can click on them to see more information about them.

Leave a Comment