TL;DR: I’ve packaged Ollama for openSUSE! As of February 27, 2024, it’s available for Tumbleweed (Leap will hopefully pick it up at some p

Ollama is now packaged for openSUSE

submited by
Style Pass
2024-02-28 02:00:19

TL;DR: I’ve packaged Ollama for openSUSE! As of February 27, 2024, it’s available for Tumbleweed (Leap will hopefully pick it up at some point). Just run sudo zypper ref && sudo zypper in ollama to get it! Please note that it will not use your GPU for now; if you need GPU support, keep installing Ollama from https://ollama.com instead. Now for the main post…

As somebody who has been a local LLM enthusiast ever since the advent of the Llama models [citation needed], I have been using Ollama for some time now. However, Ollama has one small problem:

That’s right, it’s 2024 and we still haven’t learned that Flatpaks and AppImages are the best way to distribute a Linux application that will work on any distro (yes, I am part of the Snap hater community). Instead, the Ollama authors have created a script that will download Ollama, create a systemd service, and even install CUDA drivers if necessary.

Now, to be fair, Flatpaks and AppImages aren’t intended to be used for systemd services. Also, while Ollama’s website doesn’t inform you of the security risks of running a random script from the internet, it at least gives you a link to the script source, so you can read over it and make sure it’s not going to do anything nefarious.

Leave a Comment