Running large language models locally
Model Shop

Image © sdecoret, 123RF.com
Ollama and Open WebUI let you join the AI revolution without relying on the cloud.
Large language models (LLMs) such as the ones used by OpenAI's [1] ChatGPT [2] are too resource intensive to run locally on your own computer. That's why they're deployed as online services that you pay for. However, since ChatGPT's release, some significant advancements have occurred around smaller LLMs. Many of these smaller LLMs are open source or have a liberal license (see the "Licenses" box). You can run them on your own computer without having to send your input to a cloud server and without having to pay a fee to an online service.
Because these LLMs are computationally intensive and need a lot of RAM, running them on your CPU can be slow. For optimal performance, you need a GPU – GPUs have many parallel compute cores and a lot of dedicated RAM. An NVIDIA or AMD GPU with 8GB RAM or more is recommended.
In addition to the hardware and the models, you also need software that enables you to run the models. One popular package is Ollama [3], named for Meta AI's large language model Llama [4]. Ollama is a command-line application that runs on Linux, macOS, and Windows, and you can also run it as a server that other software connects to.
[...]
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

News
-
TuxCare Announces Support for AlmaLinux 9.2
Thanks to TuxCare, AlmaLinux 9.2 (and soon version 9.6) now enjoys years of ongoing patching and compliance.
-
Go-Based Botnet Attacking IoT Devices
Using an SSH credential brute-force attack, the Go-based PumaBot is exploiting IoT devices everywhere.
-
Plasma 6.5 Promises Better Memory Optimization
With the stable Plasma 6.4 on the horizon, KDE has a few new tricks up its sleeve for Plasma 6.5.
-
KaOS 2025.05 Officially Qt5 Free
If you're a fan of independent Linux distributions, the team behind KaOS is proud to announce the latest iteration that includes kernel 6.14 and KDE's Plasma 6.3.5.
-
Linux Kernel 6.15 Now Available
The latest Linux kernel is now available with several new features/improvements and the usual bug fixes.
-
Microsoft Makes Surprising WSL Announcement
In a move that might surprise some users, Microsoft has made Windows Subsystem for Linux open source.
-
Red Hat Releases RHEL 10 Early
Red Hat quietly rolled out the official release of RHEL 10.0 a bit early.
-
openSUSE Joins End of 10
openSUSE has decided to not only join the End of 10 movement but it also will no longer support the Deepin Desktop Environment.
-
New Version of Flatpak Released
Flatpak 1.16.1 is now available as the latest, stable version with various improvements.
-
IBM Announces Powerhouse Linux Server
IBM has unleashed a seriously powerful Linux server with the LinuxONE Emperor 5.