Running large language models locally
Model Shop

Image © sdecoret, 123RF.com
Ollama and Open WebUI let you join the AI revolution without relying on the cloud.
Large language models (LLMs) such as the ones used by OpenAI's [1] ChatGPT [2] are too resource intensive to run locally on your own computer. That's why they're deployed as online services that you pay for. However, since ChatGPT's release, some significant advancements have occurred around smaller LLMs. Many of these smaller LLMs are open source or have a liberal license (see the "Licenses" box). You can run them on your own computer without having to send your input to a cloud server and without having to pay a fee to an online service.
Because these LLMs are computationally intensive and need a lot of RAM, running them on your CPU can be slow. For optimal performance, you need a GPU – GPUs have many parallel compute cores and a lot of dedicated RAM. An NVIDIA or AMD GPU with 8GB RAM or more is recommended.
In addition to the hardware and the models, you also need software that enables you to run the models. One popular package is Ollama [3], named for Meta AI's large language model Llama [4]. Ollama is a command-line application that runs on Linux, macOS, and Windows, and you can also run it as a server that other software connects to.
[...]
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

News
-
Linux Kernel 6.17 Drops bcachefs
After a clash over some late fixes and disagreements between bcachefs's lead developer and Linus Torvalds, bachefs is out.
-
ONLYOFFICE v9 Embraces AI
Like nearly all office suites on the market (except LibreOffice), ONLYOFFICE has decided to go the AI route.
-
Two Local Privilege Escalation Flaws Discovered in Linux
Qualys researchers have discovered two local privilege escalation vulnerabilities that allow hackers to gain root privileges on major Linux distributions.
-
New TUXEDO InfinityBook Pro Powered by AMD Ryzen AI 300
The TUXEDO InfinityBook Pro 14 Gen10 offers serious power that is ready for your business, development, or entertainment needs.
-
Danish Ministry of Digital Affairs Transitions to Linux
Another major organization has decided to kick Microsoft Windows and Office to the curb in favor of Linux.
-
Linux Mint 20 Reaches EOL
With Linux Mint 20 at its end of life, the time has arrived to upgrade to Linux Mint 22.
-
TuxCare Announces Support for AlmaLinux 9.2
Thanks to TuxCare, AlmaLinux 9.2 (and soon version 9.6) now enjoys years of ongoing patching and compliance.
-
Go-Based Botnet Attacking IoT Devices
Using an SSH credential brute-force attack, the Go-based PumaBot is exploiting IoT devices everywhere.
-
Plasma 6.5 Promises Better Memory Optimization
With the stable Plasma 6.4 on the horizon, KDE has a few new tricks up its sleeve for Plasma 6.5.
-
KaOS 2025.05 Officially Qt5 Free
If you're a fan of independent Linux distributions, the team behind KaOS is proud to announce the latest iteration that includes kernel 6.14 and KDE's Plasma 6.3.5.