Linux
Install
To install Ollama, run the following command:
curl -fsSL https://ollama.com/install.sh | shManual install
Download and extract the package:
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgzStart Ollama:
ollama serveIn another terminal, verify that Ollama is running:
ollama -vAMD GPU install
If you have an AMD GPU, also download and extract the additional ROCm package:
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgzARM64 install
Download and extract the ARM64-specific package:
curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz
sudo tar -C /usr -xzf ollama-linux-arm64.tgzAdding Ollama as a startup service (recommended)
Create a user and group for Ollama:
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)Create a service file in /etc/systemd/system/ollama.service:
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
[Install]
WantedBy=default.targetThen start the service:
sudo systemctl daemon-reload
sudo systemctl enable ollamaInstall CUDA drivers (optional)
Download and install CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
nvidia-smiInstall AMD ROCm drivers (optional)
Download and Install ROCm v6.
Start Ollama
Start Ollama and verify it is running:
sudo systemctl start ollama
sudo systemctl status ollamaNOTE
While AMD has contributed the amdgpu driver upstream to the official linux kernel source, the version is older and may not support all ROCm features. We recommend you install the latest driver from https://www.amd.com/en/support/linux-drivers for best support of your Radeon GPU.
Customizing
To customize the installation of Ollama, you can edit the systemd service file or the environment variables by running:
sudo systemctl edit ollamaAlternatively, create an override file manually in /etc/systemd/system/ollama.service.d/override.conf:
[Service]
Environment="OLLAMA_DEBUG=1"Updating
Update Ollama by running the install script again:
curl -fsSL https://ollama.com/install.sh | shOr by re-downloading Ollama:
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgzInstalling specific versions
Use OLLAMA_VERSION environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the releases page.
For example:
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.3.9 shViewing logs
To view logs of Ollama running as a startup service, run:
journalctl -e -u ollamaUninstall
Remove the ollama service:
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.serviceRemove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):
sudo rm $(which ollama)Remove the downloaded models and Ollama service user and group:
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama