How to Run Ollama on Linux with AMD MI50
Tested Environment: Ubuntu 22.04
Quick Start Summary
# 1. Install ROCm driver (ROCm 5.7.1 for MI50)
sudo mkdir -p /etc/apt/keyrings
wget https://repo.radeon.com/rocm/rocm.gpg.key -O - | gpg --dearmor | sudo tee /etc/apt/keyrings/rocm.gpg > /dev/null
sudo tee /etc/apt/sources.list.d/amdgpu.list <<'EOF'
deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/amdgpu/5.7.1/ubuntu jammy main
EOF
sudo tee /etc/apt/sources.list.d/rocm.list <<'EOF'
deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/rocm/apt/5.7.1 jammy main
EOF
echo -e 'Package: *\nPin: release o=repo.radeon.com\nPin-Priority: 600' | sudo tee /etc/apt/preferences.d/rocm-pin-600
sudo apt update
sudo apt install amdgpu-dkms rocm-hip-libraries
sudo reboot
# 2. Install Ollama
curl -LO https://ollama.com/download/ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
# 3. Install ROCm package for Ollama
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz
# 4. Run Ollama
ollama serve
Install ROCm Driver
If you want to run Ollama on Linux with an AMD MI50 GPU, you need to install the AMD graphics driver first.
I recommend using Ubuntu 22.04, as it is stable for running AMD graphics drivers and local large language models. I have tested this setup successfully.
Official ROCm installation guide: 👉 ROCm Quick Start
1. Download and Convert the Package Signing Key
sudo mkdir --parents --mode=0755 /etc/apt/keyrings
wget https://repo.radeon.com/rocm/rocm.gpg.key -O - | \
gpg --dearmor | sudo tee /etc/apt/keyrings/rocm.gpg > /dev/null
2. Add the Repositories
sudo tee /etc/apt/sources.list.d/amdgpu.list <<'EOF'
deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/amdgpu/5.7.1/ubuntu jammy main
EOF
sudo tee /etc/apt/sources.list.d/rocm.list <<'EOF'
deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/rocm/apt/5.7.1 jammy main
EOF
echo -e 'Package: *\nPin: release o=repo.radeon.com\nPin-Priority: 600' \
| sudo tee /etc/apt/preferences.d/rocm-pin-600
3. Update the Package List
sudo apt update
sudo apt install amdgpu-dkms
4. Install ROCm Runtimes
sudo apt install rocm-hip-libraries
5. Reboot the System
sudo reboot
Important: Always reboot after installing the ROCm driver.
Why Choose ROCm 5.7?
- The AMD MI50 is an older GPU, and newer ROCm drivers no longer support it.
- ROCm 6.4.x is not stable for this GPU.
- Version 5.7.1 is the last stable version with official (deprecated) support.
Support Status:
The current ROCm release has limited support for this hardware. Existing features are maintained, but no new features or optimizations will be added. Support will be removed in a future release.
Install Ollama
Once the driver is installed, follow the official Ollama AMD GPU guide.
Manual Install
If upgrading from a previous version, remove old libraries first:
sudo rm -rf /usr/lib/ollama
Download and Install:
curl -LO https://ollama.com/download/ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
Start Ollama:
ollama serve
Verify Installation:
ollama -v
Install AMD GPU Support for Ollama
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz
Add Ollama as a Systemd Service (Recommended)
Create a user and group:
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)
Create the service file /etc/systemd/system/ollama.service:
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
[Install]
WantedBy=multi-user.target
Enable and Start the Service:
sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama
Customizing Ollama Settings
You can edit the service file:
sudo systemctl edit ollama
Or create an override file: /etc/systemd/system/ollama.service.d/override.conf
Example (custom settings):
[Service]
Environment="OLLAMA_DEBUG=1"
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"
Environment="OLLAMA_KEEP_ALIVE=24h"
Environment="OLLAMA_NUM_PARALLEL=2"
Environment="OLLAMA_MAX_LOADED_MODELS=2"
Key Parameters:
Parameter | Description |
---|---|
OLLAMA_MODELS | Model storage directory. On Windows, avoid storing on C: — use another drive. |
OLLAMA_HOST | Network address for Ollama to listen on. Use 0.0.0.0 for LAN access. |
OLLAMA_PORT | Default port is 11434. Change if needed. |
OLLAMA_ORIGINS | Allowed HTTP origins. Use * for no restriction. |
OLLAMA_KEEP_ALIVE | Time to keep a model in memory (5m default). 24h recommended for faster reuse. |
OLLAMA_NUM_PARALLEL | Number of parallel request handlers. Default is 1. |
OLLAMA_MAX_QUEUE | Request queue length (default 512). Requests beyond this limit are dropped. |
OLLAMA_DEBUG | Set to 1 for debug logs. |
OLLAMA_MAX_LOADED_MODELS | Maximum number of loaded models at once (default 1). |
Updating Ollama
Re-run the install script:
curl -fsSL https://ollama.com/install.sh | sh
Or manually download and extract:
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
Viewing Logs
journalctl -e -u ollama
Uninstalling Ollama
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
sudo rm $(which ollama)
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama
sudo rm -rf /usr/local/lib/ollama