Introduce a new compose.yml to deploy Ollama with AMD GPU hardware acceleration on Docker Swarm. - Uses the ROCm-enabled Ollama image. - Mounts necessary GPU devices (/dev/kfd and /dev/dri) with root permissions for hardware access. - Sets HSA_OVERRIDE_GFX_VERSION to 10.3.0 to support Navi 21 (6900 XT) cards. - Configures persistent storage on local SSD and integrates with an external proxy network. - Includes deployment constraints to target nodes labeled with GPUs.
Local Homelab Repository
This Git repository is exclusively for my Local Homelab setup.
Contents
This repository contains docker-compose.yml files for various services that I run in my homelab, including:
- Home Assistant: For smart home automation.
- Monitoring Tools: For keeping an eye on the health and performance of my systems.
- Teslamate: For logging data from my Tesla.
You can find the repository here: Local Homelab Repository