Add a new ComfyUI service to the Docker Compose configuration, specifically tailored for AMD GPUs using ROCm.
- Uses the rocm/pytorch base image for hardware acceleration
- Includes an initialization script to clone and install ComfyUI on first boot
- Configures necessary hardware device mappings (/dev/dri, /dev/kfd) and environment variables for AMD compatibility
- Sets up persistent storage and network integration with the existing proxy and AI internal networks
Introduce a new compose.yml to deploy Ollama with AMD GPU hardware acceleration on Docker Swarm.
- Uses the ROCm-enabled Ollama image.
- Mounts necessary GPU devices (/dev/kfd and /dev/dri) with root permissions for hardware access.
- Sets HSA_OVERRIDE_GFX_VERSION to 10.3.0 to support Navi 21 (6900 XT) cards.
- Configures persistent storage on local SSD and integrates with an external proxy network.
- Includes deployment constraints to target nodes labeled with GPUs.
Add a new compose.yml file to deploy a headless Chromium instance using the browserless image. This setup configures the cline-browser service with automatic restarts and the KeepAlive environment variable enabled to support AI-driven browser interactions.
Restructures the project by moving all application-specific docker-compose.yml
files into a new `Local Homelab/` parent directory.
This change improves overall project organization and provides a clear
logical grouping for local homelab-related service configurations.
Affected files:
- `homeassistant/docker-compose.yml`
- `monitoring/docker-compose.yml`
- `teslamate/docker-compose.yml`
All are now located under `Local Homelab/<service>/docker-compose.yml`.