49 lines
3.3 KiB
Markdown
49 lines
3.3 KiB
Markdown
# Nomad Homelab Stacks
|
|
|
|
This repository contains HashiCorp Nomad job definitions for various services deployed in a homelab environment. It also includes a Gitea Actions workflow for continuous deployment of these stacks.
|
|
|
|
## Repository Structure
|
|
|
|
The `stacks/` directory contains subdirectories for different service categories, with each subdirectory holding one or more Nomad job files (e.g., `.nomad`). Each stack also has its own `README.md` for detailed information.
|
|
|
|
- [`stacks/ai/ai-backend/README.md`](stacks/ai/ai-backend/README.md): Documentation for the Ollama AI backend.
|
|
- [`stacks/ai/ai-frontend/README.md`](stacks/ai/ai-frontend/README.md): Documentation for the Open WebUI and LobeChat AI frontends.
|
|
- [`stacks/networking/newt/README.md`](stacks/networking/newt/README.md): Documentation for the Project Newt networking agent.
|
|
|
|
## Gitea Actions Deployment Workflow
|
|
|
|
The `.gitea/workflows/deploy.yaml` file defines a Gitea Actions workflow to automate the deployment of Nomad jobs from this repository.
|
|
|
|
### Workflow: `Deploy to Nomad`
|
|
|
|
- **Trigger**: Manually dispatched via `workflow_dispatch`.
|
|
- **Inputs**: Requires `stack_name` (a choice from `ai-backend`, `ai-frontend`, `newt`). This corresponds to the name of the Nomad job file (without the `.nomad` extension).
|
|
- **Jobs**:
|
|
- **`deploy`**: Runs on `ubuntu-latest`.
|
|
1. **Checkout**: Clones the repository.
|
|
2. **Install Nomad CLI (Universal)**: Dynamically detects the architecture (amd64 or arm64), installs `unzip` and `curl`, then downloads and installs the Nomad CLI version 1.9.2.
|
|
3. **Run Deploy**:
|
|
- Sets the `NOMAD_ADDR` environment variable to `http://192.168.1.133:4646`.
|
|
- Finds the specified `STACK.nomad` file within the `stacks/` directory (including subfolders).
|
|
- Executes `nomad job run <FILE_PATH>` to deploy the selected Nomad job.
|
|
|
|
### How to use the Gitea Workflow
|
|
|
|
1. Navigate to the "Actions" tab in your Gitea repository.
|
|
2. Select the "Deploy to Nomad" workflow.
|
|
3. Click "Run workflow" and choose the `stack_name` you wish to deploy (e.g., `ai-backend`, `ai-frontend`, or `newt`).
|
|
4. Confirm the deployment. The workflow will automatically install the Nomad CLI, locate the job file, and deploy it to your configured Nomad server.
|
|
|
|
This workflow ensures a consistent and automated way to deploy and update services in your homelab environment.
|
|
|
|
## Projects Involved (Overview)
|
|
|
|
- **[HashiCorp Nomad](https://www.nomadproject.io/)**: A workload orchestrator.
|
|
- **[Gitea Actions](https://docs.gitea.io/en-us/actions/)**: A CI/CD solution integrated with Gitea.
|
|
- **[Podman](https://podman.io/)**: A daemonless container engine (used by most Nomad jobs in this repo).
|
|
- **[Traefik](https://traefik.io/traefik/)**: An open-source Edge Router (used for some services in this repo).
|
|
- **[HashiCorp Consul](https://www.consul.io/)**: A service mesh solution (used for service discovery).
|
|
- **[Ollama](https://ollama.com/)**: A tool to run large language models locally.
|
|
- **[Open WebUI](https://docs.openwebui.com/)**: A user-friendly, open-source web interface for LLMs.
|
|
- **[LobeChat](https://github.com/lobehub/lobe-chat)**: An open-source, high-performance, extensible LLM chatbot framework.
|
|
- **[Project Newt](https://github.com/fosrl/newt)**: A project for secure and resilient overlay networking. |