# AdvisoryAI packaging (AIAI-31-008) Artifacts delivered for on-prem / air-gapped deployment: - `ops/advisory-ai/Dockerfile` builds WebService and Worker images (multi-role via `PROJECT`/`APP_DLL` args). - `ops/advisory-ai/docker-compose.advisoryai.yaml` runs WebService + Worker with shared data volume; ships remote inference toggle envs. - `ops/advisory-ai/helm/` provides a minimal chart (web + worker) with storage mounts, optional PVC, and remote inference settings. ## Build images ```bash # WebService docker build -f ops/advisory-ai/Dockerfile \ -t stellaops-advisoryai-web:dev \ --build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.WebService/StellaOps.AdvisoryAI.WebService.csproj \ --build-arg APP_DLL=StellaOps.AdvisoryAI.WebService.dll . # Worker docker build -f ops/advisory-ai/Dockerfile \ -t stellaops-advisoryai-worker:dev \ --build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.Worker/StellaOps.AdvisoryAI.Worker.csproj \ --build-arg APP_DLL=StellaOps.AdvisoryAI.Worker.dll . ``` ## Local/offline compose ```bash cd ops/advisory-ai docker compose -f docker-compose.advisoryai.yaml up -d --build ``` - Set `ADVISORYAI__INFERENCE__MODE=Remote` plus `ADVISORYAI__INFERENCE__REMOTE__BASEADDRESS`/`APIKEY` to offload inference. - Default mode is Local (offline-friendly). Queue/cache/output live under `/app/data` (binds to `advisoryai-data` volume). ## Helm (cluster) ```bash helm upgrade --install advisoryai ops/advisory-ai/helm \ --set image.repository=stellaops-advisoryai-web \ --set image.tag=dev \ --set inference.mode=Local ``` - Enable remote inference: `--set inference.mode=Remote --set inference.remote.baseAddress=https://inference.your.domain --set inference.remote.apiKey=`. - Enable persistence: `--set storage.persistence.enabled=true --set storage.persistence.size=10Gi` or `--set storage.persistence.existingClaim=`. - Worker replicas: `--set worker.replicas=2` (or `--set worker.enabled=false` to run WebService only). ## Operational notes - Data paths (`/app/data/plans`, `/app/data/queue`, `/app/data/outputs`) are configurable via env and pre-created at startup. - Guardrail phrases or policy knobs can be mounted under `/app/etc`; point `ADVISORYAI__GUARDRAILS__PHRASESLIST` to the mounted file. - Observability follows standard ASP.NET JSON logs; add OTEL exporters via `OTEL_EXPORTER_OTLP_ENDPOINT` env when allowed. Keep disabled in sealed/offline deployments. - For air-gapped clusters, publish built images to your registry and reference via `--set image.repository=/stellaops/advisoryai-web`.