Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
- Introduced a comprehensive deployment guide for AdvisoryAI, detailing local builds, remote inference toggles, and scaling guidance. - Created a multi-role Dockerfile for building WebService and Worker images. - Added a docker-compose file for local and offline deployment. - Implemented a Helm chart for Kubernetes deployment with persistence and remote inference options. - Established a new API endpoint `/advisories/summary` for deterministic summaries of observations and linksets. - Introduced a JSON schema for risk profiles and a validator to ensure compliance with the schema. - Added unit tests for the risk profile validator to ensure functionality and error handling.
2.5 KiB
2.5 KiB
AdvisoryAI packaging (AIAI-31-008)
Artifacts delivered for on-prem / air-gapped deployment:
ops/advisory-ai/Dockerfilebuilds WebService and Worker images (multi-role viaPROJECT/APP_DLLargs).ops/advisory-ai/docker-compose.advisoryai.yamlruns WebService + Worker with shared data volume; ships remote inference toggle envs.ops/advisory-ai/helm/provides a minimal chart (web + worker) with storage mounts, optional PVC, and remote inference settings.
Build images
# WebService
docker build -f ops/advisory-ai/Dockerfile \
-t stellaops-advisoryai-web:dev \
--build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.WebService/StellaOps.AdvisoryAI.WebService.csproj \
--build-arg APP_DLL=StellaOps.AdvisoryAI.WebService.dll .
# Worker
docker build -f ops/advisory-ai/Dockerfile \
-t stellaops-advisoryai-worker:dev \
--build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.Worker/StellaOps.AdvisoryAI.Worker.csproj \
--build-arg APP_DLL=StellaOps.AdvisoryAI.Worker.dll .
Local/offline compose
cd ops/advisory-ai
docker compose -f docker-compose.advisoryai.yaml up -d --build
- Set
ADVISORYAI__INFERENCE__MODE=RemoteplusADVISORYAI__INFERENCE__REMOTE__BASEADDRESS/APIKEYto offload inference. - Default mode is Local (offline-friendly). Queue/cache/output live under
/app/data(binds toadvisoryai-datavolume).
Helm (cluster)
helm upgrade --install advisoryai ops/advisory-ai/helm \
--set image.repository=stellaops-advisoryai-web \
--set image.tag=dev \
--set inference.mode=Local
- Enable remote inference:
--set inference.mode=Remote --set inference.remote.baseAddress=https://inference.your.domain --set inference.remote.apiKey=<token>. - Enable persistence:
--set storage.persistence.enabled=true --set storage.persistence.size=10Gior--set storage.persistence.existingClaim=<pvc>. - Worker replicas:
--set worker.replicas=2(or--set worker.enabled=falseto run WebService only).
Operational notes
- Data paths (
/app/data/plans,/app/data/queue,/app/data/outputs) are configurable via env and pre-created at startup. - Guardrail phrases or policy knobs can be mounted under
/app/etc; pointADVISORYAI__GUARDRAILS__PHRASESLISTto the mounted file. - Observability follows standard ASP.NET JSON logs; add OTEL exporters via
OTEL_EXPORTER_OTLP_ENDPOINTenv when allowed. Keep disabled in sealed/offline deployments. - For air-gapped clusters, publish built images to your registry and reference via
--set image.repository=<registry>/stellaops/advisoryai-web.