feat(advisory-ai): Add deployment guide, Dockerfile, and Helm chart for on-prem packaging
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
- Introduced a comprehensive deployment guide for AdvisoryAI, detailing local builds, remote inference toggles, and scaling guidance. - Created a multi-role Dockerfile for building WebService and Worker images. - Added a docker-compose file for local and offline deployment. - Implemented a Helm chart for Kubernetes deployment with persistence and remote inference options. - Established a new API endpoint `/advisories/summary` for deterministic summaries of observations and linksets. - Introduced a JSON schema for risk profiles and a validator to ensure compliance with the schema. - Added unit tests for the risk profile validator to ensure functionality and error handling.
This commit is contained in:
47
ops/advisory-ai/Dockerfile
Normal file
47
ops/advisory-ai/Dockerfile
Normal file
@@ -0,0 +1,47 @@
|
||||
# syntax=docker/dockerfile:1.7-labs
|
||||
|
||||
# StellaOps AdvisoryAI – multi-role container build
|
||||
# Build arg PROJECT selects WebService or Worker; defaults to WebService.
|
||||
# Example builds:
|
||||
# docker build -f ops/advisory-ai/Dockerfile -t stellaops-advisoryai-web \
|
||||
# --build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.WebService/StellaOps.AdvisoryAI.WebService.csproj \
|
||||
# --build-arg APP_DLL=StellaOps.AdvisoryAI.WebService.dll .
|
||||
# docker build -f ops/advisory-ai/Dockerfile -t stellaops-advisoryai-worker \
|
||||
# --build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.Worker/StellaOps.AdvisoryAI.Worker.csproj \
|
||||
# --build-arg APP_DLL=StellaOps.AdvisoryAI.Worker.dll .
|
||||
|
||||
ARG SDK_IMAGE=mcr.microsoft.com/dotnet/nightly/sdk:10.0
|
||||
ARG RUNTIME_IMAGE=gcr.io/distroless/dotnet/aspnet:latest
|
||||
ARG PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.WebService/StellaOps.AdvisoryAI.WebService.csproj
|
||||
ARG APP_DLL=StellaOps.AdvisoryAI.WebService.dll
|
||||
|
||||
FROM ${SDK_IMAGE} AS build
|
||||
WORKDIR /src
|
||||
|
||||
COPY . .
|
||||
|
||||
# Restore only AdvisoryAI graph to keep build smaller.
|
||||
RUN dotnet restore ${PROJECT}
|
||||
|
||||
RUN dotnet publish ${PROJECT} \
|
||||
-c Release \
|
||||
-o /app/publish \
|
||||
/p:UseAppHost=false
|
||||
|
||||
FROM ${RUNTIME_IMAGE} AS runtime
|
||||
WORKDIR /app
|
||||
|
||||
ENV ASPNETCORE_URLS=http://0.0.0.0:8080 \
|
||||
DOTNET_SYSTEM_GLOBALIZATION_INVARIANT=true \
|
||||
ADVISORYAI__STORAGE__PLANCACHEDIRECTORY=/app/data/plans \
|
||||
ADVISORYAI__STORAGE__OUTPUTDIRECTORY=/app/data/outputs \
|
||||
ADVISORYAI__QUEUE__DIRECTORYPATH=/app/data/queue \
|
||||
ADVISORYAI__INFERENCE__MODE=Local
|
||||
|
||||
COPY --from=build /app/publish ./
|
||||
|
||||
# Writable mount for queue/cache/output. Guardrail/guardrails can also be mounted under /app/etc.
|
||||
VOLUME ["/app/data", "/app/etc"]
|
||||
|
||||
EXPOSE 8080
|
||||
ENTRYPOINT ["dotnet", "${APP_DLL}"]
|
||||
47
ops/advisory-ai/README.md
Normal file
47
ops/advisory-ai/README.md
Normal file
@@ -0,0 +1,47 @@
|
||||
# AdvisoryAI packaging (AIAI-31-008)
|
||||
|
||||
Artifacts delivered for on-prem / air-gapped deployment:
|
||||
|
||||
- `ops/advisory-ai/Dockerfile` builds WebService and Worker images (multi-role via `PROJECT`/`APP_DLL` args).
|
||||
- `ops/advisory-ai/docker-compose.advisoryai.yaml` runs WebService + Worker with shared data volume; ships remote inference toggle envs.
|
||||
- `ops/advisory-ai/helm/` provides a minimal chart (web + worker) with storage mounts, optional PVC, and remote inference settings.
|
||||
|
||||
## Build images
|
||||
```bash
|
||||
# WebService
|
||||
docker build -f ops/advisory-ai/Dockerfile \
|
||||
-t stellaops-advisoryai-web:dev \
|
||||
--build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.WebService/StellaOps.AdvisoryAI.WebService.csproj \
|
||||
--build-arg APP_DLL=StellaOps.AdvisoryAI.WebService.dll .
|
||||
|
||||
# Worker
|
||||
docker build -f ops/advisory-ai/Dockerfile \
|
||||
-t stellaops-advisoryai-worker:dev \
|
||||
--build-arg PROJECT=src/AdvisoryAI/StellaOps.AdvisoryAI.Worker/StellaOps.AdvisoryAI.Worker.csproj \
|
||||
--build-arg APP_DLL=StellaOps.AdvisoryAI.Worker.dll .
|
||||
```
|
||||
|
||||
## Local/offline compose
|
||||
```bash
|
||||
cd ops/advisory-ai
|
||||
docker compose -f docker-compose.advisoryai.yaml up -d --build
|
||||
```
|
||||
- Set `ADVISORYAI__INFERENCE__MODE=Remote` plus `ADVISORYAI__INFERENCE__REMOTE__BASEADDRESS`/`APIKEY` to offload inference.
|
||||
- Default mode is Local (offline-friendly). Queue/cache/output live under `/app/data` (binds to `advisoryai-data` volume).
|
||||
|
||||
## Helm (cluster)
|
||||
```bash
|
||||
helm upgrade --install advisoryai ops/advisory-ai/helm \
|
||||
--set image.repository=stellaops-advisoryai-web \
|
||||
--set image.tag=dev \
|
||||
--set inference.mode=Local
|
||||
```
|
||||
- Enable remote inference: `--set inference.mode=Remote --set inference.remote.baseAddress=https://inference.your.domain --set inference.remote.apiKey=<token>`.
|
||||
- Enable persistence: `--set storage.persistence.enabled=true --set storage.persistence.size=10Gi` or `--set storage.persistence.existingClaim=<pvc>`.
|
||||
- Worker replicas: `--set worker.replicas=2` (or `--set worker.enabled=false` to run WebService only).
|
||||
|
||||
## Operational notes
|
||||
- Data paths (`/app/data/plans`, `/app/data/queue`, `/app/data/outputs`) are configurable via env and pre-created at startup.
|
||||
- Guardrail phrases or policy knobs can be mounted under `/app/etc`; point `ADVISORYAI__GUARDRAILS__PHRASESLIST` to the mounted file.
|
||||
- Observability follows standard ASP.NET JSON logs; add OTEL exporters via `OTEL_EXPORTER_OTLP_ENDPOINT` env when allowed. Keep disabled in sealed/offline deployments.
|
||||
- For air-gapped clusters, publish built images to your registry and reference via `--set image.repository=<registry>/stellaops/advisoryai-web`.
|
||||
55
ops/advisory-ai/docker-compose.advisoryai.yaml
Normal file
55
ops/advisory-ai/docker-compose.advisoryai.yaml
Normal file
@@ -0,0 +1,55 @@
|
||||
version: "3.9"
|
||||
|
||||
# Local/offline deployment for AdvisoryAI WebService + Worker.
|
||||
services:
|
||||
advisoryai-web:
|
||||
build:
|
||||
context: ../..
|
||||
dockerfile: ops/advisory-ai/Dockerfile
|
||||
args:
|
||||
PROJECT: src/AdvisoryAI/StellaOps.AdvisoryAI.WebService/StellaOps.AdvisoryAI.WebService.csproj
|
||||
APP_DLL: StellaOps.AdvisoryAI.WebService.dll
|
||||
image: stellaops-advisoryai-web:dev
|
||||
depends_on:
|
||||
- advisoryai-worker
|
||||
environment:
|
||||
ASPNETCORE_URLS: "http://0.0.0.0:8080"
|
||||
ADVISORYAI__QUEUE__DIRECTORYPATH: "/app/data/queue"
|
||||
ADVISORYAI__STORAGE__PLANCACHEDIRECTORY: "/app/data/plans"
|
||||
ADVISORYAI__STORAGE__OUTPUTDIRECTORY: "/app/data/outputs"
|
||||
ADVISORYAI__INFERENCE__MODE: "Local" # switch to Remote to call an external inference host
|
||||
# ADVISORYAI__INFERENCE__REMOTE__BASEADDRESS: "https://inference.example.com"
|
||||
# ADVISORYAI__INFERENCE__REMOTE__ENDPOINT: "/v1/inference"
|
||||
# ADVISORYAI__INFERENCE__REMOTE__APIKEY: "set-me"
|
||||
# ADVISORYAI__INFERENCE__REMOTE__TIMEOUT: "00:00:30"
|
||||
# Example SBOM context feed; optional.
|
||||
# ADVISORYAI__SBOMBASEADDRESS: "https://sbom.local/v1/sbom/context"
|
||||
# ADVISORYAI__SBOMTENANT: "tenant-a"
|
||||
# ADVISORYAI__GUARDRAILS__PHRASESLIST: "/app/etc/guardrails/phrases.txt"
|
||||
volumes:
|
||||
- advisoryai-data:/app/data
|
||||
- ./etc:/app/etc:ro
|
||||
ports:
|
||||
- "7071:8080"
|
||||
restart: unless-stopped
|
||||
|
||||
advisoryai-worker:
|
||||
build:
|
||||
context: ../..
|
||||
dockerfile: ops/advisory-ai/Dockerfile
|
||||
args:
|
||||
PROJECT: src/AdvisoryAI/StellaOps.AdvisoryAI.Worker/StellaOps.AdvisoryAI.Worker.csproj
|
||||
APP_DLL: StellaOps.AdvisoryAI.Worker.dll
|
||||
image: stellaops-advisoryai-worker:dev
|
||||
environment:
|
||||
ADVISORYAI__QUEUE__DIRECTORYPATH: "/app/data/queue"
|
||||
ADVISORYAI__STORAGE__PLANCACHEDIRECTORY: "/app/data/plans"
|
||||
ADVISORYAI__STORAGE__OUTPUTDIRECTORY: "/app/data/outputs"
|
||||
ADVISORYAI__INFERENCE__MODE: "Local"
|
||||
volumes:
|
||||
- advisoryai-data:/app/data
|
||||
- ./etc:/app/etc:ro
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
advisoryai-data:
|
||||
0
ops/advisory-ai/etc/.gitkeep
Normal file
0
ops/advisory-ai/etc/.gitkeep
Normal file
6
ops/advisory-ai/helm/Chart.yaml
Normal file
6
ops/advisory-ai/helm/Chart.yaml
Normal file
@@ -0,0 +1,6 @@
|
||||
apiVersion: v2
|
||||
name: stellaops-advisoryai
|
||||
version: 0.1.0
|
||||
appVersion: "0.1.0"
|
||||
description: AdvisoryAI WebService + Worker packaging for on-prem/air-gapped installs.
|
||||
type: application
|
||||
12
ops/advisory-ai/helm/templates/_helpers.tpl
Normal file
12
ops/advisory-ai/helm/templates/_helpers.tpl
Normal file
@@ -0,0 +1,12 @@
|
||||
{{- define "stellaops-advisoryai.name" -}}
|
||||
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- define "stellaops-advisoryai.fullname" -}}
|
||||
{{- if .Values.fullnameOverride -}}
|
||||
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" -}}
|
||||
{{- else -}}
|
||||
{{- $name := default .Chart.Name .Values.nameOverride -}}
|
||||
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
71
ops/advisory-ai/helm/templates/deployment.yaml
Normal file
71
ops/advisory-ai/helm/templates/deployment.yaml
Normal file
@@ -0,0 +1,71 @@
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: {{ include "stellaops-advisoryai.fullname" . }}
|
||||
labels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
app.kubernetes.io/version: {{ .Chart.AppVersion }}
|
||||
spec:
|
||||
replicas: 1
|
||||
selector:
|
||||
matchLabels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
spec:
|
||||
containers:
|
||||
- name: web
|
||||
image: "{{ .Values.image.repository }}:{{ .Values.image.tag }}"
|
||||
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
||||
env:
|
||||
- name: ASPNETCORE_URLS
|
||||
value: "http://0.0.0.0:{{ .Values.service.port }}"
|
||||
- name: ADVISORYAI__INFERENCE__MODE
|
||||
value: "{{ .Values.inference.mode }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__BASEADDRESS
|
||||
value: "{{ .Values.inference.remote.baseAddress }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__ENDPOINT
|
||||
value: "{{ .Values.inference.remote.endpoint }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__APIKEY
|
||||
value: "{{ .Values.inference.remote.apiKey }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__TIMEOUT
|
||||
value: "{{ printf "00:00:%d" .Values.inference.remote.timeoutSeconds }}"
|
||||
- name: ADVISORYAI__STORAGE__PLANCACHEDIRECTORY
|
||||
value: {{ .Values.storage.planCachePath | quote }}
|
||||
- name: ADVISORYAI__STORAGE__OUTPUTDIRECTORY
|
||||
value: {{ .Values.storage.outputPath | quote }}
|
||||
- name: ADVISORYAI__QUEUE__DIRECTORYPATH
|
||||
value: {{ .Values.storage.queuePath | quote }}
|
||||
envFrom:
|
||||
{{- if .Values.extraEnvFrom }}
|
||||
- secretRef:
|
||||
name: {{ .Values.extraEnvFrom | first }}
|
||||
{{- end }}
|
||||
{{- if .Values.extraEnv }}
|
||||
{{- range .Values.extraEnv }}
|
||||
- name: {{ .name }}
|
||||
value: {{ .value | quote }}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
ports:
|
||||
- containerPort: {{ .Values.service.port }}
|
||||
volumeMounts:
|
||||
- name: advisoryai-data
|
||||
mountPath: /app/data
|
||||
resources: {{- toYaml .Values.resources | nindent 12 }}
|
||||
volumes:
|
||||
- name: advisoryai-data
|
||||
{{- if .Values.storage.persistence.enabled }}
|
||||
persistentVolumeClaim:
|
||||
claimName: {{ .Values.storage.persistence.existingClaim | default (printf "%s-data" (include "stellaops-advisoryai.fullname" .)) }}
|
||||
{{- else }}
|
||||
emptyDir: {}
|
||||
{{- end }}
|
||||
nodeSelector: {{- toYaml .Values.nodeSelector | nindent 8 }}
|
||||
tolerations: {{- toYaml .Values.tolerations | nindent 8 }}
|
||||
affinity: {{- toYaml .Values.affinity | nindent 8 }}
|
||||
15
ops/advisory-ai/helm/templates/pvc.yaml
Normal file
15
ops/advisory-ai/helm/templates/pvc.yaml
Normal file
@@ -0,0 +1,15 @@
|
||||
{{- if and .Values.storage.persistence.enabled (not .Values.storage.persistence.existingClaim) }}
|
||||
apiVersion: v1
|
||||
kind: PersistentVolumeClaim
|
||||
metadata:
|
||||
name: {{ printf "%s-data" (include "stellaops-advisoryai.fullname" .) }}
|
||||
labels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
spec:
|
||||
accessModes:
|
||||
- ReadWriteOnce
|
||||
resources:
|
||||
requests:
|
||||
storage: {{ .Values.storage.persistence.size }}
|
||||
{{- end }}
|
||||
17
ops/advisory-ai/helm/templates/service.yaml
Normal file
17
ops/advisory-ai/helm/templates/service.yaml
Normal file
@@ -0,0 +1,17 @@
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: {{ include "stellaops-advisoryai.fullname" . }}
|
||||
labels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
spec:
|
||||
type: {{ .Values.service.type }}
|
||||
selector:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
ports:
|
||||
- name: http
|
||||
port: {{ .Values.service.port }}
|
||||
targetPort: {{ .Values.service.port }}
|
||||
protocol: TCP
|
||||
66
ops/advisory-ai/helm/templates/worker.yaml
Normal file
66
ops/advisory-ai/helm/templates/worker.yaml
Normal file
@@ -0,0 +1,66 @@
|
||||
{{- if .Values.worker.enabled }}
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: {{ include "stellaops-advisoryai.fullname" . }}-worker
|
||||
labels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}-worker
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
spec:
|
||||
replicas: {{ .Values.worker.replicas }}
|
||||
selector:
|
||||
matchLabels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}-worker
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app.kubernetes.io/name: {{ include "stellaops-advisoryai.name" . }}-worker
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
spec:
|
||||
containers:
|
||||
- name: worker
|
||||
image: "{{ .Values.image.repository }}:{{ .Values.image.tag }}"
|
||||
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
||||
command: ["dotnet", "StellaOps.AdvisoryAI.Worker.dll"]
|
||||
env:
|
||||
- name: ADVISORYAI__INFERENCE__MODE
|
||||
value: "{{ .Values.inference.mode }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__BASEADDRESS
|
||||
value: "{{ .Values.inference.remote.baseAddress }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__ENDPOINT
|
||||
value: "{{ .Values.inference.remote.endpoint }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__APIKEY
|
||||
value: "{{ .Values.inference.remote.apiKey }}"
|
||||
- name: ADVISORYAI__INFERENCE__REMOTE__TIMEOUT
|
||||
value: "{{ printf "00:00:%d" .Values.inference.remote.timeoutSeconds }}"
|
||||
- name: ADVISORYAI__STORAGE__PLANCACHEDIRECTORY
|
||||
value: {{ .Values.storage.planCachePath | quote }}
|
||||
- name: ADVISORYAI__STORAGE__OUTPUTDIRECTORY
|
||||
value: {{ .Values.storage.outputPath | quote }}
|
||||
- name: ADVISORYAI__QUEUE__DIRECTORYPATH
|
||||
value: {{ .Values.storage.queuePath | quote }}
|
||||
envFrom:
|
||||
{{- if .Values.extraEnvFrom }}
|
||||
- secretRef:
|
||||
name: {{ .Values.extraEnvFrom | first }}
|
||||
{{- end }}
|
||||
{{- if .Values.extraEnv }}
|
||||
{{- range .Values.extraEnv }}
|
||||
- name: {{ .name }}
|
||||
value: {{ .value | quote }}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
volumeMounts:
|
||||
- name: advisoryai-data
|
||||
mountPath: /app/data
|
||||
resources: {{- toYaml .Values.worker.resources | nindent 12 }}
|
||||
volumes:
|
||||
- name: advisoryai-data
|
||||
{{- if .Values.storage.persistence.enabled }}
|
||||
persistentVolumeClaim:
|
||||
claimName: {{ .Values.storage.persistence.existingClaim | default (printf "%s-data" (include "stellaops-advisoryai.fullname" .)) }}
|
||||
{{- else }}
|
||||
emptyDir: {}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
38
ops/advisory-ai/helm/values.yaml
Normal file
38
ops/advisory-ai/helm/values.yaml
Normal file
@@ -0,0 +1,38 @@
|
||||
image:
|
||||
repository: stellaops/advisoryai
|
||||
tag: dev
|
||||
pullPolicy: IfNotPresent
|
||||
|
||||
service:
|
||||
port: 8080
|
||||
type: ClusterIP
|
||||
|
||||
inference:
|
||||
mode: Local # or Remote
|
||||
remote:
|
||||
baseAddress: ""
|
||||
endpoint: "/v1/inference"
|
||||
apiKey: ""
|
||||
timeoutSeconds: 30
|
||||
|
||||
storage:
|
||||
planCachePath: /app/data/plans
|
||||
outputPath: /app/data/outputs
|
||||
queuePath: /app/data/queue
|
||||
persistence:
|
||||
enabled: false
|
||||
existingClaim: ""
|
||||
size: 5Gi
|
||||
|
||||
resources: {}
|
||||
nodeSelector: {}
|
||||
tolerations: []
|
||||
affinity: {}
|
||||
|
||||
worker:
|
||||
enabled: true
|
||||
replicas: 1
|
||||
resources: {}
|
||||
|
||||
extraEnv: [] # list of { name: ..., value: ... }
|
||||
extraEnvFrom: []
|
||||
Reference in New Issue
Block a user