Advisory AI CI Runner Harness (DEVOPS-AIAI-31-001)
Purpose: deterministic, offline-friendly CI harness for Advisory AI service/worker. Produces warmed-cache restore, build binlog, and TRX outputs for the core test suite so downstream sprints can validate without bespoke pipelines.
Usage
- From repo root run:
ops/devops/advisoryai-ci-runner/run-advisoryai-ci.sh - Outputs land in
ops/devops/artifacts/advisoryai-ci/<UTC timestamp>/:build.binlog(solution build)tests/advisoryai.trx(VSTest results)summary.json(paths + hashes + durations)
Environment
- Defaults:
DOTNET_CLI_TELEMETRY_OPTOUT=1,DOTNET_SKIP_FIRST_TIME_EXPERIENCE=1,NUGET_PACKAGES=$REPO/.nuget/packages. - Sources default to
local-nugetsthen the warmed cache; override viaNUGET_SOURCES(semicolon-separated). - No external services required; tests are isolated/local.
What it does
- Warm NuGet cache from
local-nugets/into$NUGET_PACKAGESfor air-gap parity. dotnet restore+dotnet buildonsrc/AdvisoryAI/StellaOps.AdvisoryAI.slnwith/bl.- Run the AdvisoryAI test project (
__Tests/StellaOps.AdvisoryAI.Tests) with TRX output; optionalTEST_FILTERenv narrows scope. - Emit
summary.jsonwith artefact paths and SHA256s for reproducibility.
Notes
- Timestamped output folders keep ordering deterministic; consumers should sort lexicographically.
- Use
TEST_FILTER="Name~Inference"to target inference/monitoring-specific tests when iterating.