Implement MongoDB-based storage for Pack Run approval, artifact, log, and state management
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
- Added MongoPackRunApprovalStore for managing approval states with MongoDB. - Introduced MongoPackRunArtifactUploader for uploading and storing artifacts. - Created MongoPackRunLogStore to handle logging of pack run events. - Developed MongoPackRunStateStore for persisting and retrieving pack run states. - Implemented unit tests for MongoDB stores to ensure correct functionality. - Added MongoTaskRunnerTestContext for setting up MongoDB test environment. - Enhanced PackRunStateFactory to correctly initialize state with gate reasons.
This commit is contained in:
@@ -15,6 +15,7 @@ forbidden fields are rejected long before they reach MongoDB.
|
||||
- `IAocGuard` / `AocWriteGuard` — validate JSON payloads and emit `AocGuardResult`.
|
||||
- `AocGuardOptions` — toggles for signature enforcement, tenant requirements, and required top-level fields.
|
||||
- `AocViolation` / `AocViolationCode` — structured violations surfaced to callers.
|
||||
- `AocError` — canonical error DTO (`code`, `message`, `violations[]`) re-used by HTTP helpers, CLI tooling, and telemetry.
|
||||
- `ServiceCollectionExtensions.AddAocGuard()` — DI helper that registers the singleton guard.
|
||||
- `AocGuardExtensions.ValidateOrThrow()` — throws `AocGuardException` when validation fails.
|
||||
|
||||
@@ -75,7 +76,22 @@ Key points:
|
||||
can yield multiple payloads (e.g. batch ingestion) and the filter will validate each one.
|
||||
- Prefer the `RequireAocGuard` extension when wiring endpoints; it wraps `AddEndpointFilter`
|
||||
and handles single-payload scenarios without additional boilerplate.
|
||||
- Wrap guard exceptions with `AocHttpResults.Problem` to ensure clients receive machine-readables codes (`ERR_AOC_00x`).
|
||||
- Wrap guard exceptions with `AocHttpResults.Problem` to ensure clients receive machine-readable codes (`ERR_AOC_00x`). The helper now emits the serialized `AocError` under the `error` extension for consumers that want a typed payload.
|
||||
|
||||
### Allowed top-level fields
|
||||
|
||||
`AocWriteGuard` enforces the contract’s top-level allowlist: `_id`, `tenant`, `source`, `upstream`,
|
||||
`content`, `identifiers`, `linkset`, `supersedes`, `createdAt`/`created_at`, `ingestedAt`/`ingested_at`, and `attributes`.
|
||||
Unknown fields produce `ERR_AOC_007` violations. When staging schema changes, extend the allowlist through
|
||||
`AocGuardOptions.AllowedTopLevelFields`:
|
||||
|
||||
```csharp
|
||||
builder.Services.Configure<AocGuardOptions>(options =>
|
||||
{
|
||||
options.AllowedTopLevelFields =
|
||||
options.AllowedTopLevelFields.Add("experimental_field");
|
||||
});
|
||||
```
|
||||
|
||||
## Worker / repository usage
|
||||
|
||||
@@ -100,6 +116,7 @@ public sealed class AdvisoryRawRepository
|
||||
## Configuration tips
|
||||
|
||||
- Adjust `AocGuardOptions.RequiredTopLevelFields` when staging new schema changes. All configured names are case-insensitive.
|
||||
- Extend `AllowedTopLevelFields` for temporary schema experiments so that guard runs stay clean while the contract is updated.
|
||||
- Set `RequireSignatureMetadata = false` for legacy feeds that do not provide signature envelopes yet; track the waiver in the module backlog.
|
||||
- Use module-specific wrappers (`AddConcelierAocGuards`, `AddExcititorAocGuards`) to combine guard registration with domain exceptions and metrics.
|
||||
|
||||
|
||||
@@ -72,7 +72,37 @@ The output JSON captures:
|
||||
- Provenance placeholder (`expectedDsseSha256`, `nonce`, `attestorUri` when provided). `nonce` is derived deterministically from the image + SBOM metadata so repeated runs produce identical placeholders for identical inputs.
|
||||
- Generator metadata and deterministic timestamps.
|
||||
|
||||
## 5. (Optional) Send the placeholder to an Attestor
|
||||
### 4.1 Persist Surface manifests & payloads (optional)
|
||||
|
||||
Pass the new `--surface-*` switches to the `descriptor` command whenever you have Surface artefacts (layer fragments, EntryTrace graph/NDJSON) that should be cached during build-time scans. The plug-in stores the payloads under the configured CAS root (defaults to `--cas`) and emits a manifest pointer that Scanner/WebService can consume later.
|
||||
|
||||
```bash
|
||||
dotnet out/buildx/StellaOps.Scanner.Sbomer.BuildXPlugin.dll descriptor \
|
||||
--manifest out/buildx \
|
||||
--image "$DIGEST" \
|
||||
--sbom out/buildx-sbom.cdx.json \
|
||||
--surface-layer-fragments out/layer-fragments.json \
|
||||
--surface-entrytrace-graph out/entrytrace-graph.json \
|
||||
--surface-entrytrace-ndjson out/entrytrace.ndjson \
|
||||
--surface-cache-root out/cas \
|
||||
--surface-tenant demo-tenant \
|
||||
--surface-manifest-output out/surface-manifest.json \
|
||||
> out/buildx-descriptor.json
|
||||
```
|
||||
|
||||
Environment variables mirror the CLI flags when you need deterministic defaults:
|
||||
|
||||
| Variable | Purpose |
|
||||
|----------|---------|
|
||||
| `STELLAOPS_SURFACE_CACHE_ROOT` | CAS/cache directory used for Surface artefacts (defaults to `--cas`). |
|
||||
| `STELLAOPS_SURFACE_BUCKET` | Bucket name embedded in `cas://` URIs (default `scanner-artifacts`). |
|
||||
| `STELLAOPS_SURFACE_TENANT` | Tenant recorded in the manifest (`default` if omitted). |
|
||||
| `STELLAOPS_SURFACE_LAYER_FRAGMENTS` / `...ENTRYTRACE_GRAPH` / `...ENTRYTRACE_NDJSON` | File paths for the respective artefacts. |
|
||||
| `STELLAOPS_SURFACE_MANIFEST_OUTPUT` | Optional path where the manifest JSON should be duplicated for CI artefacts. |
|
||||
|
||||
Manifests and payloads use the same deterministic layout as Scanner.Worker (`scanner/surface/...`) so WebService and Offline Kit tooling can consume them without rescanning the image.
|
||||
|
||||
## 5. (Optional) Send the placeholder to an Attestor
|
||||
|
||||
The plug-in can POST the descriptor metadata to an Attestor endpoint, returning once it receives an HTTP 202.
|
||||
|
||||
|
||||
@@ -46,6 +46,8 @@ Follow the sprint files below in order. Update task status in both `SPRINTS` and
|
||||
> 2025-11-06: MERGE-LNM-21-002 remains DOING (BE-Merge) – default-off merge DI + job gating landed, but Concelier WebService ingest/mirror tests are failing; guard and migration fixes pending before completion.
|
||||
> 2025-11-06: TASKRUN-43-001 marked DONE (Task Runner Guild) – approvals resume API now requeues packs, plan snapshots persisted, and filesystem artifact uploader stores manifests/files for offline review.
|
||||
> 2025-11-06: CLI-POLICY-23-005 marked DONE (DevEx/CLI Guild) – policy activate CLI verifies scheduling/approval flow, Spectre console fallbacks emit warnings offline, and full CLI suite passes against local feeds.
|
||||
> 2025-11-07: DOCS-AIAI-31-007 marked DONE (Docs Guild, Security Guild) – published `/docs/security/assistant-guardrails.md` covering redaction rules, blocked phrases, telemetry, and alert wiring.
|
||||
> 2025-11-06: AIAI-31-007 marked DONE (Advisory AI Guild, Observability Guild) – pipeline latency histograms, guardrail/validation counters, citation coverage metrics, and OTEL spans ship alongside refreshed Grafana alerts.
|
||||
> 2025-11-03: DOCS-LNM-22-008 moved to DOING (Docs Guild, DevOps Guild) – aligning migration playbook structure and readiness checklist.
|
||||
> 2025-11-03: DOCS-LNM-22-008 marked DONE – `/docs/migration/no-merge.md` published for DevOps/Export Center planning with checklist for cutover readiness.
|
||||
> 2025-11-03: SCHED-CONSOLE-27-001 marked DONE (Scheduler WebService Guild, Policy Registry Guild) – policy simulation endpoints now emit SSE retry/heartbeat, enforce metadata normalization, support Mongo-backed integration, and ship auth/stream coverage.
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
|
||||
- `CONCELIER-GRAPH-21-001`, `CONCELIER-GRAPH-21-002`, and `CONCELIER-GRAPH-21-005` remain BLOCKED awaiting `CONCELIER-POLICY-20-002` outputs and Cartographer schema (`CARTO-GRAPH-21-002`), keeping downstream Excititor graph consumers on hold.
|
||||
- `EXCITITOR-GRAPH-21-001`, `EXCITITOR-GRAPH-21-002`, and `EXCITITOR-GRAPH-21-005` stay BLOCKED until the same Cartographer/Link-Not-Merge prerequisites are delivered.
|
||||
- Connector provenance updates `FEEDCONN-ICSCISA-02-012` (due 2025-10-23) and `FEEDCONN-KISA-02-008` (due 2025-10-24) plus coordination items `FEEDMERGE-COORD-02-901`/`FEEDMERGE-COORD-02-902`/`FEEDMERGE-COORD-02-903` (due 2025-10-21 through 2025-10-24) are past due and need scheduling.
|
||||
- Connector provenance updates `FEEDCONN-ICSCISA-02-012` (due 2025-10-23) and `FEEDCONN-KISA-02-008` (due 2025-10-24) remain past due and need scheduling. FeedMerge coordination tasks have been dropped (no AOC policy/governance backing yet), so capacity shifts to schema/guard deliverables.
|
||||
- Mirror evidence work remains blocked until `MIRROR-CRT-56-001` ships; align Export Center (`EXPORT-OBS-51-001`) and AirGap time anchor (`AIRGAP-TIME-57-001`) owners for kickoff.
|
||||
|
||||
[Ingestion & Evidence] 110.A) AdvisoryAI
|
||||
@@ -29,15 +29,15 @@ AIAI-31-004 | DONE (2025-11-04) | Build orchestration pipeline for Summary/Confl
|
||||
AIAI-31-004A | DONE (2025-11-04) | Wire orchestrator into WebService/Worker, expose API + queue contract, emit metrics, stub cache. Dependencies: AIAI-31-004, AIAI-31-002. | Advisory AI Guild, Platform Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
> 2025-11-03: WebService/Worker scaffolds created with in-memory cache/queue, minimal APIs (`/api/v1/advisory/plan`, `/api/v1/advisory/queue`), metrics counters, and plan cache instrumentation; worker processes queue using orchestrator.
|
||||
> 2025-11-04: SBOM base address now flows via `SbomContextClientOptions.BaseAddress`, worker emits queue/plan metrics, and orchestrator cache keys expanded to cover SBOM hash inputs.
|
||||
AIAI-31-004B | TODO | Implement prompt assembler, guardrails, cache persistence, DSSE provenance, golden outputs. Dependencies: AIAI-31-004A, DOCS-AIAI-31-003, AUTH-AIAI-31-004. | Advisory AI Guild, Security Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-004C | TODO | Deliver CLI `stella advise run` command, renderer, docs, CLI golden tests. Dependencies: AIAI-31-004B, CLI-AIAI-31-003. | Advisory AI Guild, CLI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-004B | DONE (2025-11-06) | Implement prompt assembler, guardrails, cache persistence, DSSE provenance, golden outputs. Dependencies: AIAI-31-004A, DOCS-AIAI-31-003, AUTH-AIAI-31-004. | Advisory AI Guild, Security Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-004C | DONE (2025-11-06) | Deliver CLI `stella advise run` command, renderer, docs, CLI golden tests. Dependencies: AIAI-31-004B, CLI-AIAI-31-003. | Advisory AI Guild, CLI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
DOCS-AIAI-31-002 | DONE (2025-11-03) | Author `/docs/advisory-ai/architecture.md` detailing RAG pipeline, deterministic tooling, caching, model profiles. Dependencies: AIAI-31-004. | Docs Guild, Advisory AI Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-001 | DONE (2025-11-03) | Publish `/docs/advisory-ai/overview.md` covering capabilities, guardrails, RBAC personas, and offline posture. | Docs Guild, Advisory AI Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-003 | DONE (2025-11-03) | Write `/docs/advisory-ai/api.md` covering endpoints, schemas, errors, rate limits, and imposed-rule banner. Dependencies: DOCS-AIAI-31-002. | Docs Guild, Advisory AI Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-004 | BLOCKED (2025-11-03) | Create `/docs/advisory-ai/console.md` with screenshots, a11y notes, copy-as-ticket instructions. Dependencies: CONSOLE-VULN-29-001, CONSOLE-VEX-30-001, EXCITITOR-CONSOLE-23-001. | Docs Guild, Console Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-005 | BLOCKED (2025-11-03) | Publish `/docs/advisory-ai/cli.md` covering commands, exit codes, scripting patterns. Dependencies: CLI-VULN-29-001, CLI-VEX-30-001, AIAI-31-004C. | Docs Guild, DevEx/CLI Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-006 | BLOCKED (2025-11-03) | Update `/docs/policy/assistant-parameters.md` covering temperature, token limits, ranking weights, TTLs. Dependencies: POLICY-ENGINE-31-001. | Docs Guild, Policy Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-007 | BLOCKED (2025-11-03) | Write `/docs/security/assistant-guardrails.md` detailing redaction, injection defense, logging. Dependencies: AIAI-31-005. | Docs Guild, Security Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-007 | DONE (2025-11-07) | Write `/docs/security/assistant-guardrails.md` detailing redaction, injection defense, logging. Dependencies: AIAI-31-005. | Docs Guild, Security Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-008 | BLOCKED (2025-11-03) | Publish `/docs/sbom/remediation-heuristics.md` (feasibility scoring, blast radius). Dependencies: SBOM-AIAI-31-001. | Docs Guild, SBOM Service Guild (docs/TASKS.md)
|
||||
DOCS-AIAI-31-009 | BLOCKED (2025-11-03) | Create `/docs/runbooks/assistant-ops.md` for warmup, cache priming, model outages, scaling. Dependencies: DEVOPS-AIAI-31-001. | Docs Guild, DevOps Guild (docs/TASKS.md)
|
||||
> 2025-11-03: DOCS-AIAI-31-003 moved to DOING – drafting Advisory AI API reference (endpoints, rate limits, error model) for sprint 110.
|
||||
@@ -48,13 +48,14 @@ DOCS-AIAI-31-009 | BLOCKED (2025-11-03) | Create `/docs/runbooks/assistant-ops.m
|
||||
> 2025-11-03: DOCS-AIAI-31-004 marked BLOCKED – Console widgets/endpoints (CONSOLE-VULN-29-001, CONSOLE-VEX-30-001, EXCITITOR-CONSOLE-23-001) still pending; cannot document UI flows yet.
|
||||
> 2025-11-03: DOCS-AIAI-31-005 marked BLOCKED – CLI implementation (`stella advise run`, CLI-VULN-29-001, CLI-VEX-30-001) plus AIAI-31-004C not shipped; doc blocked until commands exist.
|
||||
> 2025-11-03: DOCS-AIAI-31-006 marked BLOCKED – Advisory AI parameter knobs (POLICY-ENGINE-31-001) absent; doc deferred.
|
||||
> 2025-11-03: DOCS-AIAI-31-007 marked BLOCKED – Guardrail implementation (AIAI-31-005) incomplete.
|
||||
> 2025-11-07: DOCS-AIAI-31-007 marked DONE – `/docs/security/assistant-guardrails.md` now documents redaction rules, blocked phrases, telemetry, and alert procedures.
|
||||
> 2025-11-03: DOCS-AIAI-31-008 marked BLOCKED – Waiting on SBOM heuristics delivery (SBOM-AIAI-31-001).
|
||||
> 2025-11-03: DOCS-AIAI-31-009 marked BLOCKED – DevOps runbook inputs (DEVOPS-AIAI-31-001) outstanding.
|
||||
AIAI-31-005 | DONE (2025-11-04) | Implement guardrails (redaction, injection defense, output validation, citation enforcement) and fail-safe handling. Dependencies: AIAI-31-004. | Advisory AI Guild, Security Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-006 | DONE (2025-11-04) | Expose REST API endpoints (`/advisory/ai/*`) with RBAC, rate limits, OpenAPI schemas, and batching support. Dependencies: AIAI-31-004..005. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
> 2025-11-03: Shipped `/api/v1/advisory/{task}` execution and `/api/v1/advisory/outputs/{cacheKey}` retrieval endpoints with guardrail integration, provenance hashes, and metrics (RBAC & rate limiting still pending Authority scope delivery).
|
||||
AIAI-31-007 | TODO | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. Dependencies: AIAI-31-004..006. | Advisory AI Guild, Observability Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-007 | DONE (2025-11-06) | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. Dependencies: AIAI-31-004..006. | Advisory AI Guild, Observability Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
> 2025-11-06: AIAI-31-007 completed – Advisory AI WebService/Worker emit latency histograms, guardrail/validation counters, citation coverage ratios, and OTEL spans; Grafana dashboard + burn-rate alerts refreshed.
|
||||
AIAI-31-008 | TODO | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. Dependencies: AIAI-31-006..007. | Advisory AI Guild, DevOps Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-010 | DONE (2025-11-02) | Implement Concelier advisory raw document provider mapping CSAF/OSV payloads into structured chunks for retrieval. Dependencies: CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-011 | DONE (2025-11-02) | Implement Excititor VEX document provider to surface structured VEX statements for retrieval. Dependencies: EXCITITOR-LNM-21-201, EXCITITOR-CORE-AOC-19-002. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
@@ -78,7 +79,7 @@ Depends on: Sprint 100.A - Attestor
|
||||
Summary: Ingestion & Evidence focus on Concelier (phase I).
|
||||
Task ID | State | Task description | Owners (Source)
|
||||
--- | --- | --- | ---
|
||||
CONCELIER-AIAI-31-001 `Paragraph anchors` | TODO | Expose advisory chunk API returning paragraph anchors, section metadata, and token-safe text for Advisory AI retrieval. | Concelier WebService Guild (src/Concelier/StellaOps.Concelier.WebService/TASKS.md)
|
||||
CONCELIER-AIAI-31-001 `Paragraph anchors` | DONE | Expose advisory chunk API returning paragraph anchors, section metadata, and token-safe text for Advisory AI retrieval. | Concelier WebService Guild (src/Concelier/StellaOps.Concelier.WebService/TASKS.md)
|
||||
CONCELIER-AIAI-31-002 `Structured fields` | TODO | Ensure observation APIs expose upstream workaround/fix/CVSS fields with provenance; add caching for summary queries. Dependencies: CONCELIER-AIAI-31-001. | Concelier WebService Guild (src/Concelier/StellaOps.Concelier.WebService/TASKS.md)
|
||||
CONCELIER-AIAI-31-003 `Advisory AI telemetry` | TODO | Emit metrics/logs for chunk requests, cache hits, and guardrail blocks triggered by advisory payloads. Dependencies: CONCELIER-AIAI-31-001. | Concelier WebService Guild, Observability Guild (src/Concelier/StellaOps.Concelier.WebService/TASKS.md)
|
||||
CONCELIER-AIRGAP-56-001 `Mirror ingestion adapters` | TODO | Add mirror source adapters reading advisories from imported bundles, preserving source metadata and bundle IDs. Ensure ingestion remains append-only. Dependencies: AIRGAP-IMP-57-002, MIRROR-CRT-56-001. | Concelier Core Guild (src/Concelier/__Libraries/StellaOps.Concelier.Core/TASKS.md)
|
||||
@@ -197,9 +198,9 @@ FEEDCONN-CISCO-02-009 SemVer range provenance | BE-Conn-Cisco | **TODO (due 2025
|
||||
FEEDCONN-ICSCISA-02-012 Version range provenance | BE-Conn-ICS-CISA | **DONE (2025-11-03)** – Promote existing firmware/semver data into `advisory_observations.affected.versions[]` entries with deterministic comparison keys and provenance identifiers (`ics-cisa:{advisoryId}:{product}`). Add regression coverage for mixed firmware strings and raise a Models ticket only when observation schema needs a new comparison helper.<br>2025-10-29: Follow `docs/dev/normalized-rule-recipes.md` §2 to build observation version entries and log failures without invoking the retired merge helpers.<br>2025-11-03: Completed – connector now normalizes semver ranges with provenance notes, RSS fallback content clears the AOC guard, and end-to-end Fetch/Parse/Map integration tests pass. | CONCELIER-LNM-21-001 (src/Concelier/__Libraries/StellaOps.Concelier.Connector.Ics.Cisa/TASKS.md)
|
||||
FEEDCONN-KISA-02-008 Firmware range provenance | BE-Conn-KISA, Models | **DONE (2025-11-04)** – Define comparison helpers for Hangul-labelled firmware ranges (`XFU 1.0.1.0084 ~ 2.0.1.0034`) and map them into `advisory_observations.affected.versions[]` with provenance tags. Coordinate with Models only if a new comparison scheme is required, then update localisation notes and fixtures for the Link-Not-Merge schema.<br>2025-11-03: Analysis in progress – auditing existing mapper output/fixtures ahead of implementing firmware range normalization and provenance wiring.<br>2025-11-03: SemVer normalization helper wired through `KisaMapper` with provenance slugs + vendor extensions; integration tests updated and green, follow-up capture for additional Hangul exclusivity markers queued before completion.<br>2025-11-03: Extended connector tests to cover single-ended (`이상`, `초과`, `이하`, `미만`) and non-numeric phrases, verifying normalized rule types (`gt`, `gte`, `lt`, `lte`) and fallback behaviour; broader corpus review remains before transitioning to DONE.<br>2025-11-03: Captured the top 10 `detailDos.do?IDX=` pages into `seed-data/kisa/html/` via `scripts/kisa_capture_html.py`; JSON endpoint (`rssDetailData.do?IDX=…`) now returns error pages, so connector updates must parse the embedded HTML or secure authenticated API access before closing.<br>2025-11-04: Fetch + parse pipeline now consumes the HTML detail pages end to end (metadata persisted, DOM parser extracts vendor/product ranges); fixtures/tests operate on the HTML snapshots to guard normalized SemVer + vendor extension expectations and severity extraction. | CONCELIER-LNM-21-001 (src/Concelier/__Libraries/StellaOps.Concelier.Connector.Kisa/TASKS.md)
|
||||
FEEDCONN-SHARED-STATE-003 Source state seeding helper | Tools Guild, BE-Conn-MSRC | **DONE (2025-11-04)** – Delivered `SourceStateSeeder` CLI + processor APIs, Mongo fixtures, and MSRC runbook updates. Seeds raw docs + cursor state deterministically; tests cover happy/path/idempotent flows (`dotnet test src/Concelier/__Tests/StellaOps.Concelier.Connector.Common.Tests/...` – note: requires `libcrypto.so.1.1` when running Mongo2Go locally). | Tools (src/Concelier/__Libraries/StellaOps.Concelier.Connector.Common/TASKS.md)
|
||||
FEEDMERGE-COORD-02-901 Connector deadline check-ins | BE-Merge | **TODO (due 2025-10-21)** – Confirm Cccs/Cisco version-provenance updates land, capture `LinksetVersionCoverage` dashboard snapshots (expect zero missing-range warnings), and update coordination docs with the results.<br>2025-10-29: Observation metrics now surface `version_entries_total`/`missing_version_entries_total`; include screenshots for both when closing this task. | FEEDMERGE-COORD-02-900 (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
FEEDMERGE-COORD-02-902 ICS-CISA version comparison support | BE-Merge, Models | **TODO (due 2025-10-23)** – Review ICS-CISA sample advisories, validate reuse of existing comparison helpers, and pre-stage Models ticket template only if a new firmware comparator is required. Document the outcome and observation coverage logs in coordination docs + tracker files.<br>2025-10-29: `docs/dev/normalized-rule-recipes.md` (§2–§3) now covers observation entries; attach decision summary + log sample when handing off to Models. Dependencies: FEEDMERGE-COORD-02-901. | FEEDMERGE-COORD-02-900 (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
FEEDMERGE-COORD-02-903 KISA firmware scheme review | BE-Merge, Models | **TODO (due 2025-10-24)** – Pair with KISA team on proposed firmware comparison helper (`kisa.build` or variant), ensure observation mapper alignment, and open Models ticket only if a new comparator is required. Log the final helper signature and observation coverage metrics in coordination docs + tracker files. Dependencies: FEEDMERGE-COORD-02-902. | FEEDMERGE-COORD-02-900 (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
FEEDMERGE-COORD-02-901 Connector deadline check-ins | DROPPED (2025-11-07) | Scope removed: FeedMerge coordination requires an AOC policy that does not exist yet. Re-open once governance/ownership is defined. | —
|
||||
FEEDMERGE-COORD-02-902 ICS-CISA version comparison support | DROPPED (2025-11-07) | Blocked on FEEDMERGE policy/ownership; dropped alongside 02-901. | —
|
||||
FEEDMERGE-COORD-02-903 KISA firmware scheme review | DROPPED (2025-11-07) | Blocked on FEEDMERGE policy/ownership; dropped alongside 02-901. | —
|
||||
Fixture validation sweep | QA | **DONE (2025-11-04)** – Regenerated RHSA CSAF goldens via `scripts/update-redhat-fixtures.sh` (sets `UPDATE_GOLDENS=1`) and re-ran connector tests `dotnet test src/Concelier/__Tests/StellaOps.Concelier.Connector.Distro.RedHat.Tests/StellaOps.Concelier.Connector.Distro.RedHat.Tests.csproj --no-restore` to confirm snapshot parity. | None (src/Concelier/__Libraries/StellaOps.Concelier.Connector.Distro.RedHat/TASKS.md)
|
||||
Link-Not-Merge version provenance coordination | BE-Merge | **DONE (2025-11-04)** – Published connector status tracker + follow-up IDs in `docs/dev/normalized-rule-recipes.md`, enabled `Normalized version rules missing` diagnostics in Merge, and aligned dashboards on `LinksetVersionCoverage`. Remaining gaps (ACSC/CCCS/CERTBUND/Cisco/RU-BDU) documented as upstream data deficiencies awaiting feed updates. Dependencies: CONCELIER-LNM-21-203. | CONCELIER-LNM-21-001 (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
MERGE-LNM-21-001 | DONE (2025-11-03) | Draft `no-merge` migration playbook, documenting backfill strategy, feature flag rollout, and rollback steps for legacy merge pipeline deprecation.<br>2025-11-03: Authored `docs/migration/no-merge.md` covering rollout phases, backfill/validation checklists, and rollback guidance; shared artefact owners. | BE-Merge, Architecture Guild (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
@@ -210,8 +211,9 @@ Depends on: Sprint 110.B - Concelier.VI
|
||||
Summary: Ingestion & Evidence focus on Concelier (phase VII).
|
||||
Task ID | State | Task description | Owners (Source)
|
||||
--- | --- | --- | ---
|
||||
MERGE-LNM-21-002 | DOING (2025-11-06) | Refactor or retire `AdvisoryMergeService` and related pipelines, ensuring callers transition to observation/linkset APIs; add compile-time analyzer preventing merge service usage.<br>2025-11-03: Began dependency audit and call-site inventory ahead of deprecation plan; cataloging service registrations/tests referencing merge APIs.<br>2025-11-05 14:42Z: Drafted `concelier:features:noMergeEnabled` gating, merge job allowlist handling, and deprecation/telemetry changes prior to analyzer rollout.<br>2025-11-06 16:10Z: Landed analyzer project (`CONCELIER0002`), wired into Concelier WebService/tests, and updated docs to direct suppressions through explicit migration notes.<br>2025-11-07 03:25Z: Default-on toggle + job gating break existing Concelier WebService tests; guard/migration adjustments pending before closing the task. | BE-Merge (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
MERGE-LNM-21-002 | DOING (2025-11-07) | Refactor or retire `AdvisoryMergeService` and related pipelines, ensuring callers transition to observation/linkset APIs; add compile-time analyzer preventing merge service usage.<br>2025-11-03: Began dependency audit and call-site inventory ahead of deprecation plan; cataloging service registrations/tests referencing merge APIs.<br>2025-11-05 14:42Z: Drafted `concelier:features:noMergeEnabled` gating, merge job allowlist handling, and deprecation/telemetry changes prior to analyzer rollout.<br>2025-11-06 16:10Z: Landed analyzer project (`CONCELIER0002`), wired into Concelier WebService/tests, and updated docs to direct suppressions through explicit migration notes.<br>2025-11-07 03:25Z: Default-on toggle + job gating break existing Concelier WebService tests; guard/migration adjustments pending before closing the task.<br>2025-11-07 07:05Z: Added ingest-path diagnostics (hash logging + test log dumping) to trace why HTTP binding loses `upstream.contentHash` with `noMergeEnabled=true`; need to adapt seeding/tests once the binding issue is fixed. | BE-Merge (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
MERGE-LNM-21-003 Determinism/test updates | QA Guild, BE-Merge | Replace merge determinism suites with observation/linkset regression tests verifying no data mutation and conflicts remain visible. Dependencies: MERGE-LNM-21-002. | MERGE-LNM-21-002 (src/Concelier/__Libraries/StellaOps.Concelier.Merge/TASKS.md)
|
||||
WEB-AOC-19-001 (dependency) | DONE (2025-11-07) | Shared guard primitives now enforce the top-level allowlist (`_id`, tenant, source, upstream, content, identifiers, linkset, supersedes, created/ingested timestamps, attributes) and emit the reusable `AocError` payload consumed by HTTP/CLI tooling. Extend `AocGuardOptions.AllowedTopLevelFields` when staging new schema fields to avoid false-positive `ERR_AOC_007` violations. | BE-Base Platform Guild (docs/aoc/guard-library.md, src/Web/StellaOps.Web/TASKS.md)
|
||||
|
||||
|
||||
[Ingestion & Evidence] 110.C) Excititor.I
|
||||
|
||||
@@ -142,8 +142,8 @@ SCANNER-EVENTS-16-302 | DONE (2025-11-06) | Extend orchestrator event links (rep
|
||||
SCANNER-GRAPH-21-001 | TODO | Provide webhook/REST endpoint for Cartographer to request policy overlays and runtime evidence for graph nodes, ensuring determinism and tenant scoping. | Scanner WebService Guild, Cartographer Guild (src/Scanner/StellaOps.Scanner.WebService/TASKS.md)
|
||||
SCANNER-LNM-21-001 | TODO | Update `/reports` and `/policy/runtime` payloads to consume advisory/vex linksets, exposing source severity arrays and conflict summaries alongside effective verdicts. | Scanner WebService Guild, Policy Guild (src/Scanner/StellaOps.Scanner.WebService/TASKS.md)
|
||||
SCANNER-LNM-21-002 | TODO | Add evidence endpoint for Console to fetch linkset summaries with policy overlay for a component/SBOM, including AOC references. Dependencies: SCANNER-LNM-21-001. | Scanner WebService Guild, UI Guild (src/Scanner/StellaOps.Scanner.WebService/TASKS.md)
|
||||
SCANNER-SECRETS-01 | DOING (2025-11-06) | Adopt `StellaOps.Scanner.Surface.Secrets` for registry/CAS credentials during scan execution.<br>2025-11-02: Worker integration tests added for CAS token retrieval via Surface.Secrets abstraction; refactor under review.<br>2025-11-06: Resumed to replace remaining registry credential plumbing and emit rotation-aware metrics.<br>2025-11-06 21:35Z: Surface secret configurator now hydrates `ScannerStorageOptions` from `cas-access` payloads; unit coverage added. | Scanner Worker Guild, Security Guild (src/Scanner/StellaOps.Scanner.Worker/TASKS.md)
|
||||
SCANNER-SECRETS-02 | DOING (2025-11-06) | Replace ad-hoc secret wiring with Surface.Secrets for report/export operations (registry and CAS tokens). Dependencies: SCANNER-SECRETS-01.<br>2025-11-02: WebService export path now resolves registry credentials via Surface.Secrets stub; CI pipeline hook in progress.<br>2025-11-06: Picking up Surface.Secrets provider usage across report/export flows and removing legacy secret file readers.<br>2025-11-06 21:40Z: WebService options now consume `cas-access` secrets via configurator; storage mirrors updated; targeted tests passing. | Scanner WebService Guild, Security Guild (src/Scanner/StellaOps.Scanner.WebService/TASKS.md)
|
||||
SCANNER-SECRETS-01 | DONE (2025-11-06) | Adopt `StellaOps.Scanner.Surface.Secrets` for registry/CAS credentials during scan execution.<br>2025-11-02: Surface.Secrets provider wired for CAS token retrieval; integration tests added.<br>2025-11-06: Replaced registry credential plumbing with shared provider + rotation-aware metrics; introduced registry secret stage and analysis keys.<br>2025-11-06 23:40Z: Installed .NET 10 RC2 runtime, parser/stage unit suites green (`dotnet test` Surface.Secrets + Worker focused filter). | Scanner Worker Guild, Security Guild (src/Scanner/StellaOps.Scanner.Worker/TASKS.md)
|
||||
SCANNER-SECRETS-02 | DONE (2025-11-06) | Replace ad-hoc secret wiring with Surface.Secrets for report/export operations (registry and CAS tokens). Dependencies: SCANNER-SECRETS-01.<br>2025-11-02: WebService export path now resolves registry credentials via Surface.Secrets stub; CI pipeline hook in progress.<br>2025-11-06: Picking up Surface.Secrets provider usage across report/export flows and removing legacy secret file readers.<br>2025-11-06 21:40Z: WebService options now consume `cas-access` secrets via configurator; storage mirrors updated; targeted tests passing.<br>2025-11-06 23:58Z: Registry + attestation secrets sourced via Surface.Secrets (options extended, configurator + tests updated); Surface.Secrets & configurator test suites executed on .NET 10 RC2 runtime. | Scanner WebService Guild, Security Guild (src/Scanner/StellaOps.Scanner.WebService/TASKS.md)
|
||||
SCANNER-SECRETS-03 | TODO | Use Surface.Secrets to retrieve registry credentials when interacting with CAS/referrers. Dependencies: SCANNER-SECRETS-02. | BuildX Plugin Guild, Security Guild (src/Scanner/StellaOps.Scanner.Sbomer.BuildXPlugin/TASKS.md)
|
||||
SCANNER-ENG-0020 | TODO | Implement Homebrew collector & fragment mapper per `design/macos-analyzer.md` §3.1. | Scanner Guild (docs/modules/scanner/TASKS.md)
|
||||
SCANNER-ENG-0021 | TODO | Implement pkgutil receipt collector per `design/macos-analyzer.md` §3.2. | Scanner Guild (docs/modules/scanner/TASKS.md)
|
||||
@@ -153,9 +153,9 @@ SCANNER-ENG-0024 | TODO | Implement Windows MSI collector per `design/windows-an
|
||||
SCANNER-ENG-0025 | TODO | Implement WinSxS manifest collector per `design/windows-analyzer.md` §3.2. | Scanner Guild (docs/modules/scanner/TASKS.md)
|
||||
SCANNER-ENG-0026 | TODO | Implement Windows Chocolatey & registry collectors per `design/windows-analyzer.md` §3.3–3.4. | Scanner Guild (docs/modules/scanner/TASKS.md)
|
||||
SCANNER-ENG-0027 | TODO | Deliver Windows policy/offline integration per `design/windows-analyzer.md` §5–6. | Scanner Guild, Policy Guild, Offline Kit Guild (docs/modules/scanner/TASKS.md)
|
||||
SCANNER-SURFACE-01 | DOING (2025-11-06) | Persist Surface.FS manifests after analyzer stages, including layer CAS metadata and EntryTrace fragments.<br>2025-11-02: Worker pipeline emitting draft Surface.FS manifests for sample scans; determinism checks running.<br>2025-11-06: Continuing with manifest writer abstraction + telemetry wiring for Surface.FS persistence. | Scanner Worker Guild (src/Scanner/StellaOps.Scanner.Worker/TASKS.md)
|
||||
SCANNER-SURFACE-01 | DONE (2025-11-06) | Persist Surface.FS manifests after analyzer stages, including layer CAS metadata and EntryTrace fragments.<br>2025-11-02: Worker pipeline emitting draft Surface.FS manifests for sample scans; determinism checks running.<br>2025-11-06: Continuing with manifest writer abstraction + telemetry wiring for Surface.FS persistence.<br>2025-11-06 18:45Z: Resumed work; targeting manifest writer abstraction, CAS persistence hooks, and telemetry/test coverage updates.<br>2025-11-06 20:20Z: Published Surface worker Grafana dashboard + updated design doc; WebService pointer integration test now covers manifest/payload artefacts. | Scanner Worker Guild (src/Scanner/StellaOps.Scanner.Worker/TASKS.md)
|
||||
SCANNER-SURFACE-02 | DONE (2025-11-05) | Publish Surface.FS pointers (CAS URIs, manifests) via scan/report APIs and update attestation metadata. Dependencies: SCANNER-SURFACE-01.<br>2025-11-05: Surface pointer projection wired through WebService endpoints, orchestrator samples & DSSE fixtures refreshed with `surface` manifest block, and regression suite (platform events, report sample, ready check) updated. | Scanner WebService Guild (src/Scanner/StellaOps.Scanner.WebService/TASKS.md)
|
||||
SCANNER-SURFACE-03 | DOING (2025-11-06) | Push layer manifests and entry fragments into Surface.FS during build-time SBOM generation. Dependencies: SCANNER-SURFACE-02.<br>2025-11-06: Starting BuildX manifest upload implementation with Surface.FS client abstraction and integration tests. | BuildX Plugin Guild (src/Scanner/StellaOps.Scanner.Sbomer.BuildXPlugin/TASKS.md)
|
||||
SCANNER-SURFACE-03 | DONE (2025-11-07) | Push layer manifests and entry fragments into Surface.FS during build-time SBOM generation. Dependencies: SCANNER-SURFACE-02.<br>2025-11-06: Starting BuildX manifest upload implementation with Surface.FS client abstraction and integration tests.<br>2025-11-07 15:30Z: Resumed BuildX plugin Surface wiring; analyzing Surface.FS models, CAS flow, and upcoming tests before coding.<br>2025-11-07 22:10Z: Added Surface manifest writer + CLI flags to the BuildX plug-in, persisted artefacts into CAS, regenerated docs/fixtures, and shipped new tests covering the writer + descriptor flow. | BuildX Plugin Guild (src/Scanner/StellaOps.Scanner.Sbomer.BuildXPlugin/TASKS.md)
|
||||
|
||||
[Scanner & Surface] 130.A) Scanner.VIII
|
||||
Depends on: Sprint 130.A - Scanner.VII
|
||||
|
||||
@@ -222,8 +222,8 @@ Task ID | State | Task description | Owners (Source)
|
||||
WEB-AIAI-31-001 `API routing` | TODO | Route `/advisory/ai/*` endpoints through gateway with RBAC/ABAC, rate limits, and telemetry headers. | BE-Base Platform Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
WEB-AIAI-31-002 `Batch orchestration` | TODO | Provide batching job handlers and streaming responses for CLI automation with retry/backoff. Dependencies: WEB-AIAI-31-001. | BE-Base Platform Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
WEB-AIAI-31-003 `Telemetry & audit` | TODO | Emit metrics/logs (latency, guardrail blocks, validation failures) and forward anonymized prompt hashes to analytics. Dependencies: WEB-AIAI-31-002. | BE-Base Platform Guild, Observability Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
WEB-AOC-19-001 `Shared AOC guard primitives` | DOING (2025-10-26) | Provide `AOCForbiddenKeys`, guard middleware/interceptor hooks, and error types (`AOCError`, `AOCViolationCode`) for ingestion services. Publish sample usage + analyzer to ensure guard registered. | BE-Base Platform Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
> 2025-11-06: Added the `RequireAocGuard` endpoint extension, wired Concelier advisory ingestion through the shared filter, refreshed docs, and introduced extension tests.
|
||||
WEB-AOC-19-001 `Shared AOC guard primitives` | DONE (2025-11-07) | Provide `AOCForbiddenKeys`, guard middleware/interceptor hooks, and error types (`AOCError`, `AOCViolationCode`) for ingestion services. Publish sample usage + analyzer to ensure guard registered. | BE-Base Platform Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
> 2025-11-07: Enforced unknown-field detection, added the shared `AocError` payload (HTTP + CLI), refreshed guard docs, and extended tests/endpoint helpers.
|
||||
WEB-AOC-19-002 `Provenance & signature helpers` | TODO | Ship `ProvenanceBuilder`, checksum utilities, and signature verification helper integrated with guard logging. Cover DSSE/CMS formats with unit tests. Dependencies: WEB-AOC-19-001. | BE-Base Platform Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
WEB-AOC-19-003 `Analyzer + test fixtures` | TODO | Author Roslyn analyzer preventing ingestion modules from writing forbidden keys without guard, and provide shared test fixtures for guard validation used by Concelier/Excititor service tests. Dependencies: WEB-AOC-19-002. | QA Guild, BE-Base Platform Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
WEB-CONSOLE-23-001 `Global posture endpoints` | TODO | Provide consolidated `/console/dashboard` and `/console/filters` APIs returning tenant-scoped aggregates (findings by severity, VEX override counts, advisory deltas, run health, policy change log). Enforce AOC labelling, deterministic ordering, and cursor-based pagination for drill-down hints. | BE-Base Platform Guild, Product Analytics Guild (src/Web/StellaOps.Web/TASKS.md)
|
||||
|
||||
@@ -325,9 +325,9 @@ Depends on: Sprint 100.A - Attestor, Sprint 110.A - AdvisoryAI, Sprint 120.A - A
|
||||
Summary: Documentation & Process focus on Docs Modules Excititor).
|
||||
Task ID | State | Task description | Owners (Source)
|
||||
--- | --- | --- | ---
|
||||
EXCITITOR-DOCS-0001 | DONE (2025-11-05) | README updated with consensus API beta note ([docs/updates/2025-11-05-excitor-consensus-beta.md](../updates/2025-11-05-excitor-consensus-beta.md)) and consensus JSON sample ([docs/vex/consensus-json.md](../vex/consensus-json.md)). | Docs Guild (docs/modules/excititor/TASKS.md)
|
||||
EXCITITOR-ENG-0001 | TODO | Update status via ./AGENTS.md workflow | Module Team (docs/modules/excititor/TASKS.md)
|
||||
EXCITITOR-OPS-0001 | TODO | Sync outcomes back to ../../TASKS.md | Ops Guild (docs/modules/excititor/TASKS.md)
|
||||
EXCITITOR-DOCS-0001 | DONE (2025-11-07) | README refreshed with consensus beta DSSE/export references + explicit release-note links. | Docs Guild (docs/modules/excititor/TASKS.md)
|
||||
EXCITITOR-ENG-0001 | DONE (2025-11-07) | Implementation plan now mirrors SPRINT_200 state via sprint-alignment table. | Module Team (docs/modules/excititor/TASKS.md)
|
||||
EXCITITOR-OPS-0001 | DONE (2025-11-07) | Runbook/observability checklist (metrics, alerts, incident steps) added to `docs/modules/excititor/mirrors.md`. | Ops Guild (docs/modules/excititor/TASKS.md)
|
||||
|
||||
|
||||
[Documentation & Process] 200.J) Docs Modules Export Center
|
||||
|
||||
@@ -119,7 +119,7 @@ Canonicalisation rules:
|
||||
| `ERR_AOC_006` | Attempt to persist derived findings from ingestion context. | 403 | Policy engine guard, Authority scopes. |
|
||||
| `ERR_AOC_007` | Unknown top-level fields (schema violation). | 400 | Mongo validator, CLI verifier. |
|
||||
|
||||
Consumers should map these codes to CLI exit codes and structured log events so automation can fail fast and produce actionable guidance.
|
||||
Consumers should map these codes to CLI exit codes and structured log events so automation can fail fast and produce actionable guidance. The shared guard library (`StellaOps.Aoc.AocError`) emits consistent payloads (`code`, `message`, `violations[]`) for HTTP APIs, CLI tooling, and verifiers.
|
||||
|
||||
## 6. API and Tooling Interfaces
|
||||
|
||||
|
||||
@@ -1,29 +1,37 @@
|
||||
# StellaOps Advisory AI
|
||||
|
||||
Advisory AI is the retrieval-augmented assistant that synthesizes advisory and VEX evidence into operator-ready summaries, conflict explanations, and remediation plans with strict provenance.
|
||||
|
||||
## Responsibilities
|
||||
- Generate policy-aware advisory summaries with citations back to Conseiller and Excititor evidence.
|
||||
- Explain conflicting advisories/VEX statements using weights from VEX Lens and Policy Engine.
|
||||
- Propose remediation hints aligned with Offline Kit staging and export bundles.
|
||||
- Expose API/UI surfaces with guardrails on model prompts, outputs, and retention.
|
||||
|
||||
## Key components
|
||||
- RAG pipeline drawing from Conseiller, Excititor, VEX Lens, Policy Engine, and SBOM Service data.
|
||||
- Prompt templates and guard models enforcing provenance and redaction policies.
|
||||
- Vercel/offline inference workers with deterministic caching of generated artefacts.
|
||||
|
||||
## Integrations & dependencies
|
||||
- Authority for tenant-aware access control.
|
||||
- Policy Engine for context-specific decisions and explain traces.
|
||||
- Console/CLI for interaction surfaces.
|
||||
- Export Center/Vuln Explorer for embedding generated briefs.
|
||||
|
||||
## Operational notes
|
||||
- Model cache management and offline bundle packaging per Epic 8 requirements.
|
||||
- Usage/latency dashboards for prompt/response monitoring.
|
||||
- Redaction policies validated against security/LLM guardrail tests.
|
||||
|
||||
## Epic alignment
|
||||
- Epic 8: Advisory AI Assistant.
|
||||
- DOCS-AI stories to be tracked in ../../TASKS.md.
|
||||
# StellaOps Advisory AI
|
||||
|
||||
Advisory AI is the retrieval-augmented assistant that synthesizes advisory and VEX evidence into operator-ready summaries, conflict explanations, and remediation plans with strict provenance.
|
||||
|
||||
## Responsibilities
|
||||
- Generate policy-aware advisory summaries with citations back to Conseiller and Excititor evidence.
|
||||
- Explain conflicting advisories/VEX statements using weights from VEX Lens and Policy Engine.
|
||||
- Propose remediation hints aligned with Offline Kit staging and export bundles.
|
||||
- Expose API/UI surfaces with guardrails on model prompts, outputs, and retention.
|
||||
|
||||
## Key components
|
||||
- RAG pipeline drawing from Conseiller, Excititor, VEX Lens, Policy Engine, and SBOM Service data.
|
||||
- Prompt templates and guard models enforcing provenance and redaction policies.
|
||||
- Vercel/offline inference workers with deterministic caching of generated artefacts.
|
||||
|
||||
## Integrations & dependencies
|
||||
- Authority for tenant-aware access control.
|
||||
- Policy Engine for context-specific decisions and explain traces.
|
||||
- Console/CLI for interaction surfaces.
|
||||
- Export Center/Vuln Explorer for embedding generated briefs.
|
||||
|
||||
## Operational notes
|
||||
- Model cache management and offline bundle packaging per Epic 8 requirements.
|
||||
- Usage/latency dashboards for prompt/response monitoring with `advisory_ai_latency_seconds`, guardrail block/validation counters, and citation coverage histograms wired into the default “Advisory AI” Grafana dashboard.
|
||||
- Alert policies fire when `advisory_ai_guardrail_blocks_total` or `advisory_ai_validation_failures_total` breach burn-rate thresholds (5 blocks/min or validation failures > 1% of traffic) and when latency p95 exceeds 30s.
|
||||
- Redaction policies validated against security/LLM guardrail tests.
|
||||
- Guardrail behaviour, blocked phrases, and operational alerts are detailed in `/docs/security/assistant-guardrails.md`.
|
||||
|
||||
## CLI usage
|
||||
- `stella advise run <summary|conflict|remediation> --advisory-key <id> [--artifact-id id] [--artifact-purl purl] [--policy-version v] [--profile profile] [--section name] [--force-refresh] [--timeout seconds]`
|
||||
- Requests an advisory plan from the web service, enqueues execution, then polls for the generated output (default wait 120 s, single check if `--timeout 0`).
|
||||
- Renders plan metadata (cache key, prompt template, token budget), guardrail state, provenance hashes, signatures, and citations in a deterministic table view.
|
||||
- Honors `STELLAOPS_ADVISORYAI_URL` when set; otherwise the CLI reuses the backend URL and scopes requests via `X-StellaOps-Scopes`.
|
||||
|
||||
## Epic alignment
|
||||
- Epic 8: Advisory AI Assistant.
|
||||
- DOCS-AI stories to be tracked in ../../TASKS.md.
|
||||
|
||||
@@ -129,7 +129,16 @@ src/
|
||||
|
||||
Both subcommands honour offline-first expectations (no network access) and normalise relative roots via `--root` when operators mirror the credential store.
|
||||
|
||||
### 2.11 Air-gap guard
|
||||
### 2.11 Advisory AI (RAG summaries)
|
||||
|
||||
* `advise run <summary|conflict|remediation> --advisory-key <id> [--artifact-id id] [--artifact-purl purl] [--policy-version v] [--profile profile] [--section name] [--force-refresh] [--timeout seconds]`
|
||||
|
||||
* Calls the Advisory AI service (`/v1/advisory-ai/pipeline/{task}` + `/outputs/{cacheKey}`) to materialise a deterministic plan, queue execution, and poll for the generated brief.
|
||||
* Renders plan metadata (cache key, prompt template, token budgets), guardrail results, provenance hashes/signatures, and citation list. Exit code is non-zero if guardrails block or the command times out.
|
||||
* Uses `STELLAOPS_ADVISORYAI_URL` when configured; otherwise it reuses the backend base address and adds `X-StellaOps-Scopes` (`advisory:run` + task scope) per request.
|
||||
* `--timeout 0` performs a single cache lookup (for CI flows that only want cached artefacts).
|
||||
|
||||
### 2.12 Air-gap guard
|
||||
|
||||
- CLI outbound HTTP flows (Authority auth, backend APIs, advisory downloads) route through `StellaOps.AirGap.Policy`. When sealed mode is active the CLI refuses commands that would require external egress and surfaces the shared `AIRGAP_EGRESS_BLOCKED` remediation guidance instead of attempting the request.
|
||||
|
||||
|
||||
@@ -3,15 +3,19 @@
|
||||
Concelier ingests signed advisories from dozens of sources and converts them into immutable observations plus linksets under the Aggregation-Only Contract (AOC).
|
||||
|
||||
## Responsibilities
|
||||
- Fetch and normalise vulnerability advisories via restart-time connectors.
|
||||
- Persist observations and correlation linksets without precedence decisions.
|
||||
- Emit deterministic exports (JSON, Trivy DB) for downstream policy evaluation.
|
||||
- Coordinate offline/air-gap updates via Offline Kit bundles.
|
||||
- Fetch and normalise vulnerability advisories via restart-time connectors.
|
||||
- Persist observations and correlation linksets without precedence decisions.
|
||||
- Emit deterministic exports (JSON, Trivy DB) for downstream policy evaluation.
|
||||
- Coordinate offline/air-gap updates via Offline Kit bundles.
|
||||
- Serve paragraph-anchored advisory chunks for Advisory AI consumers without breaking the Aggregation-Only Contract.
|
||||
|
||||
## Key components
|
||||
- `StellaOps.Concelier.WebService` orchestration host.
|
||||
- Connector libraries under `StellaOps.Concelier.Connector.*`.
|
||||
- Exporter packages (`StellaOps.Concelier.Exporter.*`).
|
||||
## Key components
|
||||
- `StellaOps.Concelier.WebService` orchestration host.
|
||||
- Connector libraries under `StellaOps.Concelier.Connector.*`.
|
||||
- Exporter packages (`StellaOps.Concelier.Exporter.*`).
|
||||
|
||||
## Recent updates
|
||||
- **2025-11-07:** Paragraph-anchored `/advisories/{advisoryKey}/chunks` endpoint shipped for Advisory AI paragraph retrieval. Details and rollout notes live in [`../../updates/2025-11-07-concelier-advisory-chunks.md`](../../updates/2025-11-07-concelier-advisory-chunks.md).
|
||||
|
||||
## Integrations & dependencies
|
||||
- MongoDB for canonical observations and schedules.
|
||||
|
||||
@@ -3,8 +3,15 @@
|
||||
Excititor converts heterogeneous VEX feeds into raw observations and linksets that honour the Aggregation-Only Contract.
|
||||
|
||||
## Latest updates (2025-11-05)
|
||||
- Link-Not-Merge readiness: release note [Excitor consensus beta](../../updates/2025-11-05-excitor-consensus-beta.md) captures how Excititor feeds power the Excitor consensus beta (sample payload in [consensus JSON](../../vex/consensus-json.md)).
|
||||
- Link-Not-Merge readiness: release note [Excitor consensus beta](../../updates/2025-11-05-excitor-consensus-beta.md) captures how Excititor feeds power the Excititor consensus beta (sample payload in [consensus JSON](../../vex/consensus-json.md)).
|
||||
- README now points policy/UI teams to the upcoming consensus integration work.
|
||||
- DSSE packaging for consensus bundles and Export Center hooks are documented in the [beta release note](../../updates/2025-11-05-excitor-consensus-beta.md); operators mirroring Excititor exports must verify detached JWS artefacts (`bundle.json.jws`) alongside each bundle.
|
||||
- Follow-ups called out in the release note (Policy weighting knobs `POLICY-ENGINE-30-101`, CLI verb `CLI-VEX-30-002`) remain in-flight and are tracked in `/docs/implplan/SPRINT_200_documentation_process.md`.
|
||||
|
||||
## Release references
|
||||
- Consensus beta payload reference: [docs/vex/consensus-json.md](../../vex/consensus-json.md)
|
||||
- Export Center offline packaging: [docs/modules/export-center/devportal-offline.md](../export-center/devportal-offline.md)
|
||||
- Historical release log: [docs/updates/](../../updates/)
|
||||
|
||||
## Responsibilities
|
||||
- Fetch OpenVEX/CSAF/CycloneDX statements via restart-only connectors.
|
||||
|
||||
@@ -4,6 +4,6 @@
|
||||
|
||||
| ID | Status | Owner(s) | Description | Notes |
|
||||
|----|--------|----------|-------------|-------|
|
||||
| EXCITITOR-DOCS-0001 | DONE (2025-11-05) | Docs Guild | Validate that ./README.md aligns with the latest release notes. | README now links to the [Excitor consensus beta release note](../../updates/2025-11-05-excitor-consensus-beta.md) and [consensus JSON sample](../../vex/consensus-json.md). |
|
||||
| EXCITITOR-OPS-0001 | TODO | Ops Guild | Review runbooks/observability assets after next sprint demo. | Sync outcomes back to ../../TASKS.md |
|
||||
| EXCITITOR-ENG-0001 | TODO | Module Team | Cross-check implementation plan milestones against `/docs/implplan/SPRINT_*.md`. | Update status via ./AGENTS.md workflow |
|
||||
| EXCITITOR-DOCS-0001 | DONE (2025-11-07) | Docs Guild | Validate that ./README.md aligns with the latest release notes. | README now includes DSSE/export references + release-note cross-links for the consensus beta. |
|
||||
| EXCITITOR-OPS-0001 | DONE (2025-11-07) | Ops Guild | Review runbooks/observability assets after next sprint demo. | Added runbook/observability checklist (metrics, alerts, incident steps) to `docs/modules/excititor/mirrors.md`. |
|
||||
| EXCITITOR-ENG-0001 | DONE (2025-11-07) | Module Team | Cross-check implementation plan milestones against `/docs/implplan/SPRINT_*.md`. | Implementation plan now mirrors SPRINT_200 statuses in a new sprint-alignment table. |
|
||||
|
||||
@@ -15,7 +15,17 @@
|
||||
- **Epic 8 – Advisory AI:** guarantee citation-ready payloads and normalized context for AI summaries/explainers.
|
||||
- Track DOCS-LNM-22-006/007 and CLI-EXC-25-001..002 in ../../TASKS.md.
|
||||
|
||||
## Coordination
|
||||
- Review ./AGENTS.md before picking up new work.
|
||||
- Sync with cross-cutting teams noted in `/docs/implplan/SPRINT_*.md`.
|
||||
- Update this plan whenever scope, dependencies, or guardrails change.
|
||||
## Coordination
|
||||
- Review ./AGENTS.md before picking up new work.
|
||||
- Sync with cross-cutting teams noted in `/docs/implplan/SPRINT_*.md`.
|
||||
- Update this plan whenever scope, dependencies, or guardrails change.
|
||||
|
||||
## Sprint alignment (2025-11-07)
|
||||
|
||||
| Sprint task | State (SPRINT_200) | Notes |
|
||||
| --- | --- | --- |
|
||||
| EXCITITOR-DOCS-0001 | DONE | README release alignment + consensus beta references refreshed (DSSE/export guidance). |
|
||||
| EXCITITOR-ENG-0001 | DONE | Implementation plan now mirrors `SPRINT_200_documentation_process.md` through this table. |
|
||||
| EXCITITOR-OPS-0001 | DONE | Runbook/observability checklist added to `docs/modules/excititor/mirrors.md`. |
|
||||
|
||||
See `/docs/implplan/SPRINT_200_documentation_process.md` for the canonical status table.
|
||||
|
||||
@@ -156,9 +156,40 @@ Downstream automation reads `manifest.json`/`bundle.json` directly, while `/exci
|
||||
|
||||
---
|
||||
|
||||
## 6) Future alignment
|
||||
|
||||
* Replace manual export definitions with generated mirror bundle manifests once `EXCITITOR-EXPORT-01-007` ships.
|
||||
* Extend `/index` payload with quiet-provenance when `EXCITITOR-EXPORT-01-006` adds that metadata.
|
||||
* Integrate domain manifests with DevOps mirror profiles (`DEVOPS-MIRROR-08-001`) so helm/compose overlays can enable or disable domains declaratively.
|
||||
## 6) Future alignment
|
||||
|
||||
* Replace manual export definitions with generated mirror bundle manifests once `EXCITITOR-EXPORT-01-007` ships.
|
||||
* Extend `/index` payload with quiet-provenance when `EXCITITOR-EXPORT-01-006` adds that metadata.
|
||||
* Integrate domain manifests with DevOps mirror profiles (`DEVOPS-MIRROR-08-001`) so helm/compose overlays can enable or disable domains declaratively.
|
||||
|
||||
---
|
||||
|
||||
## 7) Runbook & observability checklist (Sprint 22 demo refresh · 2025-11-07)
|
||||
|
||||
### Daily / on-call checks
|
||||
1. **Index freshness** – watch `excitor_mirror_export_latency_seconds` (p95 < 180) grouped by `domainId`. If latency grows past 10 minutes, verify the export worker queue (`stellaops-export-worker` logs) and ensure Mongo `vex_exports` has entries newer than `now()-10m`.
|
||||
2. **Quota exhaustion** – alert on `excitor_mirror_quota_exhausted_total{scope="download"}` increases. When triggered, inspect structured logs (`MirrorDomainId`, `QuotaScope`, `RemoteIp`) and either raise limits or throttle abusive clients.
|
||||
3. **Bundle signature health** – metric `excitor_mirror_bundle_signature_verified_total` should match download counts when signing enabled. Deltas indicate missing `.jws` files; rebuild the bundle via export job or copy artefacts from the authority mirror cache.
|
||||
4. **HTTP errors** – dashboards should track 4xx/5xx rates split by route; repeated `503` statuses imply misconfigured exports. Check `mirror/index` logs for `status=misconfigured`.
|
||||
|
||||
### Incident steps
|
||||
1. Use `GET /excititor/mirror/domains/{id}/index` to capture current manifests. Attach the response to the incident log for reproducibility.
|
||||
2. For quota incidents, temporarily raise `maxIndexRequestsPerHour`/`maxDownloadRequestsPerHour` via the `Excititor:Mirror:Domains` config override, redeploy, then work with the consuming team on caching.
|
||||
3. For stale exports, trigger the export job (`Excititor.ExportRunner`) and confirm the artefacts are written to `outputRoot/<domain>`.
|
||||
4. Validate DSSE artefacts by running `cosign verify-blob --certificate-rekor-url=<rekor> --bundle <domain>/bundle.json --signature <domain>/bundle.json.jws`.
|
||||
|
||||
### Logging fields (structured)
|
||||
| Field | Description |
|
||||
| --- | --- |
|
||||
| `MirrorDomainId` | Domain handling the request (matches `id` in config). |
|
||||
| `QuotaScope` | `index` / `download`, useful when alerting on quota events. |
|
||||
| `ExportKey` | Included in download logs to pinpoint misconfigured exports. |
|
||||
| `BundleDigest` | SHA-256 of the artefact; compare with index payload when debugging corruption. |
|
||||
|
||||
### OTEL signals
|
||||
- **Counters:** `excitor.mirror.requests`, `excitor.mirror.quota_blocked`, `excitor.mirror.signature.failures`.
|
||||
- **Histograms:** `excitor.mirror.download.duration`, `excitor.mirror.export.latency`.
|
||||
- **Spans:** `mirror.index`, `mirror.download` include attributes `mirror.domain`, `mirror.export.key`, and `mirror.quota.remaining`.
|
||||
|
||||
Add these instruments via the `MirrorEndpoints` middleware; see `StellaOps.Excititor.WebService/Telemetry/MirrorMetrics.cs`.
|
||||
|
||||
|
||||
@@ -110,6 +110,10 @@ Import script calls `PutManifest` for each manifest, verifying digests. This ena
|
||||
|
||||
Scanner.Worker serialises EntryTrace graphs into Surface.FS using `SurfaceCacheKey(namespace: "entrytrace.graph", tenant, sha256(options|env|entrypoint))`. At runtime the worker checks the cache before invoking analyzers; cache hits bypass parsing and feed the result store/attestor pipeline directly. The same namespace is consumed by WebService and CLI to retrieve cached graphs for reporting.
|
||||
|
||||
### 6.2 BuildX generator path
|
||||
|
||||
`StellaOps.Scanner.Sbomer.BuildXPlugin` reuses the same CAS layout via the `--surface-*` descriptor flags (or `STELLAOPS_SURFACE_*` env vars). When layer fragment JSON, EntryTrace graph JSON, or NDJSON files are supplied, the plug-in writes them under `scanner/surface/**` within the configured CAS root and emits a manifest pointer so Scanner.WebService can pick up the artefacts without re-scanning. The Surface manifest JSON can also be copied to an arbitrary path via `--surface-manifest-output` for CI artefacts/offline kits.
|
||||
|
||||
## 7. Security & Tenancy
|
||||
|
||||
- Tenant ID is mandatory; Surface.Validation enforces match with Authority token.
|
||||
@@ -119,8 +123,12 @@ Scanner.Worker serialises EntryTrace graphs into Surface.FS using `SurfaceCacheK
|
||||
|
||||
## 8. Observability
|
||||
|
||||
- Logs include manifest SHA, tenant, and kind; payload paths truncated for brevity.
|
||||
- Metrics exported via Prometheus with labels `{tenant, kind, result}`.
|
||||
- Logs include manifest SHA, tenant, kind, and cache namespace; payload paths are truncated for brevity.
|
||||
- Prometheus metrics (emitted by Scanner.Worker) now include:
|
||||
- `scanner_worker_surface_manifests_published_total`, `scanner_worker_surface_manifests_failed_total`, `scanner_worker_surface_manifests_skipped_total` with labels `{queue, job_kind, surface_result, reason?, surface_payload_count}`.
|
||||
- `scanner_worker_surface_payload_persisted_total` with `{surface_kind}` to track cache churn (`entrytrace.graph`, `entrytrace.ndjson`, `layer.fragments`, …).
|
||||
- `scanner_worker_surface_manifest_publish_duration_ms` histogram for end-to-end persistence latency.
|
||||
- Grafana dashboard JSON: `docs/modules/scanner/operations/surface-worker-grafana-dashboard.json` (panels for publish outcomes, latency, per-kind cache rate, and failure reasons). Import alongside the analyzer dashboard and point it to the Scanner Prometheus datasource.
|
||||
- Tracing spans: `surface.fs.put`, `surface.fs.get`, `surface.fs.cache`.
|
||||
|
||||
## 9. Testing Strategy
|
||||
|
||||
@@ -0,0 +1,177 @@
|
||||
{
|
||||
"title": "StellaOps Scanner Surface Worker",
|
||||
"uid": "scanner-surface-worker",
|
||||
"schemaVersion": 38,
|
||||
"version": 1,
|
||||
"editable": true,
|
||||
"timezone": "",
|
||||
"graphTooltip": 0,
|
||||
"time": {
|
||||
"from": "now-24h",
|
||||
"to": "now"
|
||||
},
|
||||
"templating": {
|
||||
"list": [
|
||||
{
|
||||
"name": "datasource",
|
||||
"type": "datasource",
|
||||
"query": "prometheus",
|
||||
"refresh": 1,
|
||||
"hide": 0,
|
||||
"current": {}
|
||||
}
|
||||
]
|
||||
},
|
||||
"annotations": {
|
||||
"list": []
|
||||
},
|
||||
"panels": [
|
||||
{
|
||||
"id": 1,
|
||||
"type": "timeseries",
|
||||
"title": "Surface Manifest Outcomes (5m rate)",
|
||||
"datasource": {
|
||||
"type": "prometheus",
|
||||
"uid": "${datasource}"
|
||||
},
|
||||
"fieldConfig": {
|
||||
"defaults": {
|
||||
"displayName": "{{__series.name}}",
|
||||
"unit": "ops"
|
||||
},
|
||||
"overrides": []
|
||||
},
|
||||
"options": {
|
||||
"legend": {
|
||||
"displayMode": "table",
|
||||
"placement": "bottom"
|
||||
},
|
||||
"tooltip": {
|
||||
"mode": "multi",
|
||||
"sort": "none"
|
||||
}
|
||||
},
|
||||
"targets": [
|
||||
{
|
||||
"expr": "sum(rate(scanner_worker_surface_manifests_published_total[5m]))",
|
||||
"legendFormat": "published",
|
||||
"refId": "A"
|
||||
},
|
||||
{
|
||||
"expr": "sum(rate(scanner_worker_surface_manifests_failed_total[5m]))",
|
||||
"legendFormat": "failed",
|
||||
"refId": "B"
|
||||
},
|
||||
{
|
||||
"expr": "sum(rate(scanner_worker_surface_manifests_skipped_total[5m]))",
|
||||
"legendFormat": "skipped",
|
||||
"refId": "C"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"type": "timeseries",
|
||||
"title": "Surface Manifest Publish Duration (ms)",
|
||||
"datasource": {
|
||||
"type": "prometheus",
|
||||
"uid": "${datasource}"
|
||||
},
|
||||
"fieldConfig": {
|
||||
"defaults": {
|
||||
"displayName": "{{__series.name}}",
|
||||
"unit": "ms"
|
||||
},
|
||||
"overrides": []
|
||||
},
|
||||
"options": {
|
||||
"legend": {
|
||||
"displayMode": "table",
|
||||
"placement": "bottom"
|
||||
},
|
||||
"tooltip": {
|
||||
"mode": "single",
|
||||
"sort": "none"
|
||||
}
|
||||
},
|
||||
"targets": [
|
||||
{
|
||||
"expr": "histogram_quantile(0.95, sum by (le) (rate(scanner_worker_surface_manifest_publish_duration_ms_bucket[5m])))",
|
||||
"legendFormat": "p95",
|
||||
"refId": "A"
|
||||
},
|
||||
{
|
||||
"expr": "sum(rate(scanner_worker_surface_manifest_publish_duration_ms_sum[5m])) / sum(rate(scanner_worker_surface_manifest_publish_duration_ms_count[5m]))",
|
||||
"legendFormat": "avg",
|
||||
"refId": "B"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"type": "timeseries",
|
||||
"title": "Surface Payload Cached by Kind (5m rate)",
|
||||
"datasource": {
|
||||
"type": "prometheus",
|
||||
"uid": "${datasource}"
|
||||
},
|
||||
"fieldConfig": {
|
||||
"defaults": {
|
||||
"displayName": "{{surface_kind}}",
|
||||
"unit": "ops"
|
||||
},
|
||||
"overrides": []
|
||||
},
|
||||
"options": {
|
||||
"legend": {
|
||||
"displayMode": "table",
|
||||
"placement": "bottom"
|
||||
},
|
||||
"tooltip": {
|
||||
"mode": "multi",
|
||||
"sort": "none"
|
||||
}
|
||||
},
|
||||
"targets": [
|
||||
{
|
||||
"expr": "sum by (surface_kind) (rate(scanner_worker_surface_payload_persisted_total[5m]))",
|
||||
"legendFormat": "{{surface_kind}}",
|
||||
"refId": "A"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": 4,
|
||||
"type": "timeseries",
|
||||
"title": "Surface Manifest Failures by Reason (5m rate)",
|
||||
"datasource": {
|
||||
"type": "prometheus",
|
||||
"uid": "${datasource}"
|
||||
},
|
||||
"fieldConfig": {
|
||||
"defaults": {
|
||||
"displayName": "{{reason}}",
|
||||
"unit": "ops"
|
||||
},
|
||||
"overrides": []
|
||||
},
|
||||
"options": {
|
||||
"legend": {
|
||||
"displayMode": "table",
|
||||
"placement": "bottom"
|
||||
},
|
||||
"tooltip": {
|
||||
"mode": "multi",
|
||||
"sort": "none"
|
||||
}
|
||||
},
|
||||
"targets": [
|
||||
{
|
||||
"expr": "sum by (reason) (rate(scanner_worker_surface_manifests_failed_total[5m]))",
|
||||
"legendFormat": "{{reason}}",
|
||||
"refId": "A"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
99
docs/modules/taskrunner/migrations/pack-run-collections.md
Normal file
99
docs/modules/taskrunner/migrations/pack-run-collections.md
Normal file
@@ -0,0 +1,99 @@
|
||||
# Task Runner Collections — Initial Migration
|
||||
|
||||
Last updated: 2025-11-06
|
||||
|
||||
This migration seeds the MongoDB collections that back the Task Runner service. It is implemented as `20251106-task-runner-baseline.mongosh` under the platform migration runner and must be applied **before** enabling the TaskRunner service in any environment.
|
||||
|
||||
## Collections
|
||||
|
||||
### `pack_runs`
|
||||
|
||||
| Field | Type | Notes |
|
||||
|------------------|-----------------|-----------------------------------------------------------|
|
||||
| `_id` | `string` | Run identifier (same as `runId`). |
|
||||
| `planHash` | `string` | Deterministic hash produced by the planner. |
|
||||
| `plan` | `object` | Full `TaskPackPlan` payload used to execute the run. |
|
||||
| `failurePolicy` | `object` | Retry/backoff directives resolved at plan time. |
|
||||
| `requestedAt` | `date` | Timestamp when the client requested the run. |
|
||||
| `createdAt` | `date` | Timestamp when the run was persisted. |
|
||||
| `updatedAt` | `date` | Timestamp of the last mutation. |
|
||||
| `steps` | `array<object>` | Flattened step records (`stepId`, `status`, attempts…). |
|
||||
| `tenantId` | `string` | Optional multi-tenant scope (reserved for future phases). |
|
||||
|
||||
**Indexes**
|
||||
|
||||
1. `{ _id: 1 }` — implicit primary key / uniqueness guarantee.
|
||||
2. `{ updatedAt: -1 }` — serves `GET /runs` listings and staleness checks.
|
||||
3. `{ tenantId: 1, updatedAt: -1 }` — activated once tenancy is enforced; remains sparse until then.
|
||||
|
||||
### `pack_run_logs`
|
||||
|
||||
| Field | Type | Notes |
|
||||
|---------------|-----------------|--------------------------------------------------------|
|
||||
| `_id` | `ObjectId` | Generated per log entry. |
|
||||
| `runId` | `string` | Foreign key to `pack_runs._id`. |
|
||||
| `sequence` | `long` | Monotonic counter assigned by the writer. |
|
||||
| `timestamp` | `date` | UTC timestamp of the log event. |
|
||||
| `level` | `string` | `trace`, `debug`, `info`, `warn`, `error`. |
|
||||
| `eventType` | `string` | Machine-friendly event identifier (e.g. `step.started`). |
|
||||
| `message` | `string` | Human-readable summary. |
|
||||
| `stepId` | `string` | Optional step identifier. |
|
||||
| `metadata` | `object` | Deterministic key/value payload (string-only values). |
|
||||
|
||||
**Indexes**
|
||||
|
||||
1. `{ runId: 1, sequence: 1 }` (unique) — guarantees ordered retrieval and enforces idempotence.
|
||||
2. `{ runId: 1, timestamp: 1 }` — accelerates replay and time-window queries.
|
||||
3. `{ timestamp: 1 }` — optional TTL (disabled by default) for retention policies.
|
||||
|
||||
### `pack_artifacts`
|
||||
|
||||
| Field | Type | Notes |
|
||||
|--------------|------------|-------------------------------------------------------------|
|
||||
| `_id` | `ObjectId` | Generated per artifact record. |
|
||||
| `runId` | `string` | Foreign key to `pack_runs._id`. |
|
||||
| `name` | `string` | Output name from the Task Pack manifest. |
|
||||
| `type` | `string` | `file`, `object`, or other future evidence categories. |
|
||||
| `sourcePath` | `string` | Local path captured during execution (nullable). |
|
||||
| `storedPath` | `string` | Object store path or bundle-relative URI (nullable). |
|
||||
| `status` | `string` | `pending`, `copied`, `materialized`, `skipped`. |
|
||||
| `notes` | `string` | Free-form notes (deterministic messages only). |
|
||||
| `capturedAt` | `date` | UTC timestamp recorded by the worker. |
|
||||
|
||||
**Indexes**
|
||||
|
||||
1. `{ runId: 1, name: 1 }` (unique) — ensures a run emits at most one record per output.
|
||||
2. `{ runId: 1 }` — supports artifact listing alongside run inspection.
|
||||
|
||||
## Execution Order
|
||||
|
||||
1. Create collections with `validator` envelopes mirroring the field expectations above (if MongoDB schema validation is enabled in the environment).
|
||||
2. Apply the indexes in the order listed — unique indexes first to surface data issues early.
|
||||
3. Backfill existing filesystem-backed runs by importing the serialized state/log/artifact manifests into the new collections. A dedicated importer script (`tools/taskrunner/import-filesystem-state.ps1`) accompanies the migration.
|
||||
4. Switch the Task Runner service configuration to point at the Mongo-backed stores (`TaskRunner:Storage:Mode = "Mongo"`), then redeploy workers and web service.
|
||||
|
||||
## Rollback
|
||||
|
||||
To revert, switch the Task Runner configuration back to the filesystem provider and stop the Mongo migration runner. Collections can remain in place; they are append-only and harmless when unused.
|
||||
|
||||
## Configuration Reference
|
||||
|
||||
Enable the Mongo-backed stores by updating the worker and web service configuration (Compose/Helm values or `appsettings*.json`):
|
||||
|
||||
```json
|
||||
"TaskRunner": {
|
||||
"Storage": {
|
||||
"Mode": "mongo",
|
||||
"Mongo": {
|
||||
"ConnectionString": "mongodb://127.0.0.1:27017/taskrunner",
|
||||
"Database": "taskrunner",
|
||||
"RunsCollection": "pack_runs",
|
||||
"LogsCollection": "pack_run_logs",
|
||||
"ArtifactsCollection": "pack_artifacts",
|
||||
"ApprovalsCollection": "pack_run_approvals"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The worker uses the mirrored structure under the `Worker` section. Omit the `Database` property to fall back to the name embedded in the connection string.
|
||||
73
docs/security/assistant-guardrails.md
Normal file
73
docs/security/assistant-guardrails.md
Normal file
@@ -0,0 +1,73 @@
|
||||
# Advisory AI Guardrails & Redaction Policy
|
||||
|
||||
> **Audience:** Advisory AI guild, Security guild, Docs guild, operators consuming Advisory AI outputs.
|
||||
> **Scope:** Prompt redaction rules, injection defenses, telemetry/alert wiring, and audit guidance for Advisory AI (Epic 8).
|
||||
|
||||
Advisory AI accepts structured evidence from Concelier/Excititor and assembles prompts before executing downstream inference. Guardrails enforce provenance, block injection attempts, and redact sensitive content prior to handing data to any inference provider (online or offline). This document enumerates the guardrail surface and how to observe, alert, and audit it.
|
||||
|
||||
---
|
||||
|
||||
## 1 · Input validation & injection defense
|
||||
|
||||
Advisory prompts are rejected when any of the following checks fail:
|
||||
|
||||
1. **Citation coverage** – every prompt must carry at least one citation with an index, document id, and chunk id. Missing or malformed citations raise the `citation_missing` / `citation_invalid` violations.
|
||||
2. **Prompt length** – `AdvisoryGuardrailOptions.MaxPromptLength` defaults to 16 000 characters. Longer payloads raise `prompt_too_long`.
|
||||
3. **Blocked phrases** – the guardrail pipeline lowercases the prompt and searches for the blocked phrase cache (`ignore previous instructions`, `disregard earlier instructions`, `you are now the system`, `override the system prompt`, `please jailbreak`). Each hit raises `prompt_injection` and increments `blocked_phrase_count` metadata.
|
||||
4. **Optional per-profile rules** – when additional phrases are configured via configuration, they are appended to the cache at startup and evaluated with the same logic.
|
||||
|
||||
Any validation failure stops the pipeline before inference and emits `guardrail_blocked = true` in the persisted output as well as the corresponding metric counter.
|
||||
|
||||
## 2 · Redaction rules
|
||||
|
||||
Redactions are deterministic so caches remain stable. The current rule set (in order) is:
|
||||
|
||||
| Rule | Regex | Replacement |
|
||||
|------|-------|-------------|
|
||||
| AWS secret access keys | `(?i)(aws_secret_access_key\s*[:=]\s*)([A-Za-z0-9/+=]{40,})` | `$1[REDACTED_AWS_SECRET]` |
|
||||
| Credentials/tokens | `(?i)(token|apikey|password)\s*[:=]\s*([A-Za-z0-9\-_/]{16,})` | `$1: [REDACTED_CREDENTIAL]` |
|
||||
| PEM private keys | `(?is)-----BEGIN [^-]+ PRIVATE KEY-----.*?-----END [^-]+ PRIVATE KEY-----` | `[REDACTED_PRIVATE_KEY]` |
|
||||
|
||||
Redaction counts are surfaced via `guardrailResult.Metadata["redaction_count"]` and emitted as log fields to simplify threat hunting.
|
||||
|
||||
## 3 · Telemetry, logs, and traces
|
||||
|
||||
Advisory AI now exposes the following metrics (all tagged with `task_type` and, where applicable, cache/citation metadata):
|
||||
|
||||
| Metric | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `advisory_ai_latency_seconds` | Histogram | End-to-end worker latency from dequeue through persisted output. Aggregated with `plan_cache_hit` to compare cached vs. regenerated plans. |
|
||||
| `advisory_ai_guardrail_blocks_total` | Counter | Number of guardrail rejections per task. |
|
||||
| `advisory_ai_validation_failures_total` | Counter | Total validation violations emitted by the guardrail pipeline (one increment per violation instance). |
|
||||
| `advisory_ai_citation_coverage_ratio` | Histogram | Ratio of unique citations to structured chunks (0–1). Tags include `citations` and `structured_chunks`. |
|
||||
| `advisory_plans_created/queued/processed` | Counters | Existing plan lifecycle metrics (unchanged but now tagged by task type). |
|
||||
|
||||
### Logging
|
||||
|
||||
- Successful writes: `Stored advisory pipeline output {CacheKey}` log line now includes `guardrail_blocked`, `validation_failures`, and `citation_coverage`.
|
||||
- Guardrail rejection: warning log includes violation count and advisory key.
|
||||
- All dequeued jobs emit info logs carrying `cache:{Cache}` for quicker diagnosis.
|
||||
|
||||
### Tracing
|
||||
|
||||
- WebService (`/v1/advisory-ai/pipeline*`) emits `advisory_ai.plan_request` / `plan_batch` spans with tags for tenant, advisory key, cache key, and validation state.
|
||||
- Worker emits `advisory_ai.process` spans for each queue item with latency measurement and cache hit tags.
|
||||
|
||||
## 4 · Dashboards & alerts
|
||||
|
||||
Update the “Advisory AI” Grafana board with the new metrics:
|
||||
|
||||
1. **Latency panel** – plot `advisory_ai_latency_seconds` p50/p95 split by `plan_cache_hit`. Alert when p95 > 30s for 5 minutes.
|
||||
2. **Guardrail burn rate** – `advisory_ai_guardrail_blocks_total` vs. `advisory_ai_validation_failures_total`. Alert when either exceeds 5 blocks/min or 1% of total traffic.
|
||||
3. **Citation coverage** – histogram heatmap of `advisory_ai_citation_coverage_ratio` to identify evidence gaps (alert when <0.6 for more than 10 minutes).
|
||||
|
||||
All alerts should route to `#advisory-ai-ops` with the tenant, task type, and recent advisory keys in the message template.
|
||||
|
||||
## 5 · Operations & audit
|
||||
|
||||
- **When an alert fires:** capture the guardrail log entry, relevant metrics sample, and the cached plan from the worker output store. Attach them to the incident timeline entry.
|
||||
- **Tenant overrides:** any request to loosen guardrails or blocked phrase lists requires a signed change request and security approval. Update `AdvisoryGuardrailOptions` via configuration bundles and document the reason in the change log.
|
||||
- **Offline kit checks:** ensure the offline inference bundle uses the same guardrail configuration file as production; mismatches should fail the bundle validation step.
|
||||
- **Forensics:** persisted outputs now contain `guardrail_blocked`, `plan_cache_hit`, and `citation_coverage` metadata. Include these fields when exporting evidence bundles to prove guardrail enforcement.
|
||||
|
||||
Keep this document synced whenever guardrail rules, telemetry names, or alert targets change.
|
||||
12
docs/updates/2025-11-07-concelier-advisory-chunks.md
Normal file
12
docs/updates/2025-11-07-concelier-advisory-chunks.md
Normal file
@@ -0,0 +1,12 @@
|
||||
# 2025-11-07 – Concelier advisory chunks API
|
||||
|
||||
**Subject:** Paragraph-anchored advisory chunks land for Advisory AI
|
||||
**Audience:** Concelier WebService Guild, Advisory AI Guild, Observability Guild
|
||||
|
||||
- Shipped /advisories/{advisoryKey}/chunks with tenant enforcement, AdvisoryRead scopes, and filters for sections/formats/limits/minLength so Advisory AI can pull paragraph anchors plus source metadata deterministically.
|
||||
- Registered AdvisoryChunkBuilder behind new advisoryChunks configuration (chunk/observation/min-length caps) to keep offline and air-gapped deployments tunable without code changes.
|
||||
- Added regression coverage that seeds synthetic observations with embedded paragraphs to validate anchors, metadata, and source ordering before Advisory AI consumes the API; module README now points at this release note.
|
||||
|
||||
**Follow-ups**
|
||||
- [ ] CONCELIER-AIAI-31-002 – surface structured workaround/fix fields plus caching for downstream retrievers.
|
||||
- [ ] CONCELIER-AIAI-31-003 – wire chunk request metrics/logs and guardrail telemetry once the API stabilizes.
|
||||
BIN
local-nuget/Microsoft.IdentityModel.Tokens.7.0.3.nupkg
Normal file
BIN
local-nuget/Microsoft.IdentityModel.Tokens.7.0.3.nupkg
Normal file
Binary file not shown.
Binary file not shown.
51
scripts/buildx/buildx-surface-run.sh
Normal file
51
scripts/buildx/buildx-surface-run.sh
Normal file
@@ -0,0 +1,51 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
if [[ $# -lt 2 ]]; then
|
||||
echo "Usage: $0 <docker-context> <image-tag> [additional docker buildx args...]" >&2
|
||||
exit 64
|
||||
fi
|
||||
|
||||
context=$1
|
||||
shift
|
||||
image_tag=$1
|
||||
shift || true
|
||||
|
||||
builder_args=("$@")
|
||||
|
||||
: "${STELLAOPS_BUILDX_PUBLISH_DIR:=out/buildx}"
|
||||
: "${STELLAOPS_BUILDX_DLL:=${STELLAOPS_BUILDX_PUBLISH_DIR}/StellaOps.Scanner.Sbomer.BuildXPlugin.dll}"
|
||||
: "${STELLAOPS_BUILDX_MANIFEST_DIR:=${STELLAOPS_BUILDX_PUBLISH_DIR}}"
|
||||
: "${STELLAOPS_BUILDX_CAS_ROOT:=out/cas}"
|
||||
: "${STELLAOPS_SURFACE_TENANT:=default}"
|
||||
: "${STELLAOPS_SURFACE_CACHE_ROOT:=${STELLAOPS_BUILDX_CAS_ROOT}}"
|
||||
: "${STELLAOPS_SURFACE_MANIFEST_OUTPUT:=out/surface-manifest.json}"
|
||||
: "${STELLAOPS_SBOM_PATH:=out/buildx-sbom.cdx.json}"
|
||||
: "${STELLAOPS_SBOM_FORMAT:=cyclonedx-json}"
|
||||
: "${STELLAOPS_SBOM_MEDIA_TYPE:=application/vnd.cyclonedx+json}"
|
||||
: "${STELLAOPS_SBOM_KIND:=inventory}"
|
||||
: "${STELLAOPS_SBOM_ARTIFACT_TYPE:=application/vnd.stellaops.sbom.layer+json}"
|
||||
: "${STELLAOPS_SUBJECT_MEDIA_TYPE:=application/vnd.oci.image.manifest.v1+json}"
|
||||
: "${STELLAOPS_PREDICATE_TYPE:=https://slsa.dev/provenance/v1}"
|
||||
|
||||
mkdir -p "$STELLAOPS_BUILDX_PUBLISH_DIR" "$STELLAOPS_BUILDX_CAS_ROOT" "$(dirname "$STELLAOPS_SBOM_PATH")" "$(dirname "$STELLAOPS_SURFACE_MANIFEST_OUTPUT")"
|
||||
|
||||
if [[ ! -s "$STELLAOPS_BUILDX_DLL" ]]; then
|
||||
echo "Publishing BuildX plug-in to $STELLAOPS_BUILDX_PUBLISH_DIR" >&2
|
||||
dotnet publish src/Scanner/StellaOps.Scanner.Sbomer.BuildXPlugin/StellaOps.Scanner.Sbomer.BuildXPlugin.csproj \
|
||||
-c Release \
|
||||
-o "$STELLAOPS_BUILDX_PUBLISH_DIR"
|
||||
fi
|
||||
|
||||
if [[ ! -d "$STELLAOPS_BUILDX_MANIFEST_DIR" ]]; then
|
||||
echo "Manifest directory missing" >&2
|
||||
exit 65
|
||||
fi
|
||||
|
||||
if ! command -v docker >/dev/null 2>&1; then
|
||||
echo "docker CLI is not available in this environment" >&2
|
||||
exit 69
|
||||
fi
|
||||
|
||||
set -x
|
||||
format={{index
|
||||
@@ -13,6 +13,8 @@ public sealed class AdvisoryAiServiceOptions
|
||||
|
||||
public AdvisoryAiQueueOptions Queue { get; set; } = new();
|
||||
|
||||
public AdvisoryAiStorageOptions Storage { get; set; } = new();
|
||||
|
||||
internal string ResolveQueueDirectory(string contentRoot)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(contentRoot);
|
||||
@@ -31,9 +33,45 @@ public sealed class AdvisoryAiServiceOptions
|
||||
Directory.CreateDirectory(path);
|
||||
return path;
|
||||
}
|
||||
|
||||
internal string ResolvePlanCacheDirectory(string contentRoot)
|
||||
=> Storage.ResolvePlanCacheDirectory(contentRoot);
|
||||
|
||||
internal string ResolveOutputDirectory(string contentRoot)
|
||||
=> Storage.ResolveOutputDirectory(contentRoot);
|
||||
}
|
||||
|
||||
public sealed class AdvisoryAiQueueOptions
|
||||
{
|
||||
public string DirectoryPath { get; set; } = Path.Combine("data", "advisory-ai", "queue");
|
||||
}
|
||||
|
||||
public sealed class AdvisoryAiStorageOptions
|
||||
{
|
||||
public string PlanCacheDirectory { get; set; } = Path.Combine("data", "advisory-ai", "plans");
|
||||
|
||||
public string OutputDirectory { get; set; } = Path.Combine("data", "advisory-ai", "outputs");
|
||||
|
||||
internal string ResolvePlanCacheDirectory(string contentRoot)
|
||||
=> Resolve(contentRoot, PlanCacheDirectory, Path.Combine("data", "advisory-ai", "plans"));
|
||||
|
||||
internal string ResolveOutputDirectory(string contentRoot)
|
||||
=> Resolve(contentRoot, OutputDirectory, Path.Combine("data", "advisory-ai", "outputs"));
|
||||
|
||||
private static string Resolve(string contentRoot, string configuredPath, string fallback)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(contentRoot);
|
||||
|
||||
var path = string.IsNullOrWhiteSpace(configuredPath)
|
||||
? fallback
|
||||
: configuredPath;
|
||||
|
||||
if (!Path.IsPathFullyQualified(path))
|
||||
{
|
||||
path = Path.GetFullPath(Path.Combine(contentRoot, path));
|
||||
}
|
||||
|
||||
Directory.CreateDirectory(path);
|
||||
return path;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -41,6 +41,17 @@ internal static class AdvisoryAiServiceOptionsValidator
|
||||
options.Queue.DirectoryPath = Path.Combine("data", "advisory-ai", "queue");
|
||||
}
|
||||
|
||||
options.Storage ??= new AdvisoryAiStorageOptions();
|
||||
if (string.IsNullOrWhiteSpace(options.Storage.PlanCacheDirectory))
|
||||
{
|
||||
options.Storage.PlanCacheDirectory = Path.Combine("data", "advisory-ai", "plans");
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(options.Storage.OutputDirectory))
|
||||
{
|
||||
options.Storage.OutputDirectory = Path.Combine("data", "advisory-ai", "outputs");
|
||||
}
|
||||
|
||||
error = null;
|
||||
return true;
|
||||
}
|
||||
|
||||
@@ -0,0 +1,177 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Guardrails;
|
||||
using StellaOps.AdvisoryAI.Outputs;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
using StellaOps.AdvisoryAI.Prompting;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Hosting;
|
||||
|
||||
internal sealed class FileSystemAdvisoryOutputStore : IAdvisoryOutputStore
|
||||
{
|
||||
private readonly string _rootDirectory;
|
||||
private readonly JsonSerializerOptions _serializerOptions = new(JsonSerializerDefaults.Web);
|
||||
private readonly ILogger<FileSystemAdvisoryOutputStore> _logger;
|
||||
|
||||
public FileSystemAdvisoryOutputStore(
|
||||
IOptions<AdvisoryAiServiceOptions> serviceOptions,
|
||||
ILogger<FileSystemAdvisoryOutputStore> logger)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(serviceOptions);
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
|
||||
var options = serviceOptions.Value ?? throw new InvalidOperationException("Advisory AI options are required.");
|
||||
AdvisoryAiServiceOptionsValidator.Validate(options);
|
||||
_rootDirectory = options.ResolveOutputDirectory(AppContext.BaseDirectory);
|
||||
Directory.CreateDirectory(_rootDirectory);
|
||||
}
|
||||
|
||||
public async Task SaveAsync(AdvisoryPipelineOutput output, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(output);
|
||||
|
||||
var path = GetOutputPath(output.CacheKey, output.TaskType, output.Profile);
|
||||
Directory.CreateDirectory(Path.GetDirectoryName(path)!);
|
||||
|
||||
var envelope = OutputEnvelope.FromOutput(output);
|
||||
var tmpPath = $"{path}.tmp";
|
||||
|
||||
await using (var stream = new FileStream(tmpPath, FileMode.Create, FileAccess.Write, FileShare.None))
|
||||
{
|
||||
await JsonSerializer.SerializeAsync(stream, envelope, _serializerOptions, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
}
|
||||
|
||||
File.Move(tmpPath, path, overwrite: true);
|
||||
}
|
||||
|
||||
public async Task<AdvisoryPipelineOutput?> TryGetAsync(string cacheKey, AdvisoryTaskType taskType, string profile, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(cacheKey);
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(profile);
|
||||
|
||||
var path = GetOutputPath(cacheKey, taskType, profile);
|
||||
if (!File.Exists(path))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
await using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
var envelope = await JsonSerializer
|
||||
.DeserializeAsync<OutputEnvelope>(stream, _serializerOptions, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
|
||||
return envelope?.ToOutput();
|
||||
}
|
||||
catch (Exception ex) when (ex is IOException or JsonException)
|
||||
{
|
||||
_logger.LogWarning(ex, "Failed to read advisory output file {Path}", path);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private string GetOutputPath(string cacheKey, AdvisoryTaskType taskType, string profile)
|
||||
{
|
||||
var safeKey = Sanitize(cacheKey);
|
||||
var safeProfile = Sanitize(profile);
|
||||
var taskDirectory = Path.Combine(_rootDirectory, taskType.ToString().ToLowerInvariant(), safeProfile);
|
||||
return Path.Combine(taskDirectory, $"{safeKey}.json");
|
||||
}
|
||||
|
||||
private static string Sanitize(string value)
|
||||
{
|
||||
var invalid = Path.GetInvalidFileNameChars();
|
||||
var buffer = new char[value.Length];
|
||||
var length = 0;
|
||||
|
||||
foreach (var ch in value)
|
||||
{
|
||||
buffer[length++] = invalid.Contains(ch) ? '_' : ch;
|
||||
}
|
||||
|
||||
return new string(buffer, 0, length);
|
||||
}
|
||||
|
||||
private sealed record OutputEnvelope(
|
||||
string CacheKey,
|
||||
AdvisoryTaskType TaskType,
|
||||
string Profile,
|
||||
string Prompt,
|
||||
List<AdvisoryPromptCitation> Citations,
|
||||
Dictionary<string, string> Metadata,
|
||||
GuardrailEnvelope Guardrail,
|
||||
ProvenanceEnvelope Provenance,
|
||||
DateTimeOffset GeneratedAtUtc,
|
||||
bool PlanFromCache)
|
||||
{
|
||||
public static OutputEnvelope FromOutput(AdvisoryPipelineOutput output)
|
||||
=> new(
|
||||
output.CacheKey,
|
||||
output.TaskType,
|
||||
output.Profile,
|
||||
output.Prompt,
|
||||
output.Citations.ToList(),
|
||||
output.Metadata.ToDictionary(static pair => pair.Key, static pair => pair.Value, StringComparer.Ordinal),
|
||||
GuardrailEnvelope.FromResult(output.Guardrail),
|
||||
ProvenanceEnvelope.FromProvenance(output.Provenance),
|
||||
output.GeneratedAtUtc,
|
||||
output.PlanFromCache);
|
||||
|
||||
public AdvisoryPipelineOutput ToOutput()
|
||||
{
|
||||
var guardrail = Guardrail.ToResult();
|
||||
var citations = Citations.ToImmutableArray();
|
||||
var metadata = Metadata.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
|
||||
return new AdvisoryPipelineOutput(
|
||||
CacheKey,
|
||||
TaskType,
|
||||
Profile,
|
||||
Prompt,
|
||||
citations,
|
||||
metadata,
|
||||
guardrail,
|
||||
Provenance.ToProvenance(),
|
||||
GeneratedAtUtc,
|
||||
PlanFromCache);
|
||||
}
|
||||
}
|
||||
|
||||
private sealed record GuardrailEnvelope(
|
||||
bool Blocked,
|
||||
string SanitizedPrompt,
|
||||
List<AdvisoryGuardrailViolation> Violations,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static GuardrailEnvelope FromResult(AdvisoryGuardrailResult result)
|
||||
=> new(
|
||||
result.Blocked,
|
||||
result.SanitizedPrompt,
|
||||
result.Violations.ToList(),
|
||||
result.Metadata.ToDictionary(static pair => pair.Key, static pair => pair.Value, StringComparer.Ordinal));
|
||||
|
||||
public AdvisoryGuardrailResult ToResult()
|
||||
=> Blocked
|
||||
? AdvisoryGuardrailResult.Reject(SanitizedPrompt, Violations, Metadata.ToImmutableDictionary(StringComparer.Ordinal))
|
||||
: AdvisoryGuardrailResult.Allowed(SanitizedPrompt, Metadata.ToImmutableDictionary(StringComparer.Ordinal));
|
||||
}
|
||||
|
||||
private sealed record ProvenanceEnvelope(
|
||||
string InputDigest,
|
||||
string OutputHash,
|
||||
List<string> Signatures)
|
||||
{
|
||||
public static ProvenanceEnvelope FromProvenance(AdvisoryDsseProvenance provenance)
|
||||
=> new(
|
||||
provenance.InputDigest,
|
||||
provenance.OutputHash,
|
||||
provenance.Signatures.ToList());
|
||||
|
||||
public AdvisoryDsseProvenance ToProvenance()
|
||||
=> new(InputDigest, OutputHash, Signatures.ToImmutableArray());
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,462 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Caching;
|
||||
using StellaOps.AdvisoryAI.Context;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
using StellaOps.AdvisoryAI.Tools;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Hosting;
|
||||
|
||||
internal sealed class FileSystemAdvisoryPlanCache : IAdvisoryPlanCache
|
||||
{
|
||||
private readonly string _directory;
|
||||
private readonly JsonSerializerOptions _serializerOptions = new(JsonSerializerDefaults.Web);
|
||||
private readonly ILogger<FileSystemAdvisoryPlanCache> _logger;
|
||||
private readonly TimeProvider _timeProvider;
|
||||
private readonly TimeSpan _defaultTtl;
|
||||
private readonly TimeSpan _cleanupInterval;
|
||||
private DateTimeOffset _lastCleanup;
|
||||
|
||||
public FileSystemAdvisoryPlanCache(
|
||||
IOptions<AdvisoryAiServiceOptions> serviceOptions,
|
||||
IOptions<AdvisoryPlanCacheOptions> cacheOptions,
|
||||
ILogger<FileSystemAdvisoryPlanCache> logger,
|
||||
TimeProvider? timeProvider = null)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(serviceOptions);
|
||||
ArgumentNullException.ThrowIfNull(cacheOptions);
|
||||
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
var options = serviceOptions.Value ?? throw new InvalidOperationException("Advisory AI options are required.");
|
||||
AdvisoryAiServiceOptionsValidator.Validate(options);
|
||||
_directory = options.ResolvePlanCacheDirectory(AppContext.BaseDirectory);
|
||||
Directory.CreateDirectory(_directory);
|
||||
|
||||
var cache = cacheOptions.Value ?? throw new InvalidOperationException("Plan cache options are required.");
|
||||
if (cache.DefaultTimeToLive <= TimeSpan.Zero)
|
||||
{
|
||||
cache.DefaultTimeToLive = TimeSpan.FromMinutes(10);
|
||||
}
|
||||
|
||||
if (cache.CleanupInterval <= TimeSpan.Zero)
|
||||
{
|
||||
cache.CleanupInterval = TimeSpan.FromMinutes(5);
|
||||
}
|
||||
|
||||
_defaultTtl = cache.DefaultTimeToLive;
|
||||
_cleanupInterval = cache.CleanupInterval;
|
||||
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||
_lastCleanup = _timeProvider.GetUtcNow();
|
||||
}
|
||||
|
||||
public async Task SetAsync(string cacheKey, AdvisoryTaskPlan plan, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(cacheKey);
|
||||
ArgumentNullException.ThrowIfNull(plan);
|
||||
|
||||
var now = _timeProvider.GetUtcNow();
|
||||
await CleanupIfRequiredAsync(now, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var envelope = PlanEnvelope.FromPlan(plan, now + _defaultTtl);
|
||||
var targetPath = GetPlanPath(cacheKey);
|
||||
var tmpPath = $"{targetPath}.tmp";
|
||||
|
||||
await using (var stream = new FileStream(tmpPath, FileMode.Create, FileAccess.Write, FileShare.None))
|
||||
{
|
||||
await JsonSerializer.SerializeAsync(stream, envelope, _serializerOptions, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
}
|
||||
|
||||
File.Move(tmpPath, targetPath, overwrite: true);
|
||||
}
|
||||
|
||||
public async Task<AdvisoryTaskPlan?> TryGetAsync(string cacheKey, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(cacheKey);
|
||||
|
||||
var path = GetPlanPath(cacheKey);
|
||||
if (!File.Exists(path))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
await using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
var envelope = await JsonSerializer
|
||||
.DeserializeAsync<PlanEnvelope>(stream, _serializerOptions, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
|
||||
if (envelope is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var now = _timeProvider.GetUtcNow();
|
||||
if (envelope.ExpiresAtUtc <= now)
|
||||
{
|
||||
TryDelete(path);
|
||||
return null;
|
||||
}
|
||||
|
||||
return envelope.ToPlan();
|
||||
}
|
||||
catch (Exception ex) when (ex is IOException or JsonException)
|
||||
{
|
||||
_logger.LogWarning(ex, "Failed to read advisory plan cache file {Path}", path);
|
||||
TryDelete(path);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
public Task RemoveAsync(string cacheKey, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(cacheKey);
|
||||
|
||||
var path = GetPlanPath(cacheKey);
|
||||
TryDelete(path);
|
||||
return Task.CompletedTask;
|
||||
}
|
||||
|
||||
private async Task CleanupIfRequiredAsync(DateTimeOffset now, CancellationToken cancellationToken)
|
||||
{
|
||||
if (now - _lastCleanup < _cleanupInterval)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
foreach (var file in Directory.EnumerateFiles(_directory, "*.json", SearchOption.TopDirectoryOnly))
|
||||
{
|
||||
try
|
||||
{
|
||||
await using var stream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
var envelope = await JsonSerializer
|
||||
.DeserializeAsync<PlanEnvelope>(stream, _serializerOptions, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
|
||||
if (envelope is null || envelope.ExpiresAtUtc <= now)
|
||||
{
|
||||
TryDelete(file);
|
||||
}
|
||||
}
|
||||
catch (Exception ex) when (ex is IOException or JsonException)
|
||||
{
|
||||
_logger.LogDebug(ex, "Failed to inspect advisory plan cache file {Path}", file);
|
||||
TryDelete(file);
|
||||
}
|
||||
}
|
||||
|
||||
_lastCleanup = now;
|
||||
}
|
||||
|
||||
private string GetPlanPath(string cacheKey)
|
||||
{
|
||||
var safeName = Sanitize(cacheKey);
|
||||
return Path.Combine(_directory, $"{safeName}.json");
|
||||
}
|
||||
|
||||
private static string Sanitize(string value)
|
||||
{
|
||||
var invalid = Path.GetInvalidFileNameChars();
|
||||
var builder = new char[value.Length];
|
||||
var length = 0;
|
||||
|
||||
foreach (var ch in value)
|
||||
{
|
||||
builder[length++] = invalid.Contains(ch) ? '_' : ch;
|
||||
}
|
||||
|
||||
return new string(builder, 0, length);
|
||||
}
|
||||
|
||||
private void TryDelete(string path)
|
||||
{
|
||||
try
|
||||
{
|
||||
File.Delete(path);
|
||||
}
|
||||
catch (IOException ex)
|
||||
{
|
||||
_logger.LogDebug(ex, "Failed to delete advisory plan cache file {Path}", path);
|
||||
}
|
||||
}
|
||||
|
||||
private sealed record PlanEnvelope(
|
||||
AdvisoryTaskRequestEnvelope Request,
|
||||
string CacheKey,
|
||||
string PromptTemplate,
|
||||
List<AdvisoryChunkEnvelope> StructuredChunks,
|
||||
List<VectorResultEnvelope> VectorResults,
|
||||
SbomContextEnvelope? SbomContext,
|
||||
DependencyAnalysisEnvelope? DependencyAnalysis,
|
||||
AdvisoryTaskBudget Budget,
|
||||
Dictionary<string, string> Metadata,
|
||||
DateTimeOffset ExpiresAtUtc)
|
||||
{
|
||||
public static PlanEnvelope FromPlan(AdvisoryTaskPlan plan, DateTimeOffset expiry)
|
||||
=> new(
|
||||
AdvisoryTaskRequestEnvelope.FromRequest(plan.Request),
|
||||
plan.CacheKey,
|
||||
plan.PromptTemplate,
|
||||
plan.StructuredChunks.Select(AdvisoryChunkEnvelope.FromChunk).ToList(),
|
||||
plan.VectorResults.Select(VectorResultEnvelope.FromResult).ToList(),
|
||||
plan.SbomContext is null ? null : SbomContextEnvelope.FromContext(plan.SbomContext),
|
||||
plan.DependencyAnalysis is null ? null : DependencyAnalysisEnvelope.FromAnalysis(plan.DependencyAnalysis),
|
||||
plan.Budget,
|
||||
plan.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal),
|
||||
expiry);
|
||||
|
||||
public AdvisoryTaskPlan ToPlan()
|
||||
{
|
||||
var chunks = StructuredChunks
|
||||
.Select(static chunk => chunk.ToChunk())
|
||||
.ToImmutableArray();
|
||||
|
||||
var vectors = VectorResults
|
||||
.Select(static result => result.ToResult())
|
||||
.ToImmutableArray();
|
||||
|
||||
var sbom = SbomContext?.ToContext();
|
||||
var dependency = DependencyAnalysis?.ToAnalysis();
|
||||
var metadata = Metadata.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
|
||||
return new AdvisoryTaskPlan(
|
||||
Request.ToRequest(),
|
||||
CacheKey,
|
||||
PromptTemplate,
|
||||
chunks,
|
||||
vectors,
|
||||
sbom,
|
||||
dependency,
|
||||
Budget,
|
||||
metadata);
|
||||
}
|
||||
}
|
||||
|
||||
private sealed record AdvisoryTaskRequestEnvelope(
|
||||
AdvisoryTaskType TaskType,
|
||||
string AdvisoryKey,
|
||||
string? ArtifactId,
|
||||
string? ArtifactPurl,
|
||||
string? PolicyVersion,
|
||||
string Profile,
|
||||
IReadOnlyList<string>? PreferredSections,
|
||||
bool ForceRefresh)
|
||||
{
|
||||
public static AdvisoryTaskRequestEnvelope FromRequest(AdvisoryTaskRequest request)
|
||||
=> new(
|
||||
request.TaskType,
|
||||
request.AdvisoryKey,
|
||||
request.ArtifactId,
|
||||
request.ArtifactPurl,
|
||||
request.PolicyVersion,
|
||||
request.Profile,
|
||||
request.PreferredSections?.ToArray(),
|
||||
request.ForceRefresh);
|
||||
|
||||
public AdvisoryTaskRequest ToRequest()
|
||||
=> new(
|
||||
TaskType,
|
||||
AdvisoryKey,
|
||||
ArtifactId,
|
||||
ArtifactPurl,
|
||||
PolicyVersion,
|
||||
Profile,
|
||||
PreferredSections,
|
||||
ForceRefresh);
|
||||
}
|
||||
|
||||
private sealed record AdvisoryChunkEnvelope(
|
||||
string DocumentId,
|
||||
string ChunkId,
|
||||
string Section,
|
||||
string ParagraphId,
|
||||
string Text,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static AdvisoryChunkEnvelope FromChunk(AdvisoryChunk chunk)
|
||||
=> new(
|
||||
chunk.DocumentId,
|
||||
chunk.ChunkId,
|
||||
chunk.Section,
|
||||
chunk.ParagraphId,
|
||||
chunk.Text,
|
||||
chunk.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal));
|
||||
|
||||
public AdvisoryChunk ToChunk()
|
||||
=> AdvisoryChunk.Create(
|
||||
DocumentId,
|
||||
ChunkId,
|
||||
Section,
|
||||
ParagraphId,
|
||||
Text,
|
||||
Metadata);
|
||||
}
|
||||
|
||||
private sealed record VectorResultEnvelope(string Query, List<VectorMatchEnvelope> Matches)
|
||||
{
|
||||
public static VectorResultEnvelope FromResult(AdvisoryVectorResult result)
|
||||
=> new(
|
||||
result.Query,
|
||||
result.Matches.Select(VectorMatchEnvelope.FromMatch).ToList());
|
||||
|
||||
public AdvisoryVectorResult ToResult()
|
||||
=> new(Query, Matches.Select(static match => match.ToMatch()).ToImmutableArray());
|
||||
}
|
||||
|
||||
private sealed record VectorMatchEnvelope(
|
||||
string DocumentId,
|
||||
string ChunkId,
|
||||
string Text,
|
||||
double Score,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static VectorMatchEnvelope FromMatch(VectorRetrievalMatch match)
|
||||
=> new(
|
||||
match.DocumentId,
|
||||
match.ChunkId,
|
||||
match.Text,
|
||||
match.Score,
|
||||
match.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal));
|
||||
|
||||
public VectorRetrievalMatch ToMatch()
|
||||
=> new(DocumentId, ChunkId, Text, Score, Metadata);
|
||||
}
|
||||
|
||||
private sealed record SbomContextEnvelope(
|
||||
string ArtifactId,
|
||||
string? Purl,
|
||||
List<SbomVersionTimelineEntryEnvelope> VersionTimeline,
|
||||
List<SbomDependencyPathEnvelope> DependencyPaths,
|
||||
Dictionary<string, string> EnvironmentFlags,
|
||||
SbomBlastRadiusEnvelope? BlastRadius,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static SbomContextEnvelope FromContext(SbomContextResult context)
|
||||
=> new(
|
||||
context.ArtifactId,
|
||||
context.Purl,
|
||||
context.VersionTimeline.Select(SbomVersionTimelineEntryEnvelope.FromEntry).ToList(),
|
||||
context.DependencyPaths.Select(SbomDependencyPathEnvelope.FromPath).ToList(),
|
||||
context.EnvironmentFlags.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal),
|
||||
context.BlastRadius is null ? null : SbomBlastRadiusEnvelope.FromBlastRadius(context.BlastRadius),
|
||||
context.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal));
|
||||
|
||||
public SbomContextResult ToContext()
|
||||
=> SbomContextResult.Create(
|
||||
ArtifactId,
|
||||
Purl,
|
||||
VersionTimeline.Select(static entry => entry.ToEntry()),
|
||||
DependencyPaths.Select(static path => path.ToPath()),
|
||||
EnvironmentFlags,
|
||||
BlastRadius?.ToBlastRadius(),
|
||||
Metadata);
|
||||
}
|
||||
|
||||
private sealed record SbomVersionTimelineEntryEnvelope(
|
||||
string Version,
|
||||
DateTimeOffset FirstObserved,
|
||||
DateTimeOffset? LastObserved,
|
||||
string Status,
|
||||
string Source)
|
||||
{
|
||||
public static SbomVersionTimelineEntryEnvelope FromEntry(SbomVersionTimelineEntry entry)
|
||||
=> new(entry.Version, entry.FirstObserved, entry.LastObserved, entry.Status, entry.Source);
|
||||
|
||||
public SbomVersionTimelineEntry ToEntry()
|
||||
=> new(Version, FirstObserved, LastObserved, Status, Source);
|
||||
}
|
||||
|
||||
private sealed record SbomDependencyPathEnvelope(
|
||||
List<SbomDependencyNodeEnvelope> Nodes,
|
||||
bool IsRuntime,
|
||||
string? Source,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static SbomDependencyPathEnvelope FromPath(SbomDependencyPath path)
|
||||
=> new(
|
||||
path.Nodes.Select(SbomDependencyNodeEnvelope.FromNode).ToList(),
|
||||
path.IsRuntime,
|
||||
path.Source,
|
||||
path.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal));
|
||||
|
||||
public SbomDependencyPath ToPath()
|
||||
=> new(
|
||||
Nodes.Select(static node => node.ToNode()),
|
||||
IsRuntime,
|
||||
Source,
|
||||
Metadata);
|
||||
}
|
||||
|
||||
private sealed record SbomDependencyNodeEnvelope(string Identifier, string? Version)
|
||||
{
|
||||
public static SbomDependencyNodeEnvelope FromNode(SbomDependencyNode node)
|
||||
=> new(node.Identifier, node.Version);
|
||||
|
||||
public SbomDependencyNode ToNode()
|
||||
=> new(Identifier, Version);
|
||||
}
|
||||
|
||||
private sealed record SbomBlastRadiusEnvelope(
|
||||
int ImpactedAssets,
|
||||
int ImpactedWorkloads,
|
||||
int ImpactedNamespaces,
|
||||
double? ImpactedPercentage,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static SbomBlastRadiusEnvelope FromBlastRadius(SbomBlastRadiusSummary blastRadius)
|
||||
=> new(
|
||||
blastRadius.ImpactedAssets,
|
||||
blastRadius.ImpactedWorkloads,
|
||||
blastRadius.ImpactedNamespaces,
|
||||
blastRadius.ImpactedPercentage,
|
||||
blastRadius.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal));
|
||||
|
||||
public SbomBlastRadiusSummary ToBlastRadius()
|
||||
=> new(
|
||||
ImpactedAssets,
|
||||
ImpactedWorkloads,
|
||||
ImpactedNamespaces,
|
||||
ImpactedPercentage,
|
||||
Metadata);
|
||||
}
|
||||
|
||||
private sealed record DependencyAnalysisEnvelope(
|
||||
string ArtifactId,
|
||||
List<DependencyNodeSummaryEnvelope> Nodes,
|
||||
Dictionary<string, string> Metadata)
|
||||
{
|
||||
public static DependencyAnalysisEnvelope FromAnalysis(DependencyAnalysisResult analysis)
|
||||
=> new(
|
||||
analysis.ArtifactId,
|
||||
analysis.Nodes.Select(DependencyNodeSummaryEnvelope.FromNode).ToList(),
|
||||
analysis.Metadata.ToDictionary(static p => p.Key, static p => p.Value, StringComparer.Ordinal));
|
||||
|
||||
public DependencyAnalysisResult ToAnalysis()
|
||||
=> DependencyAnalysisResult.Create(
|
||||
ArtifactId,
|
||||
Nodes.Select(static node => node.ToNode()),
|
||||
Metadata);
|
||||
}
|
||||
|
||||
private sealed record DependencyNodeSummaryEnvelope(
|
||||
string Identifier,
|
||||
List<string> Versions,
|
||||
int RuntimeOccurrences,
|
||||
int DevelopmentOccurrences)
|
||||
{
|
||||
public static DependencyNodeSummaryEnvelope FromNode(DependencyNodeSummary node)
|
||||
=> new(
|
||||
node.Identifier,
|
||||
node.Versions.ToList(),
|
||||
node.RuntimeOccurrences,
|
||||
node.DevelopmentOccurrences);
|
||||
|
||||
public DependencyNodeSummary ToNode()
|
||||
=> new(Identifier, Versions, RuntimeOccurrences, DevelopmentOccurrences);
|
||||
}
|
||||
}
|
||||
@@ -3,14 +3,16 @@ using Microsoft.Extensions.Configuration;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.DependencyInjection.Extensions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Caching;
|
||||
using StellaOps.AdvisoryAI.DependencyInjection;
|
||||
using StellaOps.AdvisoryAI.Providers;
|
||||
using StellaOps.AdvisoryAI.Queue;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Hosting;
|
||||
|
||||
public static class ServiceCollectionExtensions
|
||||
{
|
||||
using StellaOps.AdvisoryAI.Outputs;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Hosting;
|
||||
|
||||
public static class ServiceCollectionExtensions
|
||||
{
|
||||
public static IServiceCollection AddAdvisoryAiCore(
|
||||
this IServiceCollection services,
|
||||
IConfiguration configuration,
|
||||
@@ -43,6 +45,8 @@ public static class ServiceCollectionExtensions
|
||||
services.AddAdvisoryPipelineInfrastructure();
|
||||
|
||||
services.Replace(ServiceDescriptor.Singleton<IAdvisoryTaskQueue, FileSystemAdvisoryTaskQueue>());
|
||||
services.Replace(ServiceDescriptor.Singleton<IAdvisoryPlanCache, FileSystemAdvisoryPlanCache>());
|
||||
services.Replace(ServiceDescriptor.Singleton<IAdvisoryOutputStore, FileSystemAdvisoryOutputStore>());
|
||||
services.TryAddSingleton<AdvisoryAiMetrics>();
|
||||
|
||||
return services;
|
||||
|
||||
@@ -1,27 +1,66 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using StellaOps.AdvisoryAI.Guardrails;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
using StellaOps.AdvisoryAI.Outputs;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.WebService.Contracts;
|
||||
|
||||
public sealed record AdvisoryOutputResponse(
|
||||
internal sealed record AdvisoryOutputResponse(
|
||||
string CacheKey,
|
||||
AdvisoryTaskType TaskType,
|
||||
string TaskType,
|
||||
string Profile,
|
||||
string OutputHash,
|
||||
bool GuardrailBlocked,
|
||||
IReadOnlyCollection<AdvisoryGuardrailViolationResponse> GuardrailViolations,
|
||||
IReadOnlyDictionary<string, string> GuardrailMetadata,
|
||||
string Prompt,
|
||||
IReadOnlyCollection<AdvisoryCitationResponse> Citations,
|
||||
IReadOnlyList<AdvisoryOutputCitation> Citations,
|
||||
IReadOnlyDictionary<string, string> Metadata,
|
||||
AdvisoryOutputGuardrail Guardrail,
|
||||
AdvisoryOutputProvenance Provenance,
|
||||
DateTimeOffset GeneratedAtUtc,
|
||||
bool PlanFromCache);
|
||||
|
||||
public sealed record AdvisoryGuardrailViolationResponse(string Code, string Message)
|
||||
bool PlanFromCache)
|
||||
{
|
||||
public static AdvisoryGuardrailViolationResponse From(AdvisoryGuardrailViolation violation)
|
||||
=> new(violation.Code, violation.Message);
|
||||
public static AdvisoryOutputResponse FromDomain(AdvisoryPipelineOutput output)
|
||||
=> new(
|
||||
output.CacheKey,
|
||||
output.TaskType.ToString(),
|
||||
output.Profile,
|
||||
output.Prompt,
|
||||
output.Citations
|
||||
.Select(citation => new AdvisoryOutputCitation(citation.Index, citation.DocumentId, citation.ChunkId))
|
||||
.ToList(),
|
||||
output.Metadata.ToDictionary(static pair => pair.Key, static pair => pair.Value, StringComparer.Ordinal),
|
||||
AdvisoryOutputGuardrail.FromDomain(output.Guardrail),
|
||||
AdvisoryOutputProvenance.FromDomain(output.Provenance),
|
||||
output.GeneratedAtUtc,
|
||||
output.PlanFromCache);
|
||||
}
|
||||
|
||||
public sealed record AdvisoryCitationResponse(int Index, string DocumentId, string ChunkId);
|
||||
internal sealed record AdvisoryOutputCitation(int Index, string DocumentId, string ChunkId);
|
||||
|
||||
internal sealed record AdvisoryOutputGuardrail(
|
||||
bool Blocked,
|
||||
string SanitizedPrompt,
|
||||
IReadOnlyList<AdvisoryOutputGuardrailViolation> Violations,
|
||||
IReadOnlyDictionary<string, string> Metadata)
|
||||
{
|
||||
public static AdvisoryOutputGuardrail FromDomain(AdvisoryGuardrailResult result)
|
||||
=> new(
|
||||
result.Blocked,
|
||||
result.SanitizedPrompt,
|
||||
result.Violations
|
||||
.Select(violation => new AdvisoryOutputGuardrailViolation(violation.Code, violation.Message))
|
||||
.ToList(),
|
||||
result.Metadata.ToDictionary(static pair => pair.Key, static pair => pair.Value, StringComparer.Ordinal));
|
||||
}
|
||||
|
||||
internal sealed record AdvisoryOutputGuardrailViolation(string Code, string Message);
|
||||
|
||||
internal sealed record AdvisoryOutputProvenance(
|
||||
string InputDigest,
|
||||
string OutputHash,
|
||||
IReadOnlyList<string> Signatures)
|
||||
{
|
||||
public static AdvisoryOutputProvenance FromDomain(AdvisoryDsseProvenance provenance)
|
||||
=> new(
|
||||
provenance.InputDigest,
|
||||
provenance.OutputHash,
|
||||
provenance.Signatures.ToArray());
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
using System.Diagnostics;
|
||||
using System.Linq;
|
||||
using System.Net;
|
||||
using System.Threading.RateLimiting;
|
||||
@@ -9,10 +10,13 @@ using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.AdvisoryAI.Caching;
|
||||
using StellaOps.AdvisoryAI.Diagnostics;
|
||||
using StellaOps.AdvisoryAI.Hosting;
|
||||
using StellaOps.AdvisoryAI.Metrics;
|
||||
using StellaOps.AdvisoryAI.Outputs;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
using StellaOps.AdvisoryAI.Queue;
|
||||
using StellaOps.AdvisoryAI.WebService.Contracts;
|
||||
|
||||
var builder = WebApplication.CreateBuilder(args);
|
||||
|
||||
@@ -72,6 +76,9 @@ app.MapPost("/v1/advisory-ai/pipeline/{taskType}", HandleSinglePlan)
|
||||
app.MapPost("/v1/advisory-ai/pipeline:batch", HandleBatchPlans)
|
||||
.RequireRateLimiting("advisory-ai");
|
||||
|
||||
app.MapGet("/v1/advisory-ai/outputs/{cacheKey}", HandleGetOutput)
|
||||
.RequireRateLimiting("advisory-ai");
|
||||
|
||||
app.Run();
|
||||
|
||||
static async Task<IResult> HandleSinglePlan(
|
||||
@@ -85,6 +92,10 @@ static async Task<IResult> HandleSinglePlan(
|
||||
AdvisoryPipelineMetrics pipelineMetrics,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
using var activity = AdvisoryAiActivitySource.Instance.StartActivity("advisory_ai.plan_request", ActivityKind.Server);
|
||||
activity?.SetTag("advisory.task_type", taskType);
|
||||
activity?.SetTag("advisory.advisory_key", request.AdvisoryKey);
|
||||
|
||||
if (!Enum.TryParse<AdvisoryTaskType>(taskType, ignoreCase: true, out var parsedType))
|
||||
{
|
||||
return Results.BadRequest(new { error = $"Unknown task type '{taskType}'." });
|
||||
@@ -103,6 +114,7 @@ static async Task<IResult> HandleSinglePlan(
|
||||
var normalizedRequest = request with { TaskType = parsedType };
|
||||
var taskRequest = normalizedRequest.ToTaskRequest();
|
||||
var plan = await orchestrator.CreatePlanAsync(taskRequest, cancellationToken).ConfigureAwait(false);
|
||||
activity?.SetTag("advisory.plan_cache_key", plan.CacheKey);
|
||||
|
||||
await planCache.SetAsync(plan.CacheKey, plan, cancellationToken).ConfigureAwait(false);
|
||||
await taskQueue.EnqueueAsync(new AdvisoryTaskQueueMessage(plan.CacheKey, plan.Request), cancellationToken).ConfigureAwait(false);
|
||||
@@ -125,6 +137,9 @@ static async Task<IResult> HandleBatchPlans(
|
||||
AdvisoryPipelineMetrics pipelineMetrics,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
using var activity = AdvisoryAiActivitySource.Instance.StartActivity("advisory_ai.plan_batch", ActivityKind.Server);
|
||||
activity?.SetTag("advisory.batch_size", batchRequest.Requests.Count);
|
||||
|
||||
if (batchRequest.Requests.Count == 0)
|
||||
{
|
||||
return Results.BadRequest(new { error = "At least one request must be supplied." });
|
||||
@@ -153,6 +168,12 @@ static async Task<IResult> HandleBatchPlans(
|
||||
var normalizedRequest = item with { TaskType = parsedType };
|
||||
var taskRequest = normalizedRequest.ToTaskRequest();
|
||||
var plan = await orchestrator.CreatePlanAsync(taskRequest, cancellationToken).ConfigureAwait(false);
|
||||
activity?.AddEvent(new ActivityEvent("advisory.plan.created", tags: new ActivityTagsCollection
|
||||
{
|
||||
{ "advisory.task_type", plan.Request.TaskType.ToString() },
|
||||
{ "advisory.advisory_key", plan.Request.AdvisoryKey },
|
||||
{ "advisory.plan_cache_key", plan.CacheKey }
|
||||
}));
|
||||
|
||||
await planCache.SetAsync(plan.CacheKey, plan, cancellationToken).ConfigureAwait(false);
|
||||
await taskQueue.EnqueueAsync(new AdvisoryTaskQueueMessage(plan.CacheKey, plan.Request), cancellationToken).ConfigureAwait(false);
|
||||
@@ -167,6 +188,37 @@ static async Task<IResult> HandleBatchPlans(
|
||||
return Results.Ok(results);
|
||||
}
|
||||
|
||||
static async Task<IResult> HandleGetOutput(
|
||||
HttpContext httpContext,
|
||||
string cacheKey,
|
||||
string taskType,
|
||||
string? profile,
|
||||
IAdvisoryOutputStore outputStore,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(outputStore);
|
||||
if (!Enum.TryParse<AdvisoryTaskType>(taskType, ignoreCase: true, out var parsedTaskType))
|
||||
{
|
||||
return Results.BadRequest(new { error = $"Unknown task type '{taskType}'." });
|
||||
}
|
||||
|
||||
if (!EnsureAuthorized(httpContext, parsedTaskType))
|
||||
{
|
||||
return Results.StatusCode(StatusCodes.Status403Forbidden);
|
||||
}
|
||||
|
||||
var resolvedProfile = string.IsNullOrWhiteSpace(profile) ? "default" : profile!.Trim();
|
||||
var output = await outputStore.TryGetAsync(cacheKey, parsedTaskType, resolvedProfile, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
|
||||
if (output is null)
|
||||
{
|
||||
return Results.NotFound(new { error = "Output not found." });
|
||||
}
|
||||
|
||||
return Results.Ok(AdvisoryOutputResponse.FromDomain(output));
|
||||
}
|
||||
|
||||
static bool EnsureAuthorized(HttpContext context, AdvisoryTaskType taskType)
|
||||
{
|
||||
if (!context.Request.Headers.TryGetValue("X-StellaOps-Scopes", out var scopes))
|
||||
|
||||
@@ -5,7 +5,7 @@ using Microsoft.Extensions.Logging;
|
||||
using StellaOps.AdvisoryAI.Hosting;
|
||||
using StellaOps.AdvisoryAI.Worker.Services;
|
||||
|
||||
var builder = Host.CreateApplicationBuilder(args);
|
||||
var builder = Microsoft.Extensions.Hosting.Host.CreateApplicationBuilder(args);
|
||||
|
||||
builder.Configuration
|
||||
.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
using System.Diagnostics;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.AdvisoryAI.Caching;
|
||||
using StellaOps.AdvisoryAI.Diagnostics;
|
||||
using StellaOps.AdvisoryAI.Metrics;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
using StellaOps.AdvisoryAI.Queue;
|
||||
@@ -50,8 +52,14 @@ internal sealed class AdvisoryTaskWorker : BackgroundService
|
||||
continue;
|
||||
}
|
||||
|
||||
using var activity = AdvisoryAiActivitySource.Instance.StartActivity("advisory_ai.process", ActivityKind.Consumer);
|
||||
activity?.SetTag("advisory.task_type", message.Request.TaskType.ToString());
|
||||
activity?.SetTag("advisory.advisory_key", message.Request.AdvisoryKey);
|
||||
|
||||
var processStart = _timeProvider.GetTimestamp();
|
||||
AdvisoryTaskPlan? plan = await _cache.TryGetAsync(message.PlanCacheKey, stoppingToken).ConfigureAwait(false);
|
||||
var fromCache = plan is not null && !message.Request.ForceRefresh;
|
||||
activity?.SetTag("advisory.plan_cache_hit", fromCache);
|
||||
|
||||
if (!fromCache)
|
||||
{
|
||||
@@ -68,8 +76,12 @@ internal sealed class AdvisoryTaskWorker : BackgroundService
|
||||
message.Request.AdvisoryKey,
|
||||
fromCache);
|
||||
|
||||
plan ??= throw new InvalidOperationException("Advisory task plan could not be generated.");
|
||||
await _executor.ExecuteAsync(plan, message, fromCache, stoppingToken).ConfigureAwait(false);
|
||||
_metrics.RecordPlanProcessed(message.Request.TaskType, fromCache);
|
||||
var totalElapsed = _timeProvider.GetElapsedTime(processStart);
|
||||
_metrics.RecordPipelineLatency(message.Request.TaskType, totalElapsed.TotalSeconds, fromCache);
|
||||
activity?.SetTag("advisory.pipeline_latency_seconds", totalElapsed.TotalSeconds);
|
||||
}
|
||||
catch (OperationCanceledException)
|
||||
{
|
||||
|
||||
@@ -0,0 +1,8 @@
|
||||
using System.Diagnostics;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Diagnostics;
|
||||
|
||||
internal static class AdvisoryAiActivitySource
|
||||
{
|
||||
public static readonly ActivitySource Instance = new("StellaOps.AdvisoryAI");
|
||||
}
|
||||
@@ -1,4 +1,7 @@
|
||||
using Microsoft.Extensions.Logging;
|
||||
using System;
|
||||
using System.Diagnostics;
|
||||
using System.Linq;
|
||||
using StellaOps.AdvisoryAI.Guardrails;
|
||||
using StellaOps.AdvisoryAI.Outputs;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
@@ -53,27 +56,72 @@ internal sealed class AdvisoryPipelineExecutor : IAdvisoryPipelineExecutor
|
||||
|
||||
var prompt = await _promptAssembler.AssembleAsync(plan, cancellationToken).ConfigureAwait(false);
|
||||
var guardrailResult = await _guardrailPipeline.EvaluateAsync(prompt, cancellationToken).ConfigureAwait(false);
|
||||
var violationCount = guardrailResult.Violations.Length;
|
||||
|
||||
if (guardrailResult.Blocked)
|
||||
{
|
||||
_logger?.LogWarning(
|
||||
"Guardrail blocked advisory pipeline output for {TaskType} on advisory {AdvisoryKey}",
|
||||
"Guardrail blocked advisory pipeline output for {TaskType} on advisory {AdvisoryKey} with {ViolationCount} violations",
|
||||
plan.Request.TaskType,
|
||||
plan.Request.AdvisoryKey,
|
||||
violationCount);
|
||||
}
|
||||
else if (violationCount > 0)
|
||||
{
|
||||
_logger?.LogInformation(
|
||||
"Guardrail recorded {ViolationCount} advisory validation violations for {TaskType} on advisory {AdvisoryKey}",
|
||||
violationCount,
|
||||
plan.Request.TaskType,
|
||||
plan.Request.AdvisoryKey);
|
||||
}
|
||||
|
||||
var citationCoverage = CalculateCitationCoverage(plan, prompt);
|
||||
var activity = Activity.Current;
|
||||
activity?.SetTag("advisory.guardrail_blocked", guardrailResult.Blocked);
|
||||
activity?.SetTag("advisory.validation_failures", violationCount);
|
||||
activity?.SetTag("advisory.citation_coverage", citationCoverage);
|
||||
_metrics.RecordGuardrailOutcome(plan.Request.TaskType, guardrailResult.Blocked, violationCount);
|
||||
_metrics.RecordCitationCoverage(
|
||||
plan.Request.TaskType,
|
||||
citationCoverage,
|
||||
prompt.Citations.Length,
|
||||
plan.StructuredChunks.Length);
|
||||
|
||||
var generatedAt = _timeProvider.GetUtcNow();
|
||||
var output = AdvisoryPipelineOutput.Create(plan, prompt, guardrailResult, generatedAt, planFromCache);
|
||||
await _outputStore.SaveAsync(output, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
_metrics.RecordGuardrailResult(plan.Request.TaskType, guardrailResult.Blocked);
|
||||
_metrics.RecordOutputStored(plan.Request.TaskType, planFromCache, guardrailResult.Blocked);
|
||||
|
||||
_logger?.LogInformation(
|
||||
"Stored advisory pipeline output {CacheKey} (task {TaskType}, cache:{CacheHit}, guardrail_blocked:{Blocked})",
|
||||
"Stored advisory pipeline output {CacheKey} (task {TaskType}, cache:{CacheHit}, guardrail_blocked:{Blocked}, validation_failures:{ValidationFailures}, citation_coverage:{CitationCoverage:0.00})",
|
||||
output.CacheKey,
|
||||
plan.Request.TaskType,
|
||||
planFromCache,
|
||||
guardrailResult.Blocked);
|
||||
guardrailResult.Blocked,
|
||||
violationCount,
|
||||
citationCoverage);
|
||||
}
|
||||
|
||||
private static double CalculateCitationCoverage(AdvisoryTaskPlan plan, AdvisoryPrompt prompt)
|
||||
{
|
||||
var structuredCount = plan.StructuredChunks.Length;
|
||||
if (structuredCount <= 0)
|
||||
{
|
||||
return 0d;
|
||||
}
|
||||
|
||||
if (prompt.Citations.IsDefaultOrEmpty)
|
||||
{
|
||||
return 0d;
|
||||
}
|
||||
|
||||
var uniqueCitations = prompt.Citations
|
||||
.Select(citation => (citation.DocumentId, citation.ChunkId))
|
||||
.Distinct()
|
||||
.Count();
|
||||
|
||||
var coverage = (double)uniqueCitations / structuredCount;
|
||||
return Math.Clamp(coverage, 0d, 1d);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -13,7 +13,10 @@ public sealed class AdvisoryPipelineMetrics : IDisposable
|
||||
private readonly Counter<long> _plansProcessed;
|
||||
private readonly Counter<long> _outputsStored;
|
||||
private readonly Counter<long> _guardrailBlocks;
|
||||
private readonly Counter<long> _validationFailures;
|
||||
private readonly Histogram<double> _planBuildDuration;
|
||||
private readonly Histogram<double> _pipelineLatencySeconds;
|
||||
private readonly Histogram<double> _citationCoverageRatio;
|
||||
private bool _disposed;
|
||||
|
||||
public AdvisoryPipelineMetrics(IMeterFactory meterFactory)
|
||||
@@ -25,8 +28,11 @@ public sealed class AdvisoryPipelineMetrics : IDisposable
|
||||
_plansQueued = _meter.CreateCounter<long>("advisory_plans_queued");
|
||||
_plansProcessed = _meter.CreateCounter<long>("advisory_plans_processed");
|
||||
_outputsStored = _meter.CreateCounter<long>("advisory_outputs_stored");
|
||||
_guardrailBlocks = _meter.CreateCounter<long>("advisory_guardrail_blocks");
|
||||
_guardrailBlocks = _meter.CreateCounter<long>("advisory_ai_guardrail_blocks_total");
|
||||
_validationFailures = _meter.CreateCounter<long>("advisory_ai_validation_failures_total");
|
||||
_planBuildDuration = _meter.CreateHistogram<double>("advisory_plan_build_duration_seconds");
|
||||
_pipelineLatencySeconds = _meter.CreateHistogram<double>("advisory_ai_latency_seconds");
|
||||
_citationCoverageRatio = _meter.CreateHistogram<double>("advisory_ai_citation_coverage_ratio");
|
||||
}
|
||||
|
||||
public void RecordPlanCreated(double buildSeconds, AdvisoryTaskType taskType)
|
||||
@@ -55,12 +61,40 @@ public sealed class AdvisoryPipelineMetrics : IDisposable
|
||||
KeyValuePair.Create<string, object?>("guardrail_blocked", guardrailBlocked));
|
||||
}
|
||||
|
||||
public void RecordGuardrailResult(AdvisoryTaskType taskType, bool blocked)
|
||||
public void RecordGuardrailOutcome(AdvisoryTaskType taskType, bool blocked, int validationFailures)
|
||||
{
|
||||
if (blocked)
|
||||
{
|
||||
_guardrailBlocks.Add(1, KeyValuePair.Create<string, object?>("task_type", taskType.ToString()));
|
||||
}
|
||||
|
||||
if (validationFailures > 0)
|
||||
{
|
||||
_validationFailures.Add(
|
||||
validationFailures,
|
||||
KeyValuePair.Create<string, object?>("task_type", taskType.ToString()));
|
||||
}
|
||||
}
|
||||
|
||||
public void RecordPipelineLatency(AdvisoryTaskType taskType, double seconds, bool planFromCache)
|
||||
{
|
||||
_pipelineLatencySeconds.Record(
|
||||
seconds,
|
||||
KeyValuePair.Create<string, object?>("task_type", taskType.ToString()),
|
||||
KeyValuePair.Create<string, object?>("plan_cache_hit", planFromCache));
|
||||
}
|
||||
|
||||
public void RecordCitationCoverage(
|
||||
AdvisoryTaskType taskType,
|
||||
double coverageRatio,
|
||||
int citationCount,
|
||||
int structuredChunkCount)
|
||||
{
|
||||
_citationCoverageRatio.Record(
|
||||
coverageRatio,
|
||||
KeyValuePair.Create<string, object?>("task_type", taskType.ToString()),
|
||||
KeyValuePair.Create<string, object?>("citations", citationCount),
|
||||
KeyValuePair.Create<string, object?>("structured_chunks", structuredChunkCount));
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
|
||||
@@ -6,11 +6,11 @@
|
||||
| AIAI-31-003 | DONE (2025-11-04) | Advisory AI Guild | AIAI-31-001..002 | Implement deterministic toolset (version comparators, range checks, dependency analysis, policy lookup) exposed via orchestrator. | Tools validated with property tests; outputs cached; docs updated. |
|
||||
| AIAI-31-004 | DONE (2025-11-04) | Advisory AI Guild | AIAI-31-001..003, AUTH-VULN-29-001 | Build orchestration pipeline for Summary/Conflict/Remediation tasks (prompt templates, tool calls, token budgets, caching). | Pipeline executes tasks deterministically; caches keyed by tuple+policy; integration tests cover tasks. |
|
||||
| AIAI-31-004A | DONE (2025-11-04) | Advisory AI Guild, Platform Guild | AIAI-31-004, AIAI-31-002 | Wire `AdvisoryPipelineOrchestrator` into WebService/Worker, expose API/queue contracts, emit metrics, and stand up cache stub. | API returns plan metadata; worker executes queue message; metrics recorded; doc updated. |
|
||||
| AIAI-31-004B | TODO | Advisory AI Guild, Security Guild | AIAI-31-004A, DOCS-AIAI-31-003, AUTH-AIAI-31-004 | Implement prompt assembler, guardrail plumbing, cache persistence, DSSE provenance; add golden outputs. | Deterministic outputs cached; guardrails enforced; tests cover prompt assembly + caching. |
|
||||
| AIAI-31-004C | TODO | Advisory AI Guild, CLI Guild, Docs Guild | AIAI-31-004B, CLI-AIAI-31-003 | Deliver CLI `stella advise run <task>` command, renderers, documentation updates, and CLI golden tests. | CLI command produces deterministic output; docs published; smoke run recorded. |
|
||||
| AIAI-31-004B | DONE (2025-11-06) | Advisory AI Guild, Security Guild | AIAI-31-004A, DOCS-AIAI-31-003, AUTH-AIAI-31-004 | Implement prompt assembler, guardrail plumbing, cache persistence, DSSE provenance; add golden outputs. | Deterministic outputs cached; guardrails enforced; tests cover prompt assembly + caching. |
|
||||
| AIAI-31-004C | DONE (2025-11-06) | Advisory AI Guild, CLI Guild, Docs Guild | AIAI-31-004B, CLI-AIAI-31-003 | Deliver CLI `stella advise run <task>` command, renderers, documentation updates, and CLI golden tests. | CLI command produces deterministic output; docs published; smoke run recorded. |
|
||||
| AIAI-31-005 | DONE (2025-11-04) | Advisory AI Guild, Security Guild | AIAI-31-004 | Implement guardrails (redaction, injection defense, output validation, citation enforcement) and fail-safe handling. | Guardrails block adversarial inputs; output validator enforces schemas; security tests pass. |
|
||||
| AIAI-31-006 | DONE (2025-11-04) | Advisory AI Guild | AIAI-31-004..005 | Expose REST API endpoints (`/advisory/ai/*`) with RBAC, rate limits, OpenAPI schemas, and batching support. | Endpoints deployed with schema validation; rate limits enforced; integration tests cover error codes. |
|
||||
| AIAI-31-007 | TODO | Advisory AI Guild, Observability Guild | AIAI-31-004..006 | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. | Telemetry live; dashboards approved; alerts configured. |
|
||||
| AIAI-31-007 | DONE (2025-11-06) | Advisory AI Guild, Observability Guild | AIAI-31-004..006 | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. | Telemetry live; dashboards approved; alerts configured. |
|
||||
| AIAI-31-008 | TODO | Advisory AI Guild, DevOps Guild | AIAI-31-006..007 | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. | Deployment docs merged; smoke deploy executed; offline kit updated; feature flags documented. |
|
||||
| AIAI-31-010 | DONE (2025-11-02) | Advisory AI Guild | CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001 | Implement Concelier advisory raw document provider mapping CSAF/OSV payloads into structured chunks for retrieval. | Provider resolves content format, preserves metadata, and passes unit tests covering CSAF/OSV cases. |
|
||||
| AIAI-31-011 | DONE (2025-11-02) | Advisory AI Guild | EXCITITOR-LNM-21-201, EXCITITOR-CORE-AOC-19-002 | Implement Excititor VEX document provider to surface structured VEX statements for vector retrieval. | Provider returns conflict-aware VEX chunks with deterministic metadata and tests for representative statements. |
|
||||
@@ -31,3 +31,5 @@
|
||||
> 2025-11-04: AIAI-31-005 DONE – guardrail pipeline redacts secrets, enforces citation/injection policies, emits block counters, and tests (`AdvisoryGuardrailPipelineTests`) cover redaction + citation validation.
|
||||
|
||||
> 2025-11-04: AIAI-31-006 DONE – REST endpoints enforce header scopes, apply token bucket rate limiting, sanitize prompts via guardrails, and queue execution with cached metadata. Tests executed via `dotnet test src/AdvisoryAI/__Tests/StellaOps.AdvisoryAI.Tests/StellaOps.AdvisoryAI.Tests.csproj --no-restore`.
|
||||
> 2025-11-06: AIAI-31-004B/C – Resuming prompt/cache hardening and CLI integration; first focus on backend client wiring and deterministic CLI outputs before full suite.
|
||||
> 2025-11-06: AIAI-31-004B/C DONE – Advisory AI Mongo integration validated, backend client + CLI `advise run` wired, deterministic console renderer with provenance/guardrail display added, docs refreshed, and targeted CLI tests executed.
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Diagnostics.Metrics;
|
||||
@@ -65,6 +66,58 @@ public sealed class AdvisoryPipelineExecutorTests : IDisposable
|
||||
saved.Prompt.Should().Be("{\"prompt\":\"value\"}");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ExecuteAsync_RecordsTelemetryMeasurements()
|
||||
{
|
||||
using var listener = new MeterListener();
|
||||
var doubleMeasurements = new List<(string Name, double Value, IEnumerable<KeyValuePair<string, object?>> Tags)>();
|
||||
var longMeasurements = new List<(string Name, long Value, IEnumerable<KeyValuePair<string, object?>> Tags)>();
|
||||
|
||||
listener.InstrumentPublished = (instrument, l) =>
|
||||
{
|
||||
if (instrument.Meter.Name == AdvisoryPipelineMetrics.MeterName)
|
||||
{
|
||||
l.EnableMeasurementEvents(instrument);
|
||||
}
|
||||
};
|
||||
|
||||
listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
doubleMeasurements.Add((instrument.Name, measurement, tags));
|
||||
});
|
||||
|
||||
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
longMeasurements.Add((instrument.Name, measurement, tags));
|
||||
});
|
||||
|
||||
listener.Start();
|
||||
|
||||
var plan = BuildMinimalPlan(cacheKey: "CACHE-3");
|
||||
var assembler = new StubPromptAssembler();
|
||||
var guardrail = new StubGuardrailPipeline(blocked: true);
|
||||
var store = new InMemoryAdvisoryOutputStore();
|
||||
using var metrics = new AdvisoryPipelineMetrics(_meterFactory);
|
||||
var executor = new AdvisoryPipelineExecutor(assembler, guardrail, store, metrics, TimeProvider.System);
|
||||
|
||||
var message = new AdvisoryTaskQueueMessage(plan.CacheKey, plan.Request);
|
||||
await executor.ExecuteAsync(plan, message, planFromCache: false, CancellationToken.None);
|
||||
|
||||
listener.Dispose();
|
||||
|
||||
longMeasurements.Should().Contain(measurement =>
|
||||
measurement.Name == "advisory_ai_guardrail_blocks_total" &&
|
||||
measurement.Value == 1);
|
||||
|
||||
longMeasurements.Should().Contain(measurement =>
|
||||
measurement.Name == "advisory_ai_validation_failures_total" &&
|
||||
measurement.Value == 1);
|
||||
|
||||
doubleMeasurements.Should().Contain(measurement =>
|
||||
measurement.Name == "advisory_ai_citation_coverage_ratio" &&
|
||||
Math.Abs(measurement.Value - 1d) < 0.0001);
|
||||
}
|
||||
|
||||
private static AdvisoryTaskPlan BuildMinimalPlan(string cacheKey)
|
||||
{
|
||||
var request = new AdvisoryTaskRequest(
|
||||
|
||||
@@ -0,0 +1,176 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.IO;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using FluentAssertions;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Caching;
|
||||
using StellaOps.AdvisoryAI.Context;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.AdvisoryAI.Guardrails;
|
||||
using StellaOps.AdvisoryAI.Hosting;
|
||||
using StellaOps.AdvisoryAI.Outputs;
|
||||
using StellaOps.AdvisoryAI.Orchestration;
|
||||
using StellaOps.AdvisoryAI.Tools;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class FileSystemAdvisoryPersistenceTests : IDisposable
|
||||
{
|
||||
private readonly TempDirectory _tempDir = new();
|
||||
|
||||
[Fact]
|
||||
public async Task PlanCache_PersistsPlanOnDisk()
|
||||
{
|
||||
var serviceOptions = Options.Create(new AdvisoryAiServiceOptions
|
||||
{
|
||||
Storage = new AdvisoryAiStorageOptions
|
||||
{
|
||||
PlanCacheDirectory = Path.Combine(_tempDir.Path, "plans"),
|
||||
OutputDirectory = Path.Combine(_tempDir.Path, "outputs")
|
||||
}
|
||||
});
|
||||
var cacheOptions = Options.Create(new AdvisoryPlanCacheOptions
|
||||
{
|
||||
DefaultTimeToLive = TimeSpan.FromMinutes(5),
|
||||
CleanupInterval = TimeSpan.FromMinutes(5)
|
||||
});
|
||||
var cache = new FileSystemAdvisoryPlanCache(serviceOptions, cacheOptions, NullLogger<FileSystemAdvisoryPlanCache>.Instance);
|
||||
|
||||
var plan = CreatePlan("cache-123");
|
||||
await cache.SetAsync(plan.CacheKey, plan, CancellationToken.None);
|
||||
|
||||
var reloaded = await cache.TryGetAsync(plan.CacheKey, CancellationToken.None);
|
||||
reloaded.Should().NotBeNull();
|
||||
reloaded!.CacheKey.Should().Be(plan.CacheKey);
|
||||
reloaded.Request.AdvisoryKey.Should().Be(plan.Request.AdvisoryKey);
|
||||
reloaded.StructuredChunks.Length.Should().Be(plan.StructuredChunks.Length);
|
||||
reloaded.Metadata.Should().ContainKey("advisory_key").WhoseValue.Should().Be("adv-key");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task OutputStore_PersistsOutputOnDisk()
|
||||
{
|
||||
var serviceOptions = Options.Create(new AdvisoryAiServiceOptions
|
||||
{
|
||||
Storage = new AdvisoryAiStorageOptions
|
||||
{
|
||||
PlanCacheDirectory = Path.Combine(_tempDir.Path, "plans"),
|
||||
OutputDirectory = Path.Combine(_tempDir.Path, "outputs")
|
||||
}
|
||||
});
|
||||
var store = new FileSystemAdvisoryOutputStore(serviceOptions, NullLogger<FileSystemAdvisoryOutputStore>.Instance);
|
||||
var plan = CreatePlan("cache-abc");
|
||||
var prompt = "{\"prompt\":\"value\"}";
|
||||
var guardrail = AdvisoryGuardrailResult.Allowed(prompt);
|
||||
var output = new AdvisoryPipelineOutput(
|
||||
plan.CacheKey,
|
||||
plan.Request.TaskType,
|
||||
plan.Request.Profile,
|
||||
prompt,
|
||||
ImmutableArray.Create(new AdvisoryPromptCitation(1, "doc-1", "chunk-1")),
|
||||
ImmutableDictionary<string, string>.Empty.Add("advisory_key", plan.Request.AdvisoryKey),
|
||||
guardrail,
|
||||
new AdvisoryDsseProvenance(plan.CacheKey, "hash", ImmutableArray<string>.Empty),
|
||||
DateTimeOffset.UtcNow,
|
||||
planFromCache: false);
|
||||
|
||||
await store.SaveAsync(output, CancellationToken.None);
|
||||
var reloaded = await store.TryGetAsync(plan.CacheKey, plan.Request.TaskType, plan.Request.Profile, CancellationToken.None);
|
||||
|
||||
reloaded.Should().NotBeNull();
|
||||
reloaded!.Prompt.Should().Be(prompt);
|
||||
reloaded.Metadata.Should().ContainKey("advisory_key").WhoseValue.Should().Be(plan.Request.AdvisoryKey);
|
||||
}
|
||||
|
||||
private static AdvisoryTaskPlan CreatePlan(string cacheKey)
|
||||
{
|
||||
var request = new AdvisoryTaskRequest(
|
||||
AdvisoryTaskType.Summary,
|
||||
advisoryKey: "adv-key",
|
||||
artifactId: "artifact-1",
|
||||
artifactPurl: "pkg:docker/sample@1.0.0",
|
||||
policyVersion: "policy-1",
|
||||
profile: "default",
|
||||
preferredSections: new[] { "Summary" },
|
||||
forceRefresh: false);
|
||||
|
||||
var chunk = AdvisoryChunk.Create("doc-1", "doc-1:chunk-1", "Summary", "para-1", "Summary text", new Dictionary<string, string> { ["section"] = "Summary" });
|
||||
var structured = ImmutableArray.Create(chunk);
|
||||
var vectorMatch = new VectorRetrievalMatch("doc-1", "doc-1:chunk-1", "Summary text", 0.95, new Dictionary<string, string>());
|
||||
var vectorResult = new AdvisoryVectorResult("summary-query", ImmutableArray.Create(vectorMatch));
|
||||
var sbom = SbomContextResult.Create(
|
||||
"artifact-1",
|
||||
"pkg:docker/sample@1.0.0",
|
||||
new[]
|
||||
{
|
||||
new SbomVersionTimelineEntry("1.0.0", DateTimeOffset.UtcNow.AddDays(-10), null, "affected", "scanner")
|
||||
},
|
||||
new[]
|
||||
{
|
||||
new SbomDependencyPath(
|
||||
new[]
|
||||
{
|
||||
new SbomDependencyNode("root", "1.0.0"),
|
||||
new SbomDependencyNode("runtime-lib", "2.1.0")
|
||||
},
|
||||
isRuntime: true)
|
||||
});
|
||||
var dependency = DependencyAnalysisResult.Create(
|
||||
"artifact-1",
|
||||
new[]
|
||||
{
|
||||
new DependencyNodeSummary("runtime-lib", new[] { "2.1.0" }, 1, 0)
|
||||
},
|
||||
new Dictionary<string, string> { ["artifact_id"] = "artifact-1" });
|
||||
|
||||
var metadata = ImmutableDictionary<string, string>.Empty.Add("advisory_key", "adv-key");
|
||||
var budget = new AdvisoryTaskBudget { PromptTokens = 1024, CompletionTokens = 256 };
|
||||
|
||||
return new AdvisoryTaskPlan(
|
||||
request,
|
||||
cacheKey,
|
||||
promptTemplate: "prompts/advisory/summary.liquid",
|
||||
structured,
|
||||
ImmutableArray.Create(vectorResult),
|
||||
sbom,
|
||||
dependency,
|
||||
budget,
|
||||
metadata);
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
_tempDir.Dispose();
|
||||
}
|
||||
|
||||
private sealed class TempDirectory : IDisposable
|
||||
{
|
||||
public TempDirectory()
|
||||
{
|
||||
Path = System.IO.Path.Combine(System.IO.Path.GetTempPath(), $"advisory-ai-tests-{Guid.NewGuid():N}");
|
||||
Directory.CreateDirectory(Path);
|
||||
}
|
||||
|
||||
public string Path { get; }
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
try
|
||||
{
|
||||
if (Directory.Exists(Path))
|
||||
{
|
||||
Directory.Delete(Path, recursive: true);
|
||||
}
|
||||
}
|
||||
catch
|
||||
{
|
||||
// ignore cleanup failures in tests
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -44,11 +44,9 @@ public static class AocHttpResults
|
||||
throw new ArgumentNullException(nameof(exception));
|
||||
}
|
||||
|
||||
var primaryCode = exception.Result.Violations.IsDefaultOrEmpty
|
||||
? "ERR_AOC_000"
|
||||
: exception.Result.Violations[0].ErrorCode;
|
||||
var error = AocError.FromException(exception, detail);
|
||||
|
||||
var violationPayload = exception.Result.Violations
|
||||
var violationPayload = error.Violations
|
||||
.Select(v => new Dictionary<string, object?>(StringComparer.Ordinal)
|
||||
{
|
||||
["code"] = v.ErrorCode,
|
||||
@@ -59,8 +57,9 @@ public static class AocHttpResults
|
||||
|
||||
var extensionPayload = new Dictionary<string, object?>(StringComparer.Ordinal)
|
||||
{
|
||||
["code"] = primaryCode,
|
||||
["violations"] = violationPayload
|
||||
["code"] = error.Code,
|
||||
["violations"] = violationPayload,
|
||||
["error"] = error
|
||||
};
|
||||
|
||||
if (extensions is not null)
|
||||
@@ -71,9 +70,9 @@ public static class AocHttpResults
|
||||
}
|
||||
}
|
||||
|
||||
var statusCode = status ?? MapErrorCodeToStatus(primaryCode);
|
||||
var statusCode = status ?? MapErrorCodeToStatus(error.Code);
|
||||
var problemType = type ?? DefaultProblemType;
|
||||
var problemDetail = detail ?? $"AOC guard rejected the request with {primaryCode}.";
|
||||
var problemDetail = detail ?? error.Message;
|
||||
var problemTitle = title ?? "Aggregation-Only Contract violation";
|
||||
|
||||
return HttpResults.Problem(
|
||||
|
||||
37
src/Aoc/__Libraries/StellaOps.Aoc/AocError.cs
Normal file
37
src/Aoc/__Libraries/StellaOps.Aoc/AocError.cs
Normal file
@@ -0,0 +1,37 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Aoc;
|
||||
|
||||
/// <summary>
|
||||
/// Represents a structured Aggregation-Only Contract error payload.
|
||||
/// </summary>
|
||||
public sealed record AocError(
|
||||
[property: JsonPropertyName("code")] string Code,
|
||||
[property: JsonPropertyName("message")] string Message,
|
||||
[property: JsonPropertyName("violations")] ImmutableArray<AocViolation> Violations)
|
||||
{
|
||||
public static AocError FromResult(AocGuardResult result, string? message = null)
|
||||
{
|
||||
if (result is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(result));
|
||||
}
|
||||
|
||||
var violations = result.Violations;
|
||||
var code = violations.IsDefaultOrEmpty ? "ERR_AOC_000" : violations[0].ErrorCode;
|
||||
var resolvedMessage = message ?? $"AOC guard rejected the payload with {code}.";
|
||||
return new(code, resolvedMessage, violations);
|
||||
}
|
||||
|
||||
public static AocError FromException(AocGuardException exception, string? message = null)
|
||||
{
|
||||
if (exception is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(exception));
|
||||
}
|
||||
|
||||
return FromResult(exception.Result, message);
|
||||
}
|
||||
}
|
||||
@@ -1,29 +1,49 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
|
||||
namespace StellaOps.Aoc;
|
||||
|
||||
public sealed record AocGuardOptions
|
||||
{
|
||||
private static readonly ImmutableHashSet<string> DefaultRequiredTopLevel = new[]
|
||||
{
|
||||
"tenant",
|
||||
"source",
|
||||
"upstream",
|
||||
"content",
|
||||
"linkset",
|
||||
}.ToImmutableHashSet(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
public static AocGuardOptions Default { get; } = new();
|
||||
|
||||
public ImmutableHashSet<string> RequiredTopLevelFields { get; init; } = DefaultRequiredTopLevel;
|
||||
|
||||
/// <summary>
|
||||
/// When true, signature metadata is required under upstream.signature.
|
||||
/// </summary>
|
||||
public bool RequireSignatureMetadata { get; init; } = true;
|
||||
|
||||
/// <summary>
|
||||
/// When true, tenant must be a non-empty string.
|
||||
/// </summary>
|
||||
public bool RequireTenant { get; init; } = true;
|
||||
}
|
||||
public sealed record AocGuardOptions
|
||||
{
|
||||
private static readonly ImmutableHashSet<string> DefaultRequiredTopLevel = new[]
|
||||
{
|
||||
"tenant",
|
||||
"source",
|
||||
"upstream",
|
||||
"content",
|
||||
"linkset",
|
||||
}.ToImmutableHashSet(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
private static readonly ImmutableHashSet<string> DefaultAllowedTopLevel = DefaultRequiredTopLevel
|
||||
.Union(new[]
|
||||
{
|
||||
"_id",
|
||||
"identifiers",
|
||||
"attributes",
|
||||
"supersedes",
|
||||
"createdAt",
|
||||
"created_at",
|
||||
"ingestedAt",
|
||||
"ingested_at"
|
||||
}, StringComparer.OrdinalIgnoreCase)
|
||||
.ToImmutableHashSet(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
public static AocGuardOptions Default { get; } = new();
|
||||
|
||||
public ImmutableHashSet<string> RequiredTopLevelFields { get; init; } = DefaultRequiredTopLevel;
|
||||
|
||||
/// <summary>
|
||||
/// Optional allowlist for top-level fields. Unknown fields trigger ERR_AOC_007.
|
||||
/// </summary>
|
||||
public ImmutableHashSet<string> AllowedTopLevelFields { get; init; } = DefaultAllowedTopLevel;
|
||||
|
||||
/// <summary>
|
||||
/// When true, signature metadata is required under upstream.signature.
|
||||
/// </summary>
|
||||
public bool RequireSignatureMetadata { get; init; } = true;
|
||||
|
||||
/// <summary>
|
||||
/// When true, tenant must be a non-empty string.
|
||||
/// </summary>
|
||||
public bool RequireTenant { get; init; } = true;
|
||||
}
|
||||
|
||||
@@ -11,16 +11,17 @@ public interface IAocGuard
|
||||
|
||||
public sealed class AocWriteGuard : IAocGuard
|
||||
{
|
||||
public AocGuardResult Validate(JsonElement document, AocGuardOptions? options = null)
|
||||
{
|
||||
options ??= AocGuardOptions.Default;
|
||||
var violations = ImmutableArray.CreateBuilder<AocViolation>();
|
||||
var presentTopLevel = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var property in document.EnumerateObject())
|
||||
{
|
||||
presentTopLevel.Add(property.Name);
|
||||
|
||||
public AocGuardResult Validate(JsonElement document, AocGuardOptions? options = null)
|
||||
{
|
||||
options ??= AocGuardOptions.Default;
|
||||
var violations = ImmutableArray.CreateBuilder<AocViolation>();
|
||||
var presentTopLevel = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
var allowedTopLevelFields = options.AllowedTopLevelFields ?? AocGuardOptions.Default.AllowedTopLevelFields;
|
||||
|
||||
foreach (var property in document.EnumerateObject())
|
||||
{
|
||||
presentTopLevel.Add(property.Name);
|
||||
|
||||
if (AocForbiddenKeys.IsForbiddenTopLevel(property.Name))
|
||||
{
|
||||
violations.Add(AocViolation.Create(AocViolationCode.ForbiddenField, $"/{property.Name}", $"Field '{property.Name}' is forbidden in AOC documents."));
|
||||
@@ -28,14 +29,20 @@ public sealed class AocWriteGuard : IAocGuard
|
||||
}
|
||||
|
||||
if (AocForbiddenKeys.IsDerivedField(property.Name))
|
||||
{
|
||||
violations.Add(AocViolation.Create(AocViolationCode.DerivedFindingDetected, $"/{property.Name}", $"Derived field '{property.Name}' must not be written during ingestion."));
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var required in options.RequiredTopLevelFields)
|
||||
{
|
||||
if (!document.TryGetProperty(required, out var element) || element.ValueKind is JsonValueKind.Null or JsonValueKind.Undefined)
|
||||
{
|
||||
violations.Add(AocViolation.Create(AocViolationCode.DerivedFindingDetected, $"/{property.Name}", $"Derived field '{property.Name}' must not be written during ingestion."));
|
||||
}
|
||||
|
||||
if (!allowedTopLevelFields.Contains(property.Name))
|
||||
{
|
||||
violations.Add(AocViolation.Create(AocViolationCode.UnknownField, $"/{property.Name}", $"Field '{property.Name}' is not allowed in AOC documents."));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var required in options.RequiredTopLevelFields)
|
||||
{
|
||||
if (!document.TryGetProperty(required, out var element) || element.ValueKind is JsonValueKind.Null or JsonValueKind.Undefined)
|
||||
{
|
||||
violations.Add(AocViolation.Create(AocViolationCode.MissingRequiredField, $"/{required}", $"Required field '{required}' is missing."));
|
||||
continue;
|
||||
|
||||
@@ -45,5 +45,10 @@ public sealed class AocHttpResultsTests
|
||||
Assert.Equal(2, violationsJson.GetArrayLength());
|
||||
Assert.Equal("ERR_AOC_004", violationsJson[0].GetProperty("code").GetString());
|
||||
Assert.Equal("/upstream", violationsJson[0].GetProperty("path").GetString());
|
||||
|
||||
var errorJson = root.GetProperty("error");
|
||||
Assert.Equal("ERR_AOC_004", errorJson.GetProperty("code").GetString());
|
||||
Assert.Equal(2, errorJson.GetProperty("violations").GetArrayLength());
|
||||
Assert.False(string.IsNullOrWhiteSpace(errorJson.GetProperty("message").GetString()));
|
||||
}
|
||||
}
|
||||
|
||||
44
src/Aoc/__Tests/StellaOps.Aoc.Tests/AocErrorTests.cs
Normal file
44
src/Aoc/__Tests/StellaOps.Aoc.Tests/AocErrorTests.cs
Normal file
@@ -0,0 +1,44 @@
|
||||
using System.Collections.Immutable;
|
||||
using StellaOps.Aoc;
|
||||
|
||||
namespace StellaOps.Aoc.Tests;
|
||||
|
||||
public sealed class AocErrorTests
|
||||
{
|
||||
[Fact]
|
||||
public void FromResult_UsesFirstViolationCode()
|
||||
{
|
||||
var violations = ImmutableArray.Create(
|
||||
AocViolation.Create(AocViolationCode.MissingProvenance, "/upstream", "Missing"),
|
||||
AocViolation.Create(AocViolationCode.ForbiddenField, "/severity", "Forbidden"));
|
||||
|
||||
var result = AocGuardResult.FromViolations(violations);
|
||||
|
||||
var error = AocError.FromResult(result);
|
||||
|
||||
Assert.Equal("ERR_AOC_004", error.Code);
|
||||
Assert.Equal(violations, error.Violations);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FromResult_DefaultsWhenNoViolations()
|
||||
{
|
||||
var error = AocError.FromResult(AocGuardResult.Success);
|
||||
|
||||
Assert.Equal("ERR_AOC_000", error.Code);
|
||||
Assert.Contains("ERR_AOC_000", error.Message, StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FromException_UsesCustomMessage()
|
||||
{
|
||||
var violations = ImmutableArray.Create(
|
||||
AocViolation.Create(AocViolationCode.ForbiddenField, "/severity", "Forbidden"));
|
||||
var exception = new AocGuardException(AocGuardResult.FromViolations(violations));
|
||||
|
||||
var error = AocError.FromException(exception, "custom");
|
||||
|
||||
Assert.Equal("custom", error.Message);
|
||||
Assert.Equal("ERR_AOC_001", error.Code);
|
||||
}
|
||||
}
|
||||
@@ -59,16 +59,17 @@ public sealed class AocWriteGuardTests
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Validate_FlagsForbiddenField()
|
||||
{
|
||||
using var document = JsonDocument.Parse("""
|
||||
{
|
||||
"tenant": "default",
|
||||
"severity": "high",
|
||||
"source": {"vendor": "osv"},
|
||||
"upstream": {
|
||||
"upstream_id": "GHSA-xxxx",
|
||||
"content_hash": "sha256:abc",
|
||||
public void Validate_FlagsForbiddenField()
|
||||
{
|
||||
using var document = JsonDocument.Parse("""
|
||||
{
|
||||
"tenant": "default",
|
||||
"identifiers": {},
|
||||
"severity": "high",
|
||||
"source": {"vendor": "osv"},
|
||||
"upstream": {
|
||||
"upstream_id": "GHSA-xxxx",
|
||||
"content_hash": "sha256:abc",
|
||||
"signature": { "present": false }
|
||||
},
|
||||
"content": {
|
||||
@@ -81,16 +82,74 @@ public sealed class AocWriteGuardTests
|
||||
|
||||
var result = Guard.Validate(document.RootElement);
|
||||
|
||||
Assert.False(result.IsValid);
|
||||
Assert.Contains(result.Violations, v => v.ErrorCode == "ERR_AOC_001" && v.Path == "/severity");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Validate_FlagsInvalidSignatureMetadata()
|
||||
{
|
||||
using var document = JsonDocument.Parse("""
|
||||
{
|
||||
"tenant": "default",
|
||||
Assert.False(result.IsValid);
|
||||
Assert.Contains(result.Violations, v => v.ErrorCode == "ERR_AOC_001" && v.Path == "/severity");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Validate_FlagsUnknownField()
|
||||
{
|
||||
using var document = JsonDocument.Parse("""
|
||||
{
|
||||
"tenant": "default",
|
||||
"source": {"vendor": "osv"},
|
||||
"upstream": {
|
||||
"upstream_id": "GHSA-xxxx",
|
||||
"content_hash": "sha256:abc",
|
||||
"signature": { "present": false }
|
||||
},
|
||||
"content": {
|
||||
"format": "OSV",
|
||||
"raw": {"id": "GHSA-xxxx"}
|
||||
},
|
||||
"linkset": {},
|
||||
"custom_field": {"extra": true}
|
||||
}
|
||||
""");
|
||||
|
||||
var result = Guard.Validate(document.RootElement);
|
||||
|
||||
Assert.False(result.IsValid);
|
||||
Assert.Contains(result.Violations, v => v.ErrorCode == "ERR_AOC_007" && v.Path == "/custom_field");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Validate_AllowsCustomField_WhenConfigured()
|
||||
{
|
||||
using var document = JsonDocument.Parse("""
|
||||
{
|
||||
"tenant": "default",
|
||||
"source": {"vendor": "osv"},
|
||||
"upstream": {
|
||||
"upstream_id": "GHSA-xxxx",
|
||||
"content_hash": "sha256:abc",
|
||||
"signature": { "present": false }
|
||||
},
|
||||
"content": {
|
||||
"format": "OSV",
|
||||
"raw": {"id": "GHSA-xxxx"}
|
||||
},
|
||||
"linkset": {},
|
||||
"custom_field": {"extra": true}
|
||||
}
|
||||
""");
|
||||
|
||||
var options = new AocGuardOptions
|
||||
{
|
||||
AllowedTopLevelFields = AocGuardOptions.Default.AllowedTopLevelFields.Add("custom_field")
|
||||
};
|
||||
|
||||
var result = Guard.Validate(document.RootElement, options);
|
||||
|
||||
Assert.True(result.IsValid);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Validate_FlagsInvalidSignatureMetadata()
|
||||
{
|
||||
using var document = JsonDocument.Parse("""
|
||||
{
|
||||
"tenant": "default",
|
||||
"source": {"vendor": "osv"},
|
||||
"upstream": {
|
||||
"upstream_id": "GHSA-xxxx",
|
||||
|
||||
@@ -4,7 +4,8 @@ using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Plugins;
|
||||
using StellaOps.Cli.Plugins;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
|
||||
namespace StellaOps.Cli.Commands;
|
||||
|
||||
@@ -35,12 +36,13 @@ internal static class CommandFactory
|
||||
root.Add(BuildSourcesCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAocCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAuthCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildPolicyCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildTaskRunnerCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildFindingsCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildConfigCommand(options));
|
||||
root.Add(BuildKmsCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildVulnCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildPolicyCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildTaskRunnerCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildFindingsCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAdviseCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildConfigCommand(options));
|
||||
root.Add(BuildKmsCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildVulnCommand(services, verboseOption, cancellationToken));
|
||||
|
||||
var pluginLogger = loggerFactory.CreateLogger<CliCommandModuleLoader>();
|
||||
var pluginLoader = new CliCommandModuleLoader(services, options, pluginLogger);
|
||||
@@ -733,7 +735,7 @@ internal static class CommandFactory
|
||||
var activateVersionOption = new Option<int>("--version")
|
||||
{
|
||||
Description = "Revision version to activate.",
|
||||
IsRequired = true
|
||||
Arity = ArgumentArity.ExactlyOne
|
||||
};
|
||||
|
||||
var activationNoteOption = new Option<string?>("--note")
|
||||
@@ -809,11 +811,11 @@ internal static class CommandFactory
|
||||
var taskRunner = new Command("task-runner", "Interact with Task Runner operations.");
|
||||
|
||||
var simulate = new Command("simulate", "Simulate a task pack and inspect the execution graph.");
|
||||
var manifestOption = new Option<string>("--manifest")
|
||||
{
|
||||
Description = "Path to the task pack manifest (YAML).",
|
||||
IsRequired = true
|
||||
};
|
||||
var manifestOption = new Option<string>("--manifest")
|
||||
{
|
||||
Description = "Path to the task pack manifest (YAML).",
|
||||
Arity = ArgumentArity.ExactlyOne
|
||||
};
|
||||
var inputsOption = new Option<string?>("--inputs")
|
||||
{
|
||||
Description = "Optional JSON file containing Task Pack input values."
|
||||
@@ -1042,13 +1044,110 @@ internal static class CommandFactory
|
||||
cancellationToken);
|
||||
});
|
||||
|
||||
findings.Add(list);
|
||||
findings.Add(get);
|
||||
findings.Add(explain);
|
||||
return findings;
|
||||
}
|
||||
|
||||
private static Command BuildVulnCommand(IServiceProvider services, Option<bool> verboseOption, CancellationToken cancellationToken)
|
||||
findings.Add(list);
|
||||
findings.Add(get);
|
||||
findings.Add(explain);
|
||||
return findings;
|
||||
}
|
||||
|
||||
private static Command BuildAdviseCommand(IServiceProvider services, StellaOpsCliOptions options, Option<bool> verboseOption, CancellationToken cancellationToken)
|
||||
{
|
||||
var advise = new Command("advise", "Interact with Advisory AI pipelines.");
|
||||
_ = options;
|
||||
|
||||
var run = new Command("run", "Generate Advisory AI output for the specified task.");
|
||||
var taskArgument = new Argument<string>("task")
|
||||
{
|
||||
Description = "Task to run (summary, conflict, remediation)."
|
||||
};
|
||||
run.Add(taskArgument);
|
||||
|
||||
var advisoryKeyOption = new Option<string>("--advisory-key")
|
||||
{
|
||||
Description = "Advisory identifier to summarise (required).",
|
||||
Required = true
|
||||
};
|
||||
var artifactIdOption = new Option<string?>("--artifact-id")
|
||||
{
|
||||
Description = "Optional artifact identifier to scope SBOM context."
|
||||
};
|
||||
var artifactPurlOption = new Option<string?>("--artifact-purl")
|
||||
{
|
||||
Description = "Optional package URL to scope dependency context."
|
||||
};
|
||||
var policyVersionOption = new Option<string?>("--policy-version")
|
||||
{
|
||||
Description = "Policy revision to evaluate (defaults to current)."
|
||||
};
|
||||
var profileOption = new Option<string?>("--profile")
|
||||
{
|
||||
Description = "Advisory AI execution profile (default, fips-local, etc.)."
|
||||
};
|
||||
var sectionOption = new Option<string[]>("--section")
|
||||
{
|
||||
Description = "Preferred context sections to emphasise (repeatable).",
|
||||
Arity = ArgumentArity.ZeroOrMore
|
||||
};
|
||||
sectionOption.AllowMultipleArgumentsPerToken = true;
|
||||
|
||||
var forceRefreshOption = new Option<bool>("--force-refresh")
|
||||
{
|
||||
Description = "Bypass cached plan/output and recompute."
|
||||
};
|
||||
|
||||
var timeoutOption = new Option<int?>("--timeout")
|
||||
{
|
||||
Description = "Seconds to wait for generated output before timing out (0 = single attempt)."
|
||||
};
|
||||
timeoutOption.Arity = ArgumentArity.ZeroOrOne;
|
||||
|
||||
run.Add(advisoryKeyOption);
|
||||
run.Add(artifactIdOption);
|
||||
run.Add(artifactPurlOption);
|
||||
run.Add(policyVersionOption);
|
||||
run.Add(profileOption);
|
||||
run.Add(sectionOption);
|
||||
run.Add(forceRefreshOption);
|
||||
run.Add(timeoutOption);
|
||||
|
||||
run.SetAction((parseResult, _) =>
|
||||
{
|
||||
var taskValue = parseResult.GetValue(taskArgument);
|
||||
var advisoryKey = parseResult.GetValue(advisoryKeyOption) ?? string.Empty;
|
||||
var artifactId = parseResult.GetValue(artifactIdOption);
|
||||
var artifactPurl = parseResult.GetValue(artifactPurlOption);
|
||||
var policyVersion = parseResult.GetValue(policyVersionOption);
|
||||
var profile = parseResult.GetValue(profileOption) ?? "default";
|
||||
var sections = parseResult.GetValue(sectionOption) ?? Array.Empty<string>();
|
||||
var forceRefresh = parseResult.GetValue(forceRefreshOption);
|
||||
var timeoutSeconds = parseResult.GetValue(timeoutOption) ?? 120;
|
||||
var verbose = parseResult.GetValue(verboseOption);
|
||||
|
||||
if (!Enum.TryParse<AdvisoryAiTaskType>(taskValue, ignoreCase: true, out var taskType))
|
||||
{
|
||||
throw new InvalidOperationException($"Unknown advisory task '{taskValue}'. Expected summary, conflict, or remediation.");
|
||||
}
|
||||
|
||||
return CommandHandlers.HandleAdviseRunAsync(
|
||||
services,
|
||||
taskType,
|
||||
advisoryKey,
|
||||
artifactId,
|
||||
artifactPurl,
|
||||
policyVersion,
|
||||
profile,
|
||||
sections,
|
||||
forceRefresh,
|
||||
timeoutSeconds,
|
||||
verbose,
|
||||
cancellationToken);
|
||||
});
|
||||
|
||||
advise.Add(run);
|
||||
return advise;
|
||||
}
|
||||
|
||||
private static Command BuildVulnCommand(IServiceProvider services, Option<bool> verboseOption, CancellationToken cancellationToken)
|
||||
{
|
||||
var vuln = new Command("vuln", "Explore vulnerability observations and overlays.");
|
||||
|
||||
|
||||
@@ -24,9 +24,10 @@ using Spectre.Console.Rendering;
|
||||
using StellaOps.Auth.Client;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Prompts;
|
||||
using StellaOps.Cli.Services;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Telemetry;
|
||||
using StellaOps.Cli.Services;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
using StellaOps.Cli.Telemetry;
|
||||
using StellaOps.Cryptography;
|
||||
using StellaOps.Cryptography.Kms;
|
||||
|
||||
@@ -426,14 +427,154 @@ internal static class CommandHandlers
|
||||
{
|
||||
verbosity.MinimumLevel = previousLevel;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public static async Task HandleSourcesIngestAsync(
|
||||
IServiceProvider services,
|
||||
bool dryRun,
|
||||
string source,
|
||||
string input,
|
||||
}
|
||||
|
||||
public static async Task HandleAdviseRunAsync(
|
||||
IServiceProvider services,
|
||||
AdvisoryAiTaskType taskType,
|
||||
string advisoryKey,
|
||||
string? artifactId,
|
||||
string? artifactPurl,
|
||||
string? policyVersion,
|
||||
string profile,
|
||||
IReadOnlyList<string> preferredSections,
|
||||
bool forceRefresh,
|
||||
int timeoutSeconds,
|
||||
bool verbose,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
await using var scope = services.CreateAsyncScope();
|
||||
var client = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
|
||||
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("advise-run");
|
||||
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
|
||||
var previousLevel = verbosity.MinimumLevel;
|
||||
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
|
||||
using var activity = CliActivitySource.Instance.StartActivity("cli.advisory.run", ActivityKind.Client);
|
||||
activity?.SetTag("stellaops.cli.command", "advise run");
|
||||
activity?.SetTag("stellaops.cli.task", taskType.ToString());
|
||||
using var duration = CliMetrics.MeasureCommandDuration("advisory run");
|
||||
activity?.SetTag("stellaops.cli.force_refresh", forceRefresh);
|
||||
|
||||
var outcome = "error";
|
||||
try
|
||||
{
|
||||
var normalizedKey = advisoryKey?.Trim();
|
||||
if (string.IsNullOrWhiteSpace(normalizedKey))
|
||||
{
|
||||
throw new ArgumentException("Advisory key is required.", nameof(advisoryKey));
|
||||
}
|
||||
|
||||
activity?.SetTag("stellaops.cli.advisory.key", normalizedKey);
|
||||
var normalizedProfile = string.IsNullOrWhiteSpace(profile) ? "default" : profile.Trim();
|
||||
activity?.SetTag("stellaops.cli.profile", normalizedProfile);
|
||||
|
||||
var normalizedSections = NormalizeSections(preferredSections);
|
||||
|
||||
var request = new AdvisoryPipelinePlanRequestModel
|
||||
{
|
||||
TaskType = taskType,
|
||||
AdvisoryKey = normalizedKey,
|
||||
ArtifactId = string.IsNullOrWhiteSpace(artifactId) ? null : artifactId!.Trim(),
|
||||
ArtifactPurl = string.IsNullOrWhiteSpace(artifactPurl) ? null : artifactPurl!.Trim(),
|
||||
PolicyVersion = string.IsNullOrWhiteSpace(policyVersion) ? null : policyVersion!.Trim(),
|
||||
Profile = normalizedProfile,
|
||||
PreferredSections = normalizedSections.Length > 0 ? normalizedSections : null,
|
||||
ForceRefresh = forceRefresh
|
||||
};
|
||||
|
||||
logger.LogInformation("Requesting advisory plan for {TaskType} (advisory={AdvisoryKey}).", taskType, normalizedKey);
|
||||
|
||||
var plan = await client.CreateAdvisoryPipelinePlanAsync(taskType, request, cancellationToken).ConfigureAwait(false);
|
||||
activity?.SetTag("stellaops.cli.advisory.cache_key", plan.CacheKey);
|
||||
RenderAdvisoryPlan(plan);
|
||||
logger.LogInformation("Plan {CacheKey} queued with {Chunks} chunks and {Vectors} vectors.",
|
||||
plan.CacheKey,
|
||||
plan.Chunks.Count,
|
||||
plan.Vectors.Count);
|
||||
|
||||
var pollDelay = TimeSpan.FromSeconds(1);
|
||||
var shouldWait = timeoutSeconds > 0;
|
||||
var deadline = shouldWait ? DateTimeOffset.UtcNow + TimeSpan.FromSeconds(timeoutSeconds) : DateTimeOffset.UtcNow;
|
||||
|
||||
AdvisoryPipelineOutputModel? output = null;
|
||||
while (true)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
output = await client
|
||||
.TryGetAdvisoryPipelineOutputAsync(plan.CacheKey, taskType, normalizedProfile, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
|
||||
if (output is not null)
|
||||
{
|
||||
break;
|
||||
}
|
||||
|
||||
if (!shouldWait || DateTimeOffset.UtcNow >= deadline)
|
||||
{
|
||||
break;
|
||||
}
|
||||
|
||||
logger.LogDebug("Advisory output pending for {CacheKey}; retrying in {DelaySeconds}s.", plan.CacheKey, pollDelay.TotalSeconds);
|
||||
await Task.Delay(pollDelay, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
if (output is null)
|
||||
{
|
||||
logger.LogError("Timed out after {Timeout}s waiting for advisory output (cache key {CacheKey}).",
|
||||
Math.Max(timeoutSeconds, 0),
|
||||
plan.CacheKey);
|
||||
activity?.SetStatus(ActivityStatusCode.Error, "timeout");
|
||||
outcome = "timeout";
|
||||
Environment.ExitCode = Environment.ExitCode == 0 ? 70 : Environment.ExitCode;
|
||||
return;
|
||||
}
|
||||
|
||||
activity?.SetTag("stellaops.cli.advisory.generated_at", output.GeneratedAtUtc.ToString("O", CultureInfo.InvariantCulture));
|
||||
activity?.SetTag("stellaops.cli.advisory.cache_hit", output.PlanFromCache);
|
||||
logger.LogInformation("Advisory output ready (cache key {CacheKey}).", output.CacheKey);
|
||||
|
||||
RenderAdvisoryOutput(output);
|
||||
|
||||
if (output.Guardrail.Blocked)
|
||||
{
|
||||
logger.LogError("Guardrail blocked advisory output (cache key {CacheKey}).", output.CacheKey);
|
||||
activity?.SetStatus(ActivityStatusCode.Error, "guardrail_blocked");
|
||||
outcome = "blocked";
|
||||
Environment.ExitCode = Environment.ExitCode == 0 ? 65 : Environment.ExitCode;
|
||||
return;
|
||||
}
|
||||
|
||||
activity?.SetStatus(ActivityStatusCode.Ok);
|
||||
outcome = output.PlanFromCache ? "cache-hit" : "ok";
|
||||
Environment.ExitCode = 0;
|
||||
}
|
||||
catch (OperationCanceledException)
|
||||
{
|
||||
outcome = "cancelled";
|
||||
activity?.SetStatus(ActivityStatusCode.Error, "cancelled");
|
||||
Environment.ExitCode = Environment.ExitCode == 0 ? 130 : Environment.ExitCode;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
activity?.SetStatus(ActivityStatusCode.Error, ex.Message);
|
||||
logger.LogError(ex, "Failed to run advisory task.");
|
||||
outcome = "error";
|
||||
Environment.ExitCode = Environment.ExitCode == 0 ? 1 : Environment.ExitCode;
|
||||
}
|
||||
finally
|
||||
{
|
||||
activity?.SetTag("stellaops.cli.advisory.outcome", outcome);
|
||||
CliMetrics.RecordAdvisoryRun(taskType.ToString(), outcome);
|
||||
verbosity.MinimumLevel = previousLevel;
|
||||
}
|
||||
}
|
||||
|
||||
public static async Task HandleSourcesIngestAsync(
|
||||
IServiceProvider services,
|
||||
bool dryRun,
|
||||
string source,
|
||||
string input,
|
||||
string? tenantOverride,
|
||||
string format,
|
||||
bool disableColor,
|
||||
@@ -6137,7 +6278,156 @@ internal static class CommandHandlers
|
||||
["ERR_AOC_007"] = 17
|
||||
};
|
||||
|
||||
private static IDictionary<string, object?> RemoveNullValues(Dictionary<string, object?> source)
|
||||
private static string[] NormalizeSections(IReadOnlyList<string> sections)
|
||||
{
|
||||
if (sections is null || sections.Count == 0)
|
||||
{
|
||||
return Array.Empty<string>();
|
||||
}
|
||||
|
||||
return sections
|
||||
.Where(section => !string.IsNullOrWhiteSpace(section))
|
||||
.Select(section => section.Trim())
|
||||
.Where(section => section.Length > 0)
|
||||
.Distinct(StringComparer.OrdinalIgnoreCase)
|
||||
.ToArray();
|
||||
}
|
||||
|
||||
private static void RenderAdvisoryPlan(AdvisoryPipelinePlanResponseModel plan)
|
||||
{
|
||||
var console = AnsiConsole.Console;
|
||||
|
||||
var summary = new Table()
|
||||
.Border(TableBorder.Rounded)
|
||||
.Title("[bold]Advisory Plan[/]");
|
||||
summary.AddColumn("Field");
|
||||
summary.AddColumn("Value");
|
||||
summary.AddRow("Task", Markup.Escape(plan.TaskType));
|
||||
summary.AddRow("Cache Key", Markup.Escape(plan.CacheKey));
|
||||
summary.AddRow("Prompt Template", Markup.Escape(plan.PromptTemplate));
|
||||
summary.AddRow("Chunks", plan.Chunks.Count.ToString(CultureInfo.InvariantCulture));
|
||||
summary.AddRow("Vectors", plan.Vectors.Count.ToString(CultureInfo.InvariantCulture));
|
||||
summary.AddRow("Prompt Tokens", plan.Budget.PromptTokens.ToString(CultureInfo.InvariantCulture));
|
||||
summary.AddRow("Completion Tokens", plan.Budget.CompletionTokens.ToString(CultureInfo.InvariantCulture));
|
||||
|
||||
console.Write(summary);
|
||||
|
||||
if (plan.Metadata.Count > 0)
|
||||
{
|
||||
console.Write(CreateKeyValueTable("Plan Metadata", plan.Metadata));
|
||||
}
|
||||
}
|
||||
|
||||
private static void RenderAdvisoryOutput(AdvisoryPipelineOutputModel output)
|
||||
{
|
||||
var console = AnsiConsole.Console;
|
||||
|
||||
var summary = new Table()
|
||||
.Border(TableBorder.Rounded)
|
||||
.Title("[bold]Advisory Output[/]");
|
||||
summary.AddColumn("Field");
|
||||
summary.AddColumn("Value");
|
||||
summary.AddRow("Cache Key", Markup.Escape(output.CacheKey));
|
||||
summary.AddRow("Task", Markup.Escape(output.TaskType));
|
||||
summary.AddRow("Profile", Markup.Escape(output.Profile));
|
||||
summary.AddRow("Generated", output.GeneratedAtUtc.ToString("O", CultureInfo.InvariantCulture));
|
||||
summary.AddRow("Plan From Cache", output.PlanFromCache ? "yes" : "no");
|
||||
summary.AddRow("Citations", output.Citations.Count.ToString(CultureInfo.InvariantCulture));
|
||||
summary.AddRow("Guardrail Blocked", output.Guardrail.Blocked ? "[red]yes[/]" : "no");
|
||||
|
||||
console.Write(summary);
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(output.Prompt))
|
||||
{
|
||||
var panel = new Panel(new Markup(Markup.Escape(output.Prompt)))
|
||||
{
|
||||
Header = new PanelHeader("Prompt"),
|
||||
Border = BoxBorder.Rounded,
|
||||
Expand = true
|
||||
};
|
||||
console.Write(panel);
|
||||
}
|
||||
|
||||
if (output.Citations.Count > 0)
|
||||
{
|
||||
var citations = new Table()
|
||||
.Border(TableBorder.Minimal)
|
||||
.Title("[grey]Citations[/]");
|
||||
citations.AddColumn("Index");
|
||||
citations.AddColumn("Document");
|
||||
citations.AddColumn("Chunk");
|
||||
|
||||
foreach (var citation in output.Citations.OrderBy(c => c.Index))
|
||||
{
|
||||
citations.AddRow(
|
||||
citation.Index.ToString(CultureInfo.InvariantCulture),
|
||||
Markup.Escape(citation.DocumentId),
|
||||
Markup.Escape(citation.ChunkId));
|
||||
}
|
||||
|
||||
console.Write(citations);
|
||||
}
|
||||
|
||||
if (output.Metadata.Count > 0)
|
||||
{
|
||||
console.Write(CreateKeyValueTable("Output Metadata", output.Metadata));
|
||||
}
|
||||
|
||||
if (output.Guardrail.Metadata.Count > 0)
|
||||
{
|
||||
console.Write(CreateKeyValueTable("Guardrail Metadata", output.Guardrail.Metadata));
|
||||
}
|
||||
|
||||
if (output.Guardrail.Violations.Count > 0)
|
||||
{
|
||||
var violations = new Table()
|
||||
.Border(TableBorder.Minimal)
|
||||
.Title("[red]Guardrail Violations[/]");
|
||||
violations.AddColumn("Code");
|
||||
violations.AddColumn("Message");
|
||||
|
||||
foreach (var violation in output.Guardrail.Violations)
|
||||
{
|
||||
violations.AddRow(Markup.Escape(violation.Code), Markup.Escape(violation.Message));
|
||||
}
|
||||
|
||||
console.Write(violations);
|
||||
}
|
||||
|
||||
var provenance = new Table()
|
||||
.Border(TableBorder.Minimal)
|
||||
.Title("[grey]Provenance[/]");
|
||||
provenance.AddColumn("Field");
|
||||
provenance.AddColumn("Value");
|
||||
|
||||
provenance.AddRow("Input Digest", Markup.Escape(output.Provenance.InputDigest));
|
||||
provenance.AddRow("Output Hash", Markup.Escape(output.Provenance.OutputHash));
|
||||
|
||||
var signatures = output.Provenance.Signatures.Count == 0
|
||||
? "none"
|
||||
: string.Join(Environment.NewLine, output.Provenance.Signatures.Select(Markup.Escape));
|
||||
provenance.AddRow("Signatures", signatures);
|
||||
|
||||
console.Write(provenance);
|
||||
}
|
||||
|
||||
private static Table CreateKeyValueTable(string title, IReadOnlyDictionary<string, string> entries)
|
||||
{
|
||||
var table = new Table()
|
||||
.Border(TableBorder.Minimal)
|
||||
.Title($"[grey]{Markup.Escape(title)}[/]");
|
||||
table.AddColumn("Key");
|
||||
table.AddColumn("Value");
|
||||
|
||||
foreach (var kvp in entries.OrderBy(kvp => kvp.Key, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
table.AddRow(Markup.Escape(kvp.Key), Markup.Escape(kvp.Value));
|
||||
}
|
||||
|
||||
return table;
|
||||
}
|
||||
|
||||
private static IDictionary<string, object?> RemoveNullValues(Dictionary<string, object?> source)
|
||||
{
|
||||
foreach (var key in source.Where(kvp => kvp.Value is null).Select(kvp => kvp.Key).ToList())
|
||||
{
|
||||
|
||||
@@ -26,13 +26,15 @@ public static class CliBootstrapper
|
||||
options.PostBind = (cliOptions, configuration) =>
|
||||
{
|
||||
cliOptions.ApiKey = ResolveWithFallback(cliOptions.ApiKey, configuration, "API_KEY", "StellaOps:ApiKey", "ApiKey");
|
||||
cliOptions.BackendUrl = ResolveWithFallback(cliOptions.BackendUrl, configuration, "STELLAOPS_BACKEND_URL", "StellaOps:BackendUrl", "BackendUrl");
|
||||
cliOptions.ConcelierUrl = ResolveWithFallback(cliOptions.ConcelierUrl, configuration, "STELLAOPS_CONCELIER_URL", "StellaOps:ConcelierUrl", "ConcelierUrl");
|
||||
cliOptions.BackendUrl = ResolveWithFallback(cliOptions.BackendUrl, configuration, "STELLAOPS_BACKEND_URL", "StellaOps:BackendUrl", "BackendUrl");
|
||||
cliOptions.ConcelierUrl = ResolveWithFallback(cliOptions.ConcelierUrl, configuration, "STELLAOPS_CONCELIER_URL", "StellaOps:ConcelierUrl", "ConcelierUrl");
|
||||
cliOptions.AdvisoryAiUrl = ResolveWithFallback(cliOptions.AdvisoryAiUrl, configuration, "STELLAOPS_ADVISORYAI_URL", "StellaOps:AdvisoryAiUrl", "AdvisoryAiUrl");
|
||||
cliOptions.ScannerSignaturePublicKeyPath = ResolveWithFallback(cliOptions.ScannerSignaturePublicKeyPath, configuration, "SCANNER_PUBLIC_KEY", "STELLAOPS_SCANNER_PUBLIC_KEY", "StellaOps:ScannerSignaturePublicKeyPath", "ScannerSignaturePublicKeyPath");
|
||||
|
||||
cliOptions.ApiKey = cliOptions.ApiKey?.Trim() ?? string.Empty;
|
||||
cliOptions.BackendUrl = cliOptions.BackendUrl?.Trim() ?? string.Empty;
|
||||
cliOptions.ConcelierUrl = cliOptions.ConcelierUrl?.Trim() ?? string.Empty;
|
||||
cliOptions.ConcelierUrl = cliOptions.ConcelierUrl?.Trim() ?? string.Empty;
|
||||
cliOptions.AdvisoryAiUrl = cliOptions.AdvisoryAiUrl?.Trim() ?? string.Empty;
|
||||
cliOptions.ScannerSignaturePublicKeyPath = cliOptions.ScannerSignaturePublicKeyPath?.Trim() ?? string.Empty;
|
||||
|
||||
var attemptsRaw = ResolveWithFallback(
|
||||
|
||||
@@ -11,7 +11,9 @@ public sealed class StellaOpsCliOptions
|
||||
|
||||
public string BackendUrl { get; set; } = string.Empty;
|
||||
|
||||
public string ConcelierUrl { get; set; } = string.Empty;
|
||||
public string ConcelierUrl { get; set; } = string.Empty;
|
||||
|
||||
public string AdvisoryAiUrl { get; set; } = string.Empty;
|
||||
|
||||
public string ScannerCacheDirectory { get; set; } = "scanners";
|
||||
|
||||
|
||||
@@ -18,7 +18,8 @@ using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Auth.Abstractions;
|
||||
using StellaOps.Auth.Client;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
using StellaOps.Cli.Services.Models.Transport;
|
||||
|
||||
namespace StellaOps.Cli.Services;
|
||||
@@ -30,10 +31,12 @@ internal sealed class BackendOperationsClient : IBackendOperationsClient
|
||||
private static readonly IReadOnlyDictionary<string, object?> EmptyMetadata =
|
||||
new ReadOnlyDictionary<string, object?>(new Dictionary<string, object?>(0, StringComparer.OrdinalIgnoreCase));
|
||||
|
||||
private const string OperatorReasonParameterName = "operator_reason";
|
||||
private const string OperatorTicketParameterName = "operator_ticket";
|
||||
private const string BackfillReasonParameterName = "backfill_reason";
|
||||
private const string BackfillTicketParameterName = "backfill_ticket";
|
||||
private const string OperatorReasonParameterName = "operator_reason";
|
||||
private const string OperatorTicketParameterName = "operator_ticket";
|
||||
private const string BackfillReasonParameterName = "backfill_reason";
|
||||
private const string BackfillTicketParameterName = "backfill_ticket";
|
||||
private const string AdvisoryScopesHeader = "X-StellaOps-Scopes";
|
||||
private const string AdvisoryRunScope = "advisory:run";
|
||||
|
||||
private readonly HttpClient _httpClient;
|
||||
private readonly StellaOpsCliOptions _options;
|
||||
@@ -885,13 +888,122 @@ internal sealed class BackendOperationsClient : IBackendOperationsClient
|
||||
throw new InvalidOperationException("EntryTrace response payload was empty.");
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<ExcititorProviderSummary>> GetExcititorProvidersAsync(bool includeDisabled, CancellationToken cancellationToken)
|
||||
{
|
||||
EnsureBackendConfigured();
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public async Task<AdvisoryPipelinePlanResponseModel> CreateAdvisoryPipelinePlanAsync(
|
||||
AdvisoryAiTaskType taskType,
|
||||
AdvisoryPipelinePlanRequestModel request,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(request);
|
||||
|
||||
var taskSegment = taskType.ToString().ToLowerInvariant();
|
||||
var relative = $"v1/advisory-ai/pipeline/{taskSegment}";
|
||||
|
||||
var payload = new AdvisoryPipelinePlanRequestModel
|
||||
{
|
||||
TaskType = taskType,
|
||||
AdvisoryKey = string.IsNullOrWhiteSpace(request.AdvisoryKey) ? string.Empty : request.AdvisoryKey.Trim(),
|
||||
ArtifactId = string.IsNullOrWhiteSpace(request.ArtifactId) ? null : request.ArtifactId!.Trim(),
|
||||
ArtifactPurl = string.IsNullOrWhiteSpace(request.ArtifactPurl) ? null : request.ArtifactPurl!.Trim(),
|
||||
PolicyVersion = string.IsNullOrWhiteSpace(request.PolicyVersion) ? null : request.PolicyVersion!.Trim(),
|
||||
Profile = string.IsNullOrWhiteSpace(request.Profile) ? "default" : request.Profile!.Trim(),
|
||||
PreferredSections = request.PreferredSections is null
|
||||
? null
|
||||
: request.PreferredSections
|
||||
.Where(static section => !string.IsNullOrWhiteSpace(section))
|
||||
.Select(static section => section.Trim())
|
||||
.ToArray(),
|
||||
ForceRefresh = request.ForceRefresh
|
||||
};
|
||||
|
||||
using var httpRequest = CreateRequest(HttpMethod.Post, relative);
|
||||
ApplyAdvisoryAiEndpoint(httpRequest, taskType);
|
||||
await AuthorizeRequestAsync(httpRequest, cancellationToken).ConfigureAwait(false);
|
||||
httpRequest.Content = JsonContent.Create(payload, options: SerializerOptions);
|
||||
|
||||
using var response = await _httpClient.SendAsync(httpRequest, cancellationToken).ConfigureAwait(false);
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var failure = await CreateFailureMessageAsync(response, cancellationToken).ConfigureAwait(false);
|
||||
throw new InvalidOperationException(failure);
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
var plan = await response.Content.ReadFromJsonAsync<AdvisoryPipelinePlanResponseModel>(SerializerOptions, cancellationToken).ConfigureAwait(false);
|
||||
if (plan is null)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory AI plan response was empty.");
|
||||
}
|
||||
|
||||
return plan;
|
||||
}
|
||||
catch (JsonException ex)
|
||||
{
|
||||
var raw = response.Content is null
|
||||
? string.Empty
|
||||
: await response.Content.ReadAsStringAsync(cancellationToken).ConfigureAwait(false);
|
||||
throw new InvalidOperationException($"Failed to parse advisory plan response. {ex.Message}", ex)
|
||||
{
|
||||
Data = { ["payload"] = raw }
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<AdvisoryPipelineOutputModel?> TryGetAdvisoryPipelineOutputAsync(
|
||||
string cacheKey,
|
||||
AdvisoryAiTaskType taskType,
|
||||
string profile,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(cacheKey))
|
||||
{
|
||||
throw new ArgumentException("Cache key is required.", nameof(cacheKey));
|
||||
}
|
||||
|
||||
var encodedKey = Uri.EscapeDataString(cacheKey);
|
||||
var taskSegment = Uri.EscapeDataString(taskType.ToString().ToLowerInvariant());
|
||||
var resolvedProfile = string.IsNullOrWhiteSpace(profile) ? "default" : profile.Trim();
|
||||
var relative = $"v1/advisory-ai/outputs/{encodedKey}?taskType={taskSegment}&profile={Uri.EscapeDataString(resolvedProfile)}";
|
||||
|
||||
using var request = CreateRequest(HttpMethod.Get, relative);
|
||||
ApplyAdvisoryAiEndpoint(request, taskType);
|
||||
await AuthorizeRequestAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
using var response = await _httpClient.SendAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
if (response.StatusCode == HttpStatusCode.NotFound)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var failure = await CreateFailureMessageAsync(response, cancellationToken).ConfigureAwait(false);
|
||||
throw new InvalidOperationException(failure);
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
return await response.Content.ReadFromJsonAsync<AdvisoryPipelineOutputModel>(SerializerOptions, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
catch (JsonException ex)
|
||||
{
|
||||
var raw = response.Content is null
|
||||
? string.Empty
|
||||
: await response.Content.ReadAsStringAsync(cancellationToken).ConfigureAwait(false);
|
||||
throw new InvalidOperationException($"Failed to parse advisory output response. {ex.Message}", ex)
|
||||
{
|
||||
Data = { ["payload"] = raw }
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<ExcititorProviderSummary>> GetExcititorProvidersAsync(bool includeDisabled, CancellationToken cancellationToken)
|
||||
{
|
||||
EnsureBackendConfigured();
|
||||
|
||||
var query = includeDisabled ? "?includeDisabled=true" : string.Empty;
|
||||
using var request = CreateRequest(HttpMethod.Get, $"excititor/providers{query}");
|
||||
await AuthorizeRequestAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
@@ -1778,7 +1890,44 @@ internal sealed class BackendOperationsClient : IBackendOperationsClient
|
||||
return string.IsNullOrWhiteSpace(value) ? null : value.Trim();
|
||||
}
|
||||
|
||||
private HttpRequestMessage CreateRequest(HttpMethod method, string relativeUri)
|
||||
private void ApplyAdvisoryAiEndpoint(HttpRequestMessage request, AdvisoryAiTaskType taskType)
|
||||
{
|
||||
if (request is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(request));
|
||||
}
|
||||
|
||||
var requestUri = request.RequestUri ?? throw new InvalidOperationException("Request URI was not initialized.");
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(_options.AdvisoryAiUrl) &&
|
||||
Uri.TryCreate(_options.AdvisoryAiUrl, UriKind.Absolute, out var advisoryBase))
|
||||
{
|
||||
if (!requestUri.IsAbsoluteUri)
|
||||
{
|
||||
request.RequestUri = new Uri(advisoryBase, requestUri.ToString());
|
||||
}
|
||||
}
|
||||
else if (!string.IsNullOrWhiteSpace(_options.AdvisoryAiUrl))
|
||||
{
|
||||
throw new InvalidOperationException($"Advisory AI URL '{_options.AdvisoryAiUrl}' is not a valid absolute URI.");
|
||||
}
|
||||
else
|
||||
{
|
||||
EnsureBackendConfigured();
|
||||
}
|
||||
|
||||
var taskScope = $"advisory:{taskType.ToString().ToLowerInvariant()}";
|
||||
var combined = $"{AdvisoryRunScope} {taskScope}";
|
||||
|
||||
if (request.Headers.Contains(AdvisoryScopesHeader))
|
||||
{
|
||||
request.Headers.Remove(AdvisoryScopesHeader);
|
||||
}
|
||||
|
||||
request.Headers.TryAddWithoutValidation(AdvisoryScopesHeader, combined);
|
||||
}
|
||||
|
||||
private HttpRequestMessage CreateRequest(HttpMethod method, string relativeUri)
|
||||
{
|
||||
if (!Uri.TryCreate(relativeUri, UriKind.RelativeOrAbsolute, out var requestUri))
|
||||
{
|
||||
|
||||
@@ -4,6 +4,7 @@ using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
|
||||
namespace StellaOps.Cli.Services;
|
||||
|
||||
@@ -46,4 +47,8 @@ internal interface IBackendOperationsClient
|
||||
Task<PolicyFindingExplainResult> GetPolicyFindingExplainAsync(string policyId, string findingId, string? mode, CancellationToken cancellationToken);
|
||||
|
||||
Task<EntryTraceResponseModel?> GetEntryTraceAsync(string scanId, CancellationToken cancellationToken);
|
||||
|
||||
Task<AdvisoryPipelinePlanResponseModel> CreateAdvisoryPipelinePlanAsync(AdvisoryAiTaskType taskType, AdvisoryPipelinePlanRequestModel request, CancellationToken cancellationToken);
|
||||
|
||||
Task<AdvisoryPipelineOutputModel?> TryGetAdvisoryPipelineOutputAsync(string cacheKey, AdvisoryAiTaskType taskType, string profile, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
@@ -0,0 +1,139 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
|
||||
internal enum AdvisoryAiTaskType
|
||||
{
|
||||
Summary,
|
||||
Conflict,
|
||||
Remediation
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryPipelinePlanRequestModel
|
||||
{
|
||||
public AdvisoryAiTaskType TaskType { get; init; }
|
||||
|
||||
public string AdvisoryKey { get; init; } = string.Empty;
|
||||
|
||||
public string? ArtifactId { get; init; }
|
||||
|
||||
public string? ArtifactPurl { get; init; }
|
||||
|
||||
public string? PolicyVersion { get; init; }
|
||||
|
||||
public string Profile { get; init; } = "default";
|
||||
|
||||
public IReadOnlyList<string>? PreferredSections { get; init; }
|
||||
|
||||
public bool ForceRefresh { get; init; }
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryPipelinePlanResponseModel
|
||||
{
|
||||
public string CacheKey { get; init; } = string.Empty;
|
||||
|
||||
public string TaskType { get; init; } = string.Empty;
|
||||
|
||||
public string PromptTemplate { get; init; } = string.Empty;
|
||||
|
||||
public AdvisoryTaskBudgetModel Budget { get; init; } = new();
|
||||
|
||||
public IReadOnlyList<PipelineChunkSummaryModel> Chunks { get; init; } = Array.Empty<PipelineChunkSummaryModel>();
|
||||
|
||||
public IReadOnlyList<PipelineVectorSummaryModel> Vectors { get; init; } = Array.Empty<PipelineVectorSummaryModel>();
|
||||
|
||||
public Dictionary<string, string> Metadata { get; init; } = new(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryTaskBudgetModel
|
||||
{
|
||||
public int PromptTokens { get; init; }
|
||||
|
||||
public int CompletionTokens { get; init; }
|
||||
}
|
||||
|
||||
internal sealed class PipelineChunkSummaryModel
|
||||
{
|
||||
public string DocumentId { get; init; } = string.Empty;
|
||||
|
||||
public string ChunkId { get; init; } = string.Empty;
|
||||
|
||||
public string Section { get; init; } = string.Empty;
|
||||
|
||||
public string? DisplaySection { get; init; }
|
||||
}
|
||||
|
||||
internal sealed class PipelineVectorSummaryModel
|
||||
{
|
||||
public string Query { get; init; } = string.Empty;
|
||||
|
||||
public IReadOnlyList<PipelineVectorMatchSummaryModel> Matches { get; init; } = Array.Empty<PipelineVectorMatchSummaryModel>();
|
||||
}
|
||||
|
||||
internal sealed class PipelineVectorMatchSummaryModel
|
||||
{
|
||||
public string ChunkId { get; init; } = string.Empty;
|
||||
|
||||
public double Score { get; init; }
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryPipelineOutputModel
|
||||
{
|
||||
public string CacheKey { get; init; } = string.Empty;
|
||||
|
||||
public string TaskType { get; init; } = string.Empty;
|
||||
|
||||
public string Profile { get; init; } = string.Empty;
|
||||
|
||||
public string Prompt { get; init; } = string.Empty;
|
||||
|
||||
public IReadOnlyList<AdvisoryOutputCitationModel> Citations { get; init; } = Array.Empty<AdvisoryOutputCitationModel>();
|
||||
|
||||
public Dictionary<string, string> Metadata { get; init; } = new(StringComparer.Ordinal);
|
||||
|
||||
public AdvisoryOutputGuardrailModel Guardrail { get; init; } = new();
|
||||
|
||||
public AdvisoryOutputProvenanceModel Provenance { get; init; } = new();
|
||||
|
||||
public DateTimeOffset GeneratedAtUtc { get; init; }
|
||||
|
||||
public bool PlanFromCache { get; init; }
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryOutputCitationModel
|
||||
{
|
||||
public int Index { get; init; }
|
||||
|
||||
public string DocumentId { get; init; } = string.Empty;
|
||||
|
||||
public string ChunkId { get; init; } = string.Empty;
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryOutputGuardrailModel
|
||||
{
|
||||
public bool Blocked { get; init; }
|
||||
|
||||
public string SanitizedPrompt { get; init; } = string.Empty;
|
||||
|
||||
public IReadOnlyList<AdvisoryOutputGuardrailViolationModel> Violations { get; init; } = Array.Empty<AdvisoryOutputGuardrailViolationModel>();
|
||||
|
||||
public Dictionary<string, string> Metadata { get; init; } = new(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryOutputGuardrailViolationModel
|
||||
{
|
||||
public string Code { get; init; } = string.Empty;
|
||||
|
||||
public string Message { get; init; } = string.Empty;
|
||||
}
|
||||
|
||||
internal sealed class AdvisoryOutputProvenanceModel
|
||||
{
|
||||
public string InputDigest { get; init; } = string.Empty;
|
||||
|
||||
public string OutputHash { get; init; } = string.Empty;
|
||||
|
||||
public IReadOnlyList<string> Signatures { get; init; } = Array.Empty<string>();
|
||||
}
|
||||
@@ -20,6 +20,7 @@ internal static class CliMetrics
|
||||
private static readonly Counter<long> PolicyFindingsListCounter = Meter.CreateCounter<long>("stellaops.cli.policy.findings.list.count");
|
||||
private static readonly Counter<long> PolicyFindingsGetCounter = Meter.CreateCounter<long>("stellaops.cli.policy.findings.get.count");
|
||||
private static readonly Counter<long> PolicyFindingsExplainCounter = Meter.CreateCounter<long>("stellaops.cli.policy.findings.explain.count");
|
||||
private static readonly Counter<long> AdvisoryRunCounter = Meter.CreateCounter<long>("stellaops.cli.advisory.run.count");
|
||||
private static readonly Histogram<double> CommandDurationHistogram = Meter.CreateHistogram<double>("stellaops.cli.command.duration.ms");
|
||||
|
||||
public static void RecordScannerDownload(string channel, bool fromCache)
|
||||
@@ -70,6 +71,13 @@ internal static class CliMetrics
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordAdvisoryRun(string taskType, string outcome)
|
||||
=> AdvisoryRunCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
new("task", string.IsNullOrWhiteSpace(taskType) ? "unknown" : taskType.ToLowerInvariant()),
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordSourcesDryRun(string status)
|
||||
=> SourcesDryRunCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
|
||||
@@ -23,6 +23,7 @@ using StellaOps.Cli.Commands;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Services;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
using StellaOps.Cli.Telemetry;
|
||||
using StellaOps.Cli.Tests.Testing;
|
||||
using StellaOps.Cryptography;
|
||||
@@ -223,6 +224,291 @@ public sealed class CommandHandlersTests
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task HandleAdviseRunAsync_WritesOutputAndSetsExitCode()
|
||||
{
|
||||
var originalExit = Environment.ExitCode;
|
||||
var originalConsole = AnsiConsole.Console;
|
||||
var testConsole = new TestConsole();
|
||||
|
||||
try
|
||||
{
|
||||
Environment.ExitCode = 0;
|
||||
AnsiConsole.Console = testConsole;
|
||||
|
||||
var planResponse = new AdvisoryPipelinePlanResponseModel
|
||||
{
|
||||
TaskType = AdvisoryAiTaskType.Summary.ToString(),
|
||||
CacheKey = "cache-123",
|
||||
PromptTemplate = "prompts/advisory/summary.liquid",
|
||||
Budget = new AdvisoryTaskBudgetModel
|
||||
{
|
||||
PromptTokens = 512,
|
||||
CompletionTokens = 128
|
||||
},
|
||||
Chunks = new[]
|
||||
{
|
||||
new PipelineChunkSummaryModel
|
||||
{
|
||||
DocumentId = "doc-1",
|
||||
ChunkId = "chunk-1",
|
||||
Section = "Summary",
|
||||
DisplaySection = "Summary"
|
||||
}
|
||||
},
|
||||
Vectors = new[]
|
||||
{
|
||||
new PipelineVectorSummaryModel
|
||||
{
|
||||
Query = "summary query",
|
||||
Matches = new[]
|
||||
{
|
||||
new PipelineVectorMatchSummaryModel
|
||||
{
|
||||
ChunkId = "chunk-1",
|
||||
Score = 0.9
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
Metadata = new Dictionary<string, string>
|
||||
{
|
||||
["profile"] = "default"
|
||||
}
|
||||
};
|
||||
|
||||
var outputResponse = new AdvisoryPipelineOutputModel
|
||||
{
|
||||
CacheKey = planResponse.CacheKey,
|
||||
TaskType = planResponse.TaskType,
|
||||
Profile = "default",
|
||||
Prompt = "Summary result",
|
||||
Citations = new[]
|
||||
{
|
||||
new AdvisoryOutputCitationModel
|
||||
{
|
||||
Index = 0,
|
||||
DocumentId = "doc-1",
|
||||
ChunkId = "chunk-1"
|
||||
}
|
||||
},
|
||||
Metadata = new Dictionary<string, string>
|
||||
{
|
||||
["confidence"] = "high"
|
||||
},
|
||||
Guardrail = new AdvisoryOutputGuardrailModel
|
||||
{
|
||||
Blocked = false,
|
||||
SanitizedPrompt = "Summary result",
|
||||
Violations = Array.Empty<AdvisoryOutputGuardrailViolationModel>(),
|
||||
Metadata = new Dictionary<string, string>()
|
||||
},
|
||||
Provenance = new AdvisoryOutputProvenanceModel
|
||||
{
|
||||
InputDigest = "sha256:aaa",
|
||||
OutputHash = "sha256:bbb",
|
||||
Signatures = Array.Empty<string>()
|
||||
},
|
||||
GeneratedAtUtc = DateTimeOffset.Parse("2025-11-06T12:00:00Z", CultureInfo.InvariantCulture),
|
||||
PlanFromCache = false
|
||||
};
|
||||
|
||||
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null))
|
||||
{
|
||||
AdvisoryPlanResponse = planResponse,
|
||||
AdvisoryOutputResponse = outputResponse
|
||||
};
|
||||
|
||||
var provider = BuildServiceProvider(backend);
|
||||
|
||||
await CommandHandlers.HandleAdviseRunAsync(
|
||||
provider,
|
||||
AdvisoryAiTaskType.Summary,
|
||||
" ADV-1 ",
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
"default",
|
||||
new[] { "impact", "impact " },
|
||||
forceRefresh: false,
|
||||
timeoutSeconds: 0,
|
||||
verbose: false,
|
||||
cancellationToken: CancellationToken.None);
|
||||
|
||||
Assert.Equal(0, Environment.ExitCode);
|
||||
Assert.Single(backend.AdvisoryPlanRequests);
|
||||
var request = backend.AdvisoryPlanRequests[0];
|
||||
Assert.Equal(AdvisoryAiTaskType.Summary, request.TaskType);
|
||||
Assert.Equal("ADV-1", request.Request.AdvisoryKey);
|
||||
Assert.NotNull(request.Request.PreferredSections);
|
||||
Assert.Single(request.Request.PreferredSections!);
|
||||
Assert.Equal("impact", request.Request.PreferredSections![0]);
|
||||
|
||||
Assert.Single(backend.AdvisoryOutputRequests);
|
||||
Assert.Equal(planResponse.CacheKey, backend.AdvisoryOutputRequests[0].CacheKey);
|
||||
Assert.Equal("default", backend.AdvisoryOutputRequests[0].Profile);
|
||||
|
||||
var output = testConsole.Output;
|
||||
Assert.Contains("Advisory Output", output, StringComparison.OrdinalIgnoreCase);
|
||||
Assert.Contains(planResponse.CacheKey, output, StringComparison.Ordinal);
|
||||
Assert.Contains("Summary result", output, StringComparison.Ordinal);
|
||||
}
|
||||
finally
|
||||
{
|
||||
AnsiConsole.Console = originalConsole;
|
||||
Environment.ExitCode = originalExit;
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task HandleAdviseRunAsync_ReturnsGuardrailExitCodeOnBlock()
|
||||
{
|
||||
var originalExit = Environment.ExitCode;
|
||||
var originalConsole = AnsiConsole.Console;
|
||||
var testConsole = new TestConsole();
|
||||
|
||||
try
|
||||
{
|
||||
Environment.ExitCode = 0;
|
||||
AnsiConsole.Console = testConsole;
|
||||
|
||||
var planResponse = new AdvisoryPipelinePlanResponseModel
|
||||
{
|
||||
TaskType = AdvisoryAiTaskType.Remediation.ToString(),
|
||||
CacheKey = "cache-guard",
|
||||
PromptTemplate = "prompts/advisory/remediation.liquid",
|
||||
Budget = new AdvisoryTaskBudgetModel
|
||||
{
|
||||
PromptTokens = 256,
|
||||
CompletionTokens = 64
|
||||
},
|
||||
Chunks = Array.Empty<PipelineChunkSummaryModel>(),
|
||||
Vectors = Array.Empty<PipelineVectorSummaryModel>(),
|
||||
Metadata = new Dictionary<string, string>()
|
||||
};
|
||||
|
||||
var outputResponse = new AdvisoryPipelineOutputModel
|
||||
{
|
||||
CacheKey = planResponse.CacheKey,
|
||||
TaskType = planResponse.TaskType,
|
||||
Profile = "default",
|
||||
Prompt = "Blocked output",
|
||||
Citations = Array.Empty<AdvisoryOutputCitationModel>(),
|
||||
Metadata = new Dictionary<string, string>(),
|
||||
Guardrail = new AdvisoryOutputGuardrailModel
|
||||
{
|
||||
Blocked = true,
|
||||
SanitizedPrompt = "Blocked output",
|
||||
Violations = new[]
|
||||
{
|
||||
new AdvisoryOutputGuardrailViolationModel
|
||||
{
|
||||
Code = "PROMPT_INJECTION",
|
||||
Message = "Detected prompt injection attempt."
|
||||
}
|
||||
},
|
||||
Metadata = new Dictionary<string, string>()
|
||||
},
|
||||
Provenance = new AdvisoryOutputProvenanceModel
|
||||
{
|
||||
InputDigest = "sha256:ccc",
|
||||
OutputHash = "sha256:ddd",
|
||||
Signatures = Array.Empty<string>()
|
||||
},
|
||||
GeneratedAtUtc = DateTimeOffset.Parse("2025-11-06T13:05:00Z", CultureInfo.InvariantCulture),
|
||||
PlanFromCache = true
|
||||
};
|
||||
|
||||
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null))
|
||||
{
|
||||
AdvisoryPlanResponse = planResponse,
|
||||
AdvisoryOutputResponse = outputResponse
|
||||
};
|
||||
|
||||
var provider = BuildServiceProvider(backend);
|
||||
|
||||
await CommandHandlers.HandleAdviseRunAsync(
|
||||
provider,
|
||||
AdvisoryAiTaskType.Remediation,
|
||||
"ADV-2",
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
"default",
|
||||
Array.Empty<string>(),
|
||||
forceRefresh: true,
|
||||
timeoutSeconds: 0,
|
||||
verbose: false,
|
||||
cancellationToken: CancellationToken.None);
|
||||
|
||||
Assert.Equal(65, Environment.ExitCode);
|
||||
Assert.Contains("Guardrail Violations", testConsole.Output, StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
finally
|
||||
{
|
||||
AnsiConsole.Console = originalConsole;
|
||||
Environment.ExitCode = originalExit;
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task HandleAdviseRunAsync_TimesOutWhenOutputMissing()
|
||||
{
|
||||
var originalExit = Environment.ExitCode;
|
||||
var originalConsole = AnsiConsole.Console;
|
||||
|
||||
try
|
||||
{
|
||||
Environment.ExitCode = 0;
|
||||
AnsiConsole.Console = new TestConsole();
|
||||
|
||||
var planResponse = new AdvisoryPipelinePlanResponseModel
|
||||
{
|
||||
TaskType = AdvisoryAiTaskType.Conflict.ToString(),
|
||||
CacheKey = "cache-timeout",
|
||||
PromptTemplate = "prompts/advisory/conflict.liquid",
|
||||
Budget = new AdvisoryTaskBudgetModel
|
||||
{
|
||||
PromptTokens = 128,
|
||||
CompletionTokens = 32
|
||||
},
|
||||
Chunks = Array.Empty<PipelineChunkSummaryModel>(),
|
||||
Vectors = Array.Empty<PipelineVectorSummaryModel>(),
|
||||
Metadata = new Dictionary<string, string>()
|
||||
};
|
||||
|
||||
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null))
|
||||
{
|
||||
AdvisoryPlanResponse = planResponse,
|
||||
AdvisoryOutputResponse = null
|
||||
};
|
||||
|
||||
var provider = BuildServiceProvider(backend);
|
||||
|
||||
await CommandHandlers.HandleAdviseRunAsync(
|
||||
provider,
|
||||
AdvisoryAiTaskType.Conflict,
|
||||
"ADV-3",
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
"default",
|
||||
Array.Empty<string>(),
|
||||
forceRefresh: false,
|
||||
timeoutSeconds: 0,
|
||||
verbose: false,
|
||||
cancellationToken: CancellationToken.None);
|
||||
|
||||
Assert.Equal(70, Environment.ExitCode);
|
||||
Assert.Single(backend.AdvisoryOutputRequests);
|
||||
}
|
||||
finally
|
||||
{
|
||||
AnsiConsole.Console = originalConsole;
|
||||
Environment.ExitCode = originalExit;
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task HandleAuthLoginAsync_UsesClientCredentialsFlow()
|
||||
{
|
||||
@@ -1726,10 +2012,16 @@ spec:
|
||||
Assert.NotNull(backend.LastTaskRunnerSimulationRequest);
|
||||
|
||||
var consoleOutput = writer.ToString();
|
||||
Assert.Contains("\"planHash\":\"hash-xyz789\"", consoleOutput, StringComparison.Ordinal);
|
||||
using (var consoleJson = JsonDocument.Parse(consoleOutput))
|
||||
{
|
||||
Assert.Equal("hash-xyz789", consoleJson.RootElement.GetProperty("planHash").GetString());
|
||||
}
|
||||
|
||||
var fileOutput = await File.ReadAllTextAsync(outputPath);
|
||||
Assert.Contains("\"planHash\":\"hash-xyz789\"", fileOutput, StringComparison.Ordinal);
|
||||
using (var fileJson = JsonDocument.Parse(fileOutput))
|
||||
{
|
||||
Assert.Equal("hash-xyz789", fileJson.RootElement.GetProperty("planHash").GetString());
|
||||
}
|
||||
|
||||
Assert.True(backend.LastTaskRunnerSimulationRequest!.Inputs!.TryGetPropertyValue("dryRun", out var dryRunNode));
|
||||
Assert.False(dryRunNode!.GetValue<bool>());
|
||||
@@ -2738,6 +3030,13 @@ spec:
|
||||
public EntryTraceResponseModel? EntryTraceResponse { get; set; }
|
||||
public Exception? EntryTraceException { get; set; }
|
||||
public string? LastEntryTraceScanId { get; private set; }
|
||||
public List<(AdvisoryAiTaskType TaskType, AdvisoryPipelinePlanRequestModel Request)> AdvisoryPlanRequests { get; } = new();
|
||||
public AdvisoryPipelinePlanResponseModel? AdvisoryPlanResponse { get; set; }
|
||||
public Exception? AdvisoryPlanException { get; set; }
|
||||
public Queue<AdvisoryPipelineOutputModel?> AdvisoryOutputQueue { get; } = new();
|
||||
public AdvisoryPipelineOutputModel? AdvisoryOutputResponse { get; set; }
|
||||
public Exception? AdvisoryOutputException { get; set; }
|
||||
public List<(string CacheKey, AdvisoryAiTaskType TaskType, string Profile)> AdvisoryOutputRequests { get; } = new();
|
||||
|
||||
public Task<ScannerArtifactResult> DownloadScannerAsync(string channel, string outputPath, bool overwrite, bool verbose, CancellationToken cancellationToken)
|
||||
=> throw new NotImplementedException();
|
||||
@@ -2890,10 +3189,52 @@ spec:
|
||||
|
||||
return Task.FromResult(EntryTraceResponse);
|
||||
}
|
||||
|
||||
public Task<AdvisoryPipelinePlanResponseModel> CreateAdvisoryPipelinePlanAsync(AdvisoryAiTaskType taskType, AdvisoryPipelinePlanRequestModel request, CancellationToken cancellationToken)
|
||||
{
|
||||
AdvisoryPlanRequests.Add((taskType, request));
|
||||
if (AdvisoryPlanException is not null)
|
||||
{
|
||||
throw AdvisoryPlanException;
|
||||
}
|
||||
|
||||
var response = AdvisoryPlanResponse ?? new AdvisoryPipelinePlanResponseModel
|
||||
{
|
||||
TaskType = taskType.ToString(),
|
||||
CacheKey = "stub-cache-key",
|
||||
PromptTemplate = "prompts/advisory/stub.liquid",
|
||||
Budget = new AdvisoryTaskBudgetModel
|
||||
{
|
||||
PromptTokens = 0,
|
||||
CompletionTokens = 0
|
||||
},
|
||||
Chunks = Array.Empty<PipelineChunkSummaryModel>(),
|
||||
Vectors = Array.Empty<PipelineVectorSummaryModel>(),
|
||||
Metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
};
|
||||
|
||||
return Task.FromResult(response);
|
||||
}
|
||||
|
||||
public Task<AdvisoryPipelineOutputModel?> TryGetAdvisoryPipelineOutputAsync(string cacheKey, AdvisoryAiTaskType taskType, string profile, CancellationToken cancellationToken)
|
||||
{
|
||||
AdvisoryOutputRequests.Add((cacheKey, taskType, profile));
|
||||
if (AdvisoryOutputException is not null)
|
||||
{
|
||||
throw AdvisoryOutputException;
|
||||
}
|
||||
|
||||
if (AdvisoryOutputQueue.Count > 0)
|
||||
{
|
||||
return Task.FromResult(AdvisoryOutputQueue.Dequeue());
|
||||
}
|
||||
|
||||
return Task.FromResult(AdvisoryOutputResponse);
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class StubExecutor : IScannerExecutor
|
||||
{
|
||||
|
||||
private sealed class StubExecutor : IScannerExecutor
|
||||
{
|
||||
private readonly ScannerExecutionResult _result;
|
||||
|
||||
public StubExecutor(ScannerExecutionResult result)
|
||||
|
||||
@@ -19,16 +19,20 @@ public sealed class EgressPolicyHttpMessageHandlerTests
|
||||
{
|
||||
Mode = EgressPolicyMode.Sealed
|
||||
};
|
||||
options.AddAllowRule(example.com);
|
||||
options.AddAllowRule("example.com");
|
||||
|
||||
var policy = new EgressPolicy(options);
|
||||
var handler = new EgressPolicyHttpMessageHandler(policy, NullLogger<EgressPolicyHttpMessageHandler>.Instance, cli, test)
|
||||
var handler = new EgressPolicyHttpMessageHandler(
|
||||
policy,
|
||||
NullLogger.Instance,
|
||||
component: "cli-tests",
|
||||
intent: "allow-test")
|
||||
{
|
||||
InnerHandler = new StubHandler()
|
||||
};
|
||||
|
||||
var client = new HttpClient(handler, disposeHandler: true);
|
||||
var response = await client.GetAsync(https://example.com/resource, CancellationToken.None).ConfigureAwait(false);
|
||||
using var client = new HttpClient(handler, disposeHandler: true);
|
||||
var response = await client.GetAsync("https://example.com/resource", CancellationToken.None);
|
||||
|
||||
Assert.Equal(HttpStatusCode.OK, response.StatusCode);
|
||||
}
|
||||
@@ -42,15 +46,19 @@ public sealed class EgressPolicyHttpMessageHandlerTests
|
||||
};
|
||||
|
||||
var policy = new EgressPolicy(options);
|
||||
var handler = new EgressPolicyHttpMessageHandler(policy, NullLogger<EgressPolicyHttpMessageHandler>.Instance, cli, test)
|
||||
var handler = new EgressPolicyHttpMessageHandler(
|
||||
policy,
|
||||
NullLogger.Instance,
|
||||
component: "cli-tests",
|
||||
intent: "deny-test")
|
||||
{
|
||||
InnerHandler = new StubHandler()
|
||||
};
|
||||
|
||||
var client = new HttpClient(handler, disposeHandler: true);
|
||||
using var client = new HttpClient(handler, disposeHandler: true);
|
||||
|
||||
var exception = await Assert.ThrowsAsync<AirGapEgressBlockedException>(
|
||||
() => client.GetAsync(https://blocked.example, CancellationToken.None)).ConfigureAwait(false);
|
||||
() => client.GetAsync("https://blocked.example", CancellationToken.None));
|
||||
|
||||
Assert.Contains(AirGapEgressBlockedException.ErrorCode, exception.Message, StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
@@ -574,7 +574,7 @@ public sealed class BackendOperationsClientTests
|
||||
var result = await client.TriggerJobAsync("test", new Dictionary<string, object?>(), CancellationToken.None);
|
||||
|
||||
Assert.True(result.Success);
|
||||
var metadata = Assert.NotNull(tokenClient.LastAdditionalParameters);
|
||||
var metadata = Assert.IsAssignableFrom<IReadOnlyDictionary<string, string>>(tokenClient.LastAdditionalParameters);
|
||||
Assert.Equal("Resume operations", metadata["operator_reason"]);
|
||||
Assert.Equal("INC-6006", metadata["operator_ticket"]);
|
||||
Assert.Equal("Historical rebuild", metadata["backfill_reason"]);
|
||||
|
||||
@@ -0,0 +1,26 @@
|
||||
using System.Collections.Generic;
|
||||
|
||||
namespace StellaOps.Concelier.WebService.Contracts;
|
||||
|
||||
public sealed record AdvisoryChunkCollectionResponse(
|
||||
string AdvisoryKey,
|
||||
int Total,
|
||||
bool Truncated,
|
||||
IReadOnlyList<AdvisoryChunkItemResponse> Chunks,
|
||||
IReadOnlyList<AdvisoryChunkSourceResponse> Sources);
|
||||
|
||||
public sealed record AdvisoryChunkItemResponse(
|
||||
string DocumentId,
|
||||
string ChunkId,
|
||||
string Section,
|
||||
string ParagraphId,
|
||||
string Text,
|
||||
IReadOnlyDictionary<string, string> Metadata);
|
||||
|
||||
public sealed record AdvisoryChunkSourceResponse(
|
||||
string ObservationId,
|
||||
string DocumentId,
|
||||
string Format,
|
||||
string Vendor,
|
||||
string ContentHash,
|
||||
DateTimeOffset CreatedAt);
|
||||
@@ -111,14 +111,17 @@ internal static class JobRegistrationExtensions
|
||||
|
||||
private static void ConfigureMergeJob(JobSchedulerOptions options, IConfiguration configuration)
|
||||
{
|
||||
var noMergeEnabled = configuration.GetValue<bool?>("concelier:features:noMergeEnabled") ?? true;
|
||||
var noMergeEnabled = configuration.GetValue<bool?>("concelier:features:noMergeEnabled")
|
||||
?? configuration.GetValue<bool?>("features:noMergeEnabled")
|
||||
?? true;
|
||||
if (noMergeEnabled)
|
||||
{
|
||||
options.Definitions.Remove(MergeReconcileBuiltInJob.Kind);
|
||||
return;
|
||||
}
|
||||
|
||||
var allowlist = configuration.GetSection("concelier:jobs:merge:allowlist").Get<string[]>();
|
||||
var allowlist = configuration.GetSection("concelier:jobs:merge:allowlist").Get<string[]>()
|
||||
?? configuration.GetSection("jobs:merge:allowlist").Get<string[]>();
|
||||
if (allowlist is { Length: > 0 })
|
||||
{
|
||||
var allowlistSet = new HashSet<string>(allowlist, StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
@@ -17,6 +17,8 @@ public sealed class ConcelierOptions
|
||||
public MirrorOptions Mirror { get; set; } = new();
|
||||
|
||||
public FeaturesOptions Features { get; set; } = new();
|
||||
|
||||
public AdvisoryChunkOptions AdvisoryChunks { get; set; } = new();
|
||||
|
||||
public sealed class StorageOptions
|
||||
{
|
||||
@@ -81,6 +83,8 @@ public sealed class ConcelierOptions
|
||||
|
||||
public IList<string> RequiredScopes { get; set; } = new List<string>();
|
||||
|
||||
public IList<string> RequiredTenants { get; set; } = new List<string>();
|
||||
|
||||
public IList<string> BypassNetworks { get; set; } = new List<string>();
|
||||
|
||||
public string? ClientId { get; set; }
|
||||
@@ -146,4 +150,19 @@ public sealed class ConcelierOptions
|
||||
|
||||
public IList<string> MergeJobAllowlist { get; } = new List<string>();
|
||||
}
|
||||
|
||||
public sealed class AdvisoryChunkOptions
|
||||
{
|
||||
public int DefaultChunkLimit { get; set; } = 200;
|
||||
|
||||
public int MaxChunkLimit { get; set; } = 400;
|
||||
|
||||
public int DefaultObservationLimit { get; set; } = 24;
|
||||
|
||||
public int MaxObservationLimit { get; set; } = 48;
|
||||
|
||||
public int DefaultMinimumLength { get; set; } = 64;
|
||||
|
||||
public int MaxMinimumLength { get; set; } = 512;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -30,11 +30,14 @@ public static class ConcelierOptionsValidator
|
||||
|
||||
options.Authority ??= new ConcelierOptions.AuthorityOptions();
|
||||
options.Authority.Resilience ??= new ConcelierOptions.AuthorityOptions.ResilienceOptions();
|
||||
options.Authority.RequiredTenants ??= new List<string>();
|
||||
NormalizeList(options.Authority.Audiences, toLower: false);
|
||||
NormalizeList(options.Authority.RequiredScopes, toLower: true);
|
||||
NormalizeList(options.Authority.BypassNetworks, toLower: false);
|
||||
NormalizeList(options.Authority.ClientScopes, toLower: true);
|
||||
NormalizeList(options.Authority.RequiredTenants, toLower: true);
|
||||
ValidateResilience(options.Authority.Resilience);
|
||||
ValidateTenantAllowlist(options.Authority.RequiredTenants);
|
||||
|
||||
if (options.Authority.RequiredScopes.Count == 0)
|
||||
{
|
||||
@@ -133,6 +136,9 @@ public static class ConcelierOptionsValidator
|
||||
|
||||
options.Mirror ??= new ConcelierOptions.MirrorOptions();
|
||||
ValidateMirror(options.Mirror);
|
||||
|
||||
options.AdvisoryChunks ??= new ConcelierOptions.AdvisoryChunkOptions();
|
||||
ValidateAdvisoryChunks(options.AdvisoryChunks);
|
||||
}
|
||||
|
||||
private static void NormalizeList(IList<string> values, bool toLower)
|
||||
@@ -190,6 +196,32 @@ public static class ConcelierOptionsValidator
|
||||
}
|
||||
}
|
||||
|
||||
private static void ValidateTenantAllowlist(IList<string> tenants)
|
||||
{
|
||||
if (tenants is null || tenants.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
foreach (var tenant in tenants)
|
||||
{
|
||||
if (string.IsNullOrEmpty(tenant) || tenant.Length > 64)
|
||||
{
|
||||
throw new InvalidOperationException("Authority requiredTenants entries must be between 1 and 64 characters.");
|
||||
}
|
||||
|
||||
foreach (var ch in tenant)
|
||||
{
|
||||
var isAlpha = ch is >= 'a' and <= 'z';
|
||||
var isDigit = ch is >= '0' and <= '9';
|
||||
if (!isAlpha && !isDigit && ch != '-')
|
||||
{
|
||||
throw new InvalidOperationException($"Authority requiredTenants entry '{tenant}' contains invalid character '{ch}'. Use lowercase letters, digits, or '-'.");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void ValidateMirror(ConcelierOptions.MirrorOptions mirror)
|
||||
{
|
||||
if (mirror.MaxIndexRequestsPerHour < 0)
|
||||
@@ -242,4 +274,37 @@ public static class ConcelierOptionsValidator
|
||||
throw new InvalidOperationException("Mirror distribution requires at least one domain when enabled.");
|
||||
}
|
||||
}
|
||||
|
||||
private static void ValidateAdvisoryChunks(ConcelierOptions.AdvisoryChunkOptions chunks)
|
||||
{
|
||||
if (chunks.DefaultChunkLimit <= 0)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory chunk defaultChunkLimit must be greater than zero.");
|
||||
}
|
||||
|
||||
if (chunks.MaxChunkLimit < chunks.DefaultChunkLimit)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory chunk maxChunkLimit must be greater than or equal to defaultChunkLimit.");
|
||||
}
|
||||
|
||||
if (chunks.DefaultObservationLimit <= 0)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory chunk defaultObservationLimit must be greater than zero.");
|
||||
}
|
||||
|
||||
if (chunks.MaxObservationLimit < chunks.DefaultObservationLimit)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory chunk maxObservationLimit must be greater than or equal to defaultObservationLimit.");
|
||||
}
|
||||
|
||||
if (chunks.DefaultMinimumLength <= 0)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory chunk defaultMinimumLength must be greater than zero.");
|
||||
}
|
||||
|
||||
if (chunks.MaxMinimumLength < chunks.DefaultMinimumLength)
|
||||
{
|
||||
throw new InvalidOperationException("Advisory chunk maxMinimumLength must be greater than or equal to defaultMinimumLength.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -235,6 +235,12 @@ var resolvedConcelierOptions = app.Services.GetRequiredService<IOptions<Concelie
|
||||
var resolvedAuthority = resolvedConcelierOptions.Authority ?? new ConcelierOptions.AuthorityOptions();
|
||||
authorityConfigured = resolvedAuthority.Enabled;
|
||||
var enforceAuthority = resolvedAuthority.Enabled && !resolvedAuthority.AllowAnonymousFallback;
|
||||
var requiredTenants = (resolvedAuthority.RequiredTenants ?? Array.Empty<string>())
|
||||
.Select(static tenant => tenant?.Trim().ToLowerInvariant())
|
||||
.Where(static tenant => !string.IsNullOrWhiteSpace(tenant))
|
||||
.Distinct(StringComparer.Ordinal)
|
||||
.ToImmutableHashSet(StringComparer.Ordinal);
|
||||
var enforceTenantAllowlist = !requiredTenants.IsEmpty;
|
||||
|
||||
if (resolvedAuthority.Enabled && resolvedAuthority.AllowAnonymousFallback)
|
||||
{
|
||||
@@ -358,11 +364,14 @@ var advisoryIngestEndpoint = app.MapPost("/ingest/advisory", async (
|
||||
AdvisoryIngestRequest request,
|
||||
[FromServices] IAdvisoryRawService rawService,
|
||||
[FromServices] TimeProvider timeProvider,
|
||||
[FromServices] ILogger<Program> logger,
|
||||
CancellationToken cancellationToken) =>
|
||||
{
|
||||
ApplyNoCache(context.Response);
|
||||
|
||||
if (request is null || request.Source is null || request.Upstream is null || request.Content is null || request.Identifiers is null)
|
||||
var ingestRequest = request;
|
||||
|
||||
if (ingestRequest is null || ingestRequest.Source is null || ingestRequest.Upstream is null || ingestRequest.Content is null || ingestRequest.Identifiers is null)
|
||||
{
|
||||
return Problem(context, "Invalid request", StatusCodes.Status400BadRequest, ProblemTypes.Validation, "source, upstream, content, and identifiers sections are required.");
|
||||
}
|
||||
@@ -381,7 +390,14 @@ var advisoryIngestEndpoint = app.MapPost("/ingest/advisory", async (
|
||||
AdvisoryRawDocument document;
|
||||
try
|
||||
{
|
||||
document = AdvisoryRawRequestMapper.Map(request, tenant, timeProvider);
|
||||
logger.LogWarning(
|
||||
"Binding advisory ingest request hash={Hash}",
|
||||
ingestRequest.Upstream.ContentHash ?? "(null)");
|
||||
|
||||
document = AdvisoryRawRequestMapper.Map(ingestRequest, tenant, timeProvider);
|
||||
logger.LogWarning(
|
||||
"Mapped advisory_raw document hash={Hash}",
|
||||
string.IsNullOrWhiteSpace(document.Upstream.ContentHash) ? "(empty)" : document.Upstream.ContentHash);
|
||||
}
|
||||
catch (Exception ex) when (ex is ArgumentException or InvalidOperationException)
|
||||
{
|
||||
@@ -418,6 +434,15 @@ var advisoryIngestEndpoint = app.MapPost("/ingest/advisory", async (
|
||||
}
|
||||
catch (ConcelierAocGuardException guardException)
|
||||
{
|
||||
logger.LogWarning(
|
||||
guardException,
|
||||
"AOC guard rejected advisory ingest tenant={Tenant} upstream={UpstreamId} requestHash={RequestHash} documentHash={DocumentHash} codes={Codes}",
|
||||
tenant,
|
||||
document.Upstream.UpstreamId,
|
||||
request!.Upstream?.ContentHash ?? "(null)",
|
||||
string.IsNullOrWhiteSpace(document.Upstream.ContentHash) ? "(empty)" : document.Upstream.ContentHash,
|
||||
string.Join(',', guardException.Violations.Select(static violation => violation.ErrorCode)));
|
||||
|
||||
IngestionMetrics.ViolationCounter.Add(1, new[]
|
||||
{
|
||||
new KeyValuePair<string, object?>("tenant", tenant),
|
||||
@@ -945,6 +970,11 @@ IResult? EnsureTenantAuthorized(HttpContext context, string tenant)
|
||||
return null;
|
||||
}
|
||||
|
||||
if (enforceTenantAllowlist && !requiredTenants.Contains(tenant))
|
||||
{
|
||||
return Results.Forbid();
|
||||
}
|
||||
|
||||
var principal = context.User;
|
||||
|
||||
if (enforceAuthority && (principal?.Identity?.IsAuthenticated != true))
|
||||
@@ -965,6 +995,11 @@ IResult? EnsureTenantAuthorized(HttpContext context, string tenant)
|
||||
{
|
||||
return Results.Forbid();
|
||||
}
|
||||
|
||||
if (enforceTenantAllowlist && !requiredTenants.Contains(normalizedClaim))
|
||||
{
|
||||
return Results.Forbid();
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
|
||||
@@ -0,0 +1,257 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Globalization;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Nodes;
|
||||
using StellaOps.Concelier.Models.Observations;
|
||||
using StellaOps.Concelier.WebService.Contracts;
|
||||
|
||||
namespace StellaOps.Concelier.WebService.Services;
|
||||
|
||||
internal sealed record AdvisoryChunkBuildOptions(
|
||||
string AdvisoryKey,
|
||||
int ChunkLimit,
|
||||
int ObservationLimit,
|
||||
ImmutableHashSet<string> SectionFilter,
|
||||
ImmutableHashSet<string> FormatFilter,
|
||||
int MinimumLength);
|
||||
|
||||
internal sealed class AdvisoryChunkBuilder
|
||||
{
|
||||
private const int DefaultMinLength = 40;
|
||||
|
||||
public AdvisoryChunkCollectionResponse Build(
|
||||
AdvisoryChunkBuildOptions options,
|
||||
IReadOnlyList<AdvisoryObservation> observations)
|
||||
{
|
||||
var chunks = new List<AdvisoryChunkItemResponse>(Math.Min(options.ChunkLimit, 256));
|
||||
var sources = new List<AdvisoryChunkSourceResponse>();
|
||||
var total = 0;
|
||||
var truncated = false;
|
||||
|
||||
foreach (var observation in observations
|
||||
.OrderByDescending(o => o.CreatedAt))
|
||||
{
|
||||
if (sources.Count >= options.ObservationLimit)
|
||||
{
|
||||
truncated = truncated || chunks.Count == options.ChunkLimit;
|
||||
break;
|
||||
}
|
||||
|
||||
if (options.FormatFilter.Count > 0 &&
|
||||
!options.FormatFilter.Contains(observation.Content.Format))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var documentId = DetermineDocumentId(observation);
|
||||
sources.Add(new AdvisoryChunkSourceResponse(
|
||||
observation.ObservationId,
|
||||
documentId,
|
||||
observation.Content.Format,
|
||||
observation.Source.Vendor,
|
||||
observation.Upstream.ContentHash,
|
||||
observation.CreatedAt));
|
||||
|
||||
foreach (var chunk in ExtractChunks(observation, documentId, options))
|
||||
{
|
||||
total++;
|
||||
if (chunks.Count < options.ChunkLimit)
|
||||
{
|
||||
chunks.Add(chunk);
|
||||
}
|
||||
else
|
||||
{
|
||||
truncated = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (truncated)
|
||||
{
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!truncated)
|
||||
{
|
||||
total = chunks.Count;
|
||||
}
|
||||
|
||||
return new AdvisoryChunkCollectionResponse(
|
||||
options.AdvisoryKey,
|
||||
total,
|
||||
truncated,
|
||||
chunks,
|
||||
sources);
|
||||
}
|
||||
|
||||
private static string DetermineDocumentId(AdvisoryObservation observation)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(observation.Upstream.UpstreamId))
|
||||
{
|
||||
return observation.Upstream.UpstreamId;
|
||||
}
|
||||
|
||||
return observation.ObservationId;
|
||||
}
|
||||
|
||||
private static IEnumerable<AdvisoryChunkItemResponse> ExtractChunks(
|
||||
AdvisoryObservation observation,
|
||||
string documentId,
|
||||
AdvisoryChunkBuildOptions options)
|
||||
{
|
||||
var root = observation.Content.Raw;
|
||||
if (root is null)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var stack = new Stack<(JsonNode Node, string Path, string Section)>();
|
||||
stack.Push((root, string.Empty, string.Empty));
|
||||
|
||||
while (stack.Count > 0)
|
||||
{
|
||||
var (node, path, section) = stack.Pop();
|
||||
if (node is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
switch (node)
|
||||
{
|
||||
case JsonValue value when TryNormalize(value, out var text):
|
||||
if (text.Length < Math.Max(options.MinimumLength, DefaultMinLength))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!ContainsLetter(text))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var resolvedSection = string.IsNullOrEmpty(section) ? documentId : section;
|
||||
if (options.SectionFilter.Count > 0 && !options.SectionFilter.Contains(resolvedSection))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var paragraphId = string.IsNullOrEmpty(path) ? resolvedSection : path;
|
||||
var chunkId = CreateChunkId(documentId, paragraphId);
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["path"] = paragraphId,
|
||||
["section"] = resolvedSection,
|
||||
["format"] = observation.Content.Format
|
||||
};
|
||||
|
||||
if (!string.IsNullOrEmpty(observation.Content.SpecVersion))
|
||||
{
|
||||
metadata["specVersion"] = observation.Content.SpecVersion!;
|
||||
}
|
||||
|
||||
yield return new AdvisoryChunkItemResponse(
|
||||
documentId,
|
||||
chunkId,
|
||||
resolvedSection,
|
||||
paragraphId,
|
||||
text,
|
||||
metadata);
|
||||
break;
|
||||
|
||||
case JsonObject obj:
|
||||
foreach (var property in obj.Reverse())
|
||||
{
|
||||
var childSection = string.IsNullOrEmpty(section) ? property.Key : section;
|
||||
var childPath = AppendPath(path, property.Key);
|
||||
if (property.Value is { } childNode)
|
||||
{
|
||||
stack.Push((childNode, childPath, childSection));
|
||||
}
|
||||
}
|
||||
|
||||
break;
|
||||
|
||||
case JsonArray array:
|
||||
for (var index = array.Count - 1; index >= 0; index--)
|
||||
{
|
||||
var childPath = AppendIndex(path, index);
|
||||
if (array[index] is { } childNode)
|
||||
{
|
||||
stack.Push((childNode, childPath, section));
|
||||
}
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static bool TryNormalize(JsonValue value, out string normalized)
|
||||
{
|
||||
normalized = string.Empty;
|
||||
if (!value.TryGetValue(out string? text) || text is null)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
var span = text.AsSpan();
|
||||
var builder = new StringBuilder(span.Length);
|
||||
var previousWhitespace = false;
|
||||
|
||||
foreach (var ch in span)
|
||||
{
|
||||
if (char.IsControl(ch) && !char.IsWhiteSpace(ch))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (char.IsWhiteSpace(ch))
|
||||
{
|
||||
if (previousWhitespace)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
builder.Append(' ');
|
||||
previousWhitespace = true;
|
||||
}
|
||||
else
|
||||
{
|
||||
builder.Append(ch);
|
||||
previousWhitespace = false;
|
||||
}
|
||||
}
|
||||
|
||||
normalized = builder.ToString().Trim();
|
||||
return normalized.Length > 0;
|
||||
}
|
||||
|
||||
private static bool ContainsLetter(string text)
|
||||
=> text.Any(static ch => char.IsLetter(ch));
|
||||
|
||||
private static string AppendPath(string path, string? segment)
|
||||
{
|
||||
var safeSegment = segment ?? string.Empty;
|
||||
return string.IsNullOrEmpty(path) ? safeSegment : string.Concat(path, '.', safeSegment);
|
||||
}
|
||||
|
||||
private static string AppendIndex(string path, int index)
|
||||
{
|
||||
if (string.IsNullOrEmpty(path))
|
||||
{
|
||||
return $"[{index}]";
|
||||
}
|
||||
|
||||
return string.Concat(path, '[', index.ToString(CultureInfo.InvariantCulture), ']');
|
||||
}
|
||||
|
||||
private static string CreateChunkId(string documentId, string paragraphId)
|
||||
{
|
||||
var input = string.Concat(documentId, '|', paragraphId);
|
||||
var hash = SHA256.HashData(Encoding.UTF8.GetBytes(input));
|
||||
return string.Concat(documentId, ':', Convert.ToHexString(hash.AsSpan(0, 8)));
|
||||
}
|
||||
}
|
||||
@@ -63,7 +63,7 @@
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| CONCELIER-AIAI-31-001 `Paragraph anchors` | TODO | Concelier WebService Guild | CONCELIER-VULN-29-001 | Expose advisory chunk API returning paragraph anchors, section metadata, and token-safe text for Advisory AI retrieval. |
|
||||
| CONCELIER-AIAI-31-001 `Paragraph anchors` | DONE | Concelier WebService Guild | CONCELIER-VULN-29-001 | Expose advisory chunk API returning paragraph anchors, section metadata, and token-safe text for Advisory AI retrieval. See docs/updates/2025-11-07-concelier-advisory-chunks.md. |
|
||||
| CONCELIER-AIAI-31-002 `Structured fields` | TODO | Concelier WebService Guild | CONCELIER-AIAI-31-001 | Ensure observation APIs expose upstream workaround/fix/CVSS fields with provenance; add caching for summary queries. |
|
||||
| CONCELIER-AIAI-31-003 `Advisory AI telemetry` | TODO | Concelier WebService Guild, Observability Guild | CONCELIER-AIAI-31-001 | Emit metrics/logs for chunk requests, cache hits, and guardrail blocks triggered by advisory payloads. |
|
||||
|
||||
|
||||
@@ -12,7 +12,7 @@
|
||||
| CONCELIER-CORE-AOC-19-004 `Remove ingestion normalization` | DONE (2025-11-06) | Concelier Core Guild | CONCELIER-CORE-AOC-19-002, POLICY-AOC-19-003 | Strip normalization/dedup/severity logic from ingestion pipelines, delegate derived computations to Policy Engine, and update exporters/tests to consume raw documents only.<br>2025-10-29 19:05Z: Audit completed for `AdvisoryRawService`/Mongo repo to confirm alias order/dedup removal persists; identified remaining normalization in observation/linkset factory that will be revised to surface raw duplicates for Policy ingestion. Change sketch + regression matrix drafted under `docs/dev/aoc-normalization-removal-notes.md` (pending commit).<br>2025-10-31 20:45Z: Added raw linkset projection to observations/storage, exposing canonical+raw views, refreshed fixtures/tests, and documented behaviour in models/doc factory.<br>2025-10-31 21:10Z: Coordinated with Policy Engine (POLICY-ENGINE-20-003) on adoption timeline; backfill + consumer readiness tracked in `docs/dev/raw-linkset-backfill-plan.md`.<br>2025-11-05 14:25Z: Resuming to document merge-dependent normalization paths and prepare implementation notes for `noMergeEnabled` gating before code changes land.<br>2025-11-05 19:20Z: Observation factory/linkset now preserve upstream ordering + duplicates; canonicalisation responsibility shifts to downstream consumers with refreshed unit coverage.<br>2025-11-06 16:10Z: Updated AOC reference/backfill docs with raw vs canonical guidance and cross-linked analyzer guardrails.<br>2025-11-06 23:40Z: Final pass preserves raw alias casing/whitespace end-to-end; query filters now compare case-insensitively, exporter fixtures refreshed, and docs aligned. Tests: `StellaOps.Concelier.Models/Core/Storage.Mongo.Tests` green on .NET 10 preview. |
|
||||
> Docs alignment (2025-10-26): Architecture overview emphasises policy-only derivation; coordinate with Policy Engine guild for rollout.
|
||||
> 2025-10-29: `AdvisoryRawService` now preserves upstream alias/linkset ordering (trim-only) and updated AOC documentation reflects the behaviour; follow-up to ensure policy consumers handle duplicates remains open.
|
||||
| CONCELIER-CORE-AOC-19-013 `Authority tenant scope smoke coverage` | TODO | Concelier Core Guild | AUTH-AOC-19-002 | Extend Concelier smoke/e2e fixtures to configure `requiredTenants` and assert cross-tenant rejection with updated Authority tokens. | Coordinate deliverable so Authority docs (`AUTH-AOC-19-003`) can close once tests are in place. |
|
||||
| CONCELIER-CORE-AOC-19-013 `Authority tenant scope smoke coverage` | DONE (2025-11-07) | Concelier Core Guild | AUTH-AOC-19-002 | Extend Concelier smoke/e2e fixtures to configure `requiredTenants` and assert cross-tenant rejection with updated Authority tokens. | Coordinate deliverable so Authority docs (`AUTH-AOC-19-003`) can close once tests are in place. |
|
||||
|
||||
## Policy Engine v2
|
||||
|
||||
|
||||
@@ -18,7 +18,9 @@ public static class MergeServiceCollectionExtensions
|
||||
ArgumentNullException.ThrowIfNull(services);
|
||||
ArgumentNullException.ThrowIfNull(configuration);
|
||||
|
||||
var noMergeEnabled = configuration.GetValue<bool?>("concelier:features:noMergeEnabled") ?? true;
|
||||
var noMergeEnabled = configuration.GetValue<bool?>("concelier:features:noMergeEnabled")
|
||||
?? configuration.GetValue<bool?>("features:noMergeEnabled")
|
||||
?? true;
|
||||
if (noMergeEnabled)
|
||||
{
|
||||
return services;
|
||||
|
||||
@@ -10,6 +10,6 @@
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|MERGE-LNM-21-001 Migration plan authoring|BE-Merge, Architecture Guild|CONCELIER-LNM-21-101|**DONE (2025-11-03)** – Authored `docs/migration/no-merge.md` with rollout phases, backfill/validation checklists, rollback guidance, and ownership matrix for the Link-Not-Merge cutover.|
|
||||
|MERGE-LNM-21-002 Merge service deprecation|BE-Merge|MERGE-LNM-21-001|**DOING (2025-11-06)** – Defaulted `concelier:features:noMergeEnabled` to `true`, added merge job allowlist gate, and began rewiring guard/tier tests; follow-up work required to restore Concelier WebService test suite before declaring completion.<br>2025-11-05 14:42Z: Implemented `concelier:features:noMergeEnabled` gate, merge job allowlist checks, `[Obsolete]` markings, and analyzer scaffolding to steer consumers toward linkset APIs.<br>2025-11-06 16:10Z: Introduced Roslyn analyzer (`CONCELIER0002`) referenced by Concelier WebService + tests, documented suppression guidance, and updated migration playbook.<br>2025-11-07 03:25Z: Default-on toggle + job gating break existing Concelier WebService tests; guard + seed fixes pending to unblock ingest/mirror suites.|
|
||||
|MERGE-LNM-21-002 Merge service deprecation|BE-Merge|MERGE-LNM-21-001|**DOING (2025-11-07)** – Defaulted `concelier:features:noMergeEnabled` to `true`, added merge job allowlist gate, and began rewiring guard/tier tests; follow-up work required to restore Concelier WebService test suite before declaring completion.<br>2025-11-05 14:42Z: Implemented `concelier:features:noMergeEnabled` gate, merge job allowlist checks, `[Obsolete]` markings, and analyzer scaffolding to steer consumers toward linkset APIs.<br>2025-11-06 16:10Z: Introduced Roslyn analyzer (`CONCELIER0002`) referenced by Concelier WebService + tests, documented suppression guidance, and updated migration playbook.<br>2025-11-07 03:25Z: Default-on toggle + job gating break existing Concelier WebService tests; guard + seed fixes pending to unblock ingest/mirror suites.<br>2025-11-07 07:05Z: Added ingest logging + test log dumps to trace upstream hash loss; still chasing why Minimal API binding strips `upstream.contentHash` before the guard runs.|
|
||||
> 2025-11-03: Catalogued call sites (WebService Program `AddMergeModule`, built-in job registration `merge:reconcile`, `MergeReconcileJob`) and confirmed unit tests are the only direct `MergeAsync` callers; next step is to define analyzer + replacement observability coverage.
|
||||
|MERGE-LNM-21-003 Determinism/test updates|QA Guild, BE-Merge|MERGE-LNM-21-002|Replace merge determinism suites with observation/linkset regression tests verifying no data mutation and conflicts remain visible.|
|
||||
|
||||
@@ -9,6 +9,7 @@ using System.Net;
|
||||
using System.Net.Http.Json;
|
||||
using System.Net.Http.Headers;
|
||||
using System.Security.Claims;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using Microsoft.AspNetCore.Builder;
|
||||
@@ -22,6 +23,7 @@ using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using Mongo2Go;
|
||||
using MongoDB.Bson;
|
||||
using MongoDB.Bson.IO;
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Concelier.Core.Events;
|
||||
using StellaOps.Concelier.Core.Jobs;
|
||||
@@ -29,6 +31,7 @@ using StellaOps.Concelier.Models;
|
||||
using StellaOps.Concelier.Merge.Services;
|
||||
using StellaOps.Concelier.Storage.Mongo;
|
||||
using StellaOps.Concelier.Storage.Mongo.Observations;
|
||||
using StellaOps.Concelier.Core.Raw;
|
||||
using StellaOps.Concelier.WebService.Jobs;
|
||||
using StellaOps.Concelier.WebService.Options;
|
||||
using StellaOps.Concelier.WebService.Contracts;
|
||||
@@ -36,6 +39,7 @@ using Xunit.Sdk;
|
||||
using StellaOps.Auth.Abstractions;
|
||||
using StellaOps.Auth.Client;
|
||||
using Xunit;
|
||||
using Xunit.Abstractions;
|
||||
using Microsoft.IdentityModel.Protocols;
|
||||
using Microsoft.IdentityModel.Protocols.OpenIdConnect;
|
||||
using StellaOps.Concelier.WebService.Diagnostics;
|
||||
@@ -50,9 +54,15 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
private const string TestSigningSecret = "0123456789ABCDEF0123456789ABCDEF";
|
||||
private static readonly SymmetricSecurityKey TestSigningKey = new(Encoding.UTF8.GetBytes(TestSigningSecret));
|
||||
|
||||
private readonly ITestOutputHelper _output;
|
||||
private MongoDbRunner _runner = null!;
|
||||
private ConcelierApplicationFactory _factory = null!;
|
||||
|
||||
public WebServiceEndpointsTests(ITestOutputHelper output)
|
||||
{
|
||||
_output = output;
|
||||
}
|
||||
|
||||
public Task InitializeAsync()
|
||||
{
|
||||
_runner = MongoDbRunner.Start(singleNodeReplSet: true);
|
||||
@@ -200,17 +210,123 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
Assert.True(response.StatusCode == HttpStatusCode.BadRequest, $"Expected 400 but got {(int)response.StatusCode}: {body}");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task AdvisoryChunksEndpoint_ReturnsParagraphAnchors()
|
||||
{
|
||||
var newestRaw = BsonDocument.Parse(
|
||||
"""
|
||||
{
|
||||
"summary": {
|
||||
"intro": "This is a deterministic summary paragraph describing CVE-2025-0001 with remediation context for Advisory AI consumers."
|
||||
},
|
||||
"details": [
|
||||
"Long-form remediation guidance that exceeds the minimum length threshold and mentions affected packages.",
|
||||
{
|
||||
"body": "Nested context that Advisory AI can cite when rendering downstream explanations."
|
||||
}
|
||||
]
|
||||
}
|
||||
""");
|
||||
var olderRaw = BsonDocument.Parse(
|
||||
"""
|
||||
{
|
||||
"summary": {
|
||||
"intro": "Older paragraph that should be visible when no section filter applies."
|
||||
}
|
||||
}
|
||||
""");
|
||||
|
||||
var newerCreatedAt = new DateTime(2025, 1, 7, 0, 0, 0, DateTimeKind.Utc);
|
||||
var olderCreatedAt = new DateTime(2025, 1, 5, 0, 0, 0, DateTimeKind.Utc);
|
||||
var newerHash = ComputeContentHash(newestRaw);
|
||||
var olderHash = ComputeContentHash(olderRaw);
|
||||
|
||||
var documents = new[]
|
||||
{
|
||||
CreateChunkObservationDocument(
|
||||
id: "tenant-a:chunk:newest",
|
||||
tenant: "tenant-a",
|
||||
createdAt: newerCreatedAt,
|
||||
alias: "cve-2025-0001",
|
||||
rawDocument: newestRaw),
|
||||
CreateChunkObservationDocument(
|
||||
id: "tenant-a:chunk:older",
|
||||
tenant: "tenant-a",
|
||||
createdAt: olderCreatedAt,
|
||||
alias: "cve-2025-0001",
|
||||
rawDocument: olderRaw)
|
||||
};
|
||||
|
||||
await SeedObservationDocumentsAsync(documents);
|
||||
await SeedAdvisoryRawDocumentsAsync(
|
||||
CreateAdvisoryRawDocument("tenant-a", "nvd", "tenant-a:chunk:newest", newerHash, newestRaw.DeepClone().AsBsonDocument),
|
||||
CreateAdvisoryRawDocument("tenant-a", "nvd", "tenant-a:chunk:older", olderHash, olderRaw.DeepClone().AsBsonDocument));
|
||||
|
||||
using var client = _factory.CreateClient();
|
||||
var response = await client.GetAsync("/advisories/cve-2025-0001/chunks?tenant=tenant-a§ion=summary&format=csaf");
|
||||
response.EnsureSuccessStatusCode();
|
||||
|
||||
var payload = await response.Content.ReadAsStringAsync();
|
||||
using var document = JsonDocument.Parse(payload);
|
||||
var root = document.RootElement;
|
||||
|
||||
Assert.Equal("cve-2025-0001", root.GetProperty("advisoryKey").GetString());
|
||||
Assert.Equal(1, root.GetProperty("total").GetInt32());
|
||||
Assert.False(root.GetProperty("truncated").GetBoolean());
|
||||
|
||||
var chunk = Assert.Single(root.GetProperty("chunks").EnumerateArray());
|
||||
Assert.Equal("summary", chunk.GetProperty("section").GetString());
|
||||
Assert.Equal("summary.intro", chunk.GetProperty("paragraphId").GetString());
|
||||
var text = chunk.GetProperty("text").GetString();
|
||||
Assert.False(string.IsNullOrWhiteSpace(text));
|
||||
Assert.Contains("deterministic summary paragraph", text, StringComparison.OrdinalIgnoreCase);
|
||||
|
||||
var metadata = chunk.GetProperty("metadata");
|
||||
Assert.Equal("summary.intro", metadata.GetProperty("path").GetString());
|
||||
Assert.Equal("csaf", metadata.GetProperty("format").GetString());
|
||||
|
||||
var sources = root.GetProperty("sources").EnumerateArray().ToArray();
|
||||
Assert.Equal(2, sources.Length);
|
||||
Assert.Equal("tenant-a:chunk:newest", sources[0].GetProperty("observationId").GetString());
|
||||
Assert.Equal("tenant-a:chunk:older", sources[1].GetProperty("observationId").GetString());
|
||||
Assert.All(
|
||||
sources,
|
||||
source => Assert.True(string.Equals("csaf", source.GetProperty("format").GetString(), StringComparison.OrdinalIgnoreCase)));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task AdvisoryChunksEndpoint_ReturnsNotFoundWhenAdvisoryMissing()
|
||||
{
|
||||
await SeedObservationDocumentsAsync(BuildSampleObservationDocuments());
|
||||
|
||||
using var client = _factory.CreateClient();
|
||||
var response = await client.GetAsync("/advisories/cve-2099-9999/chunks?tenant=tenant-a");
|
||||
|
||||
Assert.Equal(HttpStatusCode.NotFound, response.StatusCode);
|
||||
var payload = await response.Content.ReadAsStringAsync();
|
||||
using var document = JsonDocument.Parse(payload);
|
||||
var root = document.RootElement;
|
||||
Assert.Equal("https://stellaops.org/problems/not-found", root.GetProperty("type").GetString());
|
||||
Assert.Equal("Advisory not found", root.GetProperty("title").GetString());
|
||||
Assert.Contains("cve-2099-9999", root.GetProperty("detail").GetString(), StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task AdvisoryIngestEndpoint_PersistsDocumentAndSupportsReadback()
|
||||
{
|
||||
using var client = _factory.CreateClient();
|
||||
client.DefaultRequestHeaders.Add("X-Stella-Tenant", "tenant-ingest");
|
||||
|
||||
const string upstreamId = "GHSA-INGEST-0001";
|
||||
var ingestRequest = BuildAdvisoryIngestRequest(
|
||||
contentHash: "sha256:abc123",
|
||||
upstreamId: "GHSA-INGEST-0001");
|
||||
contentHash: null,
|
||||
upstreamId: upstreamId);
|
||||
|
||||
var ingestResponse = await client.PostAsJsonAsync("/ingest/advisory", ingestRequest);
|
||||
if (ingestResponse.StatusCode != HttpStatusCode.Created)
|
||||
{
|
||||
WriteProgramLogs();
|
||||
}
|
||||
Assert.Equal(HttpStatusCode.Created, ingestResponse.StatusCode);
|
||||
|
||||
var ingestPayload = await ingestResponse.Content.ReadFromJsonAsync<AdvisoryIngestResponse>();
|
||||
@@ -218,7 +334,7 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
Assert.True(ingestPayload!.Inserted);
|
||||
Assert.False(string.IsNullOrWhiteSpace(ingestPayload.Id));
|
||||
Assert.Equal("tenant-ingest", ingestPayload.Tenant);
|
||||
Assert.Equal("sha256:abc123", ingestPayload.ContentHash);
|
||||
Assert.Equal(ComputeDeterministicContentHash(upstreamId), ingestPayload.ContentHash);
|
||||
Assert.NotNull(ingestResponse.Headers.Location);
|
||||
var locationValue = ingestResponse.Headers.Location!.ToString();
|
||||
Assert.False(string.IsNullOrWhiteSpace(locationValue));
|
||||
@@ -230,8 +346,8 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
Assert.Equal(ingestPayload.Id, decodedSegment);
|
||||
|
||||
var duplicateResponse = await client.PostAsJsonAsync("/ingest/advisory", BuildAdvisoryIngestRequest(
|
||||
contentHash: "sha256:abc123",
|
||||
upstreamId: "GHSA-INGEST-0001"));
|
||||
contentHash: null,
|
||||
upstreamId: upstreamId));
|
||||
Assert.Equal(HttpStatusCode.OK, duplicateResponse.StatusCode);
|
||||
var duplicatePayload = await duplicateResponse.Content.ReadFromJsonAsync<AdvisoryIngestResponse>();
|
||||
Assert.NotNull(duplicatePayload);
|
||||
@@ -247,7 +363,7 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
Assert.NotNull(record);
|
||||
Assert.Equal(ingestPayload.Id, record!.Id);
|
||||
Assert.Equal("tenant-ingest", record.Tenant);
|
||||
Assert.Equal("sha256:abc123", record.Document.Upstream.ContentHash);
|
||||
Assert.Equal(ComputeDeterministicContentHash(upstreamId), record.Document.Upstream.ContentHash);
|
||||
}
|
||||
|
||||
using (var listRequest = new HttpRequestMessage(HttpMethod.Get, "/advisories/raw?limit=10"))
|
||||
@@ -451,6 +567,54 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
Assert.Equal(HttpStatusCode.Forbidden, crossTenantResponse.StatusCode);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task AdvisoryIngestEndpoint_RejectsTenantOutsideAllowlist()
|
||||
{
|
||||
var environment = new Dictionary<string, string?>
|
||||
{
|
||||
["CONCELIER_AUTHORITY__ENABLED"] = "true",
|
||||
["CONCELIER_AUTHORITY__ALLOWANONYMOUSFALLBACK"] = "false",
|
||||
["CONCELIER_AUTHORITY__ISSUER"] = TestAuthorityIssuer,
|
||||
["CONCELIER_AUTHORITY__REQUIREHTTPSMETADATA"] = "false",
|
||||
["CONCELIER_AUTHORITY__AUDIENCES__0"] = TestAuthorityAudience,
|
||||
["CONCELIER_AUTHORITY__CLIENTID"] = "webservice-tests",
|
||||
["CONCELIER_AUTHORITY__CLIENTSECRET"] = "unused",
|
||||
["CONCELIER_AUTHORITY__REQUIREDTENANTS__0"] = "tenant-auth"
|
||||
};
|
||||
|
||||
using var factory = new ConcelierApplicationFactory(
|
||||
_runner.ConnectionString,
|
||||
authority =>
|
||||
{
|
||||
authority.Enabled = true;
|
||||
authority.AllowAnonymousFallback = false;
|
||||
authority.Issuer = TestAuthorityIssuer;
|
||||
authority.RequireHttpsMetadata = false;
|
||||
authority.Audiences.Clear();
|
||||
authority.Audiences.Add(TestAuthorityAudience);
|
||||
authority.ClientId = "webservice-tests";
|
||||
authority.ClientSecret = "unused";
|
||||
authority.RequiredTenants.Clear();
|
||||
authority.RequiredTenants.Add("tenant-auth");
|
||||
},
|
||||
environment);
|
||||
|
||||
using var client = factory.CreateClient();
|
||||
var allowedToken = CreateTestToken("tenant-auth", StellaOpsScopes.AdvisoryIngest);
|
||||
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", allowedToken);
|
||||
client.DefaultRequestHeaders.Add("X-Stella-Tenant", "tenant-auth");
|
||||
|
||||
var allowedResponse = await client.PostAsJsonAsync("/ingest/advisory", BuildAdvisoryIngestRequest("sha256:allow-1", "GHSA-ALLOW-001"));
|
||||
Assert.Equal(HttpStatusCode.Created, allowedResponse.StatusCode);
|
||||
|
||||
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", CreateTestToken("tenant-blocked", StellaOpsScopes.AdvisoryIngest));
|
||||
client.DefaultRequestHeaders.Remove("X-Stella-Tenant");
|
||||
client.DefaultRequestHeaders.Add("X-Stella-Tenant", "tenant-blocked");
|
||||
|
||||
var forbiddenResponse = await client.PostAsJsonAsync("/ingest/advisory", BuildAdvisoryIngestRequest("sha256:allow-2", "GHSA-ALLOW-002"));
|
||||
Assert.Equal(HttpStatusCode.Forbidden, forbiddenResponse.StatusCode);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task AdvisoryIngestEndpoint_ReturnsGuardViolationWhenContentHashMissing()
|
||||
{
|
||||
@@ -1244,6 +1408,55 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
};
|
||||
}
|
||||
|
||||
private static AdvisoryObservationDocument CreateChunkObservationDocument(
|
||||
string id,
|
||||
string tenant,
|
||||
DateTime createdAt,
|
||||
string alias,
|
||||
BsonDocument rawDocument)
|
||||
{
|
||||
var document = CreateObservationDocument(
|
||||
id,
|
||||
tenant,
|
||||
createdAt,
|
||||
aliases: new[] { alias });
|
||||
var clone = rawDocument.DeepClone().AsBsonDocument;
|
||||
document.Content.Raw = clone;
|
||||
document.Upstream.ContentHash = ComputeContentHash(clone);
|
||||
return document;
|
||||
}
|
||||
|
||||
private static readonly DateTimeOffset DefaultIngestTimestamp = new(2025, 1, 1, 0, 0, 0, TimeSpan.Zero);
|
||||
|
||||
private static string ComputeContentHash(BsonDocument rawDocument)
|
||||
{
|
||||
using var sha256 = SHA256.Create();
|
||||
var canonical = rawDocument.ToJson(new JsonWriterSettings
|
||||
{
|
||||
OutputMode = JsonOutputMode.RelaxedExtendedJson
|
||||
});
|
||||
var bytes = sha256.ComputeHash(Encoding.UTF8.GetBytes(canonical));
|
||||
return $"sha256:{Convert.ToHexString(bytes).ToLowerInvariant()}";
|
||||
}
|
||||
|
||||
private static string ComputeDeterministicContentHash(string upstreamId)
|
||||
{
|
||||
var raw = CreateJsonElement($@"{{""id"":""{upstreamId}"",""modified"":""{DefaultIngestTimestamp:O}""}}");
|
||||
return NormalizeContentHash(null, raw, enforceContentHash: true);
|
||||
}
|
||||
|
||||
private static string NormalizeContentHash(string? value, JsonElement raw, bool enforceContentHash)
|
||||
{
|
||||
if (!enforceContentHash)
|
||||
{
|
||||
return value ?? string.Empty;
|
||||
}
|
||||
|
||||
using var sha256 = SHA256.Create();
|
||||
var bytes = sha256.ComputeHash(Encoding.UTF8.GetBytes(raw.GetRawText()));
|
||||
return $"sha256:{Convert.ToHexString(bytes).ToLowerInvariant()}";
|
||||
}
|
||||
|
||||
private sealed record ReplayResponse(
|
||||
string VulnerabilityKey,
|
||||
DateTimeOffset? AsOf,
|
||||
@@ -1690,8 +1903,18 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
return $"advisory_raw:{vendorSegment}:{upstreamSegment}:{hashSegment}";
|
||||
}
|
||||
|
||||
private static AdvisoryIngestRequest BuildAdvisoryIngestRequest(string contentHash, string upstreamId)
|
||||
private void WriteProgramLogs()
|
||||
{
|
||||
var entries = _factory.LoggerProvider.Snapshot("StellaOps.Concelier.WebService.Program");
|
||||
foreach (var entry in entries)
|
||||
{
|
||||
_output.WriteLine($"[PROGRAM LOG] {entry.Level}: {entry.Message}");
|
||||
}
|
||||
}
|
||||
|
||||
private static AdvisoryIngestRequest BuildAdvisoryIngestRequest(string? contentHash, string upstreamId)
|
||||
{
|
||||
var normalizedContentHash = contentHash ?? ComputeDeterministicContentHash(upstreamId);
|
||||
var raw = CreateJsonElement($@"{{""id"":""{upstreamId}"",""modified"":""{DateTime.UtcNow:O}""}}");
|
||||
var references = new[]
|
||||
{
|
||||
@@ -1704,7 +1927,7 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
|
||||
upstreamId,
|
||||
"2025-01-01T00:00:00Z",
|
||||
DateTimeOffset.UtcNow,
|
||||
contentHash,
|
||||
normalizedContentHash,
|
||||
new AdvisorySignatureRequest(false, null, null, null, null, null),
|
||||
new Dictionary<string, string> { ["http.method"] = "GET" }),
|
||||
new AdvisoryContentRequest("osv", "1.3.0", raw, null),
|
||||
|
||||
@@ -12,6 +12,7 @@ using StellaOps.Scanner.Sbomer.BuildXPlugin.Attestation;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Cas;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Descriptor;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Manifest;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Surface;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin;
|
||||
|
||||
@@ -131,6 +132,12 @@ internal static class Program
|
||||
Console.WriteLine(" --attestor <url> (descriptor) Optional Attestor endpoint for provenance placeholders.");
|
||||
Console.WriteLine(" --attestor-token <token> Bearer token for Attestor requests (or STELLAOPS_ATTESTOR_TOKEN).");
|
||||
Console.WriteLine(" --attestor-insecure Skip TLS verification for Attestor requests (dev/test only).");
|
||||
Console.WriteLine(" --surface-layer-fragments <path> Persist layer fragments JSON into Surface.FS.");
|
||||
Console.WriteLine(" --surface-entrytrace-graph <path> Persist EntryTrace graph JSON into Surface.FS.");
|
||||
Console.WriteLine(" --surface-entrytrace-ndjson <path> Persist EntryTrace NDJSON into Surface.FS.");
|
||||
Console.WriteLine(" --surface-cache-root <path> Override Surface cache root (defaults to CAS root).");
|
||||
Console.WriteLine(" --surface-bucket <name> Bucket name used in Surface CAS URIs (default scanner-artifacts).");
|
||||
Console.WriteLine(" --surface-tenant <tenant> Tenant identifier recorded in the Surface manifest.");
|
||||
return 0;
|
||||
}
|
||||
|
||||
@@ -186,6 +193,11 @@ internal static class Program
|
||||
|
||||
private static async Task<int> RunDescriptorAsync(string[] args, CancellationToken cancellationToken)
|
||||
{
|
||||
var manifestDirectory = ResolveManifestDirectory(args);
|
||||
var loader = new BuildxPluginManifestLoader(manifestDirectory);
|
||||
var manifest = await loader.LoadDefaultAsync(cancellationToken).ConfigureAwait(false);
|
||||
var casRoot = ResolveCasRoot(args, manifest);
|
||||
|
||||
var imageDigest = RequireOption(args, "--image");
|
||||
var sbomPath = RequireOption(args, "--sbom");
|
||||
|
||||
@@ -244,11 +256,110 @@ internal static class Program
|
||||
await attestorClient.SendPlaceholderAsync(attestorUri, document, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
await TryPublishSurfaceArtifactsAsync(args, request, casRoot, version, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var json = JsonSerializer.Serialize(document, DescriptorJsonOptions);
|
||||
Console.WriteLine(json);
|
||||
return 0;
|
||||
}
|
||||
|
||||
private static async Task TryPublishSurfaceArtifactsAsync(
|
||||
string[] args,
|
||||
DescriptorRequest descriptorRequest,
|
||||
string casRoot,
|
||||
string generatorVersion,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var surfaceOptions = ResolveSurfaceOptions(args, descriptorRequest, casRoot, generatorVersion);
|
||||
if (surfaceOptions is null || !surfaceOptions.HasArtifacts)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var writer = new SurfaceManifestWriter(TimeProvider.System);
|
||||
var result = await writer.WriteAsync(surfaceOptions, cancellationToken).ConfigureAwait(false);
|
||||
if (result is null)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
Console.Error.WriteLine($"surface manifest stored: {result.ManifestUri} ({result.Document.Artifacts.Count} artefacts)");
|
||||
}
|
||||
|
||||
private static SurfaceOptions? ResolveSurfaceOptions(
|
||||
string[] args,
|
||||
DescriptorRequest descriptorRequest,
|
||||
string casRoot,
|
||||
string generatorVersion)
|
||||
{
|
||||
var layerFragmentsPath = GetOption(args, "--surface-layer-fragments")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_LAYER_FRAGMENTS");
|
||||
var entryTraceGraphPath = GetOption(args, "--surface-entrytrace-graph")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_ENTRYTRACE_GRAPH");
|
||||
var entryTraceNdjsonPath = GetOption(args, "--surface-entrytrace-ndjson")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_ENTRYTRACE_NDJSON");
|
||||
|
||||
if (string.IsNullOrWhiteSpace(layerFragmentsPath) &&
|
||||
string.IsNullOrWhiteSpace(entryTraceGraphPath) &&
|
||||
string.IsNullOrWhiteSpace(entryTraceNdjsonPath))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var cacheRoot = GetOption(args, "--surface-cache-root")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_CACHE_ROOT")
|
||||
?? casRoot;
|
||||
var bucket = GetOption(args, "--surface-bucket")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_BUCKET")
|
||||
?? SurfaceCasLayout.DefaultBucket;
|
||||
var rootPrefix = GetOption(args, "--surface-root-prefix")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_ROOT_PREFIX")
|
||||
?? SurfaceCasLayout.DefaultRootPrefix;
|
||||
var tenant = GetOption(args, "--surface-tenant")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_TENANT")
|
||||
?? "default";
|
||||
var component = GetOption(args, "--surface-component")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_COMPONENT")
|
||||
?? "scanner.buildx";
|
||||
var componentVersion = GetOption(args, "--surface-component-version")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_COMPONENT_VERSION")
|
||||
?? generatorVersion;
|
||||
var workerInstance = GetOption(args, "--surface-worker-instance")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_WORKER_INSTANCE")
|
||||
?? Environment.MachineName;
|
||||
var attemptValue = GetOption(args, "--surface-attempt")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_ATTEMPT");
|
||||
var attempt = 1;
|
||||
if (!string.IsNullOrWhiteSpace(attemptValue) && int.TryParse(attemptValue, out var parsedAttempt) && parsedAttempt > 0)
|
||||
{
|
||||
attempt = parsedAttempt;
|
||||
}
|
||||
|
||||
var scanId = GetOption(args, "--surface-scan-id")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_SCAN_ID")
|
||||
?? descriptorRequest.SbomName
|
||||
?? descriptorRequest.ImageDigest;
|
||||
|
||||
var manifestOutput = GetOption(args, "--surface-manifest-output")
|
||||
?? Environment.GetEnvironmentVariable("STELLAOPS_SURFACE_MANIFEST_OUTPUT");
|
||||
|
||||
return new SurfaceOptions(
|
||||
CacheRoot: cacheRoot,
|
||||
CacheBucket: bucket,
|
||||
RootPrefix: rootPrefix,
|
||||
Tenant: tenant,
|
||||
Component: component,
|
||||
ComponentVersion: componentVersion,
|
||||
WorkerInstance: workerInstance,
|
||||
Attempt: attempt,
|
||||
ImageDigest: descriptorRequest.ImageDigest,
|
||||
ScanId: scanId,
|
||||
LayerFragmentsPath: layerFragmentsPath,
|
||||
EntryTraceGraphPath: entryTraceGraphPath,
|
||||
EntryTraceNdjsonPath: entryTraceNdjsonPath,
|
||||
ManifestOutputPath: manifestOutput);
|
||||
}
|
||||
|
||||
private static string? GetOption(string[] args, string optionName)
|
||||
{
|
||||
for (var i = 0; i < args.Length; i++)
|
||||
|
||||
@@ -0,0 +1,3 @@
|
||||
using System.Runtime.CompilerServices;
|
||||
|
||||
[assembly: InternalsVisibleTo("StellaOps.Scanner.Sbomer.BuildXPlugin.Tests")]
|
||||
@@ -12,9 +12,13 @@
|
||||
<InformationalVersion>0.1.0-alpha</InformationalVersion>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<Content Include="stellaops.sbom-indexer.manifest.json">
|
||||
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
|
||||
</Content>
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
<ItemGroup>
|
||||
<Content Include="stellaops.sbom-indexer.manifest.json">
|
||||
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
|
||||
</Content>
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\\__Libraries\\StellaOps.Scanner.Surface.FS\\StellaOps.Scanner.Surface.FS.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
|
||||
@@ -0,0 +1,112 @@
|
||||
using System;
|
||||
using System.IO;
|
||||
using System.Security.Cryptography;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin.Surface;
|
||||
|
||||
internal static class SurfaceCasLayout
|
||||
{
|
||||
internal const string DefaultBucket = "scanner-artifacts";
|
||||
internal const string DefaultRootPrefix = "scanner";
|
||||
private const string Sha256 = "sha256";
|
||||
|
||||
public static string NormalizeDigest(string? digest)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(digest))
|
||||
{
|
||||
throw new BuildxPluginException("Surface artefact digest cannot be empty.");
|
||||
}
|
||||
|
||||
var trimmed = digest.Trim();
|
||||
return trimmed.Contains(':', StringComparison.Ordinal)
|
||||
? trimmed
|
||||
: $"{Sha256}:{trimmed}";
|
||||
}
|
||||
|
||||
public static string ExtractDigestValue(string normalizedDigest)
|
||||
{
|
||||
var parts = normalizedDigest.Split(':', 2, StringSplitOptions.TrimEntries | StringSplitOptions.RemoveEmptyEntries);
|
||||
return parts.Length == 2 ? parts[1] : normalizedDigest;
|
||||
}
|
||||
|
||||
public static string BuildObjectKey(string rootPrefix, SurfaceCasKind kind, string normalizedDigest)
|
||||
{
|
||||
var digestValue = ExtractDigestValue(normalizedDigest);
|
||||
var prefix = kind switch
|
||||
{
|
||||
SurfaceCasKind.LayerFragments => "surface/payloads/layer-fragments",
|
||||
SurfaceCasKind.EntryTraceGraph => "surface/payloads/entrytrace",
|
||||
SurfaceCasKind.EntryTraceNdjson => "surface/payloads/entrytrace",
|
||||
SurfaceCasKind.Manifest => "surface/manifests",
|
||||
_ => "surface/unknown"
|
||||
};
|
||||
|
||||
var extension = kind switch
|
||||
{
|
||||
SurfaceCasKind.LayerFragments => "layer-fragments.json",
|
||||
SurfaceCasKind.EntryTraceGraph => "entrytrace.graph.json",
|
||||
SurfaceCasKind.EntryTraceNdjson => "entrytrace.ndjson",
|
||||
SurfaceCasKind.Manifest => "surface.manifest.json",
|
||||
_ => "artifact.bin"
|
||||
};
|
||||
|
||||
var normalizedRoot = string.IsNullOrWhiteSpace(rootPrefix)
|
||||
? string.Empty
|
||||
: rootPrefix.Trim().Trim('/');
|
||||
|
||||
var relative = $"{prefix}/{digestValue}/{extension}";
|
||||
return string.IsNullOrWhiteSpace(normalizedRoot) ? relative : $"{normalizedRoot}/{relative}";
|
||||
}
|
||||
|
||||
public static string BuildCasUri(string bucket, string objectKey)
|
||||
{
|
||||
var normalizedBucket = string.IsNullOrWhiteSpace(bucket) ? DefaultBucket : bucket.Trim();
|
||||
var normalizedKey = string.IsNullOrWhiteSpace(objectKey) ? string.Empty : objectKey.Trim().TrimStart('/');
|
||||
return $"cas://{normalizedBucket}/{normalizedKey}";
|
||||
}
|
||||
|
||||
public static string ComputeDigest(ReadOnlySpan<byte> content)
|
||||
{
|
||||
Span<byte> hash = stackalloc byte[32];
|
||||
SHA256.HashData(content, hash);
|
||||
return $"{Sha256}:{Convert.ToHexString(hash).ToLowerInvariant()}";
|
||||
}
|
||||
|
||||
public static async Task<string> WriteBytesAsync(string rootDirectory, string objectKey, byte[] bytes, CancellationToken cancellationToken)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(rootDirectory))
|
||||
{
|
||||
throw new BuildxPluginException("Surface cache root must be provided.");
|
||||
}
|
||||
|
||||
var normalizedRoot = Path.GetFullPath(rootDirectory);
|
||||
var relativePath = objectKey.Replace('/', Path.DirectorySeparatorChar);
|
||||
var fullPath = Path.Combine(normalizedRoot, relativePath);
|
||||
var directory = Path.GetDirectoryName(fullPath);
|
||||
if (!string.IsNullOrWhiteSpace(directory))
|
||||
{
|
||||
Directory.CreateDirectory(directory);
|
||||
}
|
||||
|
||||
await using var stream = new FileStream(
|
||||
fullPath,
|
||||
FileMode.Create,
|
||||
FileAccess.Write,
|
||||
FileShare.Read,
|
||||
bufferSize: 64 * 1024,
|
||||
options: FileOptions.Asynchronous | FileOptions.SequentialScan);
|
||||
await stream.WriteAsync(bytes.AsMemory(0, bytes.Length), cancellationToken).ConfigureAwait(false);
|
||||
await stream.FlushAsync(cancellationToken).ConfigureAwait(false);
|
||||
return fullPath;
|
||||
}
|
||||
}
|
||||
|
||||
internal enum SurfaceCasKind
|
||||
{
|
||||
LayerFragments,
|
||||
EntryTraceGraph,
|
||||
EntryTraceNdjson,
|
||||
Manifest
|
||||
}
|
||||
@@ -0,0 +1,227 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.IO;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using StellaOps.Scanner.Surface.FS;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin.Surface;
|
||||
|
||||
internal sealed class SurfaceManifestWriter
|
||||
{
|
||||
private static readonly JsonSerializerOptions ManifestSerializerOptions = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||
WriteIndented = false
|
||||
};
|
||||
|
||||
private readonly TimeProvider _timeProvider;
|
||||
|
||||
public SurfaceManifestWriter(TimeProvider timeProvider)
|
||||
{
|
||||
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
|
||||
}
|
||||
|
||||
public async Task<SurfaceManifestWriteResult?> WriteAsync(SurfaceOptions options, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(options);
|
||||
|
||||
if (!options.HasArtifacts)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var cacheRoot = EnsurePath(options.CacheRoot, "Surface cache root must be provided.");
|
||||
var bucket = string.IsNullOrWhiteSpace(options.CacheBucket)
|
||||
? SurfaceCasLayout.DefaultBucket
|
||||
: options.CacheBucket.Trim();
|
||||
var rootPrefix = string.IsNullOrWhiteSpace(options.RootPrefix)
|
||||
? SurfaceCasLayout.DefaultRootPrefix
|
||||
: options.RootPrefix.Trim();
|
||||
var tenant = string.IsNullOrWhiteSpace(options.Tenant)
|
||||
? "default"
|
||||
: options.Tenant.Trim();
|
||||
var component = string.IsNullOrWhiteSpace(options.Component)
|
||||
? "scanner.buildx"
|
||||
: options.Component.Trim();
|
||||
var componentVersion = string.IsNullOrWhiteSpace(options.ComponentVersion)
|
||||
? null
|
||||
: options.ComponentVersion.Trim();
|
||||
var workerInstance = string.IsNullOrWhiteSpace(options.WorkerInstance)
|
||||
? Environment.MachineName
|
||||
: options.WorkerInstance.Trim();
|
||||
var attempt = options.Attempt <= 0 ? 1 : options.Attempt;
|
||||
var scanId = string.IsNullOrWhiteSpace(options.ScanId)
|
||||
? options.ImageDigest
|
||||
: options.ScanId!.Trim();
|
||||
|
||||
Directory.CreateDirectory(cacheRoot);
|
||||
|
||||
var artifacts = new List<SurfaceArtifactWriteResult>();
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.EntryTraceGraphPath))
|
||||
{
|
||||
var descriptor = new SurfaceArtifactDescriptor(
|
||||
Kind: "entrytrace.graph",
|
||||
Format: "entrytrace.graph",
|
||||
MediaType: "application/json",
|
||||
View: null,
|
||||
CasKind: SurfaceCasKind.EntryTraceGraph,
|
||||
FilePath: EnsurePath(options.EntryTraceGraphPath!, "EntryTrace graph path is required."));
|
||||
artifacts.Add(await PersistArtifactAsync(descriptor, cacheRoot, bucket, rootPrefix, cancellationToken).ConfigureAwait(false));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.EntryTraceNdjsonPath))
|
||||
{
|
||||
var descriptor = new SurfaceArtifactDescriptor(
|
||||
Kind: "entrytrace.ndjson",
|
||||
Format: "entrytrace.ndjson",
|
||||
MediaType: "application/x-ndjson",
|
||||
View: null,
|
||||
CasKind: SurfaceCasKind.EntryTraceNdjson,
|
||||
FilePath: EnsurePath(options.EntryTraceNdjsonPath!, "EntryTrace NDJSON path is required."));
|
||||
artifacts.Add(await PersistArtifactAsync(descriptor, cacheRoot, bucket, rootPrefix, cancellationToken).ConfigureAwait(false));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.LayerFragmentsPath))
|
||||
{
|
||||
var descriptor = new SurfaceArtifactDescriptor(
|
||||
Kind: "layer.fragments",
|
||||
Format: "layer.fragments",
|
||||
MediaType: "application/json",
|
||||
View: "inventory",
|
||||
CasKind: SurfaceCasKind.LayerFragments,
|
||||
FilePath: EnsurePath(options.LayerFragmentsPath!, "Layer fragments path is required."));
|
||||
artifacts.Add(await PersistArtifactAsync(descriptor, cacheRoot, bucket, rootPrefix, cancellationToken).ConfigureAwait(false));
|
||||
}
|
||||
|
||||
if (artifacts.Count == 0)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var orderedArtifacts = artifacts
|
||||
.Select(a => a.ManifestArtifact)
|
||||
.OrderBy(a => a.Kind, StringComparer.Ordinal)
|
||||
.ThenBy(a => a.Format, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
var timestamp = _timeProvider.GetUtcNow();
|
||||
var manifestDocument = new SurfaceManifestDocument
|
||||
{
|
||||
Tenant = tenant,
|
||||
ImageDigest = SurfaceCasLayout.NormalizeDigest(options.ImageDigest),
|
||||
ScanId = scanId,
|
||||
GeneratedAt = timestamp,
|
||||
Source = new SurfaceManifestSource
|
||||
{
|
||||
Component = component,
|
||||
Version = componentVersion,
|
||||
WorkerInstance = workerInstance,
|
||||
Attempt = attempt
|
||||
},
|
||||
Artifacts = orderedArtifacts
|
||||
};
|
||||
|
||||
var manifestBytes = JsonSerializer.SerializeToUtf8Bytes(manifestDocument, ManifestSerializerOptions);
|
||||
var manifestDigest = SurfaceCasLayout.ComputeDigest(manifestBytes);
|
||||
var manifestKey = SurfaceCasLayout.BuildObjectKey(rootPrefix, SurfaceCasKind.Manifest, manifestDigest);
|
||||
var manifestPath = await SurfaceCasLayout.WriteBytesAsync(cacheRoot, manifestKey, manifestBytes, cancellationToken).ConfigureAwait(false);
|
||||
var manifestUri = SurfaceCasLayout.BuildCasUri(bucket, manifestKey);
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.ManifestOutputPath))
|
||||
{
|
||||
var manifestOutputPath = Path.GetFullPath(options.ManifestOutputPath);
|
||||
var manifestOutputDirectory = Path.GetDirectoryName(manifestOutputPath);
|
||||
if (!string.IsNullOrWhiteSpace(manifestOutputDirectory))
|
||||
{
|
||||
Directory.CreateDirectory(manifestOutputDirectory);
|
||||
}
|
||||
|
||||
await File.WriteAllBytesAsync(manifestOutputPath, manifestBytes, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
return new SurfaceManifestWriteResult(
|
||||
manifestDigest,
|
||||
manifestUri,
|
||||
manifestPath,
|
||||
manifestDocument,
|
||||
artifacts);
|
||||
}
|
||||
|
||||
private static async Task<SurfaceArtifactWriteResult> PersistArtifactAsync(
|
||||
SurfaceArtifactDescriptor descriptor,
|
||||
string cacheRoot,
|
||||
string bucket,
|
||||
string rootPrefix,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (!File.Exists(descriptor.FilePath))
|
||||
{
|
||||
throw new BuildxPluginException($"Surface artefact file {descriptor.FilePath} was not found.");
|
||||
}
|
||||
|
||||
var content = await File.ReadAllBytesAsync(descriptor.FilePath, cancellationToken).ConfigureAwait(false);
|
||||
var digest = SurfaceCasLayout.ComputeDigest(content);
|
||||
var objectKey = SurfaceCasLayout.BuildObjectKey(rootPrefix, descriptor.CasKind, digest);
|
||||
var filePath = await SurfaceCasLayout.WriteBytesAsync(cacheRoot, objectKey, content, cancellationToken).ConfigureAwait(false);
|
||||
var uri = SurfaceCasLayout.BuildCasUri(bucket, objectKey);
|
||||
|
||||
var storage = new SurfaceManifestStorage
|
||||
{
|
||||
Bucket = bucket,
|
||||
ObjectKey = objectKey,
|
||||
SizeBytes = content.Length,
|
||||
ContentType = descriptor.MediaType
|
||||
};
|
||||
|
||||
var artifact = new SurfaceManifestArtifact
|
||||
{
|
||||
Kind = descriptor.Kind,
|
||||
Uri = uri,
|
||||
Digest = digest,
|
||||
MediaType = descriptor.MediaType,
|
||||
Format = descriptor.Format,
|
||||
SizeBytes = content.Length,
|
||||
View = descriptor.View,
|
||||
Storage = storage
|
||||
};
|
||||
|
||||
return new SurfaceArtifactWriteResult(objectKey, filePath, artifact);
|
||||
}
|
||||
|
||||
private static string EnsurePath(string value, string message)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
throw new BuildxPluginException(message);
|
||||
}
|
||||
|
||||
return Path.GetFullPath(value.Trim());
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed record SurfaceArtifactDescriptor(
|
||||
string Kind,
|
||||
string Format,
|
||||
string MediaType,
|
||||
string? View,
|
||||
SurfaceCasKind CasKind,
|
||||
string FilePath);
|
||||
|
||||
internal sealed record SurfaceArtifactWriteResult(
|
||||
string ObjectKey,
|
||||
string FilePath,
|
||||
SurfaceManifestArtifact ManifestArtifact);
|
||||
|
||||
internal sealed record SurfaceManifestWriteResult(
|
||||
string ManifestDigest,
|
||||
string ManifestUri,
|
||||
string ManifestPath,
|
||||
SurfaceManifestDocument Document,
|
||||
IReadOnlyList<SurfaceArtifactWriteResult> Artifacts);
|
||||
@@ -0,0 +1,25 @@
|
||||
using System;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin.Surface;
|
||||
|
||||
internal sealed record SurfaceOptions(
|
||||
string CacheRoot,
|
||||
string CacheBucket,
|
||||
string RootPrefix,
|
||||
string Tenant,
|
||||
string Component,
|
||||
string ComponentVersion,
|
||||
string WorkerInstance,
|
||||
int Attempt,
|
||||
string ImageDigest,
|
||||
string? ScanId,
|
||||
string? LayerFragmentsPath,
|
||||
string? EntryTraceGraphPath,
|
||||
string? EntryTraceNdjsonPath,
|
||||
string? ManifestOutputPath)
|
||||
{
|
||||
public bool HasArtifacts =>
|
||||
!string.IsNullOrWhiteSpace(LayerFragmentsPath) ||
|
||||
!string.IsNullOrWhiteSpace(EntryTraceGraphPath) ||
|
||||
!string.IsNullOrWhiteSpace(EntryTraceNdjsonPath);
|
||||
}
|
||||
@@ -2,6 +2,6 @@
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| SCANNER-SURFACE-03 | DOING (2025-11-06) | BuildX Plugin Guild | SURFACE-FS-02 | Push layer manifests and entry fragments into Surface.FS during build-time SBOM generation.<br>2025-11-06: Kicked off manifest emitter wiring within BuildX export pipeline and outlined test fixtures targeting Surface.FS client mock. | BuildX integration tests confirm cache population; CLI docs updated. |
|
||||
| SCANNER-SURFACE-03 | DONE (2025-11-07) | BuildX Plugin Guild | SURFACE-FS-02 | Push layer manifests and entry fragments into Surface.FS during build-time SBOM generation.<br>2025-11-06: Kicked off manifest emitter wiring within BuildX export pipeline and outlined test fixtures targeting Surface.FS client mock.<br>2025-11-07: Resumed work; reviewing Surface.FS models, CAS integration, and test harness approach before coding.<br>2025-11-07 22:10Z: Implemented Surface manifest writer + CLI plumbing, wired CAS persistence, documented the workflow, and added BuildX plug-in tests + Grafana fixture updates. | BuildX integration tests confirm cache population; CLI docs updated. |
|
||||
| SCANNER-ENV-03 | TODO | BuildX Plugin Guild | SURFACE-ENV-02 | Adopt Surface.Env helpers for plugin configuration (cache roots, CAS endpoints, feature toggles). | Plugin loads helper; misconfig errors logged; README updated. |
|
||||
| SCANNER-SECRETS-03 | TODO | BuildX Plugin Guild, Security Guild | SURFACE-SECRETS-02 | Use Surface.Secrets to retrieve registry credentials when interacting with CAS/referrers. | Secrets retrieved via shared library; e2e tests cover rotation; operations guide refreshed. |
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
using System.Collections.Generic;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.Scanner.Surface.Env;
|
||||
@@ -28,6 +29,13 @@ internal sealed class ScannerSurfaceSecretConfigurator : IConfigureOptions<Scann
|
||||
ArgumentNullException.ThrowIfNull(options);
|
||||
|
||||
var tenant = _surfaceEnvironment.Settings.Secrets.Tenant;
|
||||
ApplyCasAccessSecret(options, tenant);
|
||||
ApplyRegistrySecret(options, tenant);
|
||||
ApplyAttestationSecret(options, tenant);
|
||||
}
|
||||
|
||||
private void ApplyCasAccessSecret(ScannerWebServiceOptions options, string tenant)
|
||||
{
|
||||
var request = new SurfaceSecretRequest(
|
||||
Tenant: tenant,
|
||||
Component: ComponentName,
|
||||
@@ -56,6 +64,120 @@ internal sealed class ScannerSurfaceSecretConfigurator : IConfigureOptions<Scann
|
||||
ApplySecret(options.ArtifactStore ??= new ScannerWebServiceOptions.ArtifactStoreOptions(), secret);
|
||||
}
|
||||
|
||||
private void ApplyRegistrySecret(ScannerWebServiceOptions options, string tenant)
|
||||
{
|
||||
var request = new SurfaceSecretRequest(
|
||||
Tenant: tenant,
|
||||
Component: ComponentName,
|
||||
SecretType: "registry");
|
||||
|
||||
RegistryAccessSecret? secret = null;
|
||||
try
|
||||
{
|
||||
using var handle = _secretProvider.GetAsync(request).AsTask().GetAwaiter().GetResult();
|
||||
secret = SurfaceSecretParser.ParseRegistryAccessSecret(handle);
|
||||
}
|
||||
catch (SurfaceSecretNotFoundException)
|
||||
{
|
||||
_logger.LogDebug("Surface secret 'registry' not found for {Component}; leaving registry credentials unchanged.", ComponentName);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogWarning(ex, "Failed to resolve surface secret 'registry' for {Component}.", ComponentName);
|
||||
}
|
||||
|
||||
if (secret is null)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
options.Registry ??= new ScannerWebServiceOptions.RegistryOptions();
|
||||
options.Registry.DefaultRegistry = secret.DefaultRegistry;
|
||||
options.Registry.Credentials = new List<ScannerWebServiceOptions.RegistryCredentialOptions>();
|
||||
|
||||
foreach (var entry in secret.Entries)
|
||||
{
|
||||
var credential = new ScannerWebServiceOptions.RegistryCredentialOptions
|
||||
{
|
||||
Registry = entry.Registry,
|
||||
Username = entry.Username,
|
||||
Password = entry.Password,
|
||||
IdentityToken = entry.IdentityToken,
|
||||
RegistryToken = entry.RegistryToken,
|
||||
RefreshToken = entry.RefreshToken,
|
||||
ExpiresAt = entry.ExpiresAt,
|
||||
AllowInsecureTls = entry.AllowInsecureTls,
|
||||
Email = entry.Email
|
||||
};
|
||||
|
||||
if (entry.Scopes.Count > 0)
|
||||
{
|
||||
credential.Scopes = new List<string>(entry.Scopes);
|
||||
}
|
||||
|
||||
if (entry.Headers.Count > 0)
|
||||
{
|
||||
foreach (var (key, value) in entry.Headers)
|
||||
{
|
||||
credential.Headers[key] = value;
|
||||
}
|
||||
}
|
||||
|
||||
options.Registry.Credentials.Add(credential);
|
||||
}
|
||||
|
||||
_logger.LogInformation(
|
||||
"Surface secret 'registry' applied for {Component} (default: {DefaultRegistry}, entries: {Count}).",
|
||||
ComponentName,
|
||||
options.Registry.DefaultRegistry ?? "(none)",
|
||||
options.Registry.Credentials.Count);
|
||||
}
|
||||
|
||||
private void ApplyAttestationSecret(ScannerWebServiceOptions options, string tenant)
|
||||
{
|
||||
var request = new SurfaceSecretRequest(
|
||||
Tenant: tenant,
|
||||
Component: ComponentName,
|
||||
SecretType: "attestation");
|
||||
|
||||
AttestationSecret? secret = null;
|
||||
try
|
||||
{
|
||||
using var handle = _secretProvider.GetAsync(request).AsTask().GetAwaiter().GetResult();
|
||||
secret = SurfaceSecretParser.ParseAttestationSecret(handle);
|
||||
}
|
||||
catch (SurfaceSecretNotFoundException)
|
||||
{
|
||||
_logger.LogDebug("Surface secret 'attestation' not found for {Component}; retaining signing configuration.", ComponentName);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogWarning(ex, "Failed to resolve surface secret 'attestation' for {Component}.", ComponentName);
|
||||
}
|
||||
|
||||
if (secret is null)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
options.Signing ??= new ScannerWebServiceOptions.SigningOptions();
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(secret.KeyPem))
|
||||
{
|
||||
options.Signing.KeyPem = secret.KeyPem;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(secret.CertificatePem))
|
||||
{
|
||||
options.Signing.CertificatePem = secret.CertificatePem;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(secret.CertificateChainPem))
|
||||
{
|
||||
options.Signing.CertificateChainPem = secret.CertificateChainPem;
|
||||
}
|
||||
}
|
||||
|
||||
private void ApplySecret(ScannerWebServiceOptions.ArtifactStoreOptions artifactStore, CasAccessSecret secret)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(secret.Driver))
|
||||
|
||||
@@ -26,10 +26,15 @@ public sealed class ScannerWebServiceOptions
|
||||
/// </summary>
|
||||
public QueueOptions Queue { get; set; } = new();
|
||||
|
||||
/// <summary>
|
||||
/// Object store configuration for SBOM artefacts.
|
||||
/// </summary>
|
||||
public ArtifactStoreOptions ArtifactStore { get; set; } = new();
|
||||
/// <summary>
|
||||
/// Object store configuration for SBOM artefacts.
|
||||
/// </summary>
|
||||
public ArtifactStoreOptions ArtifactStore { get; set; } = new();
|
||||
|
||||
/// <summary>
|
||||
/// Registry credential configuration for report/export operations.
|
||||
/// </summary>
|
||||
public RegistryOptions Registry { get; set; } = new();
|
||||
|
||||
/// <summary>
|
||||
/// Feature flags toggling optional behaviours.
|
||||
@@ -144,11 +149,11 @@ public sealed class ScannerWebServiceOptions
|
||||
public IDictionary<string, string> Headers { get; set; } = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
public sealed class FeatureFlagOptions
|
||||
{
|
||||
public bool AllowAnonymousScanSubmission { get; set; }
|
||||
|
||||
public bool EnableSignedReports { get; set; } = true;
|
||||
public sealed class FeatureFlagOptions
|
||||
{
|
||||
public bool AllowAnonymousScanSubmission { get; set; }
|
||||
|
||||
public bool EnableSignedReports { get; set; } = true;
|
||||
|
||||
public bool EnablePolicyPreview { get; set; } = true;
|
||||
|
||||
@@ -233,11 +238,11 @@ public sealed class ScannerWebServiceOptions
|
||||
}
|
||||
}
|
||||
|
||||
public sealed class SigningOptions
|
||||
{
|
||||
public bool Enabled { get; set; } = false;
|
||||
|
||||
public string KeyId { get; set; } = string.Empty;
|
||||
public sealed class SigningOptions
|
||||
{
|
||||
public bool Enabled { get; set; } = false;
|
||||
|
||||
public string KeyId { get; set; } = string.Empty;
|
||||
|
||||
public string Algorithm { get; set; } = "ed25519";
|
||||
|
||||
@@ -251,12 +256,44 @@ public sealed class ScannerWebServiceOptions
|
||||
|
||||
public string? CertificatePemFile { get; set; }
|
||||
|
||||
public string? CertificateChainPem { get; set; }
|
||||
|
||||
public string? CertificateChainPemFile { get; set; }
|
||||
|
||||
public int EnvelopeTtlSeconds { get; set; } = 600;
|
||||
}
|
||||
public string? CertificateChainPem { get; set; }
|
||||
|
||||
public string? CertificateChainPemFile { get; set; }
|
||||
|
||||
public int EnvelopeTtlSeconds { get; set; } = 600;
|
||||
}
|
||||
|
||||
public sealed class RegistryOptions
|
||||
{
|
||||
public string? DefaultRegistry { get; set; }
|
||||
|
||||
public IList<RegistryCredentialOptions> Credentials { get; set; } = new List<RegistryCredentialOptions>();
|
||||
}
|
||||
|
||||
public sealed class RegistryCredentialOptions
|
||||
{
|
||||
public string Registry { get; set; } = string.Empty;
|
||||
|
||||
public string? Username { get; set; }
|
||||
|
||||
public string? Password { get; set; }
|
||||
|
||||
public string? IdentityToken { get; set; }
|
||||
|
||||
public string? RegistryToken { get; set; }
|
||||
|
||||
public string? RefreshToken { get; set; }
|
||||
|
||||
public DateTimeOffset? ExpiresAt { get; set; }
|
||||
|
||||
public IList<string> Scopes { get; set; } = new List<string>();
|
||||
|
||||
public bool? AllowInsecureTls { get; set; }
|
||||
|
||||
public IDictionary<string, string> Headers { get; set; } = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
public string? Email { get; set; }
|
||||
}
|
||||
|
||||
public sealed class ApiOptions
|
||||
{
|
||||
|
||||
@@ -179,6 +179,10 @@ internal sealed class SurfacePointerService : ISurfacePointerService
|
||||
ArtifactDocumentFormat.SpdxJson => "spdx-json",
|
||||
ArtifactDocumentFormat.BomIndex => "bom-index",
|
||||
ArtifactDocumentFormat.DsseJson => "dsse-json",
|
||||
ArtifactDocumentFormat.SurfaceManifestJson => "surface.manifest",
|
||||
ArtifactDocumentFormat.EntryTraceGraphJson => "entrytrace.graph",
|
||||
ArtifactDocumentFormat.EntryTraceNdjson => "entrytrace.ndjson",
|
||||
ArtifactDocumentFormat.ComponentFragmentJson => "layer.fragments",
|
||||
_ => format.ToString().ToLowerInvariant()
|
||||
};
|
||||
|
||||
@@ -199,6 +203,12 @@ internal sealed class SurfacePointerService : ISurfacePointerService
|
||||
ArtifactDocumentType.Diff => ("diff", null),
|
||||
ArtifactDocumentType.Attestation => ("attestation", null),
|
||||
ArtifactDocumentType.Index => ("bom-index", null),
|
||||
ArtifactDocumentType.SurfaceManifest => ("surface.manifest", null),
|
||||
ArtifactDocumentType.SurfaceEntryTrace when document.Format == ArtifactDocumentFormat.EntryTraceGraphJson
|
||||
=> ("entrytrace.graph", null),
|
||||
ArtifactDocumentType.SurfaceEntryTrace when document.Format == ArtifactDocumentFormat.EntryTraceNdjson
|
||||
=> ("entrytrace.ndjson", null),
|
||||
ArtifactDocumentType.SurfaceLayerFragment => ("layer.fragments", "inventory"),
|
||||
_ => (document.Type.ToString().ToLowerInvariant(), null)
|
||||
};
|
||||
}
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
| SCANNER-SURFACE-02 | DONE (2025-11-05) | Scanner WebService Guild | SURFACE-FS-02 | Publish Surface.FS pointers (CAS URIs, manifests) via scan/report APIs and update attestation metadata.<br>2025-11-05: Surface pointers projected through scan/report endpoints, orchestrator samples + DSSE fixtures refreshed with manifest block, readiness tests updated to use validator stub. | OpenAPI updated; clients regenerated; integration tests validate pointer presence and tenancy. |
|
||||
| SCANNER-ENV-02 | TODO (2025-11-06) | Scanner WebService Guild, Ops Guild | SURFACE-ENV-02 | Wire Surface.Env helpers into WebService hosting (cache roots, feature flags) and document configuration.<br>2025-11-02: Cache root resolution switched to helper; feature flag bindings updated; Helm/Compose updates pending review.<br>2025-11-05 14:55Z: Aligning readiness checks, docs, and Helm/Compose templates with Surface.Env outputs and planning test coverage for configuration fallbacks.<br>2025-11-06 17:05Z: Surface.Env documentation/README refreshed; warning catalogue captured for ops handoff.<br>2025-11-06 07:45Z: Helm values (dev/stage/prod/airgap/mirror) and Compose examples updated with `SCANNER_SURFACE_*` defaults plus rollout warning note in `deploy/README.md`.<br>2025-11-06 07:55Z: Paused; follow-up automation captured under `DEVOPS-OPENSSL-11-001/002` and pending Surface.Env readiness tests. | Service uses helper; env table documented; helm/compose templates updated. |
|
||||
> 2025-11-05 19:18Z: Added configurator to project wiring and unit test ensuring Surface.Env cache root is honoured.
|
||||
| SCANNER-SECRETS-02 | DOING (2025-11-06) | Scanner WebService Guild, Security Guild | SURFACE-SECRETS-02 | Replace ad-hoc secret wiring with Surface.Secrets for report/export operations (registry and CAS tokens).<br>2025-11-02: Export/report flows now depend on Surface.Secrets stub; integration tests in progress.<br>2025-11-06: Restarting work to eliminate file-based secrets, plumb provider handles through report/export services, and extend failure/rotation tests.<br>2025-11-06 21:40Z: Added configurator + storage post-config to hydrate artifact/CAS credentials from `cas-access` secrets with unit coverage. | Secrets fetched through shared provider; unit/integration tests cover rotation + failure cases. |
|
||||
| SCANNER-SECRETS-02 | DONE (2025-11-06) | Scanner WebService Guild, Security Guild | SURFACE-SECRETS-02 | Replace ad-hoc secret wiring with Surface.Secrets for report/export operations (registry and CAS tokens).<br>2025-11-02: Export/report flows now depend on Surface.Secrets stub; integration tests in progress.<br>2025-11-06: Restarting work to eliminate file-based secrets, plumb provider handles through report/export services, and extend failure/rotation tests.<br>2025-11-06 21:40Z: Added configurator + storage post-config to hydrate artifact/CAS credentials from `cas-access` secrets with unit coverage.<br>2025-11-06 23:58Z: Registry & attestation secrets now resolved via Surface.Secrets (options + tests updated); dotnet test suites executed with .NET 10 RC2 runtime where available. | Secrets fetched through shared provider; unit/integration tests cover rotation + failure cases. |
|
||||
| SCANNER-EVENTS-16-301 | BLOCKED (2025-10-26) | Scanner WebService Guild | ORCH-SVC-38-101, NOTIFY-SVC-38-001 | Emit orchestrator-compatible envelopes (`scanner.event.*`) and update integration tests to verify Notifier ingestion (no Redis queue coupling). | Tests assert envelope schema + orchestrator publish; Notifier consumer harness passes; docs updated with new event contract. Blocked by .NET 10 preview OpenAPI/Auth dependency drift preventing `dotnet test` completion. |
|
||||
| SCANNER-EVENTS-16-302 | DONE (2025-11-06) | Scanner WebService Guild | SCANNER-EVENTS-16-301 | Extend orchestrator event links (report/policy/attestation) once endpoints are finalised across gateway + console.<br>2025-11-06 22:55Z: Dispatcher now honours configurable API/console base segments, JSON samples/docs refreshed, and `ReportEventDispatcherTests` extended. Tests: `StellaOps.Scanner.WebService.Tests` build until pre-existing `SurfaceCacheOptionsConfiguratorTests` ctor signature drift (tracked separately). | Links section covers UI/API targets; downstream consumers validated; docs/samples updated. |
|
||||
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics.Metrics;
|
||||
using StellaOps.Scanner.Worker.Processing;
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics.Metrics;
|
||||
using StellaOps.Scanner.Surface.Secrets;
|
||||
using StellaOps.Scanner.Worker.Processing;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Diagnostics;
|
||||
|
||||
@@ -9,11 +10,18 @@ public sealed class ScannerWorkerMetrics
|
||||
{
|
||||
private readonly Histogram<double> _queueLatencyMs;
|
||||
private readonly Histogram<double> _jobDurationMs;
|
||||
private readonly Histogram<double> _stageDurationMs;
|
||||
private readonly Histogram<double> _stageDurationMs;
|
||||
private readonly Counter<long> _jobsCompleted;
|
||||
private readonly Counter<long> _jobsFailed;
|
||||
private readonly Counter<long> _languageCacheHits;
|
||||
private readonly Counter<long> _languageCacheMisses;
|
||||
private readonly Counter<long> _registrySecretRequests;
|
||||
private readonly Histogram<double> _registrySecretTtlSeconds;
|
||||
private readonly Counter<long> _surfaceManifestsPublished;
|
||||
private readonly Counter<long> _surfaceManifestSkipped;
|
||||
private readonly Counter<long> _surfaceManifestFailures;
|
||||
private readonly Counter<long> _surfacePayloadPersisted;
|
||||
private readonly Histogram<double> _surfaceManifestPublishDurationMs;
|
||||
|
||||
public ScannerWorkerMetrics()
|
||||
{
|
||||
@@ -41,6 +49,29 @@ public sealed class ScannerWorkerMetrics
|
||||
_languageCacheMisses = ScannerWorkerInstrumentation.Meter.CreateCounter<long>(
|
||||
"scanner_worker_language_cache_misses_total",
|
||||
description: "Number of language analyzer cache misses encountered by the worker.");
|
||||
_registrySecretRequests = ScannerWorkerInstrumentation.Meter.CreateCounter<long>(
|
||||
"scanner_worker_registry_secret_requests_total",
|
||||
description: "Number of registry secret resolution attempts performed by the worker.");
|
||||
_registrySecretTtlSeconds = ScannerWorkerInstrumentation.Meter.CreateHistogram<double>(
|
||||
"scanner_worker_registry_secret_ttl_seconds",
|
||||
unit: "s",
|
||||
description: "Time-to-live in seconds for resolved registry secrets (earliest expiration).");
|
||||
_surfaceManifestsPublished = ScannerWorkerInstrumentation.Meter.CreateCounter<long>(
|
||||
"scanner_worker_surface_manifests_published_total",
|
||||
description: "Number of surface manifests successfully published by the worker.");
|
||||
_surfaceManifestSkipped = ScannerWorkerInstrumentation.Meter.CreateCounter<long>(
|
||||
"scanner_worker_surface_manifests_skipped_total",
|
||||
description: "Number of surface manifest publish attempts skipped due to missing payloads.");
|
||||
_surfaceManifestFailures = ScannerWorkerInstrumentation.Meter.CreateCounter<long>(
|
||||
"scanner_worker_surface_manifests_failed_total",
|
||||
description: "Number of surface manifest publish attempts that failed.");
|
||||
_surfacePayloadPersisted = ScannerWorkerInstrumentation.Meter.CreateCounter<long>(
|
||||
"scanner_worker_surface_payload_persisted_total",
|
||||
description: "Number of surface payload artefacts persisted to the local cache.");
|
||||
_surfaceManifestPublishDurationMs = ScannerWorkerInstrumentation.Meter.CreateHistogram<double>(
|
||||
"scanner_worker_surface_manifest_publish_duration_ms",
|
||||
unit: "ms",
|
||||
description: "Duration in milliseconds to persist and publish surface manifests.");
|
||||
}
|
||||
|
||||
public void RecordQueueLatency(ScanJobContext context, TimeSpan latency)
|
||||
@@ -63,15 +94,15 @@ public sealed class ScannerWorkerMetrics
|
||||
_jobDurationMs.Record(duration.TotalMilliseconds, CreateTags(context));
|
||||
}
|
||||
|
||||
public void RecordStageDuration(ScanJobContext context, string stage, TimeSpan duration)
|
||||
{
|
||||
if (duration <= TimeSpan.Zero)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
_stageDurationMs.Record(duration.TotalMilliseconds, CreateTags(context, stage: stage));
|
||||
}
|
||||
public void RecordStageDuration(ScanJobContext context, string stage, TimeSpan duration)
|
||||
{
|
||||
if (duration <= TimeSpan.Zero)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
_stageDurationMs.Record(duration.TotalMilliseconds, CreateTags(context, stage: stage));
|
||||
}
|
||||
|
||||
public void IncrementJobCompleted(ScanJobContext context)
|
||||
{
|
||||
@@ -93,9 +124,130 @@ public sealed class ScannerWorkerMetrics
|
||||
_languageCacheMisses.Add(1, CreateTags(context, analyzerId: analyzerId));
|
||||
}
|
||||
|
||||
private static KeyValuePair<string, object?>[] CreateTags(ScanJobContext context, string? stage = null, string? failureReason = null, string? analyzerId = null)
|
||||
public void RecordRegistrySecretResolved(
|
||||
ScanJobContext context,
|
||||
string secretName,
|
||||
RegistryAccessSecret secret,
|
||||
TimeProvider timeProvider)
|
||||
{
|
||||
var tags = new List<KeyValuePair<string, object?>>(stage is null ? 5 : 6)
|
||||
var tags = CreateTags(
|
||||
context,
|
||||
secretName: secretName,
|
||||
secretResult: "resolved",
|
||||
secretEntryCount: secret.Entries.Count);
|
||||
|
||||
_registrySecretRequests.Add(1, tags);
|
||||
|
||||
if (ComputeTtlSeconds(secret, timeProvider) is double ttlSeconds)
|
||||
{
|
||||
_registrySecretTtlSeconds.Record(ttlSeconds, tags);
|
||||
}
|
||||
}
|
||||
|
||||
public void RecordRegistrySecretMissing(ScanJobContext context, string secretName)
|
||||
{
|
||||
var tags = CreateTags(context, secretName: secretName, secretResult: "missing");
|
||||
_registrySecretRequests.Add(1, tags);
|
||||
}
|
||||
|
||||
public void RecordRegistrySecretFailure(ScanJobContext context, string secretName)
|
||||
{
|
||||
var tags = CreateTags(context, secretName: secretName, secretResult: "failure");
|
||||
_registrySecretRequests.Add(1, tags);
|
||||
}
|
||||
|
||||
public void RecordSurfaceManifestPublished(ScanJobContext context, int payloadCount, TimeSpan duration)
|
||||
{
|
||||
if (payloadCount < 0)
|
||||
{
|
||||
payloadCount = 0;
|
||||
}
|
||||
|
||||
var tags = CreateTags(
|
||||
context,
|
||||
surfaceAction: "manifest",
|
||||
surfaceResult: "published",
|
||||
surfacePayloadCount: payloadCount);
|
||||
|
||||
_surfaceManifestsPublished.Add(1, tags);
|
||||
|
||||
if (duration > TimeSpan.Zero)
|
||||
{
|
||||
_surfaceManifestPublishDurationMs.Record(duration.TotalMilliseconds, tags);
|
||||
}
|
||||
}
|
||||
|
||||
public void RecordSurfaceManifestSkipped(ScanJobContext context)
|
||||
{
|
||||
var tags = CreateTags(context, surfaceAction: "manifest", surfaceResult: "skipped");
|
||||
_surfaceManifestSkipped.Add(1, tags);
|
||||
}
|
||||
|
||||
public void RecordSurfaceManifestFailed(ScanJobContext context, string failureReason)
|
||||
{
|
||||
var tags = CreateTags(
|
||||
context,
|
||||
surfaceAction: "manifest",
|
||||
surfaceResult: "failed",
|
||||
failureReason: failureReason);
|
||||
_surfaceManifestFailures.Add(1, tags);
|
||||
}
|
||||
|
||||
public void RecordSurfacePayloadPersisted(ScanJobContext context, string surfaceKind)
|
||||
{
|
||||
var normalizedKind = string.IsNullOrWhiteSpace(surfaceKind)
|
||||
? "unknown"
|
||||
: surfaceKind.Trim().ToLowerInvariant();
|
||||
|
||||
var tags = CreateTags(
|
||||
context,
|
||||
surfaceAction: "payload",
|
||||
surfaceKind: normalizedKind,
|
||||
surfaceResult: "cached");
|
||||
|
||||
_surfacePayloadPersisted.Add(1, tags);
|
||||
}
|
||||
|
||||
private static double? ComputeTtlSeconds(RegistryAccessSecret secret, TimeProvider timeProvider)
|
||||
{
|
||||
DateTimeOffset? earliest = null;
|
||||
foreach (var entry in secret.Entries)
|
||||
{
|
||||
if (entry.ExpiresAt is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (earliest is null || entry.ExpiresAt < earliest)
|
||||
{
|
||||
earliest = entry.ExpiresAt;
|
||||
}
|
||||
}
|
||||
|
||||
if (earliest is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var now = timeProvider.GetUtcNow();
|
||||
var ttl = (earliest.Value - now).TotalSeconds;
|
||||
return ttl < 0 ? 0 : ttl;
|
||||
}
|
||||
|
||||
private static KeyValuePair<string, object?>[] CreateTags(
|
||||
ScanJobContext context,
|
||||
string? stage = null,
|
||||
string? failureReason = null,
|
||||
string? analyzerId = null,
|
||||
string? secretName = null,
|
||||
string? secretResult = null,
|
||||
int? secretEntryCount = null,
|
||||
string? surfaceAction = null,
|
||||
string? surfaceKind = null,
|
||||
string? surfaceResult = null,
|
||||
int? surfacePayloadCount = null)
|
||||
{
|
||||
var tags = new List<KeyValuePair<string, object?>>(8)
|
||||
{
|
||||
new("job.id", context.JobId),
|
||||
new("scan.id", context.ScanId),
|
||||
@@ -113,10 +265,10 @@ public sealed class ScannerWorkerMetrics
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(stage))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("stage", stage));
|
||||
}
|
||||
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("stage", stage));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(failureReason))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("reason", failureReason));
|
||||
@@ -127,6 +279,41 @@ public sealed class ScannerWorkerMetrics
|
||||
tags.Add(new KeyValuePair<string, object?>("analyzer.id", analyzerId));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(secretName))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("secret.name", secretName));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(secretResult))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("secret.result", secretResult));
|
||||
}
|
||||
|
||||
if (secretEntryCount is not null)
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("secret.entries", secretEntryCount.Value));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(surfaceAction))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("surface.action", surfaceAction));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(surfaceKind))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("surface.kind", surfaceKind));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(surfaceResult))
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("surface.result", surfaceResult));
|
||||
}
|
||||
|
||||
if (surfacePayloadCount is not null)
|
||||
{
|
||||
tags.Add(new KeyValuePair<string, object?>("surface.payload_count", surfacePayloadCount.Value));
|
||||
}
|
||||
|
||||
return tags.ToArray();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1,108 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
using StellaOps.Scanner.Surface.Env;
|
||||
using StellaOps.Scanner.Surface.Secrets;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Processing;
|
||||
|
||||
internal sealed class RegistrySecretStageExecutor : IScanStageExecutor
|
||||
{
|
||||
private const string ComponentName = "Scanner.Worker.Registry";
|
||||
private const string SecretType = "registry";
|
||||
|
||||
private static readonly string[] SecretNameMetadataKeys =
|
||||
{
|
||||
"surface.registry.secret",
|
||||
"scanner.registry.secret",
|
||||
"registry.secret",
|
||||
};
|
||||
|
||||
private readonly ISurfaceSecretProvider _secretProvider;
|
||||
private readonly ISurfaceEnvironment _surfaceEnvironment;
|
||||
private readonly ScannerWorkerMetrics _metrics;
|
||||
private readonly TimeProvider _timeProvider;
|
||||
private readonly ILogger<RegistrySecretStageExecutor> _logger;
|
||||
|
||||
public RegistrySecretStageExecutor(
|
||||
ISurfaceSecretProvider secretProvider,
|
||||
ISurfaceEnvironment surfaceEnvironment,
|
||||
ScannerWorkerMetrics metrics,
|
||||
TimeProvider timeProvider,
|
||||
ILogger<RegistrySecretStageExecutor> logger)
|
||||
{
|
||||
_secretProvider = secretProvider ?? throw new ArgumentNullException(nameof(secretProvider));
|
||||
_surfaceEnvironment = surfaceEnvironment ?? throw new ArgumentNullException(nameof(surfaceEnvironment));
|
||||
_metrics = metrics ?? throw new ArgumentNullException(nameof(metrics));
|
||||
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
}
|
||||
|
||||
public string StageName => ScanStageNames.ResolveImage;
|
||||
|
||||
public async ValueTask ExecuteAsync(ScanJobContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
|
||||
var secretName = ResolveSecretName(context.Lease.Metadata);
|
||||
var request = new SurfaceSecretRequest(
|
||||
Tenant: _surfaceEnvironment.Settings.Secrets.Tenant,
|
||||
Component: ComponentName,
|
||||
SecretType: SecretType,
|
||||
Name: secretName);
|
||||
|
||||
try
|
||||
{
|
||||
using var handle = await _secretProvider.GetAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
var secret = SurfaceSecretParser.ParseRegistryAccessSecret(handle);
|
||||
|
||||
context.Analysis.Set(ScanAnalysisKeys.RegistryCredentials, secret);
|
||||
|
||||
_metrics.RecordRegistrySecretResolved(
|
||||
context,
|
||||
secretName ?? "default",
|
||||
secret,
|
||||
_timeProvider);
|
||||
|
||||
_logger.LogInformation(
|
||||
"Registry secret '{SecretName}' resolved with {EntryCount} entries for job {JobId}.",
|
||||
secretName ?? "default",
|
||||
secret.Entries.Count,
|
||||
context.JobId);
|
||||
}
|
||||
catch (SurfaceSecretNotFoundException)
|
||||
{
|
||||
_metrics.RecordRegistrySecretMissing(context, secretName ?? "default");
|
||||
_logger.LogDebug(
|
||||
"Registry secret '{SecretName}' not found for job {JobId}; continuing without registry credentials.",
|
||||
secretName ?? "default",
|
||||
context.JobId);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_metrics.RecordRegistrySecretFailure(context, secretName ?? "default");
|
||||
_logger.LogWarning(
|
||||
ex,
|
||||
"Failed to resolve registry secret '{SecretName}' for job {JobId}; continuing without registry credentials.",
|
||||
secretName ?? "default",
|
||||
context.JobId);
|
||||
}
|
||||
}
|
||||
|
||||
private static string? ResolveSecretName(IReadOnlyDictionary<string, string> metadata)
|
||||
{
|
||||
foreach (var key in SecretNameMetadataKeys)
|
||||
{
|
||||
if (metadata.TryGetValue(key, out var value) && !string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return value.Trim();
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -4,6 +4,7 @@ using System.IO;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
@@ -37,7 +38,12 @@ internal sealed record SurfaceManifestRequest(
|
||||
string? Version,
|
||||
string? WorkerInstance);
|
||||
|
||||
internal sealed class SurfaceManifestPublisher
|
||||
internal interface ISurfaceManifestPublisher
|
||||
{
|
||||
Task<SurfaceManifestPublishResult> PublishAsync(SurfaceManifestRequest request, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
internal sealed class SurfaceManifestPublisher : ISurfaceManifestPublisher
|
||||
{
|
||||
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
|
||||
@@ -1,14 +1,20 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Diagnostics;
|
||||
using System.Globalization;
|
||||
using System.Reflection;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
using StellaOps.Scanner.EntryTrace;
|
||||
using StellaOps.Scanner.EntryTrace.Serialization;
|
||||
using StellaOps.Scanner.Surface.Env;
|
||||
using StellaOps.Scanner.Surface.FS;
|
||||
using StellaOps.Scanner.Storage.Catalog;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Processing.Surface;
|
||||
|
||||
@@ -20,15 +26,30 @@ internal sealed class SurfaceManifestStageExecutor : IScanStageExecutor
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
|
||||
};
|
||||
|
||||
private readonly SurfaceManifestPublisher _publisher;
|
||||
private static readonly JsonSerializerOptions ManifestSerializerOptions = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
WriteIndented = false,
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
|
||||
};
|
||||
|
||||
private readonly ISurfaceManifestPublisher _publisher;
|
||||
private readonly ISurfaceCache _surfaceCache;
|
||||
private readonly ISurfaceEnvironment _surfaceEnvironment;
|
||||
private readonly ScannerWorkerMetrics _metrics;
|
||||
private readonly ILogger<SurfaceManifestStageExecutor> _logger;
|
||||
private readonly string _componentVersion;
|
||||
|
||||
public SurfaceManifestStageExecutor(
|
||||
SurfaceManifestPublisher publisher,
|
||||
ISurfaceManifestPublisher publisher,
|
||||
ISurfaceCache surfaceCache,
|
||||
ISurfaceEnvironment surfaceEnvironment,
|
||||
ScannerWorkerMetrics metrics,
|
||||
ILogger<SurfaceManifestStageExecutor> logger)
|
||||
{
|
||||
_publisher = publisher ?? throw new ArgumentNullException(nameof(publisher));
|
||||
_surfaceCache = surfaceCache ?? throw new ArgumentNullException(nameof(surfaceCache));
|
||||
_surfaceEnvironment = surfaceEnvironment ?? throw new ArgumentNullException(nameof(surfaceEnvironment));
|
||||
_metrics = metrics ?? throw new ArgumentNullException(nameof(metrics));
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
_componentVersion = Assembly.GetExecutingAssembly().GetName().Version?.ToString() ?? "unknown";
|
||||
}
|
||||
@@ -42,23 +63,49 @@ internal sealed class SurfaceManifestStageExecutor : IScanStageExecutor
|
||||
var payloads = CollectPayloads(context);
|
||||
if (payloads.Count == 0)
|
||||
{
|
||||
_metrics.RecordSurfaceManifestSkipped(context);
|
||||
_logger.LogDebug("No surface payloads available for job {JobId}; skipping manifest publish.", context.JobId);
|
||||
return;
|
||||
}
|
||||
|
||||
var request = new SurfaceManifestRequest(
|
||||
ScanId: context.ScanId,
|
||||
ImageDigest: ResolveImageDigest(context),
|
||||
Attempt: context.Lease.Attempt,
|
||||
Metadata: context.Lease.Metadata,
|
||||
Payloads: payloads,
|
||||
Component: "scanner.worker",
|
||||
Version: _componentVersion,
|
||||
WorkerInstance: Environment.MachineName);
|
||||
var tenant = _surfaceEnvironment.Settings?.Tenant ?? string.Empty;
|
||||
var stopwatch = Stopwatch.StartNew();
|
||||
|
||||
var result = await _publisher.PublishAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
context.Analysis.Set(ScanAnalysisKeys.SurfaceManifest, result);
|
||||
_logger.LogInformation("Surface manifest stored for job {JobId} with digest {Digest}.", context.JobId, result.ManifestDigest);
|
||||
try
|
||||
{
|
||||
await PersistPayloadsToSurfaceCacheAsync(context, tenant, payloads, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var request = new SurfaceManifestRequest(
|
||||
ScanId: context.ScanId,
|
||||
ImageDigest: ResolveImageDigest(context),
|
||||
Attempt: context.Lease.Attempt,
|
||||
Metadata: context.Lease.Metadata,
|
||||
Payloads: payloads,
|
||||
Component: "scanner.worker",
|
||||
Version: _componentVersion,
|
||||
WorkerInstance: Environment.MachineName);
|
||||
|
||||
var result = await _publisher.PublishAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
await PersistManifestToSurfaceCacheAsync(context, tenant, result, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
context.Analysis.Set(ScanAnalysisKeys.SurfaceManifest, result);
|
||||
stopwatch.Stop();
|
||||
_metrics.RecordSurfaceManifestPublished(context, payloads.Count, stopwatch.Elapsed);
|
||||
_logger.LogInformation("Surface manifest stored for job {JobId} with digest {Digest}.", context.JobId, result.ManifestDigest);
|
||||
}
|
||||
catch (OperationCanceledException)
|
||||
{
|
||||
stopwatch.Stop();
|
||||
throw;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
stopwatch.Stop();
|
||||
_metrics.RecordSurfaceManifestFailed(context, ex.GetType().Name);
|
||||
_logger.LogError(ex, "Failed to persist surface manifest for job {JobId}.", context.JobId);
|
||||
throw;
|
||||
}
|
||||
}
|
||||
|
||||
private List<SurfaceManifestPayload> CollectPayloads(ScanJobContext context)
|
||||
@@ -118,6 +165,56 @@ internal sealed class SurfaceManifestStageExecutor : IScanStageExecutor
|
||||
return payloads;
|
||||
}
|
||||
|
||||
private async Task PersistPayloadsToSurfaceCacheAsync(
|
||||
ScanJobContext context,
|
||||
string tenant,
|
||||
IReadOnlyList<SurfaceManifestPayload> payloads,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
if (payloads.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
foreach (var payload in payloads)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var digest = ComputeDigest(payload.Content.Span);
|
||||
var normalizedKind = NormalizeKind(payload.Kind);
|
||||
var cacheKey = CreateArtifactCacheKey(tenant, normalizedKind, digest);
|
||||
|
||||
await _surfaceCache.SetAsync(cacheKey, payload.Content, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
_logger.LogDebug(
|
||||
"Cached surface payload {Kind} for job {JobId} with digest {Digest}.",
|
||||
normalizedKind,
|
||||
context.JobId,
|
||||
digest);
|
||||
|
||||
_metrics.RecordSurfacePayloadPersisted(context, normalizedKind);
|
||||
}
|
||||
}
|
||||
|
||||
private async Task PersistManifestToSurfaceCacheAsync(
|
||||
ScanJobContext context,
|
||||
string tenant,
|
||||
SurfaceManifestPublishResult result,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var manifestBytes = JsonSerializer.SerializeToUtf8Bytes(result.Document, ManifestSerializerOptions);
|
||||
var cacheKey = CreateManifestCacheKey(tenant, result.ManifestDigest);
|
||||
|
||||
await _surfaceCache.SetAsync(cacheKey, manifestBytes, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
_logger.LogDebug(
|
||||
"Cached surface manifest for job {JobId} with digest {Digest}.",
|
||||
context.JobId,
|
||||
result.ManifestDigest);
|
||||
}
|
||||
|
||||
private static string ResolveImageDigest(ScanJobContext context)
|
||||
{
|
||||
static bool TryGet(IReadOnlyDictionary<string, string> metadata, string key, out string value)
|
||||
@@ -143,5 +240,46 @@ internal sealed class SurfaceManifestStageExecutor : IScanStageExecutor
|
||||
return context.ScanId;
|
||||
}
|
||||
|
||||
private static SurfaceCacheKey CreateArtifactCacheKey(string tenant, string kind, string digest)
|
||||
{
|
||||
var @namespace = $"surface.artifacts.{kind}";
|
||||
var contentKey = NormalizeDigestForKey(digest);
|
||||
return new SurfaceCacheKey(@namespace, tenant, contentKey);
|
||||
}
|
||||
|
||||
private static SurfaceCacheKey CreateManifestCacheKey(string tenant, string digest)
|
||||
{
|
||||
var contentKey = NormalizeDigestForKey(digest);
|
||||
return new SurfaceCacheKey("surface.manifests", tenant, contentKey);
|
||||
}
|
||||
|
||||
private static string NormalizeKind(string? value)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return "unknown";
|
||||
}
|
||||
|
||||
var trimmed = value.Trim();
|
||||
return trimmed.ToLowerInvariant();
|
||||
}
|
||||
|
||||
private static string NormalizeDigestForKey(string? digest)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(digest))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
return digest.Trim();
|
||||
}
|
||||
|
||||
private static string ComputeDigest(ReadOnlySpan<byte> content)
|
||||
{
|
||||
Span<byte> hash = stackalloc byte[32];
|
||||
SHA256.HashData(content, hash);
|
||||
return $"sha256:{Convert.ToHexString(hash).ToLowerInvariant()}";
|
||||
}
|
||||
|
||||
private static readonly IFormatProvider CultureInfoInvariant = System.Globalization.CultureInfo.InvariantCulture;
|
||||
}
|
||||
|
||||
@@ -55,7 +55,7 @@ if (!string.IsNullOrWhiteSpace(connectionString))
|
||||
{
|
||||
builder.Services.AddScannerStorage(storageSection);
|
||||
builder.Services.AddSingleton<IConfigureOptions<ScannerStorageOptions>, ScannerStorageSurfaceSecretConfigurator>();
|
||||
builder.Services.AddSingleton<SurfaceManifestPublisher>();
|
||||
builder.Services.AddSingleton<ISurfaceManifestPublisher, SurfaceManifestPublisher>();
|
||||
builder.Services.AddSingleton<IScanStageExecutor, SurfaceManifestStageExecutor>();
|
||||
}
|
||||
|
||||
@@ -64,6 +64,7 @@ builder.Services.TryAddSingleton<IPluginCatalogGuard, RestartOnlyPluginGuard>();
|
||||
builder.Services.AddSingleton<IOSAnalyzerPluginCatalog, OsAnalyzerPluginCatalog>();
|
||||
builder.Services.AddSingleton<ILanguageAnalyzerPluginCatalog, LanguageAnalyzerPluginCatalog>();
|
||||
builder.Services.AddSingleton<IScanAnalyzerDispatcher, CompositeScanAnalyzerDispatcher>();
|
||||
builder.Services.AddSingleton<IScanStageExecutor, RegistrySecretStageExecutor>();
|
||||
builder.Services.AddSingleton<IScanStageExecutor, AnalyzerStageExecutor>();
|
||||
|
||||
builder.Services.AddSingleton<ScannerWorkerHostedService>();
|
||||
|
||||
@@ -3,7 +3,10 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| SCAN-REPLAY-186-002 | TODO | Scanner Worker Guild | REPLAY-CORE-185-001 | Enforce deterministic analyzer execution when consuming replay input bundles, emit layer Merkle metadata, and author `docs/modules/scanner/deterministic-execution.md` summarising invariants from `docs/replay/DETERMINISTIC_REPLAY.md` Section 4. | Replay mode analyzers pass determinism tests; new doc merged; integration fixtures updated. |
|
||||
| SCANNER-SURFACE-01 | DOING (2025-11-06) | Scanner Worker Guild | SURFACE-FS-02 | Persist Surface.FS manifests after analyzer stages, including layer CAS metadata and EntryTrace fragments.<br>2025-11-02: Draft Surface.FS manifests emitted for sample scans; telemetry counters under review.<br>2025-11-06: Resuming with manifest writer abstraction, rotation metadata, and telemetry counters for Surface.FS persistence. | Integration tests prove cache entries exist; telemetry counters exported. |
|
||||
| SCANNER-SURFACE-01 | DONE (2025-11-06) | Scanner Worker Guild | SURFACE-FS-02 | Persist Surface.FS manifests after analyzer stages, including layer CAS metadata and EntryTrace fragments.<br>2025-11-02: Draft Surface.FS manifests emitted for sample scans; telemetry counters under review.<br>2025-11-06: Resuming with manifest writer abstraction, rotation metadata, and telemetry counters for Surface.FS persistence.<br>2025-11-06 21:05Z: Stage now persists manifest/payload caches, exports metrics to Prometheus/Grafana, and WebService pointer tests validate consumption. | Integration tests prove cache entries exist; telemetry counters exported. |
|
||||
> 2025-11-05 19:18Z: Bound root directory to resolved Surface.Env settings and added unit coverage around the configurator.
|
||||
> 2025-11-06 18:45Z: Resuming manifest persistence—planning publisher abstraction refactor, CAS storage wiring, and telemetry/test coverage.
|
||||
> 2025-11-06 20:20Z: Hooked Surface metrics into Grafana (new dashboard JSON) and verified WebService consumption via end-to-end pointer test seeding manifest + payload entries.
|
||||
> 2025-11-06 21:05Z: Completed Surface manifest cache + metrics work; tests/docs updated and task ready to close.
|
||||
| SCANNER-ENV-01 | TODO (2025-11-06) | Scanner Worker Guild | SURFACE-ENV-02 | Replace ad-hoc environment reads with `StellaOps.Scanner.Surface.Env` helpers for cache roots and CAS endpoints.<br>2025-11-02: Worker bootstrap now resolves cache roots via helper; warning path documented; smoke tests running.<br>2025-11-05 14:55Z: Extending helper usage into cache/secrets configuration, updating worker validator wiring, and drafting docs/tests for new Surface.Env outputs.<br>2025-11-06 17:05Z: README/design docs updated with warning catalogue; startup logging guidance captured for ops runbooks.<br>2025-11-06 07:45Z: Helm/Compose env profiles (dev/stage/prod/airgap/mirror) now seed `SCANNER_SURFACE_*` defaults to keep worker cache roots aligned with Surface.Env helpers.<br>2025-11-06 07:55Z: Paused; pending automation tracked via `DEVOPS-OPENSSL-11-001/002` and Surface.Env test fixtures. | Worker boots with helper; misconfiguration warnings documented; smoke tests updated. |
|
||||
> 2025-11-05 19:18Z: Bound `SurfaceCacheOptions` root directory to resolved Surface.Env settings and added unit coverage around the configurator.
|
||||
| SCANNER-SECRETS-01 | DOING (2025-11-06) | Scanner Worker Guild, Security Guild | SURFACE-SECRETS-02 | Adopt `StellaOps.Scanner.Surface.Secrets` for registry/CAS credentials during scan execution.<br>2025-11-02: Surface.Secrets provider wired for CAS token retrieval; integration tests added.<br>2025-11-06: Continuing to replace legacy registry credential plumbing and extend rotation metrics/fixtures.<br>2025-11-06 21:35Z: Introduced `ScannerStorageSurfaceSecretConfigurator` mapping `cas-access` secrets into storage options plus unit coverage. | Secrets fetched via shared provider; legacy secret code removed; integration tests cover rotation. |
|
||||
| SCANNER-SECRETS-01 | DONE (2025-11-06) | Scanner Worker Guild, Security Guild | SURFACE-SECRETS-02 | Adopt `StellaOps.Scanner.Surface.Secrets` for registry/CAS credentials during scan execution.<br>2025-11-02: Surface.Secrets provider wired for CAS token retrieval; integration tests added.<br>2025-11-06: Replaced registry credential plumbing with shared provider, added registry secret stage + metrics, and installed .NET 10 RC2 to validate parser/stage suites via targeted `dotnet test`. | Secrets fetched via shared provider; legacy secret code removed; integration tests cover rotation. |
|
||||
|
||||
@@ -17,4 +17,6 @@ public static class ScanAnalysisKeys
|
||||
public const string EntryTraceNdjson = "analysis.entrytrace.ndjson";
|
||||
|
||||
public const string SurfaceManifest = "analysis.surface.manifest";
|
||||
|
||||
public const string RegistryCredentials = "analysis.registry.credentials";
|
||||
}
|
||||
|
||||
@@ -0,0 +1,53 @@
|
||||
using System.Text.Json;
|
||||
|
||||
namespace StellaOps.Scanner.Surface.Secrets;
|
||||
|
||||
public sealed record AttestationSecret(
|
||||
string KeyPem,
|
||||
string? CertificatePem,
|
||||
string? CertificateChainPem,
|
||||
string? RekorApiToken);
|
||||
|
||||
public static partial class SurfaceSecretParser
|
||||
{
|
||||
public static AttestationSecret ParseAttestationSecret(SurfaceSecretHandle handle)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(handle);
|
||||
|
||||
var payload = handle.AsBytes();
|
||||
if (payload.IsEmpty)
|
||||
{
|
||||
throw new InvalidOperationException("Surface secret payload is empty.");
|
||||
}
|
||||
|
||||
using var document = JsonDocument.Parse(DecodeUtf8(payload));
|
||||
var root = document.RootElement;
|
||||
|
||||
var keyPem = GetString(root, "keyPem")
|
||||
?? GetString(root, "pem")
|
||||
?? GetMetadataValue(handle.Metadata, "keyPem")
|
||||
?? GetMetadataValue(handle.Metadata, "pem");
|
||||
|
||||
if (string.IsNullOrWhiteSpace(keyPem))
|
||||
{
|
||||
throw new InvalidOperationException("Attestation secret must include a 'keyPem' value.");
|
||||
}
|
||||
|
||||
var certificatePem = GetString(root, "certificatePem")
|
||||
?? GetMetadataValue(handle.Metadata, "certificatePem");
|
||||
|
||||
var certificateChainPem = GetString(root, "certificateChainPem")
|
||||
?? GetMetadataValue(handle.Metadata, "certificateChainPem");
|
||||
|
||||
var rekorToken = GetString(root, "rekorToken")
|
||||
?? GetString(root, "rekorApiToken")
|
||||
?? GetMetadataValue(handle.Metadata, "rekorToken")
|
||||
?? GetMetadataValue(handle.Metadata, "rekorApiToken");
|
||||
|
||||
return new AttestationSecret(
|
||||
keyPem.Trim(),
|
||||
certificatePem?.Trim(),
|
||||
certificateChainPem?.Trim(),
|
||||
rekorToken?.Trim());
|
||||
}
|
||||
}
|
||||
@@ -19,7 +19,7 @@ public sealed record CasAccessSecret(
|
||||
string? SessionToken,
|
||||
bool? AllowInsecureTls);
|
||||
|
||||
public static class SurfaceSecretParser
|
||||
public static partial class SurfaceSecretParser
|
||||
{
|
||||
public static CasAccessSecret ParseCasAccessSecret(SurfaceSecretHandle handle)
|
||||
{
|
||||
|
||||
@@ -0,0 +1,347 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.ObjectModel;
|
||||
using System.Globalization;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
|
||||
namespace StellaOps.Scanner.Surface.Secrets;
|
||||
|
||||
public sealed record RegistryAccessSecret(
|
||||
IReadOnlyList<RegistryCredential> Entries,
|
||||
string? DefaultRegistry);
|
||||
|
||||
public sealed record RegistryCredential(
|
||||
string Registry,
|
||||
string? Username,
|
||||
string? Password,
|
||||
string? IdentityToken,
|
||||
string? RegistryToken,
|
||||
string? RefreshToken,
|
||||
DateTimeOffset? ExpiresAt,
|
||||
IReadOnlyCollection<string> Scopes,
|
||||
bool? AllowInsecureTls,
|
||||
IReadOnlyDictionary<string, string> Headers,
|
||||
string? Email);
|
||||
|
||||
public static partial class SurfaceSecretParser
|
||||
{
|
||||
public static RegistryAccessSecret ParseRegistryAccessSecret(SurfaceSecretHandle handle)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(handle);
|
||||
|
||||
var entries = new List<RegistryCredential>();
|
||||
string? defaultRegistry = null;
|
||||
|
||||
var payload = handle.AsBytes();
|
||||
if (!payload.IsEmpty)
|
||||
{
|
||||
var jsonText = DecodeUtf8(payload);
|
||||
using var document = JsonDocument.Parse(jsonText);
|
||||
var root = document.RootElement;
|
||||
|
||||
defaultRegistry = GetString(root, "defaultRegistry") ?? GetMetadataValue(handle.Metadata, "defaultRegistry");
|
||||
|
||||
if (TryParseRegistryEntries(root, handle.Metadata, entries) ||
|
||||
TryParseAuthsObject(root, handle.Metadata, entries))
|
||||
{
|
||||
// entries already populated
|
||||
}
|
||||
else if (root.ValueKind == JsonValueKind.Object && root.GetRawText().Length > 2) // not empty object
|
||||
{
|
||||
entries.Add(ParseRegistryEntry(root, handle.Metadata, fallbackRegistry: null));
|
||||
}
|
||||
}
|
||||
|
||||
if (entries.Count == 0 && TryCreateRegistryEntryFromMetadata(handle.Metadata, out var metadataEntry))
|
||||
{
|
||||
entries.Add(metadataEntry);
|
||||
}
|
||||
|
||||
if (entries.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException("Registry secret payload does not contain credentials.");
|
||||
}
|
||||
|
||||
defaultRegistry ??= GetMetadataValue(handle.Metadata, "defaultRegistry")
|
||||
?? entries[0].Registry;
|
||||
|
||||
return new RegistryAccessSecret(
|
||||
new ReadOnlyCollection<RegistryCredential>(entries),
|
||||
string.IsNullOrWhiteSpace(defaultRegistry) ? entries[0].Registry : defaultRegistry.Trim());
|
||||
}
|
||||
|
||||
private static bool TryParseRegistryEntries(
|
||||
JsonElement root,
|
||||
IReadOnlyDictionary<string, string> metadata,
|
||||
ICollection<RegistryCredential> entries)
|
||||
{
|
||||
if (!TryGetPropertyIgnoreCase(root, "entries", out var entriesElement) ||
|
||||
entriesElement.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
foreach (var entryElement in entriesElement.EnumerateArray())
|
||||
{
|
||||
if (entryElement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
entries.Add(ParseRegistryEntry(entryElement, metadata, fallbackRegistry: null));
|
||||
}
|
||||
|
||||
return entries.Count > 0;
|
||||
}
|
||||
|
||||
private static bool TryParseAuthsObject(
|
||||
JsonElement root,
|
||||
IReadOnlyDictionary<string, string> metadata,
|
||||
ICollection<RegistryCredential> entries)
|
||||
{
|
||||
if (!TryGetPropertyIgnoreCase(root, "auths", out var authsElement) ||
|
||||
authsElement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
foreach (var property in authsElement.EnumerateObject())
|
||||
{
|
||||
if (property.Value.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
entries.Add(ParseRegistryEntry(property.Value, metadata, property.Name));
|
||||
}
|
||||
|
||||
return entries.Count > 0;
|
||||
}
|
||||
|
||||
private static RegistryCredential ParseRegistryEntry(
|
||||
JsonElement element,
|
||||
IReadOnlyDictionary<string, string> metadata,
|
||||
string? fallbackRegistry)
|
||||
{
|
||||
var registry = GetString(element, "registry")
|
||||
?? GetString(element, "server")
|
||||
?? fallbackRegistry
|
||||
?? GetMetadataValue(metadata, "registry")
|
||||
?? throw new InvalidOperationException("Registry credential is missing a registry identifier.");
|
||||
|
||||
registry = registry.Trim();
|
||||
|
||||
var username = GetString(element, "username") ?? GetString(element, "user");
|
||||
var password = GetString(element, "password") ?? GetString(element, "pass");
|
||||
var token = GetString(element, "token") ?? GetString(element, "registryToken");
|
||||
var identityToken = GetString(element, "identityToken") ?? GetString(element, "identitytoken");
|
||||
var refreshToken = GetString(element, "refreshToken");
|
||||
var email = GetString(element, "email");
|
||||
var allowInsecure = GetBoolean(element, "allowInsecureTls");
|
||||
|
||||
var headers = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
PopulateHeaders(element, headers);
|
||||
PopulateMetadataHeaders(metadata, headers);
|
||||
|
||||
var scopes = new List<string>();
|
||||
PopulateScopes(element, scopes);
|
||||
PopulateMetadataScopes(metadata, scopes);
|
||||
|
||||
var expiresAt = ParseDateTime(element, "expiresAt");
|
||||
|
||||
var auth = GetString(element, "auth");
|
||||
if (!string.IsNullOrWhiteSpace(auth))
|
||||
{
|
||||
TryApplyBasicAuth(auth, ref username, ref password);
|
||||
}
|
||||
|
||||
username ??= GetMetadataValue(metadata, "username");
|
||||
password ??= GetMetadataValue(metadata, "password");
|
||||
token ??= GetMetadataValue(metadata, "token") ?? GetMetadataValue(metadata, "registryToken");
|
||||
identityToken ??= GetMetadataValue(metadata, "identityToken");
|
||||
refreshToken ??= GetMetadataValue(metadata, "refreshToken");
|
||||
email ??= GetMetadataValue(metadata, "email");
|
||||
|
||||
return new RegistryCredential(
|
||||
registry,
|
||||
username?.Trim(),
|
||||
password,
|
||||
identityToken,
|
||||
token,
|
||||
refreshToken,
|
||||
expiresAt,
|
||||
scopes.Count == 0 ? Array.Empty<string>() : new ReadOnlyCollection<string>(scopes),
|
||||
allowInsecure,
|
||||
new ReadOnlyDictionary<string, string>(headers),
|
||||
email);
|
||||
}
|
||||
|
||||
private static bool TryCreateRegistryEntryFromMetadata(
|
||||
IReadOnlyDictionary<string, string> metadata,
|
||||
out RegistryCredential entry)
|
||||
{
|
||||
var registry = GetMetadataValue(metadata, "registry");
|
||||
var username = GetMetadataValue(metadata, "username");
|
||||
var password = GetMetadataValue(metadata, "password");
|
||||
var identityToken = GetMetadataValue(metadata, "identityToken");
|
||||
var token = GetMetadataValue(metadata, "token") ?? GetMetadataValue(metadata, "registryToken");
|
||||
|
||||
if (string.IsNullOrWhiteSpace(registry) &&
|
||||
string.IsNullOrWhiteSpace(username) &&
|
||||
string.IsNullOrWhiteSpace(password) &&
|
||||
string.IsNullOrWhiteSpace(identityToken) &&
|
||||
string.IsNullOrWhiteSpace(token))
|
||||
{
|
||||
entry = null!;
|
||||
return false;
|
||||
}
|
||||
|
||||
var headers = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
PopulateMetadataHeaders(metadata, headers);
|
||||
|
||||
var scopes = new List<string>();
|
||||
PopulateMetadataScopes(metadata, scopes);
|
||||
|
||||
entry = new RegistryCredential(
|
||||
registry?.Trim() ?? "registry.local",
|
||||
username?.Trim(),
|
||||
password,
|
||||
identityToken,
|
||||
token,
|
||||
GetMetadataValue(metadata, "refreshToken"),
|
||||
ParseDateTime(metadata, "expiresAt"),
|
||||
scopes.Count == 0 ? Array.Empty<string>() : new ReadOnlyCollection<string>(scopes),
|
||||
ParseBoolean(metadata, "allowInsecureTls"),
|
||||
new ReadOnlyDictionary<string, string>(headers),
|
||||
GetMetadataValue(metadata, "email"));
|
||||
return true;
|
||||
}
|
||||
|
||||
private static void PopulateScopes(JsonElement element, ICollection<string> scopes)
|
||||
{
|
||||
if (!TryGetPropertyIgnoreCase(element, "scopes", out var scopesElement))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
switch (scopesElement.ValueKind)
|
||||
{
|
||||
case JsonValueKind.Array:
|
||||
foreach (var scope in scopesElement.EnumerateArray())
|
||||
{
|
||||
if (scope.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
var value = scope.GetString();
|
||||
if (!string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
scopes.Add(value.Trim());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
break;
|
||||
case JsonValueKind.String:
|
||||
var text = scopesElement.GetString();
|
||||
if (!string.IsNullOrWhiteSpace(text))
|
||||
{
|
||||
foreach (var part in text.Split(new[] { ',', ' ' }, StringSplitOptions.RemoveEmptyEntries))
|
||||
{
|
||||
scopes.Add(part.Trim());
|
||||
}
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
private static void PopulateMetadataScopes(IReadOnlyDictionary<string, string> metadata, ICollection<string> scopes)
|
||||
{
|
||||
foreach (var (key, value) in metadata)
|
||||
{
|
||||
if (!key.StartsWith("scope", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
scopes.Add(value.Trim());
|
||||
}
|
||||
}
|
||||
|
||||
private static void TryApplyBasicAuth(string auth, ref string? username, ref string? password)
|
||||
{
|
||||
try
|
||||
{
|
||||
var decoded = Encoding.UTF8.GetString(Convert.FromBase64String(auth));
|
||||
var separator = decoded.IndexOf(':');
|
||||
if (separator >= 0)
|
||||
{
|
||||
username ??= decoded[..separator];
|
||||
password ??= decoded[(separator + 1)..];
|
||||
}
|
||||
}
|
||||
catch (FormatException)
|
||||
{
|
||||
// ignore malformed auth; caller may still have explicit username/password fields
|
||||
}
|
||||
}
|
||||
|
||||
private static DateTimeOffset? ParseDateTime(JsonElement element, string propertyName)
|
||||
{
|
||||
if (!TryGetPropertyIgnoreCase(element, propertyName, out var value) ||
|
||||
value.ValueKind != JsonValueKind.String)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var text = value.GetString();
|
||||
if (string.IsNullOrWhiteSpace(text))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (DateTimeOffset.TryParse(text, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed))
|
||||
{
|
||||
return parsed;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private static DateTimeOffset? ParseDateTime(IReadOnlyDictionary<string, string> metadata, string key)
|
||||
{
|
||||
var value = GetMetadataValue(metadata, key);
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (DateTimeOffset.TryParse(value, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed))
|
||||
{
|
||||
return parsed;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private static bool? ParseBoolean(IReadOnlyDictionary<string, string> metadata, string key)
|
||||
{
|
||||
var value = GetMetadataValue(metadata, key);
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (bool.TryParse(value, out var parsed))
|
||||
{
|
||||
return parsed;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,122 @@
|
||||
using System;
|
||||
using System.Diagnostics;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Text.Json;
|
||||
using System.Threading.Tasks;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Descriptor;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Manifest;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Tests.TestUtilities;
|
||||
using StellaOps.Scanner.Surface.FS;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin.Tests.Descriptor;
|
||||
|
||||
public sealed class DescriptorCommandSurfaceTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task DescriptorCommand_PublishesSurfaceArtifacts()
|
||||
{
|
||||
await using var temp = new TempDirectory();
|
||||
var casRoot = Path.Combine(temp.Path, "cas");
|
||||
Directory.CreateDirectory(casRoot);
|
||||
|
||||
var sbomPath = Path.Combine(temp.Path, "sample.cdx.json");
|
||||
await File.WriteAllTextAsync(sbomPath, "{\"bomFormat\":\"CycloneDX\",\"specVersion\":\"1.6\"}");
|
||||
|
||||
var layerFragmentsPath = Path.Combine(temp.Path, "layer-fragments.json");
|
||||
await File.WriteAllTextAsync(layerFragmentsPath, "[]");
|
||||
|
||||
var entryTraceGraphPath = Path.Combine(temp.Path, "entrytrace-graph.json");
|
||||
await File.WriteAllTextAsync(entryTraceGraphPath, "{\"nodes\":[],\"edges\":[]}");
|
||||
|
||||
var entryTraceNdjsonPath = Path.Combine(temp.Path, "entrytrace.ndjson");
|
||||
await File.WriteAllTextAsync(entryTraceNdjsonPath, "{}\n{}");
|
||||
|
||||
var manifestOutputPath = Path.Combine(temp.Path, "out", "surface-manifest.json");
|
||||
|
||||
var repoRoot = TestPathHelper.FindRepositoryRoot();
|
||||
var manifestDirectory = Path.Combine(repoRoot, "src", "Scanner", "StellaOps.Scanner.Sbomer.BuildXPlugin");
|
||||
var pluginAssembly = typeof(BuildxPluginManifest).Assembly.Location;
|
||||
|
||||
var psi = new ProcessStartInfo("dotnet")
|
||||
{
|
||||
RedirectStandardOutput = true,
|
||||
RedirectStandardError = true,
|
||||
UseShellExecute = false,
|
||||
WorkingDirectory = repoRoot
|
||||
};
|
||||
|
||||
psi.ArgumentList.Add(pluginAssembly);
|
||||
psi.ArgumentList.Add("descriptor");
|
||||
psi.ArgumentList.Add("--manifest");
|
||||
psi.ArgumentList.Add(manifestDirectory);
|
||||
psi.ArgumentList.Add("--cas");
|
||||
psi.ArgumentList.Add(casRoot);
|
||||
psi.ArgumentList.Add("--image");
|
||||
psi.ArgumentList.Add("sha256:feedfacefeedfacefeedfacefeedfacefeedfacefeedfacefeedfacefeedface");
|
||||
psi.ArgumentList.Add("--sbom");
|
||||
psi.ArgumentList.Add(sbomPath);
|
||||
psi.ArgumentList.Add("--sbom-name");
|
||||
psi.ArgumentList.Add("sample.cdx.json");
|
||||
psi.ArgumentList.Add("--surface-layer-fragments");
|
||||
psi.ArgumentList.Add(layerFragmentsPath);
|
||||
psi.ArgumentList.Add("--surface-entrytrace-graph");
|
||||
psi.ArgumentList.Add(entryTraceGraphPath);
|
||||
psi.ArgumentList.Add("--surface-entrytrace-ndjson");
|
||||
psi.ArgumentList.Add(entryTraceNdjsonPath);
|
||||
psi.ArgumentList.Add("--surface-cache-root");
|
||||
psi.ArgumentList.Add(casRoot);
|
||||
psi.ArgumentList.Add("--surface-tenant");
|
||||
psi.ArgumentList.Add("test-tenant");
|
||||
psi.ArgumentList.Add("--surface-manifest-output");
|
||||
psi.ArgumentList.Add(manifestOutputPath);
|
||||
|
||||
var process = Process.Start(psi) ?? throw new InvalidOperationException("Failed to start BuildX plug-in process.");
|
||||
var stdout = await process.StandardOutput.ReadToEndAsync();
|
||||
var stderr = await process.StandardError.ReadToEndAsync();
|
||||
await process.WaitForExitAsync();
|
||||
|
||||
Assert.True(process.ExitCode == 0, $"Descriptor command failed.\nSTDOUT: {stdout}\nSTDERR: {stderr}");
|
||||
|
||||
var descriptor = JsonSerializer.Deserialize<DescriptorDocument>(stdout, new JsonSerializerOptions(JsonSerializerDefaults.Web));
|
||||
Assert.NotNull(descriptor);
|
||||
Assert.Equal("stellaops.buildx.descriptor.v1", descriptor!.Schema);
|
||||
Assert.Equal("sha256:d07d06ae82e1789a5b505731f3ec3add106e23a55395213c9a881c7e816c695c", descriptor.Artifact.Digest);
|
||||
|
||||
Assert.Contains("surface manifest stored", stderr, StringComparison.OrdinalIgnoreCase);
|
||||
Assert.True(File.Exists(manifestOutputPath));
|
||||
|
||||
var surfaceManifestPath = Directory.GetFiles(Path.Combine(casRoot, "scanner", "surface", "manifests"), "*.json", SearchOption.AllDirectories).Single();
|
||||
var manifestDocument = JsonSerializer.Deserialize<SurfaceManifestDocument>(await File.ReadAllTextAsync(surfaceManifestPath), new JsonSerializerOptions(JsonSerializerDefaults.Web));
|
||||
Assert.NotNull(manifestDocument);
|
||||
Assert.Equal("test-tenant", manifestDocument!.Tenant);
|
||||
Assert.Equal(3, manifestDocument.Artifacts.Count);
|
||||
|
||||
foreach (var artifact in manifestDocument.Artifacts)
|
||||
{
|
||||
Assert.StartsWith("cas://", artifact.Uri, StringComparison.Ordinal);
|
||||
var localPath = ResolveLocalPath(artifact.Uri, casRoot);
|
||||
Assert.True(File.Exists(localPath), $"Missing CAS object for {artifact.Uri}");
|
||||
}
|
||||
}
|
||||
|
||||
private static string ResolveLocalPath(string casUri, string casRoot)
|
||||
{
|
||||
const string prefix = "cas://";
|
||||
if (!casUri.StartsWith(prefix, StringComparison.Ordinal))
|
||||
{
|
||||
throw new InvalidOperationException($"Unsupported CAS URI {casUri}.");
|
||||
}
|
||||
|
||||
var slashIndex = casUri.IndexOf(/, prefix.Length);
|
||||
if (slashIndex < 0)
|
||||
{
|
||||
throw new InvalidOperationException($"CAS URI {casUri} does not contain a bucket path.");
|
||||
}
|
||||
|
||||
var relative = casUri[(slashIndex + 1)..];
|
||||
var localPath = Path.Combine(casRoot, relative.Replace(/, Path.DirectorySeparatorChar));
|
||||
return localPath;
|
||||
}
|
||||
}
|
||||
@@ -1,45 +1,45 @@
|
||||
{
|
||||
"schema": "stellaops.buildx.descriptor.v1",
|
||||
"generatedAt": "2025-10-18T12:00:00\u002B00:00",
|
||||
"generator": {
|
||||
"name": "StellaOps.Scanner.Sbomer.BuildXPlugin",
|
||||
"version": "1.2.3"
|
||||
},
|
||||
"subject": {
|
||||
"mediaType": "application/vnd.oci.image.manifest.v1\u002Bjson",
|
||||
"digest": "sha256:0123456789abcdef"
|
||||
},
|
||||
"artifact": {
|
||||
"mediaType": "application/vnd.cyclonedx\u002Bjson",
|
||||
"digest": "sha256:d07d06ae82e1789a5b505731f3ec3add106e23a55395213c9a881c7e816c695c",
|
||||
"size": 45,
|
||||
"annotations": {
|
||||
"org.opencontainers.artifact.type": "application/vnd.stellaops.sbom.layer\u002Bjson",
|
||||
"org.stellaops.scanner.version": "1.2.3",
|
||||
"org.stellaops.sbom.kind": "inventory",
|
||||
"org.stellaops.sbom.format": "cyclonedx-json",
|
||||
"org.stellaops.provenance.status": "pending",
|
||||
"org.stellaops.provenance.dsse.sha256": "sha256:1b364a6b888d580feb8565f7b6195b24535ca8201b4bcac58da063b32c47220d",
|
||||
"org.stellaops.provenance.nonce": "a608acf859cd58a8389816b8d9eb2a07",
|
||||
"org.stellaops.license.id": "lic-123",
|
||||
"org.opencontainers.image.title": "sample.cdx.json",
|
||||
"org.stellaops.repository": "git.stella-ops.org/stellaops"
|
||||
}
|
||||
},
|
||||
"provenance": {
|
||||
"status": "pending",
|
||||
"expectedDsseSha256": "sha256:1b364a6b888d580feb8565f7b6195b24535ca8201b4bcac58da063b32c47220d",
|
||||
"nonce": "a608acf859cd58a8389816b8d9eb2a07",
|
||||
"attestorUri": "https://attestor.local/api/v1/provenance",
|
||||
"predicateType": "https://slsa.dev/provenance/v1"
|
||||
},
|
||||
"metadata": {
|
||||
"sbomDigest": "sha256:d07d06ae82e1789a5b505731f3ec3add106e23a55395213c9a881c7e816c695c",
|
||||
"sbomPath": "sample.cdx.json",
|
||||
"sbomMediaType": "application/vnd.cyclonedx\u002Bjson",
|
||||
"subjectMediaType": "application/vnd.oci.image.manifest.v1\u002Bjson",
|
||||
"repository": "git.stella-ops.org/stellaops",
|
||||
"buildRef": "refs/heads/main",
|
||||
"attestorUri": "https://attestor.local/api/v1/provenance"
|
||||
}
|
||||
{
|
||||
"schema": "stellaops.buildx.descriptor.v1",
|
||||
"generatedAt": "2025-10-18T12:00:00\u002B00:00",
|
||||
"generator": {
|
||||
"name": "StellaOps.Scanner.Sbomer.BuildXPlugin",
|
||||
"version": "1.2.3"
|
||||
},
|
||||
"subject": {
|
||||
"mediaType": "application/vnd.oci.image.manifest.v1\u002Bjson",
|
||||
"digest": "sha256:0123456789abcdef"
|
||||
},
|
||||
"artifact": {
|
||||
"mediaType": "application/vnd.cyclonedx\u002Bjson",
|
||||
"digest": "sha256:d07d06ae82e1789a5b505731f3ec3add106e23a55395213c9a881c7e816c695c",
|
||||
"size": 45,
|
||||
"annotations": {
|
||||
"org.opencontainers.artifact.type": "application/vnd.stellaops.sbom.layer\u002Bjson",
|
||||
"org.stellaops.scanner.version": "1.2.3",
|
||||
"org.stellaops.sbom.kind": "inventory",
|
||||
"org.stellaops.sbom.format": "cyclonedx-json",
|
||||
"org.stellaops.provenance.status": "pending",
|
||||
"org.stellaops.provenance.dsse.sha256": "sha256:35ab4784f3bad40bb0063b522939ac729cf43d2012059947c0e56475d682c05e",
|
||||
"org.stellaops.provenance.nonce": "5e13230e3dcbc8be996d8132d92e8826",
|
||||
"org.stellaops.license.id": "lic-123",
|
||||
"org.opencontainers.image.title": "sample.cdx.json",
|
||||
"org.stellaops.repository": "git.stella-ops.org/stellaops"
|
||||
}
|
||||
},
|
||||
"provenance": {
|
||||
"status": "pending",
|
||||
"expectedDsseSha256": "sha256:35ab4784f3bad40bb0063b522939ac729cf43d2012059947c0e56475d682c05e",
|
||||
"nonce": "5e13230e3dcbc8be996d8132d92e8826",
|
||||
"attestorUri": "https://attestor.local/api/v1/provenance",
|
||||
"predicateType": "https://slsa.dev/provenance/v1"
|
||||
},
|
||||
"metadata": {
|
||||
"sbomDigest": "sha256:d07d06ae82e1789a5b505731f3ec3add106e23a55395213c9a881c7e816c695c",
|
||||
"sbomPath": "sample.cdx.json",
|
||||
"sbomMediaType": "application/vnd.cyclonedx\u002Bjson",
|
||||
"subjectMediaType": "application/vnd.oci.image.manifest.v1\u002Bjson",
|
||||
"repository": "git.stella-ops.org/stellaops",
|
||||
"buildRef": "refs/heads/main",
|
||||
"attestorUri": "https://attestor.local/api/v1/provenance"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,95 @@
|
||||
using System;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Surface;
|
||||
using StellaOps.Scanner.Sbomer.BuildXPlugin.Tests.TestUtilities;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin.Tests.Surface;
|
||||
|
||||
public sealed class SurfaceManifestWriterTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task WriteAsync_PersistsArtifactsAndManifest()
|
||||
{
|
||||
await using var temp = new TempDirectory();
|
||||
var fragmentsPath = Path.Combine(temp.Path, "layer-fragments.json");
|
||||
await File.WriteAllTextAsync(fragmentsPath, "[]");
|
||||
|
||||
var graphPath = Path.Combine(temp.Path, "entrytrace-graph.json");
|
||||
await File.WriteAllTextAsync(graphPath, "{\"nodes\":[],\"edges\":[]}");
|
||||
|
||||
var ndjsonPath = Path.Combine(temp.Path, "entrytrace.ndjson");
|
||||
await File.WriteAllTextAsync(ndjsonPath, "{}\n{}");
|
||||
|
||||
var manifestOutputPath = Path.Combine(temp.Path, "out", "surface-manifest.json");
|
||||
|
||||
var options = new SurfaceOptions(
|
||||
CacheRoot: temp.Path,
|
||||
CacheBucket: "scanner-artifacts",
|
||||
RootPrefix: "scanner",
|
||||
Tenant: "tenant-a",
|
||||
Component: "scanner.buildx",
|
||||
ComponentVersion: "1.2.3",
|
||||
WorkerInstance: "builder-01",
|
||||
Attempt: 2,
|
||||
ImageDigest: "sha256:feedface",
|
||||
ScanId: "scan-123",
|
||||
LayerFragmentsPath: fragmentsPath,
|
||||
EntryTraceGraphPath: graphPath,
|
||||
EntryTraceNdjsonPath: ndjsonPath,
|
||||
ManifestOutputPath: manifestOutputPath);
|
||||
|
||||
var writer = new SurfaceManifestWriter(TimeProvider.System);
|
||||
var result = await writer.WriteAsync(options, CancellationToken.None);
|
||||
|
||||
Assert.NotNull(result);
|
||||
Assert.NotNull(result!.Document.Source);
|
||||
Assert.Equal("tenant-a", result.Document.Tenant);
|
||||
Assert.Equal("scanner.buildx", result.Document.Source!.Component);
|
||||
Assert.Equal("1.2.3", result.Document.Source.Version);
|
||||
Assert.Equal(3, result.Document.Artifacts.Count);
|
||||
|
||||
var kinds = result.Document.Artifacts.Select(a => a.Kind).ToHashSet();
|
||||
Assert.Contains("entrytrace.graph", kinds);
|
||||
Assert.Contains("entrytrace.ndjson", kinds);
|
||||
Assert.Contains("layer.fragments", kinds);
|
||||
|
||||
Assert.True(File.Exists(result.ManifestPath));
|
||||
Assert.True(File.Exists(manifestOutputPath));
|
||||
|
||||
foreach (var artifact in result.Artifacts)
|
||||
{
|
||||
Assert.True(File.Exists(artifact.FilePath));
|
||||
Assert.False(string.IsNullOrWhiteSpace(artifact.ManifestArtifact.Uri));
|
||||
Assert.StartsWith("cas://scanner-artifacts/", artifact.ManifestArtifact.Uri, StringComparison.Ordinal);
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task WriteAsync_NoArtifacts_ReturnsNull()
|
||||
{
|
||||
await using var temp = new TempDirectory();
|
||||
var options = new SurfaceOptions(
|
||||
CacheRoot: temp.Path,
|
||||
CacheBucket: "scanner-artifacts",
|
||||
RootPrefix: "scanner",
|
||||
Tenant: "tenant-a",
|
||||
Component: "scanner.buildx",
|
||||
ComponentVersion: "1.0",
|
||||
WorkerInstance: "builder-01",
|
||||
Attempt: 1,
|
||||
ImageDigest: "sha256:deadbeef",
|
||||
ScanId: "scan-1",
|
||||
LayerFragmentsPath: null,
|
||||
EntryTraceGraphPath: null,
|
||||
EntryTraceNdjsonPath: null,
|
||||
ManifestOutputPath: null);
|
||||
|
||||
var writer = new SurfaceManifestWriter(TimeProvider.System);
|
||||
var result = await writer.WriteAsync(options, CancellationToken.None);
|
||||
Assert.Null(result);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
using System;
|
||||
using System.IO;
|
||||
|
||||
namespace StellaOps.Scanner.Sbomer.BuildXPlugin.Tests.TestUtilities;
|
||||
|
||||
internal static class TestPathHelper
|
||||
{
|
||||
public static string FindRepositoryRoot()
|
||||
{
|
||||
var current = AppContext.BaseDirectory;
|
||||
for (var i = 0; i < 15 && !string.IsNullOrWhiteSpace(current); i++)
|
||||
{
|
||||
if (File.Exists(Path.Combine(current, "global.json")))
|
||||
{
|
||||
return current;
|
||||
}
|
||||
|
||||
current = Directory.GetParent(current)?.FullName;
|
||||
}
|
||||
|
||||
throw new InvalidOperationException("Unable to locate repository root (global.json not found).");
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,112 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Text;
|
||||
using StellaOps.Scanner.Surface.Secrets;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Surface.Secrets.Tests;
|
||||
|
||||
public sealed class RegistryAccessSecretParserTests
|
||||
{
|
||||
[Fact]
|
||||
public void ParseRegistrySecret_WithEntriesArray_ReturnsCredential()
|
||||
{
|
||||
const string json = """
|
||||
{
|
||||
"defaultRegistry": "registry.example.com",
|
||||
"entries": [
|
||||
{
|
||||
"registry": "registry.example.com",
|
||||
"username": "demo",
|
||||
"password": "s3cret",
|
||||
"token": "token-123",
|
||||
"identityToken": "identity-token",
|
||||
"refreshToken": "refresh-token",
|
||||
"expiresAt": "2025-12-01T10:00:00Z",
|
||||
"allowInsecureTls": false,
|
||||
"scopes": ["repo:sample:pull"],
|
||||
"headers": {
|
||||
"X-Test": "value"
|
||||
},
|
||||
"email": "demo@example.com"
|
||||
}
|
||||
]
|
||||
}
|
||||
""";
|
||||
|
||||
using var handle = SurfaceSecretHandle.FromBytes(Encoding.UTF8.GetBytes(json));
|
||||
var secret = SurfaceSecretParser.ParseRegistryAccessSecret(handle);
|
||||
|
||||
Assert.Equal("registry.example.com", secret.DefaultRegistry);
|
||||
var entry = Assert.Single(secret.Entries);
|
||||
Assert.Equal("registry.example.com", entry.Registry);
|
||||
Assert.Equal("demo", entry.Username);
|
||||
Assert.Equal("s3cret", entry.Password);
|
||||
Assert.Equal("token-123", entry.RegistryToken);
|
||||
Assert.Equal("identity-token", entry.IdentityToken);
|
||||
Assert.Equal("refresh-token", entry.RefreshToken);
|
||||
Assert.Equal("demo@example.com", entry.Email);
|
||||
Assert.Equal(new DateTimeOffset(2025, 12, 1, 10, 0, 0, TimeSpan.Zero), entry.ExpiresAt);
|
||||
Assert.Equal(false, entry.AllowInsecureTls);
|
||||
Assert.Contains("repo:sample:pull", entry.Scopes);
|
||||
Assert.Equal("value", entry.Headers["X-Test"]);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ParseRegistrySecret_WithDockerAuthsObject_DecodesBasicAuth()
|
||||
{
|
||||
const string json = """
|
||||
{
|
||||
"auths": {
|
||||
"ghcr.io": {
|
||||
"auth": "ZGVtbzpwYXNz",
|
||||
"identitytoken": "id-token"
|
||||
}
|
||||
}
|
||||
}
|
||||
""";
|
||||
|
||||
var metadata = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["token"] = "metadata-token"
|
||||
};
|
||||
|
||||
using var handle = SurfaceSecretHandle.FromBytes(Encoding.UTF8.GetBytes(json), metadata);
|
||||
var secret = SurfaceSecretParser.ParseRegistryAccessSecret(handle);
|
||||
|
||||
var entry = Assert.Single(secret.Entries);
|
||||
Assert.Equal("ghcr.io", entry.Registry);
|
||||
Assert.Equal("demo", entry.Username);
|
||||
Assert.Equal("pass", entry.Password);
|
||||
Assert.Equal("metadata-token", entry.RegistryToken);
|
||||
Assert.Equal("id-token", entry.IdentityToken);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ParseRegistrySecret_MetadataFallback_ReturnsCredential()
|
||||
{
|
||||
var metadata = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["registry"] = "registry.internal",
|
||||
["username"] = "meta-user",
|
||||
["password"] = "meta-pass",
|
||||
["scope:0"] = "repo:internal:pull",
|
||||
["header:X-From"] = "metadata",
|
||||
["defaultRegistry"] = "registry.internal",
|
||||
["expiresAt"] = "2025-11-10T00:00:00Z",
|
||||
["allowInsecureTls"] = "true"
|
||||
};
|
||||
|
||||
using var handle = SurfaceSecretHandle.FromBytes(ReadOnlySpan<byte>.Empty, metadata);
|
||||
var secret = SurfaceSecretParser.ParseRegistryAccessSecret(handle);
|
||||
|
||||
var entry = Assert.Single(secret.Entries);
|
||||
Assert.Equal("registry.internal", entry.Registry);
|
||||
Assert.Equal("meta-user", entry.Username);
|
||||
Assert.Equal("meta-pass", entry.Password);
|
||||
Assert.Contains("repo:internal:pull", entry.Scopes);
|
||||
Assert.Equal("metadata", entry.Headers["X-From"]);
|
||||
Assert.True(entry.AllowInsecureTls);
|
||||
Assert.Equal("registry.internal", secret.DefaultRegistry);
|
||||
}
|
||||
}
|
||||
@@ -30,7 +30,10 @@ public sealed class ScannerSurfaceSecretConfiguratorTests
|
||||
""";
|
||||
|
||||
using var handle = SurfaceSecretHandle.FromBytes(Encoding.UTF8.GetBytes(json));
|
||||
var secretProvider = new StubSecretProvider(handle);
|
||||
var secretProvider = new StubSecretProvider(new Dictionary<string, SurfaceSecretHandle>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["cas-access"] = handle
|
||||
});
|
||||
var environment = new StubSurfaceEnvironment();
|
||||
var options = new ScannerWebServiceOptions();
|
||||
|
||||
@@ -82,17 +85,101 @@ public sealed class ScannerSurfaceSecretConfiguratorTests
|
||||
Assert.Equal("X-Sync", storageOptions.ObjectStore.RustFs.ApiKeyHeader);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Configure_AppliesAttestationSecretToSigning()
|
||||
{
|
||||
const string json = """
|
||||
{
|
||||
"keyPem": "-----BEGIN KEY-----\nYWJj\n-----END KEY-----",
|
||||
"certificatePem": "CERT-PEM",
|
||||
"certificateChainPem": "CHAIN-PEM"
|
||||
}
|
||||
""";
|
||||
|
||||
using var handle = SurfaceSecretHandle.FromBytes(Encoding.UTF8.GetBytes(json));
|
||||
var secretProvider = new StubSecretProvider(new Dictionary<string, SurfaceSecretHandle>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["attestation"] = handle
|
||||
});
|
||||
var environment = new StubSurfaceEnvironment();
|
||||
var options = new ScannerWebServiceOptions();
|
||||
|
||||
var configurator = new ScannerSurfaceSecretConfigurator(
|
||||
secretProvider,
|
||||
environment,
|
||||
NullLogger<ScannerSurfaceSecretConfigurator>.Instance);
|
||||
|
||||
configurator.Configure(options);
|
||||
|
||||
Assert.Equal("-----BEGIN KEY-----\nYWJj\n-----END KEY-----", options.Signing.KeyPem);
|
||||
Assert.Equal("CERT-PEM", options.Signing.CertificatePem);
|
||||
Assert.Equal("CHAIN-PEM", options.Signing.CertificateChainPem);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Configure_AppliesRegistrySecretToOptions()
|
||||
{
|
||||
const string json = """
|
||||
{
|
||||
"defaultRegistry": "registry.example.com",
|
||||
"entries": [
|
||||
{
|
||||
"registry": "registry.example.com",
|
||||
"username": "demo",
|
||||
"password": "secret",
|
||||
"scopes": ["repo:sample:pull"],
|
||||
"headers": { "X-Test": "value" },
|
||||
"allowInsecureTls": true,
|
||||
"email": "demo@example.com"
|
||||
}
|
||||
]
|
||||
}
|
||||
""";
|
||||
|
||||
using var handle = SurfaceSecretHandle.FromBytes(Encoding.UTF8.GetBytes(json));
|
||||
var secretProvider = new StubSecretProvider(new Dictionary<string, SurfaceSecretHandle>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["registry"] = handle
|
||||
});
|
||||
var environment = new StubSurfaceEnvironment();
|
||||
var options = new ScannerWebServiceOptions();
|
||||
|
||||
var configurator = new ScannerSurfaceSecretConfigurator(
|
||||
secretProvider,
|
||||
environment,
|
||||
NullLogger<ScannerSurfaceSecretConfigurator>.Instance);
|
||||
|
||||
configurator.Configure(options);
|
||||
|
||||
Assert.Equal("registry.example.com", options.Registry.DefaultRegistry);
|
||||
var credential = Assert.Single(options.Registry.Credentials);
|
||||
Assert.Equal("registry.example.com", credential.Registry);
|
||||
Assert.Equal("demo", credential.Username);
|
||||
Assert.Equal("secret", credential.Password);
|
||||
Assert.True(credential.AllowInsecureTls);
|
||||
Assert.Contains("repo:sample:pull", credential.Scopes);
|
||||
Assert.Equal("value", credential.Headers["X-Test"]);
|
||||
Assert.Equal("demo@example.com", credential.Email);
|
||||
}
|
||||
|
||||
private sealed class StubSecretProvider : ISurfaceSecretProvider
|
||||
{
|
||||
private readonly SurfaceSecretHandle _handle;
|
||||
private readonly IDictionary<string, SurfaceSecretHandle> _handles;
|
||||
|
||||
public StubSecretProvider(SurfaceSecretHandle handle)
|
||||
public StubSecretProvider(IDictionary<string, SurfaceSecretHandle> handles)
|
||||
{
|
||||
_handle = handle;
|
||||
_handles = handles ?? throw new ArgumentNullException(nameof(handles));
|
||||
}
|
||||
|
||||
public ValueTask<SurfaceSecretHandle> GetAsync(SurfaceSecretRequest request, CancellationToken cancellationToken = default)
|
||||
=> ValueTask.FromResult(_handle);
|
||||
{
|
||||
if (_handles.TryGetValue(request.SecretType, out var handle))
|
||||
{
|
||||
return ValueTask.FromResult(handle);
|
||||
}
|
||||
|
||||
throw new SurfaceSecretNotFoundException(request);
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class StubSurfaceEnvironment : ISurfaceEnvironment
|
||||
|
||||
@@ -9,7 +9,7 @@ using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using System.Threading.Tasks;
|
||||
using System.Threading;
|
||||
using Microsoft.AspNetCore.Http;
|
||||
using Microsoft.AspNetCore.Http;
|
||||
using Microsoft.AspNetCore.Mvc.Testing;
|
||||
using Microsoft.AspNetCore.TestHost;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
@@ -17,6 +17,7 @@ using StellaOps.Scanner.EntryTrace;
|
||||
using StellaOps.Scanner.EntryTrace.Serialization;
|
||||
using StellaOps.Scanner.Storage.Catalog;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
using StellaOps.Scanner.Storage.ObjectStore;
|
||||
using StellaOps.Scanner.WebService.Contracts;
|
||||
using StellaOps.Scanner.WebService.Domain;
|
||||
using StellaOps.Scanner.WebService.Services;
|
||||
@@ -92,39 +93,88 @@ public sealed class ScansEndpointsTests
|
||||
|
||||
using var factory = new ScannerApplicationFactory();
|
||||
|
||||
const string manifestDigest = "sha256:b2efc2d1f8b042b7f168bcb7d4e2f8e91d36b8306bd855382c5f847efc2c1111";
|
||||
const string graphDigest = "sha256:9a0d4f8c7b6a5e4d3c2b1a0f9e8d7c6b5a4f3e2d1c0b9a8f7e6d5c4b3a291819";
|
||||
const string ndjsonDigest = "sha256:3f2e1d0c9b8a7f6e5d4c3b2a1908f7e6d5c4b3a29181726354433221100ffeec";
|
||||
const string fragmentsDigest = "sha256:aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55";
|
||||
|
||||
using (var scope = factory.Services.CreateScope())
|
||||
{
|
||||
var artifactRepository = scope.ServiceProvider.GetRequiredService<ArtifactRepository>();
|
||||
var linkRepository = scope.ServiceProvider.GetRequiredService<LinkRepository>();
|
||||
var artifactId = CatalogIdFactory.CreateArtifactId(ArtifactDocumentType.ImageBom, digest);
|
||||
var now = DateTime.UtcNow;
|
||||
|
||||
var artifact = new ArtifactDocument
|
||||
async Task InsertAsync(
|
||||
ArtifactDocumentType type,
|
||||
ArtifactDocumentFormat format,
|
||||
string artifactDigest,
|
||||
string mediaType,
|
||||
string ttlClass)
|
||||
{
|
||||
Id = artifactId,
|
||||
Type = ArtifactDocumentType.ImageBom,
|
||||
Format = ArtifactDocumentFormat.CycloneDxJson,
|
||||
MediaType = "application/vnd.cyclonedx+json; version=1.6; view=inventory",
|
||||
BytesSha256 = digest,
|
||||
SizeBytes = 2048,
|
||||
Immutable = true,
|
||||
RefCount = 1,
|
||||
TtlClass = "default",
|
||||
CreatedAtUtc = DateTime.UtcNow,
|
||||
UpdatedAtUtc = DateTime.UtcNow
|
||||
};
|
||||
var artifactId = CatalogIdFactory.CreateArtifactId(type, artifactDigest);
|
||||
var document = new ArtifactDocument
|
||||
{
|
||||
Id = artifactId,
|
||||
Type = type,
|
||||
Format = format,
|
||||
MediaType = mediaType,
|
||||
BytesSha256 = artifactDigest,
|
||||
SizeBytes = 2048,
|
||||
Immutable = true,
|
||||
RefCount = 1,
|
||||
TtlClass = ttlClass,
|
||||
CreatedAtUtc = now,
|
||||
UpdatedAtUtc = now
|
||||
};
|
||||
|
||||
await artifactRepository.UpsertAsync(artifact, CancellationToken.None).ConfigureAwait(false);
|
||||
await artifactRepository.UpsertAsync(document, CancellationToken.None).ConfigureAwait(false);
|
||||
|
||||
var link = new LinkDocument
|
||||
{
|
||||
Id = CatalogIdFactory.CreateLinkId(LinkSourceType.Image, digest, artifactId),
|
||||
FromType = LinkSourceType.Image,
|
||||
FromDigest = digest,
|
||||
ArtifactId = artifactId,
|
||||
CreatedAtUtc = DateTime.UtcNow
|
||||
};
|
||||
var link = new LinkDocument
|
||||
{
|
||||
Id = CatalogIdFactory.CreateLinkId(LinkSourceType.Image, digest, artifactId),
|
||||
FromType = LinkSourceType.Image,
|
||||
FromDigest = digest,
|
||||
ArtifactId = artifactId,
|
||||
CreatedAtUtc = now
|
||||
};
|
||||
|
||||
await linkRepository.UpsertAsync(link, CancellationToken.None).ConfigureAwait(false);
|
||||
await linkRepository.UpsertAsync(link, CancellationToken.None).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
await InsertAsync(
|
||||
ArtifactDocumentType.ImageBom,
|
||||
ArtifactDocumentFormat.CycloneDxJson,
|
||||
digest,
|
||||
"application/vnd.cyclonedx+json; version=1.6; view=inventory",
|
||||
"default").ConfigureAwait(false);
|
||||
|
||||
await InsertAsync(
|
||||
ArtifactDocumentType.SurfaceManifest,
|
||||
ArtifactDocumentFormat.SurfaceManifestJson,
|
||||
manifestDigest,
|
||||
"application/vnd.stellaops.surface.manifest+json",
|
||||
"surface.manifest").ConfigureAwait(false);
|
||||
|
||||
await InsertAsync(
|
||||
ArtifactDocumentType.SurfaceEntryTrace,
|
||||
ArtifactDocumentFormat.EntryTraceGraphJson,
|
||||
graphDigest,
|
||||
"application/json",
|
||||
"surface.payload").ConfigureAwait(false);
|
||||
|
||||
await InsertAsync(
|
||||
ArtifactDocumentType.SurfaceEntryTrace,
|
||||
ArtifactDocumentFormat.EntryTraceNdjson,
|
||||
ndjsonDigest,
|
||||
"application/x-ndjson",
|
||||
"surface.payload").ConfigureAwait(false);
|
||||
|
||||
await InsertAsync(
|
||||
ArtifactDocumentType.SurfaceLayerFragment,
|
||||
ArtifactDocumentFormat.ComponentFragmentJson,
|
||||
fragmentsDigest,
|
||||
"application/json",
|
||||
"surface.payload").ConfigureAwait(false);
|
||||
}
|
||||
|
||||
using var client = factory.CreateClient();
|
||||
@@ -160,15 +210,46 @@ public sealed class ScansEndpointsTests
|
||||
Assert.Equal(digest, manifest.ImageDigest);
|
||||
Assert.Equal(surface.Tenant, manifest.Tenant);
|
||||
Assert.NotEqual(default, manifest.GeneratedAt);
|
||||
var manifestArtifact = Assert.Single(manifest.Artifacts);
|
||||
Assert.Equal("sbom-inventory", manifestArtifact.Kind);
|
||||
Assert.Equal("cdx-json", manifestArtifact.Format);
|
||||
Assert.Equal(digest, manifestArtifact.Digest);
|
||||
Assert.Equal("application/vnd.cyclonedx+json; version=1.6; view=inventory", manifestArtifact.MediaType);
|
||||
Assert.Equal("inventory", manifestArtifact.View);
|
||||
var artifactsByKind = manifest.Artifacts.ToDictionary(a => a.Kind, StringComparer.Ordinal);
|
||||
Assert.Equal(5, artifactsByKind.Count);
|
||||
|
||||
var expectedUri = $"cas://scanner-artifacts/scanner/images/{digestValue}/sbom.cdx.json";
|
||||
Assert.Equal(expectedUri, manifestArtifact.Uri);
|
||||
static string BuildUri(ArtifactDocumentType type, ArtifactDocumentFormat format, string digestValue)
|
||||
=> $"cas://scanner-artifacts/{ArtifactObjectKeyBuilder.Build(type, format, digestValue, \"scanner\")}";
|
||||
|
||||
var inventory = artifactsByKind["sbom-inventory"];
|
||||
Assert.Equal(digest, inventory.Digest);
|
||||
Assert.Equal("cdx-json", inventory.Format);
|
||||
Assert.Equal("application/vnd.cyclonedx+json; version=1.6; view=inventory", inventory.MediaType);
|
||||
Assert.Equal("inventory", inventory.View);
|
||||
Assert.Equal(BuildUri(ArtifactDocumentType.ImageBom, ArtifactDocumentFormat.CycloneDxJson, digest), inventory.Uri);
|
||||
|
||||
var manifestArtifact = artifactsByKind["surface.manifest"];
|
||||
Assert.Equal(manifestDigest, manifestArtifact.Digest);
|
||||
Assert.Equal("surface.manifest", manifestArtifact.Format);
|
||||
Assert.Equal("application/vnd.stellaops.surface.manifest+json", manifestArtifact.MediaType);
|
||||
Assert.Null(manifestArtifact.View);
|
||||
Assert.Equal(BuildUri(ArtifactDocumentType.SurfaceManifest, ArtifactDocumentFormat.SurfaceManifestJson, manifestDigest), manifestArtifact.Uri);
|
||||
|
||||
var graphArtifact = artifactsByKind["entrytrace.graph"];
|
||||
Assert.Equal(graphDigest, graphArtifact.Digest);
|
||||
Assert.Equal("entrytrace.graph", graphArtifact.Format);
|
||||
Assert.Equal("application/json", graphArtifact.MediaType);
|
||||
Assert.Null(graphArtifact.View);
|
||||
Assert.Equal(BuildUri(ArtifactDocumentType.SurfaceEntryTrace, ArtifactDocumentFormat.EntryTraceGraphJson, graphDigest), graphArtifact.Uri);
|
||||
|
||||
var ndjsonArtifact = artifactsByKind["entrytrace.ndjson"];
|
||||
Assert.Equal(ndjsonDigest, ndjsonArtifact.Digest);
|
||||
Assert.Equal("entrytrace.ndjson", ndjsonArtifact.Format);
|
||||
Assert.Equal("application/x-ndjson", ndjsonArtifact.MediaType);
|
||||
Assert.Null(ndjsonArtifact.View);
|
||||
Assert.Equal(BuildUri(ArtifactDocumentType.SurfaceEntryTrace, ArtifactDocumentFormat.EntryTraceNdjson, ndjsonDigest), ndjsonArtifact.Uri);
|
||||
|
||||
var fragmentsArtifact = artifactsByKind["layer.fragments"];
|
||||
Assert.Equal(fragmentsDigest, fragmentsArtifact.Digest);
|
||||
Assert.Equal("layer.fragments", fragmentsArtifact.Format);
|
||||
Assert.Equal("application/json", fragmentsArtifact.MediaType);
|
||||
Assert.Equal("inventory", fragmentsArtifact.View);
|
||||
Assert.Equal(BuildUri(ArtifactDocumentType.SurfaceLayerFragment, ArtifactDocumentFormat.ComponentFragmentJson, fragmentsDigest), fragmentsArtifact.Uri);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
|
||||
@@ -0,0 +1,221 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics.Metrics;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Text;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
using StellaOps.Scanner.Surface.Env;
|
||||
using StellaOps.Scanner.Surface.Secrets;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
using StellaOps.Scanner.Worker.Processing;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Tests;
|
||||
|
||||
public sealed class RegistrySecretStageExecutorTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task ExecuteAsync_WithSecret_StoresCredentialsAndEmitsMetrics()
|
||||
{
|
||||
const string secretJson = """
|
||||
{
|
||||
"defaultRegistry": "registry.example.com",
|
||||
"entries": [
|
||||
{
|
||||
"registry": "registry.example.com",
|
||||
"username": "demo",
|
||||
"password": "s3cret",
|
||||
"expiresAt": "2099-01-01T00:00:00Z"
|
||||
}
|
||||
]
|
||||
}
|
||||
""";
|
||||
|
||||
var provider = new StubSecretProvider(secretJson);
|
||||
var environment = new StubSurfaceEnvironment("tenant-eu");
|
||||
var metrics = new ScannerWorkerMetrics();
|
||||
var timeProvider = TimeProvider.System;
|
||||
var executor = new RegistrySecretStageExecutor(
|
||||
provider,
|
||||
environment,
|
||||
metrics,
|
||||
timeProvider,
|
||||
NullLogger<RegistrySecretStageExecutor>.Instance);
|
||||
|
||||
var metadata = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["surface.registry.secret"] = "primary"
|
||||
};
|
||||
var lease = new StubLease("job-1", "scan-1", metadata);
|
||||
using var contextCancellation = CancellationTokenSource.CreateLinkedTokenSource(CancellationToken.None);
|
||||
var context = new ScanJobContext(lease, timeProvider, timeProvider.GetUtcNow(), contextCancellation.Token);
|
||||
|
||||
var measurements = new List<(long Value, KeyValuePair<string, object?>[] Tags)>();
|
||||
using var listener = CreateCounterListener("scanner_worker_registry_secret_requests_total", measurements);
|
||||
|
||||
await executor.ExecuteAsync(context, CancellationToken.None);
|
||||
listener.RecordObservableInstruments();
|
||||
|
||||
Assert.True(context.Analysis.TryGet<RegistryAccessSecret>(ScanAnalysisKeys.RegistryCredentials, out var secret));
|
||||
Assert.NotNull(secret);
|
||||
Assert.Single(secret!.Entries);
|
||||
|
||||
Assert.Contains(
|
||||
measurements,
|
||||
measurement => measurement.Value == 1 &&
|
||||
HasTagValue(measurement.Tags, "secret.result", "resolved") &&
|
||||
HasTagValue(measurement.Tags, "secret.name", "primary"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ExecuteAsync_SecretMissing_RecordsMissingMetric()
|
||||
{
|
||||
var provider = new MissingSecretProvider();
|
||||
var environment = new StubSurfaceEnvironment("tenant-eu");
|
||||
var metrics = new ScannerWorkerMetrics();
|
||||
var executor = new RegistrySecretStageExecutor(
|
||||
provider,
|
||||
environment,
|
||||
metrics,
|
||||
TimeProvider.System,
|
||||
NullLogger<RegistrySecretStageExecutor>.Instance);
|
||||
|
||||
var lease = new StubLease("job-2", "scan-2", new Dictionary<string, string>());
|
||||
var context = new ScanJobContext(lease, TimeProvider.System, TimeProvider.System.GetUtcNow(), CancellationToken.None);
|
||||
|
||||
var measurements = new List<(long Value, KeyValuePair<string, object?>[] Tags)>();
|
||||
using var listener = CreateCounterListener("scanner_worker_registry_secret_requests_total", measurements);
|
||||
|
||||
await executor.ExecuteAsync(context, CancellationToken.None);
|
||||
listener.RecordObservableInstruments();
|
||||
|
||||
Assert.False(context.Analysis.TryGet<RegistryAccessSecret>(ScanAnalysisKeys.RegistryCredentials, out _));
|
||||
|
||||
Assert.Contains(
|
||||
measurements,
|
||||
measurement => measurement.Value == 1 &&
|
||||
HasTagValue(measurement.Tags, "secret.result", "missing") &&
|
||||
HasTagValue(measurement.Tags, "secret.name", "default"));
|
||||
}
|
||||
|
||||
private static MeterListener CreateCounterListener(
|
||||
string instrumentName,
|
||||
ICollection<(long Value, KeyValuePair<string, object?>[] Tags)> measurements)
|
||||
{
|
||||
var listener = new MeterListener
|
||||
{
|
||||
InstrumentPublished = (instrument, meterListener) =>
|
||||
{
|
||||
if (instrument.Meter.Name == ScannerWorkerInstrumentation.MeterName &&
|
||||
instrument.Name == instrumentName)
|
||||
{
|
||||
meterListener.EnableMeasurementEvents(instrument);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
var copy = tags.ToArray();
|
||||
measurements.Add((measurement, copy));
|
||||
});
|
||||
|
||||
listener.Start();
|
||||
return listener;
|
||||
}
|
||||
|
||||
private static bool HasTagValue(IEnumerable<KeyValuePair<string, object?>> tags, string key, string expected)
|
||||
=> tags.Any(tag => string.Equals(tag.Key, key, StringComparison.OrdinalIgnoreCase) &&
|
||||
string.Equals(tag.Value?.ToString(), expected, StringComparison.OrdinalIgnoreCase));
|
||||
|
||||
private sealed class StubSecretProvider : ISurfaceSecretProvider
|
||||
{
|
||||
private readonly string _json;
|
||||
|
||||
public StubSecretProvider(string json)
|
||||
{
|
||||
_json = json;
|
||||
}
|
||||
|
||||
public ValueTask<SurfaceSecretHandle> GetAsync(SurfaceSecretRequest request, CancellationToken cancellationToken = default)
|
||||
{
|
||||
var bytes = Encoding.UTF8.GetBytes(_json);
|
||||
return ValueTask.FromResult(SurfaceSecretHandle.FromBytes(bytes));
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class MissingSecretProvider : ISurfaceSecretProvider
|
||||
{
|
||||
public ValueTask<SurfaceSecretHandle> GetAsync(SurfaceSecretRequest request, CancellationToken cancellationToken = default)
|
||||
=> throw new SurfaceSecretNotFoundException(request);
|
||||
}
|
||||
|
||||
private sealed class StubSurfaceEnvironment : ISurfaceEnvironment
|
||||
{
|
||||
public StubSurfaceEnvironment(string tenant)
|
||||
{
|
||||
Settings = new SurfaceEnvironmentSettings(
|
||||
new Uri("https://surface.example"),
|
||||
"bucket",
|
||||
"region",
|
||||
new DirectoryInfo(Path.GetTempPath()),
|
||||
1024,
|
||||
false,
|
||||
Array.Empty<string>(),
|
||||
new SurfaceSecretsConfiguration("inline", tenant, null, null, null, AllowInline: true),
|
||||
tenant,
|
||||
new SurfaceTlsConfiguration(null, null, null))
|
||||
{
|
||||
CreatedAtUtc = DateTimeOffset.UtcNow
|
||||
};
|
||||
|
||||
RawVariables = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
public SurfaceEnvironmentSettings Settings { get; }
|
||||
|
||||
public IReadOnlyDictionary<string, string> RawVariables { get; }
|
||||
}
|
||||
|
||||
private sealed class StubLease : IScanJobLease
|
||||
{
|
||||
private readonly IReadOnlyDictionary<string, string> _metadata;
|
||||
|
||||
public StubLease(string jobId, string scanId, IReadOnlyDictionary<string, string> metadata)
|
||||
{
|
||||
JobId = jobId;
|
||||
ScanId = scanId;
|
||||
_metadata = metadata;
|
||||
EnqueuedAtUtc = DateTimeOffset.UtcNow.AddMinutes(-1);
|
||||
LeasedAtUtc = DateTimeOffset.UtcNow;
|
||||
}
|
||||
|
||||
public string JobId { get; }
|
||||
|
||||
public string ScanId { get; }
|
||||
|
||||
public int Attempt { get; } = 1;
|
||||
|
||||
public DateTimeOffset EnqueuedAtUtc { get; }
|
||||
|
||||
public DateTimeOffset LeasedAtUtc { get; }
|
||||
|
||||
public TimeSpan LeaseDuration { get; } = TimeSpan.FromMinutes(5);
|
||||
|
||||
public IReadOnlyDictionary<string, string> Metadata => _metadata;
|
||||
|
||||
public ValueTask RenewAsync(CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask CompleteAsync(CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask AbandonAsync(string reason, CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask PoisonAsync(string reason, CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask DisposeAsync() => ValueTask.CompletedTask;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,349 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using System.Security.Cryptography;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
using StellaOps.Scanner.EntryTrace;
|
||||
using StellaOps.Scanner.Surface.Env;
|
||||
using StellaOps.Scanner.Surface.FS;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
using StellaOps.Scanner.Worker.Processing;
|
||||
using StellaOps.Scanner.Worker.Processing.Surface;
|
||||
using StellaOps.Scanner.Worker.Tests.TestInfrastructure;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Tests;
|
||||
|
||||
public sealed class SurfaceManifestStageExecutorTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task ExecuteAsync_WhenNoPayloads_SkipsPublishAndRecordsSkipMetric()
|
||||
{
|
||||
var metrics = new ScannerWorkerMetrics();
|
||||
var publisher = new TestSurfaceManifestPublisher();
|
||||
var cache = new RecordingSurfaceCache();
|
||||
var environment = new TestSurfaceEnvironment("tenant-a");
|
||||
|
||||
using var listener = new WorkerMeterListener();
|
||||
listener.Start();
|
||||
|
||||
var executor = new SurfaceManifestStageExecutor(
|
||||
publisher,
|
||||
cache,
|
||||
environment,
|
||||
metrics,
|
||||
NullLogger<SurfaceManifestStageExecutor>.Instance);
|
||||
|
||||
var context = CreateContext();
|
||||
|
||||
await executor.ExecuteAsync(context, CancellationToken.None);
|
||||
|
||||
Assert.Equal(0, publisher.PublishCalls);
|
||||
Assert.Empty(cache.Entries);
|
||||
|
||||
var skipMetrics = listener.Measurements
|
||||
.Where(m => m.InstrumentName == "scanner_worker_surface_manifests_skipped_total")
|
||||
.ToArray();
|
||||
|
||||
Assert.Single(skipMetrics);
|
||||
Assert.Equal(1, skipMetrics[0].Value);
|
||||
Assert.Equal("skipped", skipMetrics[0]["surface.result"]);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ExecuteAsync_PublishesPayloads_CachesArtifacts_AndRecordsMetrics()
|
||||
{
|
||||
var metrics = new ScannerWorkerMetrics();
|
||||
var publisher = new TestSurfaceManifestPublisher("tenant-a");
|
||||
var cache = new RecordingSurfaceCache();
|
||||
var environment = new TestSurfaceEnvironment("tenant-a");
|
||||
|
||||
using var listener = new WorkerMeterListener();
|
||||
listener.Start();
|
||||
|
||||
var executor = new SurfaceManifestStageExecutor(
|
||||
publisher,
|
||||
cache,
|
||||
environment,
|
||||
metrics,
|
||||
NullLogger<SurfaceManifestStageExecutor>.Instance);
|
||||
|
||||
var context = CreateContext();
|
||||
PopulateAnalysis(context);
|
||||
|
||||
await executor.ExecuteAsync(context, CancellationToken.None);
|
||||
|
||||
Assert.Equal(1, publisher.PublishCalls);
|
||||
Assert.True(context.Analysis.TryGet<SurfaceManifestPublishResult>(ScanAnalysisKeys.SurfaceManifest, out var result));
|
||||
Assert.NotNull(result);
|
||||
Assert.Equal(publisher.LastManifestDigest, result!.ManifestDigest);
|
||||
|
||||
Assert.Equal(4, cache.Entries.Count);
|
||||
Assert.Contains(cache.Entries.Keys, key => key.Namespace == "surface.artifacts.entrytrace.graph" && key.Tenant == "tenant-a");
|
||||
Assert.Contains(cache.Entries.Keys, key => key.Namespace == "surface.artifacts.entrytrace.ndjson" && key.Tenant == "tenant-a");
|
||||
Assert.Contains(cache.Entries.Keys, key => key.Namespace == "surface.artifacts.layer.fragments" && key.Tenant == "tenant-a");
|
||||
Assert.Contains(cache.Entries.Keys, key => key.Namespace == "surface.manifests" && key.Tenant == "tenant-a");
|
||||
|
||||
var publishedMetrics = listener.Measurements
|
||||
.Where(m => m.InstrumentName == "scanner_worker_surface_manifests_published_total")
|
||||
.ToArray();
|
||||
Assert.Single(publishedMetrics);
|
||||
Assert.Equal(1, publishedMetrics[0].Value);
|
||||
Assert.Equal("published", publishedMetrics[0]["surface.result"]);
|
||||
Assert.Equal(3, Convert.ToInt32(publishedMetrics[0]["surface.payload_count"]));
|
||||
|
||||
var payloadMetrics = listener.Measurements
|
||||
.Where(m => m.InstrumentName == "scanner_worker_surface_payload_persisted_total")
|
||||
.ToArray();
|
||||
Assert.Equal(3, payloadMetrics.Length);
|
||||
Assert.Contains(payloadMetrics, m => Equals("entrytrace.graph", m["surface.kind"]));
|
||||
Assert.Contains(payloadMetrics, m => Equals("entrytrace.ndjson", m["surface.kind"]));
|
||||
Assert.Contains(payloadMetrics, m => Equals("layer.fragments", m["surface.kind"]));
|
||||
}
|
||||
|
||||
private static ScanJobContext CreateContext()
|
||||
{
|
||||
var lease = new FakeJobLease();
|
||||
return new ScanJobContext(lease, TimeProvider.System, DateTimeOffset.UtcNow, CancellationToken.None);
|
||||
}
|
||||
|
||||
private static void PopulateAnalysis(ScanJobContext context)
|
||||
{
|
||||
var node = new EntryTraceNode(
|
||||
Id: 1,
|
||||
Kind: EntryTraceNodeKind.Command,
|
||||
DisplayName: "/bin/entry",
|
||||
Arguments: ImmutableArray<string>.Empty,
|
||||
InterpreterKind: EntryTraceInterpreterKind.None,
|
||||
Evidence: null,
|
||||
Span: null,
|
||||
Metadata: null);
|
||||
|
||||
var graph = new EntryTraceGraph(
|
||||
Outcome: EntryTraceOutcome.Resolved,
|
||||
Nodes: ImmutableArray.Create(node),
|
||||
Edges: ImmutableArray<EntryTraceEdge>.Empty,
|
||||
Diagnostics: ImmutableArray<EntryTraceDiagnostic>.Empty,
|
||||
Plans: ImmutableArray<EntryTracePlan>.Empty,
|
||||
Terminals: ImmutableArray<EntryTraceTerminal>.Empty);
|
||||
|
||||
context.Analysis.Set(ScanAnalysisKeys.EntryTraceGraph, graph);
|
||||
|
||||
var ndjson = ImmutableArray.Create("{\"entry\":\"/bin/entry\"}\n");
|
||||
context.Analysis.Set(ScanAnalysisKeys.EntryTraceNdjson, ndjson);
|
||||
|
||||
var component = new ComponentRecord
|
||||
{
|
||||
Identity = ComponentIdentity.Create("pkg:test", "test", "1.0.0"),
|
||||
LayerDigest = "sha256:layer-1",
|
||||
Evidence = ImmutableArray<ComponentEvidence>.Empty,
|
||||
Usage = ComponentUsage.Create(true, new[] { "/bin/entry" })
|
||||
};
|
||||
|
||||
var fragment = LayerComponentFragment.Create("sha256:layer-1", new[] { component });
|
||||
context.Analysis.Set(ScanAnalysisKeys.LayerComponentFragments, ImmutableArray.Create(fragment));
|
||||
}
|
||||
|
||||
private sealed class RecordingSurfaceCache : ISurfaceCache
|
||||
{
|
||||
private readonly Dictionary<SurfaceCacheKey, byte[]> _entries = new();
|
||||
|
||||
public IReadOnlyDictionary<SurfaceCacheKey, byte[]> Entries => _entries;
|
||||
|
||||
public Task<T> GetOrCreateAsync<T>(
|
||||
SurfaceCacheKey key,
|
||||
Func<CancellationToken, Task<T>> factory,
|
||||
Func<T, ReadOnlyMemory<byte>> serializer,
|
||||
Func<ReadOnlyMemory<byte>, T> deserializer,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
if (_entries.TryGetValue(key, out var payload))
|
||||
{
|
||||
return Task.FromResult(deserializer(payload));
|
||||
}
|
||||
|
||||
return CreateAsync(key, factory, serializer, cancellationToken);
|
||||
}
|
||||
|
||||
public Task<T?> TryGetAsync<T>(
|
||||
SurfaceCacheKey key,
|
||||
Func<ReadOnlyMemory<byte>, T> deserializer,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
if (_entries.TryGetValue(key, out var payload))
|
||||
{
|
||||
return Task.FromResult<T?>(deserializer(payload));
|
||||
}
|
||||
|
||||
return Task.FromResult<T?>(default);
|
||||
}
|
||||
|
||||
public Task SetAsync(
|
||||
SurfaceCacheKey key,
|
||||
ReadOnlyMemory<byte> payload,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
_entries[key] = payload.ToArray();
|
||||
return Task.CompletedTask;
|
||||
}
|
||||
|
||||
private async Task<T> CreateAsync<T>(
|
||||
SurfaceCacheKey key,
|
||||
Func<CancellationToken, Task<T>> factory,
|
||||
Func<T, ReadOnlyMemory<byte>> serializer,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var value = await factory(cancellationToken).ConfigureAwait(false);
|
||||
_entries[key] = serializer(value).ToArray();
|
||||
return value;
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class TestSurfaceManifestPublisher : ISurfaceManifestPublisher
|
||||
{
|
||||
private readonly string _tenant;
|
||||
private readonly JsonSerializerOptions _options = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
WriteIndented = false,
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
|
||||
};
|
||||
|
||||
public TestSurfaceManifestPublisher(string tenant = "tenant-a")
|
||||
{
|
||||
_tenant = tenant;
|
||||
}
|
||||
|
||||
public int PublishCalls { get; private set; }
|
||||
|
||||
public SurfaceManifestRequest? LastRequest { get; private set; }
|
||||
|
||||
public string? LastManifestDigest { get; private set; }
|
||||
|
||||
public Task<SurfaceManifestPublishResult> PublishAsync(SurfaceManifestRequest request, CancellationToken cancellationToken)
|
||||
{
|
||||
PublishCalls++;
|
||||
LastRequest = request;
|
||||
|
||||
var artifacts = request.Payloads.Select(payload =>
|
||||
{
|
||||
var digest = ComputeDigest(payload.Content.Span);
|
||||
return new SurfaceManifestArtifact
|
||||
{
|
||||
Kind = payload.Kind,
|
||||
Uri = $"cas://test/{payload.Kind}/{digest}",
|
||||
Digest = digest,
|
||||
MediaType = payload.MediaType,
|
||||
Format = payload.ArtifactFormat.ToString().ToLowerInvariant(),
|
||||
SizeBytes = payload.Content.Length,
|
||||
View = payload.View,
|
||||
Metadata = payload.Metadata,
|
||||
Storage = new SurfaceManifestStorage
|
||||
{
|
||||
Bucket = "test-bucket",
|
||||
ObjectKey = $"objects/{digest}",
|
||||
SizeBytes = payload.Content.Length,
|
||||
ContentType = payload.MediaType
|
||||
}
|
||||
};
|
||||
}).ToImmutableArray();
|
||||
|
||||
var document = new SurfaceManifestDocument
|
||||
{
|
||||
Tenant = _tenant,
|
||||
ImageDigest = request.ImageDigest,
|
||||
ScanId = request.ScanId,
|
||||
GeneratedAt = DateTimeOffset.UtcNow,
|
||||
Source = new SurfaceManifestSource
|
||||
{
|
||||
Component = request.Component,
|
||||
Version = request.Version,
|
||||
WorkerInstance = request.WorkerInstance,
|
||||
Attempt = request.Attempt
|
||||
},
|
||||
Artifacts = artifacts
|
||||
};
|
||||
|
||||
var manifestBytes = JsonSerializer.SerializeToUtf8Bytes(document, _options);
|
||||
var manifestDigest = ComputeDigest(manifestBytes);
|
||||
LastManifestDigest = manifestDigest;
|
||||
|
||||
var result = new SurfaceManifestPublishResult(
|
||||
ManifestDigest: manifestDigest,
|
||||
ManifestUri: $"cas://test/manifests/{manifestDigest}",
|
||||
ArtifactId: $"surface-manifest::{manifestDigest}",
|
||||
Document: document);
|
||||
|
||||
return Task.FromResult(result);
|
||||
}
|
||||
|
||||
private static string ComputeDigest(ReadOnlySpan<byte> content)
|
||||
{
|
||||
Span<byte> hash = stackalloc byte[32];
|
||||
SHA256.HashData(content, hash);
|
||||
return $"sha256:{Convert.ToHexString(hash).ToLowerInvariant()}";
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class TestSurfaceEnvironment : ISurfaceEnvironment
|
||||
{
|
||||
public TestSurfaceEnvironment(string tenant)
|
||||
{
|
||||
var cacheRoot = new DirectoryInfo(Path.Combine(Path.GetTempPath(), "surface-cache-test"));
|
||||
Settings = new SurfaceEnvironmentSettings(
|
||||
SurfaceFsEndpoint: new Uri("https://surface.local"),
|
||||
SurfaceFsBucket: "test-bucket",
|
||||
SurfaceFsRegion: null,
|
||||
CacheRoot: cacheRoot,
|
||||
CacheQuotaMegabytes: 512,
|
||||
PrefetchEnabled: false,
|
||||
FeatureFlags: Array.Empty<string>(),
|
||||
Secrets: new SurfaceSecretsConfiguration("none", tenant, null, null, null, false),
|
||||
Tenant: tenant,
|
||||
Tls: new SurfaceTlsConfiguration(null, null, null));
|
||||
}
|
||||
|
||||
public SurfaceEnvironmentSettings Settings { get; }
|
||||
|
||||
public IReadOnlyDictionary<string, string> RawVariables { get; } = new Dictionary<string, string>();
|
||||
}
|
||||
|
||||
private sealed class FakeJobLease : IScanJobLease
|
||||
{
|
||||
private readonly Dictionary<string, string> _metadata = new()
|
||||
{
|
||||
["queue"] = "tests",
|
||||
["job.kind"] = "unit"
|
||||
};
|
||||
|
||||
public string JobId { get; } = Guid.NewGuid().ToString("n");
|
||||
|
||||
public string ScanId { get; } = $"scan-{Guid.NewGuid():n}";
|
||||
|
||||
public int Attempt { get; } = 1;
|
||||
|
||||
public DateTimeOffset EnqueuedAtUtc { get; } = DateTimeOffset.UtcNow.AddMinutes(-1);
|
||||
|
||||
public DateTimeOffset LeasedAtUtc { get; } = DateTimeOffset.UtcNow;
|
||||
|
||||
public TimeSpan LeaseDuration { get; } = TimeSpan.FromMinutes(5);
|
||||
|
||||
public IReadOnlyDictionary<string, string> Metadata => _metadata;
|
||||
|
||||
public ValueTask RenewAsync(CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask CompleteAsync(CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask AbandonAsync(string reason, CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask PoisonAsync(string reason, CancellationToken cancellationToken) => ValueTask.CompletedTask;
|
||||
|
||||
public ValueTask DisposeAsync() => ValueTask.CompletedTask;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,61 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics.Metrics;
|
||||
using System.Globalization;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Tests.TestInfrastructure;
|
||||
|
||||
public sealed class WorkerMeterListener : IDisposable
|
||||
{
|
||||
private readonly MeterListener _listener;
|
||||
|
||||
public ConcurrentBag<Measurement> Measurements { get; } = new();
|
||||
|
||||
public WorkerMeterListener()
|
||||
{
|
||||
_listener = new MeterListener
|
||||
{
|
||||
InstrumentPublished = (instrument, listener) =>
|
||||
{
|
||||
if (instrument.Meter.Name == ScannerWorkerInstrumentation.MeterName)
|
||||
{
|
||||
listener.EnableMeasurementEvents(instrument);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
_listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
AddMeasurement(instrument, measurement, tags);
|
||||
});
|
||||
|
||||
_listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
AddMeasurement(instrument, measurement, tags);
|
||||
});
|
||||
}
|
||||
|
||||
public void Start() => _listener.Start();
|
||||
|
||||
public void Dispose() => _listener.Dispose();
|
||||
|
||||
public sealed record Measurement(string InstrumentName, double Value, IReadOnlyDictionary<string, object?> Tags)
|
||||
{
|
||||
public object? this[string name] => Tags.TryGetValue(name, out var value) ? value : null;
|
||||
}
|
||||
|
||||
private void AddMeasurement<T>(Instrument instrument, T measurement, ReadOnlySpan<KeyValuePair<string, object?>> tags)
|
||||
where T : struct, IConvertible
|
||||
{
|
||||
var tagDictionary = new Dictionary<string, object?>(tags.Length, StringComparer.Ordinal);
|
||||
foreach (var tag in tags)
|
||||
{
|
||||
tagDictionary[tag.Key] = tag.Value;
|
||||
}
|
||||
|
||||
var value = Convert.ToDouble(measurement, System.Globalization.CultureInfo.InvariantCulture);
|
||||
Measurements.Add(new Measurement(instrument.Name, value, tagDictionary));
|
||||
}
|
||||
}
|
||||
@@ -1,19 +1,19 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics.Metrics;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using Microsoft.Extensions.Time.Testing;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
using StellaOps.Scanner.Worker.Hosting;
|
||||
using StellaOps.Scanner.Worker.Options;
|
||||
using StellaOps.Scanner.Worker.Processing;
|
||||
using Xunit;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using Microsoft.Extensions.Time.Testing;
|
||||
using StellaOps.Scanner.Worker.Diagnostics;
|
||||
using StellaOps.Scanner.Worker.Hosting;
|
||||
using StellaOps.Scanner.Worker.Options;
|
||||
using StellaOps.Scanner.Worker.Processing;
|
||||
using StellaOps.Scanner.Worker.Tests.TestInfrastructure;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Tests;
|
||||
|
||||
@@ -48,8 +48,8 @@ public sealed class WorkerBasicScanScenarioTests
|
||||
var scheduler = new ControlledDelayScheduler();
|
||||
var analyzer = new TestAnalyzerDispatcher(scheduler);
|
||||
|
||||
using var listener = new WorkerMetricsListener();
|
||||
listener.Start();
|
||||
using var listener = new WorkerMeterListener();
|
||||
listener.Start();
|
||||
|
||||
using var services = new ServiceCollection()
|
||||
.AddLogging(builder =>
|
||||
@@ -341,46 +341,6 @@ public sealed class WorkerBasicScanScenarioTests
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class WorkerMetricsListener : IDisposable
|
||||
{
|
||||
private readonly MeterListener _listener;
|
||||
public ConcurrentBag<Measurement> Measurements { get; } = new();
|
||||
|
||||
public WorkerMetricsListener()
|
||||
{
|
||||
_listener = new MeterListener
|
||||
{
|
||||
InstrumentPublished = (instrument, listener) =>
|
||||
{
|
||||
if (instrument.Meter.Name == ScannerWorkerInstrumentation.MeterName)
|
||||
{
|
||||
listener.EnableMeasurementEvents(instrument);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
_listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
var tagDictionary = new Dictionary<string, object?>(tags.Length, StringComparer.Ordinal);
|
||||
foreach (var tag in tags)
|
||||
{
|
||||
tagDictionary[tag.Key] = tag.Value;
|
||||
}
|
||||
|
||||
Measurements.Add(new Measurement(instrument.Name, measurement, tagDictionary));
|
||||
});
|
||||
}
|
||||
|
||||
public void Start() => _listener.Start();
|
||||
|
||||
public void Dispose() => _listener.Dispose();
|
||||
}
|
||||
|
||||
public sealed record Measurement(string InstrumentName, double Value, IReadOnlyDictionary<string, object?> Tags)
|
||||
{
|
||||
public object? this[string name] => Tags.TryGetValue(name, out var value) ? value : null;
|
||||
}
|
||||
|
||||
private sealed class TestLoggerProvider : ILoggerProvider
|
||||
{
|
||||
private readonly ConcurrentQueue<TestLogEntry> _entries = new();
|
||||
|
||||
@@ -0,0 +1,31 @@
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.TaskRunner.Core.Configuration;
|
||||
|
||||
public static class TaskRunnerStorageModes
|
||||
{
|
||||
public const string Filesystem = "filesystem";
|
||||
public const string Mongo = "mongo";
|
||||
}
|
||||
|
||||
public sealed class TaskRunnerStorageOptions
|
||||
{
|
||||
public string Mode { get; set; } = TaskRunnerStorageModes.Filesystem;
|
||||
|
||||
public TaskRunnerMongoOptions Mongo { get; set; } = new();
|
||||
}
|
||||
|
||||
public sealed class TaskRunnerMongoOptions
|
||||
{
|
||||
public string ConnectionString { get; set; } = "mongodb://127.0.0.1:27017/stellaops-taskrunner";
|
||||
|
||||
public string? Database { get; set; }
|
||||
|
||||
public string RunsCollection { get; set; } = "pack_runs";
|
||||
|
||||
public string LogsCollection { get; set; } = "pack_run_logs";
|
||||
|
||||
public string ArtifactsCollection { get; set; } = "pack_artifacts";
|
||||
|
||||
public string ApprovalsCollection { get; set; } = "pack_run_approvals";
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user