feat: Implement policy attestation features and service account delegation
- Added new policy scopes: `policy:publish` and `policy:promote` with interactive-only enforcement. - Introduced metadata parameters for policy actions: `policy_reason`, `policy_ticket`, and `policy_digest`. - Enhanced token validation to require fresh authentication for policy attestation tokens. - Updated grant handlers to enforce policy scope checks and log audit information. - Implemented service account delegation configuration, including quotas and validation. - Seeded service accounts during application initialization based on configuration. - Updated documentation and tasks to reflect new features and changes.
This commit is contained in:
@@ -49,7 +49,7 @@ Authority persists every issued token in MongoDB so operators can audit or revok
|
||||
- **Redirect URIs** (defaults): `https://console.stella-ops.local/oidc/callback`
|
||||
- **Post-logout redirect**: `https://console.stella-ops.local/`
|
||||
- **Tokens**: Access tokens inherit the global 2 minute lifetime; refresh tokens remain short-lived (30 days) and can be exchanged silently via `/token`.
|
||||
- **Roles**: Assign Authority role `Orch.Viewer` (exposed to tenants as `role/orch-viewer`) when operators need read-only access to Orchestrator telemetry via Console dashboards. Policy Studio ships dedicated roles (`role/policy-author`, `role/policy-reviewer`, `role/policy-approver`, `role/policy-operator`, `role/policy-auditor`) that align with the new `policy:*` scope family; issue them per tenant so audit trails remain scoped.
|
||||
- **Roles**: Assign Authority role `Orch.Viewer` (exposed to tenants as `role/orch-viewer`) when operators need read-only access to Orchestrator telemetry via Console dashboards. Policy Studio ships dedicated roles (`role/policy-author`, `role/policy-reviewer`, `role/policy-approver`, `role/policy-operator`, `role/policy-auditor`) plus the new attestation verbs (`policy:publish`, `policy:promote`) that align with the `policy:*` scope family; issue them per tenant so audit trails remain scoped and interactive attestations stay attributable.
|
||||
|
||||
Configuration sample (`etc/authority.yaml.sample`) seeds the client with a confidential secret so Console can negotiate the code exchange on the backend while browsers execute the PKCE dance.
|
||||
|
||||
@@ -91,7 +91,8 @@ Resource servers (Concelier WebService, Backend, Agent) **must not** assume in-m
|
||||
- Client credentials that request `advisory:ingest`, `advisory:read`, `advisory-ai:view`, `advisory-ai:operate`, `advisory-ai:admin`, `vex:ingest`, `vex:read`, `signals:read`, `signals:write`, `signals:admin`, or `aoc:verify` now fail fast when the client registration lacks a tenant hint. Issued tokens are re-validated against persisted tenant metadata, and Authority rejects any cross-tenant replay (`invalid_client`/`invalid_token`), ensuring aggregation-only workloads remain tenant-scoped.
|
||||
- Client credentials that request `export.viewer`, `export.operator`, or `export.admin` must provide a tenant hint. Requests for `export.admin` also need accompanying `export_reason` and `export_ticket` parameters; Authority returns `invalid_request` when either value is missing and records the denial in token audit events.
|
||||
- Client credentials that request `notify.viewer`, `notify.operator`, or `notify.admin` must provide a tenant hint. Authority records scope violations when tenancy is missing and emits `authority.notify.scope_violation` audit metadata so operators can trace denied requests.
|
||||
- Policy Studio scopes (`policy:author`, `policy:review`, `policy:approve`, `policy:operate`, `policy:audit`, `policy:simulate`, `policy:run`, `policy:activate`) require a tenant assignment; Authority rejects tokens missing the hint with `invalid_client` and records `scope.invalid` metadata for auditing.
|
||||
- Policy Studio scopes (`policy:author`, `policy:review`, `policy:approve`, `policy:operate`, `policy:publish`, `policy:promote`, `policy:audit`, `policy:simulate`, `policy:run`, `policy:activate`) require a tenant assignment; Authority rejects tokens missing the hint with `invalid_client` and records `scope.invalid` metadata for auditing. The `policy:publish`/`policy:promote` scopes are interactive-only and demand additional metadata (see “Policy attestation metadata” below).
|
||||
- Policy attestation tokens must include three parameters: `policy_reason` (≤512 chars describing why the attestation is being produced), `policy_ticket` (≤128 chars change/request reference), and `policy_digest` (32–128 char hex digest of the policy package). Authority rejects requests missing any value, over the limits, or providing a non-hex digest. Password-grant issuance stamps these values into the resulting token/audit trail and enforces a five-minute fresh-auth window via the `auth_time` claim.
|
||||
- Task Pack scopes (`packs.read`, `packs.write`, `packs.run`, `packs.approve`) require a tenant assignment; Authority rejects tokens missing the hint with `invalid_client` and logs `authority.pack_scope_violation` metadata for audit correlation.
|
||||
- **AOC pairing guardrails** – Tokens that request `advisory:read`, `advisory-ai:view`, `advisory-ai:operate`, `advisory-ai:admin`, `vex:read`, or any `signals:*` scope must also request `aoc:verify`. Authority rejects mismatches with `invalid_scope` (e.g., `Scope 'aoc:verify' is required when requesting advisory/advisory-ai/vex read scopes.` or `Scope 'aoc:verify' is required when requesting signals scopes.`) so automation surfaces deterministic errors.
|
||||
- **Signals ingestion guardrails** – Sensors and services requesting `signals:write`/`signals:admin` must also request `aoc:verify`; Authority records the `authority.aoc_scope_violation` tag when the pairing is missing so operators can trace failing sensors immediately.
|
||||
@@ -119,6 +120,18 @@ For factory provisioning, issue sensors the **SignalsUploader** role template (`
|
||||
|
||||
These registrations are provided as examples in `etc/authority.yaml.sample`. Clone them per tenant (for example `concelier-tenant-a`, `concelier-tenant-b`) so tokens remain tenant-scoped by construction.
|
||||
|
||||
### Policy attestation metadata
|
||||
|
||||
- **Interactive only.** `policy:publish` and `policy:promote` are restricted to password/device-code flows (Console, CLI) and are rejected when requested via client credentials or app secrets. Tokens inherit the 5-minute fresh-auth window; resource servers reject stale tokens and emit `authority.policy_attestation_validated=false`.
|
||||
- **Mandatory parameters.** Requests must include:
|
||||
- `policy_reason` (≤512 chars) — human-readable justification (e.g., “Promote tenant A baseline to production”).
|
||||
- `policy_ticket` (≤128 chars) — change request / CAB identifier (e.g., `CR-2025-1102`).
|
||||
- `policy_digest` — lowercase hex digest (32–128 characters) of the policy bundle being published/promoted.
|
||||
- **Audit surfaces.** On success, the metadata is copied into the access token (`stellaops:policy_reason`, `stellaops:policy_ticket`, `stellaops:policy_digest`, `stellaops:policy_operation`) and recorded in [`authority.password.grant`] audit events as `policy.*` properties.
|
||||
- **Failure modes.** Missing/blank parameters, over-length values, or non-hex digests trigger `invalid_request` responses and `authority.policy_attestation_denied` audit tags. CLI/Console must bubble these errors to operators and provide retry UX.
|
||||
- **CLI / Console UX.** The CLI stores attestation metadata in `stella.toml` (`authority.policy.publishReason`, `authority.policy.publishTicket`) or accepts `STELLA_POLICY_REASON` / `STELLA_POLICY_TICKET` / `STELLA_POLICY_DIGEST` environment variables. Console prompts operators for the same trio before issuing attestation tokens and refuses to cache values longer than the session.
|
||||
- **Automation guidance.** CI workflows should compute the policy digest ahead of time (for example `sha256sum policy-package.tgz | cut -d' ' -f1`) and inject the reason/ticket/digest into CLI environment variables immediately before invoking `stella auth login --scope policy:publish`.
|
||||
|
||||
Graph Explorer introduces dedicated scopes: `graph:write` for Cartographer build jobs, `graph:read` for query/read operations, `graph:export` for long-running export downloads, and `graph:simulate` for what-if overlays. Assign only the scopes a client actually needs to preserve least privilege—UI-facing clients should typically request read/export access, while background services (Cartographer, Scheduler) require write privileges.
|
||||
|
||||
#### Least-privilege guidance for graph clients
|
||||
|
||||
@@ -24,6 +24,20 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| DOCS-SCANNER-BENCH-62-001 | DONE (2025-11-02) | Docs Guild, Scanner Guild | — | Maintain scanner comparison docs for Trivy, Grype, and Snyk; refresh deep dives and ecosystem matrix with source-linked implementation notes. | Comparison docs updated; matrix synced; deep dives cite source paths and highlight coverage gaps. |
|
||||
| DOCS-SCANNER-BENCH-62-002 | TODO | Docs Guild, Product Guild | DOCS-SCANNER-BENCH-62-001 | Collect Windows/macOS analyzer demand signals per `docs/benchmarks/scanner/windows-macos-demand.md`. | Demand summary produced; intake form updated; design spike criteria evaluated. |
|
||||
| DOCS-SCANNER-BENCH-62-003 | TODO | Docs Guild, Product Guild | DOCS-SCANNER-BENCH-62-002 | Capture Python lockfile / editable install requirements and document policy guidance once design completes. | Demand notes merged; policy template drafted. |
|
||||
| DOCS-SCANNER-BENCH-62-004 | TODO | Docs Guild, Java Analyzer Guild | DOCS-SCANNER-BENCH-62-003 | Document Java lockfile ingestion plan and associated policy templates per `scanning-gaps-stella-misses-from-competitors.md`. | Draft guidance published; policy examples reviewed. |
|
||||
| DOCS-SCANNER-BENCH-62-005 | TODO | Docs Guild, Go Analyzer Guild | DOCS-SCANNER-BENCH-62-004 | Document Go stripped-binary fallback enrichment guidance once implementation lands. | Docs updated with inferred module policy patterns. |
|
||||
| DOCS-SCANNER-BENCH-62-006 | TODO | Docs Guild, Rust Analyzer Guild | DOCS-SCANNER-BENCH-62-005 | Document Rust fingerprint enrichment guidance and policy examples. | Docs cover heuristic vs authoritative crate handling. |
|
||||
| DOCS-SCANNER-BENCH-62-007 | TODO | Docs Guild, Security Guild | DOCS-SCANNER-BENCH-62-006 | Produce secret leak detection documentation (rules, policy templates) once implementation lands. | Docs include rule bundle guidance and policy patterns. |
|
||||
| DOCS-SCANNER-BENCH-62-008 | TODO | Docs Guild, EntryTrace Guild | DOCS-SCANNER-BENCH-62-007 | Publish EntryTrace explain/heuristic maintenance guide per `scanning-gaps-stella-misses-from-competitors.md`. | Guide covers cadence, contribution workflow, and policy predicates. |
|
||||
| DOCS-SCANNER-BENCH-62-009 | DONE (2025-11-02) | Docs Guild, Ruby Analyzer Guild | DOCS-SCANNER-BENCH-62-008 | Extend Ruby ecosystem gap analysis in `scanning-gaps-stella-misses-from-competitors.md` with implementation notes, detection tables, and backlog mapping. | Ruby section updated with competitor techniques, task linkage, and scoring rationale. |
|
||||
| DOCS-SCANNER-BENCH-62-010 | DONE (2025-11-02) | Docs Guild, PHP Analyzer Guild | DOCS-SCANNER-BENCH-62-009 | Document PHP analyzer parity gaps with detection technique tables and policy hooks. | PHP section merged with plan references and backlog linkage. |
|
||||
| DOCS-SCANNER-BENCH-62-011 | DONE (2025-11-02) | Docs Guild, Language Analyzer Guild | DOCS-SCANNER-BENCH-62-010 | Capture Deno runtime gap analysis versus competitors, including detection/merge strategy table. | Deno section added with implementation notes and backlog callouts. |
|
||||
| DOCS-SCANNER-BENCH-62-012 | DONE (2025-11-02) | Docs Guild, Language Analyzer Guild | DOCS-SCANNER-BENCH-62-011 | Add Dart ecosystem comparison and task linkage in `scanning-gaps-stella-misses-from-competitors.md`. | Dart section present with detection table, backlog references, and scoring. |
|
||||
| DOCS-SCANNER-BENCH-62-013 | DONE (2025-11-02) | Docs Guild, Swift Analyzer Guild | DOCS-SCANNER-BENCH-62-012 | Expand Swift coverage analysis with implementation techniques and policy considerations. | Swift section integrated with detection table and backlog references. |
|
||||
| DOCS-SCANNER-BENCH-62-014 | DONE (2025-11-02) | Docs Guild, Runtime Guild | DOCS-SCANNER-BENCH-62-013 | Detail Kubernetes/VM target coverage gaps and interplay with Zastava/Runtime docs. | Target coverage section merged with detection/merging approach and action items. |
|
||||
| DOCS-SCANNER-BENCH-62-015 | DONE (2025-11-02) | Docs Guild, Export Center Guild | DOCS-SCANNER-BENCH-62-014 | Document DSSE/Rekor operator enablement guidance from competitor comparison. | DSSE section aligned with Export Center backlog and detection merge table. |
|
||||
|
||||
## Air-Gapped Mode (Epic 16)
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|
||||
@@ -4,20 +4,20 @@
|
||||
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
||||
| .NET | Dependency retrieval | Snyk | No pre-build lock/config ingestion (installed `deps.json` only). | No runtime graph; ignores `runtimeconfig`/installed assemblies. | Relies on Syft `deps.json` catalogs; no layer-aware runtime context. | Requires authenticated SaaS analysis; projects often need restore/build before scanning. | Evaluate adding lockfile analyzer parity (track via Scanner .NET guild tasks). | [dotnet.md](dotnet.md) |
|
||||
| .NET | Runtime metadata & signing | StellaOps | Authenticode inspection optional; Windows-only coverage pending. | Does not capture signer metadata or assembly hashes. | No authenticode or RID metadata captured; package fields only. | No runtimeconfig/authenticode data; focuses on dependency manifests. | Harden Authenticode integration & document Windows variants. | [dotnet.md](dotnet.md) |
|
||||
| Node.js | Workspace & pnpm resolution | Tie (StellaOps / Snyk) | Lack of pnpm lock validator tooling for CLI users. | pnpm virtual store resolved only via lockfile semantics; skips actual installs. | Depends on Syft catalogers; lacks pnpm workspace policy controls or dedupe tuning. | Manifest-based plugins (npm/yarn/pnpm) send dependency graphs to Snyk API; offline unsupported. | Add pnpm validator CLI story; share results with Surface Env guild. | [nodejs.md](nodejs.md) |
|
||||
| Node.js | Workspace & pnpm resolution | Tie (StellaOps / Snyk) | Lack of pnpm lock validator tooling for CLI users. | pnpm virtual store resolved only via lockfile semantics; skips actual installs. | Depends on Syft catalogers; lacks pnpm workspace policy controls or dedupe tuning. | Manifest-based plugins (npm/yarn/pnpm) send dependency graphs to Snyk API; offline unsupported. | Track lockfile validator plan (`scanning-gaps-stella-misses-from-competitors.md`) and file analyzer/CLI backlog items. | [nodejs.md](nodejs.md) |
|
||||
| Node.js | Usage tracking | StellaOps | EntryTrace launcher catalog requires regular updates. | No runtime usage model; inventory-only. | No runtime usage modelling; reports inventory only. | No runtime usage modelling (inventory only). | Establish cadence for launcher catalog review (EntryTrace TASKS). | [nodejs.md](nodejs.md) |
|
||||
| Python | Evidence source | Tie (StellaOps / Trivy) | Build-only repos need supplemental workflow. | Accepts stale lockfiles; installed evidence optional. | Leverages Syft-installed metadata; build-only projects need external flow. | Requires language environment & build; manifest graph sent to Snyk service. | Scope CLI guidance for build-only repos in docs backlog. | [python.md](python.md) |
|
||||
| Python | Evidence source | Tie (StellaOps / Trivy) | Build-only repos need supplemental workflow. | Accepts stale lockfiles; installed evidence optional. | Leverages Syft-installed metadata; build-only projects need external flow. | Requires language environment & build; manifest graph sent to Snyk service. | Track Python lockfile parity plan (`scanning-gaps-stella-misses-from-competitors.md`) in analyzer backlog. | [python.md](python.md) |
|
||||
| Python | Usage awareness | StellaOps | EntryTrace hints dependent on shell heuristic coverage. | Missing runtime usage context entirely. | No runtime usage awareness. | No runtime usage metadata. | Expand EntryTrace shell heuristic coverage. | [python.md](python.md) |
|
||||
| Java | Archive evidence | Tie (StellaOps / Snyk) | Gradle/SBT lockfiles out of scope; relies on observed archives. | No archive hash locators; depends on Java DB availability. | Relies on Syft archive metadata without manifest hashing/attestation. | Relies on Maven/Gradle plugins; no archive hashing or offline support. | Track Gradle/SBT ingestion feasibility (Java analyzer task board). | [java.md](java.md) |
|
||||
| Go | Stripped binaries | StellaOps | Fallback components limited to hash + binary metadata. | Drops binaries lacking build info; no fallback reporting. | Skips pseudo-version binaries without build info; no hashed fallback. | Go plugin inspects modules via manifests; binaries without modules not analysed. | Investigate richer fallback metadata (Go analyzer backlog). | [golang.md](golang.md) |
|
||||
| Rust | Binary heuristics | StellaOps | Fingerprint coverage incomplete for niche toolchains. | Unmatched binaries ignored; no fallback crates. | No fallback for binaries lacking Cargo metadata; depends on Syft crate data. | No Rust/Cargo support in CLI plugins. | Expand fingerprint signatures; note in Rust analyzer tasks. | [rust.md](rust.md) |
|
||||
| Java | Archive evidence | Tie (StellaOps / Snyk) | Gradle/SBT lockfiles out of scope; relies on observed archives. | No archive hash locators; depends on Java DB availability. | Relies on Syft archive metadata without manifest hashing/attestation. | Relies on Maven/Gradle plugins; no archive hashing or offline support. | Execute Java lockfile plan (`scanning-gaps-stella-misses-from-competitors.md`) and log analyzer/CLI backlog items. | [java.md](java.md) |
|
||||
| Go | Stripped binaries | StellaOps | Fallback components limited to hash + binary metadata. | Drops binaries lacking build info; no fallback reporting. | Skips pseudo-version binaries without build info; no hashed fallback. | Go plugin inspects modules via manifests; binaries without modules not analysed. | Execute Go fallback enrichment plan (`scanning-gaps-stella-misses-from-competitors.md`) to add inferred metadata & policy hooks. | [golang.md](golang.md) |
|
||||
| Rust | Binary heuristics | StellaOps | Fingerprint coverage incomplete for niche toolchains. | Unmatched binaries ignored; no fallback crates. | No fallback for binaries lacking Cargo metadata; depends on Syft crate data. | No Rust/Cargo support in CLI plugins. | Execute Rust fingerprint plan (`scanning-gaps-stella-misses-from-competitors.md`) and update analyzer backlog. | [rust.md](rust.md) |
|
||||
| OS packages | Linux distro coverage & provenance | Tie (StellaOps / Grype) | Requires RustFS/object store deployment for full replay; Windows packaging still out of scope. | No per-layer fragment storage; provenance limited; Windows support likewise minimal. | No per-layer provenance; shares Syft catalog and Anchore DB only. | Snyk Container scanning depends on SaaS API; no per-layer provenance. | Document RustFS dependency & offline alternatives in ops backlog; evaluate Windows pkg roadmap. | [os-packages.md](os-packages.md) |
|
||||
| OS packages | Linux flavor support (Alpine/Wolfi/Chainguard, Debian/Ubuntu, RHEL/Alma/Rocky, SUSE, Amazon/Bottlerocket) | Tie (Trivy / Snyk) | Windows/macOS package ecosystems still pending. | Coverage relies on package DB adapters; per-distro nuances (e.g., Chainguard signatures) not attested. | Supports major Linux feeds but no Windows/macOS package analyzers. | Supports documented distro list via Snyk Container but requires cloud connectivity. | Track demand for non-Linux package analyzers; document distro mapping in os-packages deep dive. | [os-packages.md](os-packages.md) |
|
||||
| OS packages | Windows/macOS coverage | — | No Windows/macOS analyzer; backlog item for offline parity. | Coverage docs enumerate Linux distributions only; Windows/macOS packages unsupported. | Syft matchers focus on Linux ecosystems; Windows/macOS packages unsupported. | Coverage depends on Snyk’s SaaS service; no offline assurance for Windows/macOS packages. | Capture demand for Windows/macOS analyzers and scope feasibility. | [os-packages.md](os-packages.md) |
|
||||
| OS packages | Windows/macOS coverage | — | No Windows/macOS analyzer; backlog item for offline parity. | Coverage docs enumerate Linux distributions only; Windows/macOS packages unsupported. | Syft matchers focus on Linux ecosystems; Windows/macOS packages unsupported. | Coverage depends on Snyk’s SaaS service; no offline assurance for Windows/macOS packages. | Capture demand for Windows/macOS analyzers (see `docs/benchmarks/scanner/windows-macos-demand.md`) and scope feasibility. | [os-packages.md](os-packages.md) |
|
||||
| Secrets | Handling posture | StellaOps | No leak scanning by design; Surface.Secrets manages retrieval/rotation with tenant scopes. | Leak detections lack governance hooks; operators must track rule updates. | No secret management abstraction; credentials configured manually. | Requires SaaS backend for secret scanning; no offline posture or secret storage guidance. | Document governance patterns for Surface.Secrets users and recommended companion tooling. | [secrets.md](secrets.md) |
|
||||
| Secrets | Detection technique | Trivy | No content scanning; relies on Surface.Secrets integrations. | Regex/entropy detectors with configurable allow/deny lists across files/bytecode. | No detector available; Syft/Grype skip leak scanning entirely. | Snyk Code/Snyk secrets require uploading code to SaaS; offline detection unavailable. | Provide guidance on pairing Surface with third-party leak scanners; evaluate optional plugin strategy. | [secrets.md](secrets.md) |
|
||||
| EntryTrace | Runtime command resolution | StellaOps | Shell/language launcher coverage needs continuous tuning. | Not supported. | Not available. | Not available. | Continue EntryTrace backlog (SURFACE-ENTRYTRACE stories). | — |
|
||||
| DSSE / Rekor | Attested SBOM/report signing | StellaOps | Rekor v2 adoption requires operator enablement guidance. | Not supported. | No attestation or transparency log integration. | No attestation workflow. | Add operator playbook updates in Export Center backlog. | — |
|
||||
| Secrets | Detection technique | Trivy | No content scanning; relies on Surface.Secrets integrations. | Regex/entropy detectors with configurable allow/deny lists across files/bytecode. | No detector available; Syft/Grype skip leak scanning entirely. | Snyk Code/Snyk secrets require uploading code to SaaS; offline detection unavailable. | Execute secrets leak detection plan (`scanning-gaps-stella-misses-from-competitors.md`) and plan policy templates. | [secrets.md](secrets.md) |
|
||||
| EntryTrace | Runtime command resolution | StellaOps | Shell/language launcher coverage needs continuous tuning. | Not supported. | Not available. | Not available. | Maintain EntryTrace plan (`scanning-gaps-stella-misses-from-competitors.md`) and backlog cadence. | — |
|
||||
| DSSE / Rekor | Attested SBOM/report signing | StellaOps | Rekor v2 adoption requires operator enablement guidance. | Not supported. | No attestation or transparency log integration. | No attestation workflow. | Already covered by Export Center backlog (no additional plan required). | — |
|
||||
| Ruby | Language analyzer parity | Snyk | No Ruby analyzer implementation yet. | Lacks runtime usage/EntryTrace integration. | Supports Ruby via matcher but lacks runtime usage/attestation. | Supported through rubygems plugin (SaaS dependency graph). | Prioritise Ruby analyzer work (see `src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby/TASKS.md`). | — |
|
||||
| PHP | Language analyzer parity | Snyk | No PHP analyzer implementation yet. | No usage or evidence beyond lockfiles. | Composer handled via generic matcher; no runtime evidence. | Supported through PHP Composer plugin (requires Snyk API). | Track PHP analyzer backlog (`...Lang.Php/TASKS.md`). | — |
|
||||
| Deno | Language analyzer parity | Trivy | Analyzer not yet implemented (tasks pending). | None (lockfile support limited but present). | No Deno support. | No Deno plugin. | Execute Deno analyzer epics in `...Lang.Deno/TASKS.md`. | — |
|
||||
@@ -25,4 +25,4 @@
|
||||
| Swift | Language analyzer parity | Snyk | No Swift support today. | Supports Package.resolved parsing but no runtime usage. | No Swift support. | Supported via swift plugin but SaaS-only. | Evaluate need for Swift analyzer based on customer demand. | — |
|
||||
| SAST | Application code analysis | Snyk | No built-in SAST engine. | No SAST engine (focus on vuln & config). | No SAST support (SBOM matching only). | Requires SaaS upload of code; privacy considerations. | Evaluate integration points with existing SAST tooling / document partner options. | [sast.md](sast.md) |
|
||||
| IaC / Misconfiguration | Built-in scanning | Snyk | No misconfiguration analyzer (policy engine focuses on runtime evidence). | Ships IaC scanning but lacks deterministic replay. | No IaC or misconfiguration scanners (vulnerability-only). | Handled via Snyk IaC (`snyk iac test`) with SaaS policy engine. | Coordinate with Policy/Surface guild on IaC roadmap assessment. | — |
|
||||
| Kubernetes / VM targets | Target coverage breadth | Tie (Trivy / Snyk) | Scanner limited to images/filesystems; relies on other modules for runtime posture. | Supported but lacks attestation pipeline. | Scans images/filesystems; no live cluster or VM state analysis. | Snyk Container/K8s scanning available but cloud-managed; no offline runtime attestation. | Document complementary modules (Zastava/Runtime) in comparison appendix. | — |
|
||||
| Kubernetes / VM targets | Target coverage breadth | Tie (Trivy / Snyk) | Scanner limited to images/filesystems; relies on other modules for runtime posture. | Supported but lacks attestation pipeline. | Scans images/filesystems; no live cluster or VM state analysis. | Snyk Container/K8s scanning available but cloud-managed; no offline runtime attestation. | Document complementary modules (Zastava/Runtime) in comparison appendix. | — |
|
||||
|
||||
@@ -0,0 +1,482 @@
|
||||
# Scanning Gaps — Competitor Techniques Missing in StellaOps
|
||||
|
||||
## .NET lockfile ingestion (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Enterprise tenants request pre-build dependency evidence for audits. |
|
||||
| Competitive risk | 4 | Trivy and Snyk already parse NuGet lockfiles. |
|
||||
| Engineering effort | 3 | Collector plus CLI toggle is moderate effort. |
|
||||
| Policy/config impact | 4 | Policies must handle declared-only components. |
|
||||
| Offline/air-gap impact | 3 | Offline-friendly; bundle size increases slightly. |
|
||||
|
||||
- **Competitor capability**: Trivy parses `packages.lock.json` / `packages.config` and Snyk uploads manifest graphs, enabling pre-build dependency visibility.
|
||||
- **StellaOps gap**: Scanner currently inspects installed artifacts only (deps/runtimeconfig/assemblies); lockfiles are ignored.
|
||||
- **Proposed plan**:
|
||||
1. Add optional lockfile collectors under `StellaOps.Scanner.Analyzers.Lang.DotNet` that parse NuGet lockfiles without requiring restore, emitting auxiliary component records linked to installation evidence when present.
|
||||
2. Extend Surface.Validation to gate lockfile parsing (size, tenant policy) and Surface.Secrets for private feed credentials when resolving lockfile registries.
|
||||
3. Feed parsed lockfile metadata into Policy Engine via a new evidence flag so policy can distinguish “declared but not installed” dependencies.
|
||||
4. Provide CLI toggle (`--dotnet-lockfiles`) and document policy defaults (fail if declarations lack runtime evidence unless waiver).
|
||||
- **Policy considerations**: Introduce policy template allowing tenants to require lockfile parity or suppress pre-build-only components; leverage existing lattice logic to down-rank vulnerabilities lacking runtime evidence.
|
||||
- **Next actions**: open analyzer story (SCANNER-ANALYZERS-LANG-DOTNET) and doc task for policy guidance once design is finalized.
|
||||
|
||||
### Implementation details
|
||||
- Collectors live under `StellaOps.Scanner.Analyzers.Lang.DotNet`:
|
||||
- `DotNetDependencyCollector` (existing) resolves installed assemblies via `*.deps.json`, `*.runtimeconfig.json`, and manifest metadata.
|
||||
- New `DotNetLockfileCollector` (plan) will parse `packages.lock.json` / `packages.config` without executing restore, emitting records flagged `DeclaredOnly`.
|
||||
- Surface integrations:
|
||||
- `Surface.Validation` controls lockfile parsing size, repository allowlists, and opt-in behaviour.
|
||||
- `Surface.Secrets` provides credentials for private NuGet feeds referenced in lockfiles.
|
||||
- Merging pipeline:
|
||||
- `DotNetPackageAggregator` will merge installed and declared records by package key (id + version) with precedence rules (installed evidence supersedes declared-only).
|
||||
- Policy Engine receives both authoritative and declared-only evidence, enabling parity checks and waivers.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Installed runtime evidence | `*.deps.json`, `*.runtimeconfig.json`, assemblies, authenticode metadata | `DotNetDependencyCollector`, `DotNetPackageAggregator`, optional `IDotNetAuthenticodeInspector` | Produces authoritative components (inventory + usage) keyed by assembly path and package id. |
|
||||
| Lockfile ingestion (planned) | `packages.lock.json`, `packages.config`, restore graph metadata | `DotNetLockfileCollector` (new), integrated into `DotNetPackageAggregator` | Emits `DeclaredOnly` components; merged with installed evidence by package id/version; unresolved entries flagged for policy. |
|
||||
| Runtime usage linkage | EntryTrace outputs | `EntryTrace` integration via `LanguageAnalyzerContext.UsageHints` | Marks components used by entrypoint closure; policy and explain traces highlight runtime relevance. |
|
||||
|
||||
## Node.js pnpm lock validation (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 3 | Monorepo customers asking for pnpm parity; moderate demand. |
|
||||
| Competitive risk | 4 | Competitors advertise pnpm support, creating parity pressure. |
|
||||
| Engineering effort | 3 | Collector and CLI work similar to .NET lockfile effort. |
|
||||
| Policy/config impact | 4 | Requires policy predicates and explain updates. |
|
||||
| Offline/air-gap impact | 3 | Offline-compatible with additional parser rules bundled. |
|
||||
|
||||
- **Competitor capability**: Trivy and Snyk parse pnpm/yarn/npm lockfiles even when `node_modules` is absent, surfacing dependency graphs pre-install for policy gating.
|
||||
- **StellaOps gap**: Scanner requires installed artifacts; there is no CLI helper to validate pnpm lockfiles or compare declared vs installed packages ahead of a scan.
|
||||
- **Proposed plan**:
|
||||
1. Introduce a lockfile-only collector under `StellaOps.Scanner.Analyzers.Lang.Node` that decodes `pnpm-lock.yaml`, `package-lock.json`, and `yarn.lock`, emitting provisional component records with provenance flag `DeclaredOnly`.
|
||||
2. Expose a CLI verb (`stella node lock-validate`) that runs the collector without enqueuing a full scan, honouring Surface.Validation controls (max lockfile size, allowed registries) and Surface.Secrets for private registries.
|
||||
3. Persist lockfile-derived dependencies alongside installed evidence so Policy Engine can enforce parity via new predicates (e.g., `node.lock.declaredMissing`, `node.lock.registryDisallowed`).
|
||||
4. Extend EntryTrace explain output (and policy explain traces) to highlight packages present in runtime closure but missing from lockfiles—or declared-only artifacts not shipped in the image.
|
||||
- **Policy considerations**: Provide sample policies that fail builds when lockfiles reference disallowed registries or when declared packages lack runtime evidence; use lattice weighting to downgrade issues limited to declared-only components.
|
||||
- **Next actions**: open analyzer story (SCANNER-ANALYZERS-LANG-NODE) plus CLI story for lock validation, and schedule Docs Guild update covering the new policy predicates.
|
||||
|
||||
### Implementation details
|
||||
- Collectors under `StellaOps.Scanner.Analyzers.Lang.Node`:
|
||||
- Existing `NodePackageCollector` walks `package.json` evidence across workspaces.
|
||||
- Planned `NodeLockfileCollector` will parse `pnpm-lock.yaml`, `package-lock.json`, `yarn.lock`.
|
||||
- Surface integrations:
|
||||
- `Surface.Validation` to constrain lockfile size, allowed registries, and CLI access for `stella node lock-validate`.
|
||||
- `Surface.Secrets` for private registry credentials when validating lockfiles.
|
||||
- Merge strategy:
|
||||
- `LanguageAnalyzerContext` merges installed and declared components; declared-only entries are flagged (`DeclaredOnly`) and kept for policy comparison.
|
||||
- EntryTrace usage hints link runtime scripts to packages, influencing policy weights and explain traces.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Installed package evidence | `package.json` + `node_modules` tree | `NodePackageCollector` | Produces authoritative components with install paths and workspace metadata. |
|
||||
| Lockfile ingestion (planned) | `pnpm-lock.yaml`, `package-lock.json`, `yarn.lock` | `NodeLockfileCollector` (new) | Emits declared dependency graph with provenance; merged by package name/version; discrepancies flagged for policy. |
|
||||
| Runtime usage linkage | EntryTrace launcher catalog (npm/yarn scripts, node entrypoints) | `EntryTrace` + `LanguageAnalyzerContext.UsageHints` | Annotates components used at runtime; unresolved launchers produce explain-trace diagnostics. |
|
||||
|
||||
## Python lockfile & editable install coverage (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 3 | Editable install coverage requested by regulated Python users. |
|
||||
| Competitive risk | 3 | Trivy supports multiple lock formats; Snyk SaaS highlights poetry support. |
|
||||
| Engineering effort | 3 | Collector and editable path resolution are moderate effort. |
|
||||
| Policy/config impact | 4 | Policy needs knobs for declared-only vs runtime packages. |
|
||||
| Offline/air-gap impact | 3 | Offline workable while packaging parser tooling. |
|
||||
|
||||
- **Competitor capability**: Trivy parses Poetry/Pipenv/pip lockfiles (including editable installs) and Snyk uploads manifest graphs, exposing declared dependencies even when virtualenvs are absent.
|
||||
- **StellaOps gap**: Scanner relies on installed `dist-info` metadata; editable installs or source-only lockfiles are skipped, and there is no automated parity check between declared requirements and runtime usage.
|
||||
- **Proposed plan**:
|
||||
1. Add a lockfile collector in `StellaOps.Scanner.Analyzers.Lang.Python` that reads `poetry.lock`, `Pipfile.lock`, `requirements.txt` (including VCS URLs), tagging results as `DeclaredOnly`.
|
||||
2. Detect editable installs by parsing `pyproject.toml` / `setup.cfg`, resolving editable paths with Surface.FS, and linking to EntryTrace usage to ensure runtime awareness.
|
||||
3. Provide CLI support (`stella python lock-validate`) to diff declared vs installed artifacts, enforcing Surface.Validation constraints (lockfile size, allowed indexes) and Surface.Secrets for private PyPI mirrors.
|
||||
4. Persist declarative evidence separately so Policy Engine can evaluate predicates like `python.lock.declaredMissing` and `python.lock.indexDisallowed`.
|
||||
5. Extend explain traces to highlight editable or declared-only packages lacking runtime deployment, aiding remediation.
|
||||
- **Policy considerations**: Ship policy templates distinguishing declared-only vs runtime packages, with lattice-based weighting to reduce noise when usage is absent; allow tenants to enforce registry allowlists.
|
||||
- **Next actions**: create analyzer and CLI backlog items in the Python guild, plus Docs Guild task to cover new policy knobs once design is complete.
|
||||
|
||||
### Implementation details
|
||||
- Collectors under `StellaOps.Scanner.Analyzers.Lang.Python`:
|
||||
- Existing analyzer reads installed `*.dist-info` metadata via `PythonDistributionLoader`.
|
||||
- Planned lockfile collector parses `poetry.lock`, `Pipfile.lock`, `requirements.txt` (including VCS refs).
|
||||
- Editable installs:
|
||||
- Detect via `pyproject.toml` / `setup.cfg` markers; use `Surface.FS` to resolve local paths and mark components as editable.
|
||||
- Surface & policy integrations:
|
||||
- `Surface.Validation` constrains lockfile size and allowed indexes; `Surface.Secrets` handles private index credentials.
|
||||
- Policy Engine receives new flags (`DeclaredOnly`, `EditablePath`) to drive parity checks.
|
||||
- CLI workflow: `stella python lock-validate` (planned) will reuse collectors without scheduling full scans.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Installed distributions | `*.dist-info` directories, RECORD, METADATA | `PythonLanguageAnalyzer` | Produces authoritative components with file hashes and EntryTrace usage hints. |
|
||||
| Lockfile ingestion (planned) | `poetry.lock`, `Pipfile.lock`, `requirements.txt` | Planned lockfile collector integrated with analyzer | Emits declared dependency graph, tagged `DeclaredOnly`; merged by package name/version. |
|
||||
| Editable install resolution | Local source directories referenced in lockfiles (`path =`, `editable = true`) | New editable resolver leveraging `Surface.FS` | Links editable packages to actual source paths; policy distinguishes editable vs packaged artefacts. |
|
||||
|
||||
## Java build-tool lockfile ingestion (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 3 | Platform teams running Gradle/SBT builds request parity for pre-build evidence. |
|
||||
| Competitive risk | 4 | Trivy supports Gradle/SBT lockfiles and Snyk ships Maven/Gradle/SBT plugins. |
|
||||
| Engineering effort | 3 | Requires new lockfile collectors plus CLI wiring; moderate complexity. |
|
||||
| Policy/config impact | 3 | Policy must handle declared-only Java components and disallowed repositories. |
|
||||
| Offline/air-gap impact | 3 | Works offline but needs packaged parsers and repository allowlists. |
|
||||
|
||||
- **Competitor capability**: Trivy parses Gradle/Maven/SBT lockfiles and Snyk relies on dedicated plugins to surface declared dependencies even before artifacts are built.
|
||||
- **StellaOps gap**: Scanner inspects installed archives only; it ignores Gradle/SBT lockfiles and lacks a workflow to compare declared dependencies against runtime archives.
|
||||
- **Proposed plan**:
|
||||
1. Introduce lockfile collectors under `StellaOps.Scanner.Analyzers.Lang.Java` to parse `gradle.lockfile`, `pom.xml`/`pom.lock`, and `build.sbt` output, emitting `DeclaredOnly` components with repository metadata.
|
||||
2. Extend Surface.Validation for Java lockfiles (size limits, allowed repositories) and leverage Surface.Secrets for private Maven repository credentials.
|
||||
3. Provide a CLI verb (`stella java lock-validate`) to diff declared vs installed archives without running a full scan, emitting policy-ready diagnostics.
|
||||
4. Persist declarative evidence so Policy Engine can evaluate predicates (`java.lock.declaredMissing`, `java.lock.repoDisallowed`) and feed explain traces highlighting gaps.
|
||||
- **Policy considerations**: Supply templates enforcing repository allowlists and declared-vs-runtime parity, using lattice weights to downgrade issues that never reach runtime.
|
||||
- **Next actions**: log analyzer/CLI backlog stories with the Java guild and plan Docs Guild updates for new policy knobs once design stabilises.
|
||||
|
||||
### Implementation details
|
||||
- Collectors under `StellaOps.Scanner.Analyzers.Lang.Java`:
|
||||
- Existing analyzer normalises installed JAR/WAR/EAR archives and extracts `MANIFEST.MF`, `pom.properties`.
|
||||
- Planned lockfile collectors will ingest `gradle.lockfile`, Maven `pom.xml`/`pom.lock`, and `build.sbt` outputs.
|
||||
- Surface integrations:
|
||||
- `Surface.Validation` enforces lockfile size and repository allowlists; `Surface.Secrets` supplies credentials for private Maven repositories.
|
||||
- Merge strategy:
|
||||
- New collector emits `DeclaredOnly` components with repository metadata; `JavaLanguageAnalyzer` merges them with observed archives keyed by `groupId:artifactId:version`.
|
||||
- EntryTrace usage hints link runtime launchers to archives, enabling policy to prioritise used components.
|
||||
- CLI tooling:
|
||||
- `stella java lock-validate` (planned) exposes lockfile parity checks without full scan scheduling, reusing collectors.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Installed archives | JAR/WAR/EAR/PAR files, `MANIFEST.MF`, `pom.properties` | `JavaLanguageAnalyzer` | Produces authoritative components with archive hashes and runtime usage hints. |
|
||||
| Lockfile ingestion (planned) | `gradle.lockfile`, Maven `pom.xml`/`pom.lock`, SBT build metadata | Planned lockfile collectors integrated with analyzer | Emits declared dependency entries (repository + checksum); merged on GAV coordinates with priority to installed evidence. |
|
||||
| Runtime linkage | EntryTrace wrapper catalogue (java -jar, jetty, etc.) | `EntryTrace` integration | Marks archives invoked at runtime; unresolved launchers surfaced with remediation hints. |
|
||||
|
||||
## Go stripped binary enrichment (Trivy, Grype)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 3 | Teams shipping minimal Go binaries want richer provenance for runtime attestations. |
|
||||
| Competitive risk | 3 | Trivy/Grype skip hashed fallbacks, but customers compare hashed provenance across tools. |
|
||||
| Engineering effort | 2 | Extend fallback hashing with symbol inference; low-medium effort. |
|
||||
| Policy/config impact | 3 | Policy needs knobs to treat inferred modules differently from authoritative results. |
|
||||
| Offline/air-gap impact | 3 | Offline-friendly; requires bundling symbol parser logic. |
|
||||
|
||||
- **Competitor capability**: Trivy and Grype skip binaries without Go build info, leaving stripped binaries without component coverage.
|
||||
- **StellaOps gap**: StellaOps emits hashed fallback components but lacks inferred module names, confidence scoring, and policy integration.
|
||||
- **Proposed plan**:
|
||||
1. Enhance `GoBinaryScanner` fallback path to parse symbol tables (DWARF/ELF) and infer module/package names, tagging results with confidence metrics.
|
||||
2. Persist inferred metadata separately so Policy Engine can weight `go.inferred` components differently from authoritative modules.
|
||||
3. Expose CLI detail (`--go-fallback-detail`) and explain trace entries highlighting hashed/inferred provenance for stripped binaries.
|
||||
4. Update attestation manifests to surface inferred modules, enabling policy-controlled downgrade rather than omission.
|
||||
- **Policy considerations**: Extend policy predicates to differentiate authoritative vs inferred Go modules; adjust lattice weights to reduce noise while keeping visibility.
|
||||
- **Next actions**: create analyzer backlog story for enhanced fallback parsing and Docs Guild task to document policy/CLI behaviour.
|
||||
|
||||
### Implementation details
|
||||
- Analyzer: `StellaOps.Scanner.Analyzers.Lang.Go/GoLanguageAnalyzer` currently extracts Go build info (`module`, `buildSettings`) and DWARF metadata when available.
|
||||
- Fallback enhancements (planned):
|
||||
- Extend `GoBinaryScanner` to parse ELF/Mach-O symbol tables when build info is missing.
|
||||
- Maintain fingerprint catalogue under `StellaOps.Scanner.Analyzers.Lang.Go.Fingerprints` with signed updates for Offline Kit.
|
||||
- Surface & policy:
|
||||
- `Surface.Validation` governs fallback enablement; configuration stored alongside analyzer options.
|
||||
- Policy Engine will recognise inferred components via new flags (e.g., `go.inferred`), influencing lattice weights.
|
||||
- CLI and explain:
|
||||
- Introduce `--go-fallback-detail` to surface hashed vs inferred provenance.
|
||||
- Explain traces include confidence scores and recommended remediation (e.g., rebuild with `-buildvcs`).
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Authoritative build info | Go binary `buildinfo` section, DWARF metadata | `GoLanguageAnalyzer` | Produces authoritative modules with version/build metadata. |
|
||||
| Fallback hashing | Binary bytes when build info missing | Existing fallback path in `GoBinaryScanner` | Emits hashed component (`sha256:...`) with `Fallback` flag for policy downgrading. |
|
||||
| Symbol-based inference (planned) | ELF/Mach-O symbols, DWARF line info | Planned enhancement to `GoBinaryScanner` with fingerprint catalogue | Maps symbols to modules/packages, tagging confidence scores; merged with hashed fallback for explainability. |
|
||||
|
||||
## Rust fingerprint coverage (Trivy, Grype)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 3 | Regulated teams running Rust microservices want deterministic evidence even for stripped binaries. |
|
||||
| Competitive risk | 3 | Competitors drop stripped binaries entirely; StellaOps can differentiate by improving heuristics. |
|
||||
| Engineering effort | 3 | Requires enhancing fingerprint catalogue and symbol inference; moderate effort. |
|
||||
| Policy/config impact | 3 | Policy needs knobs to treat heuristic vs authoritative crates differently. |
|
||||
| Offline/air-gap impact | 3 | Offline-compatible; must distribute updated fingerprint datasets with Offline Kit. |
|
||||
|
||||
- **Competitor capability**: Trivy and Grype skip Rust binaries lacking Cargo metadata, offering no fallback or runtime insight.
|
||||
- **StellaOps gap**: Although StellaOps stores hashed fallback and fingerprint components, coverage for niche toolchains and stripped binaries remains limited, reducing explainability.
|
||||
- **Proposed plan**:
|
||||
1. Expand the fingerprint catalogue (`RustAnalyzerCollector`) with additional signature sources (e.g., crate fingerprint DB, community-sourced hash lists) and version inference heuristics.
|
||||
2. Parse symbol tables for stripped binaries (DWARF, `--build-id`) to infer crate names and link them to fingerprints, tagging results with confidence scores.
|
||||
3. Surface inferred vs authoritative crates distinctly in explain traces and CLI output (`--rust-fingerprint-detail`) so operators know when evidence is heuristic.
|
||||
4. Publish policy predicates (`rust.fingerprint.confidence`) allowing tenants to warn/fail when only heuristic evidence exists.
|
||||
- **Policy considerations**: Extend lattice weights to downgrade heuristic-only findings while still surfacing them; provide policy templates for regulated environments.
|
||||
- **Next actions**: open analyzer backlog story for fingerprint enrichment, schedule Docs Guild update for policy guidance, and coordinate Offline Kit team to package updated fingerprint datasets.
|
||||
|
||||
## OS packages — Windows/macOS coverage (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 2 | Requests are emerging but not yet widespread; gathering signals via `windows-macos-demand.md`. |
|
||||
| Competitive risk | 3 | Competitors currently focus on Linux; future announcements could increase pressure. |
|
||||
| Engineering effort | 4 | Full Windows/macOS analyzer support would require new parsers, evidence models, and Offline Kit updates. |
|
||||
| Policy/config impact | 3 | Policy must account for OS-specific package sources and signing requirements. |
|
||||
| Offline/air-gap impact | 4 | Supporting Windows/macOS would significantly expand Offline Kit footprint and mirroring workflows. |
|
||||
|
||||
- **Competitor capability**: Trivy and Grype document Linux distribution coverage; Snyk Container relies on SaaS services and likewise focuses on Linux bases. None offer first-class offline Windows/macOS package scanning today.
|
||||
- **StellaOps gap**: Platform currently scopes scanners to Linux; regulated customers with Windows/macOS workloads need clarity on future coverage.
|
||||
- **Proposed plan**:
|
||||
1. Continue demand intake per `docs/benchmarks/scanner/windows-macos-demand.md`, capturing customer interviews, sales telemetry, and community updates.
|
||||
2. If demand crosses the documented threshold, scope a design spike covering evidence models (e.g., MSI, Chocolatey, Homebrew), Surface integration, and policy ramifications.
|
||||
3. Document interim guidance for hybrid workflows (e.g., importing third-party SBOMs) while native analyzers are out of scope.
|
||||
- **Policy considerations**: Policies would need repository allowlists, signing requirements, and OS-specific mitigations; defer concrete templates until design spike completes.
|
||||
- **Next actions**: Execute tasks DOCS-SCANNER-BENCH-62-002/003/004/005/006 as demand signals accrue; only open engineering backlog after demand review approves scope expansion.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Layer package DB parsing | apk, dpkg, rpm status/databases per layer | StellaOps.Scanner.Analyzers.OS.* with RustFS CAS | Produces per-layer fragments keyed by layer digest; composed into inventory/usage SBOM with provenance pointers. |
|
||||
| Manifest + attestation binding | Distro manifest attestations, vendor signatures | Export Center + Signer/Attestor hand-off | Binds package fragments to DSSE attestations; policy consumes provenance metadata for trust weighting. |
|
||||
| External SBOM import (interim) | Third-party SBOMs for Windows/macOS | Scanner SBOM import API (planned) | Imports produce declared-only entries flagged for policy review until native analyzers exist. |
|
||||
|
||||
## Secrets leak detection (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Security and compliance teams expect leak detection in parity with Trivy/Snyk. |
|
||||
| Competitive risk | 4 | Trivy and Snyk market built-in secret scanners; lack of parity is visible. |
|
||||
| Engineering effort | 4 | Requires deterministic scanner pipeline, rule packaging, and explainability. |
|
||||
| Policy/config impact | 5 | Policy must gate rule sets, severities, and privacy guarantees. |
|
||||
| Offline/air-gap impact | 3 | Rule packs must be versioned and bundled with Offline Kit. |
|
||||
|
||||
- **Competitor capability**: Trivy ships regex/entropy secret analyzers with configurable rule packs; Snyk Code offers SaaS-based secret detection.
|
||||
- **StellaOps gap**: Scanner intentionally avoids leak detection to preserve determinism, leaving customers without first-party secret scanning.
|
||||
- **Proposed plan**:
|
||||
1. Implement a deterministic secret scanner plugin (`StellaOps.Scanner.Analyzers.Secrets`) supporting rule bundles signed and versioned for offline parity.
|
||||
2. Provide rule configuration via Surface.Validation (rule allowlists, target paths) and Surface.Secrets to manage sensitive allow rules.
|
||||
3. Emit findings into Policy Engine with new evidence types (`secret.leak`) so policies can enforce severity thresholds, ticket workflows, or waivers.
|
||||
4. Offer CLI verb (`stella secrets scan`) and integration into existing scan workflows behind an opt-in flag.
|
||||
5. Expose explain traces detailing rule IDs, masked snippets, and remediation guidance while upholding privacy constraints.
|
||||
- **Policy considerations**: Deliver policy templates for severity gating, rule packs per tenant, and privacy controls; lattice logic should discount low-confidence matches.
|
||||
- **Next actions**: open analyzer/CLI backlog work, coordinate with Docs Guild on policy templates, and bundle signed rule packs for Offline Kit distribution.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Operational secret retrieval | secret:// references resolved via Surface.Secrets providers | Surface.Secrets, Surface.Validation | Injects secrets at runtime; no SBOM entry created; policy ensures provenance of retrieved credentials. |
|
||||
| Deterministic leak detection (planned) | File content, archives, bytecode | StellaOps.Scanner.Analyzers.Secrets (planned) | Emits secret.leak evidence with masked snippets; Policy Engine merges with package evidence using VEX gating. |
|
||||
| Competitor leak scanning | Regex/entropy rulesets (Trivy pkg/fanal/secret), Snyk Code SaaS service | Trivy secret analyzer, Snyk Code API | Findings remain separate from SBOM data; StellaOps will map to policy evidence types once analyzer ships. |
|
||||
|
||||
## EntryTrace runtime command resolution (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Runtime teams rely on EntryTrace to separate inventory vs usage for policy decisions. |
|
||||
| Competitive risk | 4 | Competitors lack equivalent capability; maintaining lead is critical marketing differentiator. |
|
||||
| Engineering effort | 3 | Requires ongoing heuristics updates and parser maintenance for shells and launchers. |
|
||||
| Policy/config impact | 3 | Policy uses EntryTrace outputs; enhancements must keep explainability stable. |
|
||||
| Offline/air-gap impact | 2 | Heuristic catalog updates are lightweight and ship with Offline Kit. |
|
||||
|
||||
- **Competitor capability**: Trivy, Grype, and Snyk do not offer runtime command resolution comparable to EntryTrace.
|
||||
- **StellaOps gap**: To maintain leadership, EntryTrace heuristics must expand to new shells/launchers and provide richer explainability for policy consumers.
|
||||
- **Proposed plan**:
|
||||
1. Establish a quarterly EntryTrace heuristic review cadence to ingest new shell patterns and language launchers (npm/yarn, poetry, bundle exec, etc.).
|
||||
2. Add explain-trace improvements (confidence scores, unresolved reason catalog) so Policy Engine and UI can surface actionable guidance when resolution fails.
|
||||
3. Provide a CLI report (`stella entrytrace explain`) summarising resolved/unresolved paths with remediation hints, aligned with policy predicates.
|
||||
4. Publish contribution guidelines for customers to submit launcher patterns, keeping deterministic ordering and tests.
|
||||
- **Policy considerations**: Ensure policy predicates (e.g., `entrytrace.resolution`) include confidence metadata; lattice logic should treat unresolved entrypoints with configurable severity.
|
||||
- **Next actions**: open backlog item for heuristic upkeep and docs task for CLI/policy explain guidance.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Shell AST parsing | Dockerfile ENTRYPOINT/CMD, shell scripts | StellaOps.Scanner.EntryTrace | Builds command graph with confidence scores; merged into usage SBOM to mark runtime components. |
|
||||
| Wrapper catalogue resolution | Known launchers (npm, yarn, poetry, bundle exec, supervisor) | EntryTrace.WrapperCatalog | Resolves wrappers to underlying binaries; merges with language analyzers via UsageHints. |
|
||||
| Fallback heuristics | History scripts, init configs, service manifests | EntryTrace heuristic expansions | Flags unresolved entrypoints with reasons; Policy Engine consumes to warn/fail. |
|
||||
| Competitor baseline | — | Trivy/Grype/Snyk | No runtime resolution; StellaOps maintains differentiated capability. |
|
||||
|
||||
## DSSE/Rekor operator enablement (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Regulated tenants require auditable attestations and Rekor proofs for compliance handoffs. |
|
||||
| Competitive risk | 3 | Trivy and Grype export SBOMs but lack DSSE/Rekor workflows; Snyk relies on SaaS attestations. |
|
||||
| Engineering effort | 2 | Capabilities exist; need enablement guides, default policies, and operator tooling. |
|
||||
| Policy/config impact | 4 | Policies must ensure attestation upload/log and enforce Rekor verifiability by tenant. |
|
||||
| Offline/air-gap impact | 2 | DSSE/Rekor flows already support offline bundles; need better documentation and guardrails. |
|
||||
|
||||
- **Competitor capability**: Trivy emits SBOMs and Cosign signatures but Rekor usage is manual; Grype consumes Syft SBOMs without attestations; Snyk Container signs via SaaS only.
|
||||
- **StellaOps gap**: Signing pipeline exists (Signer → Attestor → Rekor v2) yet operators need prescriptive runbooks, policy defaults, and Export Center alignment.
|
||||
- **Proposed plan**:
|
||||
1. Publish DSSE/Rekor operator guide detailing enablement, policy toggles, and verification CLI workflows.
|
||||
2. Extend Export Center profiles with attestation policy checks and Rekor proof bundling by default.
|
||||
3. Surface Rekor health metrics in Scanner.WebService and Notify to escalate failed submissions.
|
||||
- **Policy considerations**: Provide policy predicates for attestation presence, Rekor inclusion, and proof expiry to enforce promotion gates.
|
||||
- **Next actions**: Track via DOCS-SCANNER-BENCH-62-015 and SCANNER-ENG-0015 for playbook plus tooling updates.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| SBOM emission | CycloneDX/SPDX payloads per scan | Scanner emit pipelines | Generates inventory/usage BOMs stored with CAS hashes for attestation. |
|
||||
| DSSE signing | DSSE bundles, signing keys | StellaOps.Signer + StellaOps.Attestor | Signs SBOM/report metadata, forwards to Rekor v2, records proof identifiers. |
|
||||
| Rekor proof packaging | Rekor inclusion proofs, bundle metadata | Export Center attestation packager | Bundles proofs into Offline Kit/export artifacts; Policy verifies before release. |
|
||||
| Competitor approach | CLI or SaaS-managed signing | Trivy Cosign integration, Snyk SaaS, Grype none | Operators must integrate manually; no default policy enforcement. |
|
||||
|
||||
## Ruby analyzer parity (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Rails and Sidekiq users expect first-party support with deterministic outputs. |
|
||||
| Competitive risk | 4 | Trivy ships bundler/gemspec analyzers; Snyk offers SaaS rubygems scanning; Grype mirrors Syft data. |
|
||||
| Engineering effort | 5 | Full analyzer stack (lockfile, runtime edges, capability signals) remains to be built. |
|
||||
| Policy/config impact | 4 | Requires policy predicates for bundler groups, autoload resolution, and capability flags. |
|
||||
| Offline/air-gap impact | 3 | Analyzer must ship with Offline Kit assets (fingerprints, autoload maps). |
|
||||
|
||||
- **Competitor capability**: Trivy parses bundler and gemspec data (pkg/fanal/analyzer/language/ruby); Grype relies on Syft ruby catalogers; Snyk CLI delegates to rubygems plugin hitting SaaS.
|
||||
- **StellaOps gap**: No Ruby analyzer in production; only backlog tasks exist.
|
||||
- **Proposed plan**:
|
||||
1. Execute SCANNER-ANALYZERS-RUBY-28-001..012 to deliver lockfile parsing, autoload graphs, capability mapping, and observation outputs.
|
||||
2. Wire CLI () and Offline Kit packaging once analyzer stabilises.
|
||||
3. Provide policy templates covering bundler groups, native extension handling, and dynamic require warnings.
|
||||
- **Policy considerations**: Policy Engine must treat declared groups versus runtime usage distinctly and allow waivers for development/test groups.
|
||||
- **Next actions**: Coordinate via SCANNER-ENG-0009 and DOCS-SCANNER-BENCH-62-009 for documentation and rollout.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Bundler lock parsing | Gemfile, Gemfile.lock, vendor/bundle specs | Trivy bundler analyzer; planned StellaOps Ruby lock collector | Emits package graph with group metadata; merges with installed gems once analyzer ships. |
|
||||
| Gemspec inspection | *.gemspec records, cached specs | Trivy gemspec analyzer; Syft gemspec cataloger | Provides metadata for packaged gems; merges for vendored dependencies. |
|
||||
| Runtime require graph | require/require_relative, autoload hints | Planned StellaOps Ruby require analyzer | Links runtime usage to packages; Policy uses edges for explain traces. |
|
||||
| Capability signals | exec, net/http, YAML load, Sidekiq configs | Planned StellaOps Ruby capability analyzer | Produces policy evidence for dangerous patterns and job schedulers. |
|
||||
|
||||
## PHP analyzer parity (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Magento, WordPress, Laravel tenants request deterministic composer coverage. |
|
||||
| Competitive risk | 4 | Trivy composer analyzer handles lock/json; Snyk PHP plugin uploads manifests to SaaS; Grype relies on Syft composer data. |
|
||||
| Engineering effort | 5 | Requires composer parsing, include graph, framework detectors, PHAR support. |
|
||||
| Policy/config impact | 4 | Policies must recognise autoload mappings, dangerous functions, extension requirements. |
|
||||
| Offline/air-gap impact | 3 | Analyzer assets must ship with Offline Kit (PHAR readers, fingerprints). |
|
||||
|
||||
- **Competitor capability**: Trivy composer analyzer (pkg/fanal/analyzer/language/php/composer) walks composer.lock and composer.json; Snyk CLI defers to snyk-php-plugin; Grype inherits Syft composer cataloger.
|
||||
- **StellaOps gap**: No PHP analyzer yet; tasks scoped but unimplemented.
|
||||
- **Proposed plan**:
|
||||
1. Deliver SCANNER-ANALYZERS-PHP-27-001..012 covering composer parsing, include graph, PHAR handling, capability analysis.
|
||||
2. Integrate extension detection with Surface.Validation and policy templates for required extensions.
|
||||
3. Provide CLI commands () and Offline Kit documentation.
|
||||
- **Policy considerations**: Configure policies for autoload coverage, dangerous constructs, upload limits, and extension presence.
|
||||
- **Next actions**: SCANNER-ENG-0010 and DOCS-SCANNER-BENCH-62-010 own design and documentation deliverables.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Composer lock parsing | composer.lock, composer.json | Trivy composer analyzer; planned StellaOps Composer collector | Generates package graph with direct versus transitive dependency tagging. |
|
||||
| Autoload resolution | psr-0/psr-4/classmap/files entries | Planned StellaOps PHP autoload analyzer | Builds module graph; merges with capability scanner to highlight runtime usage. |
|
||||
| Capability detection | exec, curl, unserialize, stream wrappers | Planned StellaOps PHP capability analyzer | Records evidence with file/line hashes; policy consumes for risk scoring. |
|
||||
| PHAR inspection | .phar archives, stub metadata | Planned StellaOps PHAR inspector | Expands embedded vendor trees; merges with package inventory. |
|
||||
|
||||
## Deno analyzer outlook (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 2 | Limited but growing demand from edge/runtime teams adopting Deno. |
|
||||
| Competitive risk | 2 | Trivy, Grype, and Snyk lack dedicated Deno analyzers; coverage relies on generic JavaScript workflows. |
|
||||
| Engineering effort | 3 | Requires lockfile parser, import graph resolution, and permission model mapping. |
|
||||
| Policy/config impact | 3 | Policies must treat Deno permissions (net/fs/run) and URL-based modules. |
|
||||
| Offline/air-gap impact | 3 | Need cached registry mirrors or import map handling for air-gapped runs. |
|
||||
|
||||
- **Competitor capability**: Current tooling leans on npm/pnpm analyzers; no first-party Deno parser is shipped in Trivy, Grype, or Snyk.
|
||||
- **StellaOps gap**: No analyzer today; opportunity to differentiate with deterministic import resolution and permission mapping.
|
||||
- **Proposed plan**:
|
||||
1. Scope parsing for deno.lock and import maps with content-addressed module fetching.
|
||||
2. Map permission declarations () into policy evidence.
|
||||
3. Provide Offline Kit guidance for cached module registries and pinned URLs.
|
||||
- **Policy considerations**: Introduce policy predicates for Deno permission sets and remote module domains.
|
||||
- **Next actions**: SCANNER-ENG-0011 and DOCS-SCANNER-BENCH-62-011 to draft design spike and documentation.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Lockfile analysis (planned) | deno.lock, import maps | Planned StellaOps Deno collector | Produces module graph keyed by URL; merges with cached artifacts. |
|
||||
| Permission audit | CLI flags, configuration files | Planned Deno policy analyzer | Records required permissions for policy gating. |
|
||||
| Competitor fallback | Manifest-based npm/pnpm scans | Trivy npm analyzer; Snyk node plugins | Provides partial coverage; lacks Deno permissions and remote module mapping. |
|
||||
|
||||
## Dart analyzer roadmap (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 2 | Dart/Flutter containers are niche but emerging in regulated workloads. |
|
||||
| Competitive risk | 3 | Trivy parses pubspec lockfiles; Snyk references SaaS plugin; Grype lacks native support. |
|
||||
| Engineering effort | 4 | Requires pubspec.lock parser, AOT snapshot fingerprinting, and runtime usage mapping. |
|
||||
| Policy/config impact | 3 | Policies must recognise build modes (debug/release) and AOT binaries. |
|
||||
| Offline/air-gap impact | 3 | Need mirrored pub registries and snapshot tooling packaged offline. |
|
||||
|
||||
- **Competitor capability**: Trivy Dart analyzer (pkg/fanal/analyzer/language/dart/pub) parses pubspec.lock; Snyk delegates to SaaS; Grype is absent.
|
||||
- **StellaOps gap**: No Dart analyzer or policy templates today.
|
||||
- **Proposed plan**:
|
||||
1. Implement pubspec.lock parser with dependency graph and hosted path resolution.
|
||||
2. Fingerprint Dart AOT snapshots to tie binaries back to packages.
|
||||
3. Emit capabilities (platform channels, native plugins) for policy gating.
|
||||
- **Policy considerations**: Distinguish debug versus release builds; allow tenants to require AOT parity.
|
||||
- **Next actions**: SCANNER-ENG-0012 and DOCS-SCANNER-BENCH-62-012 handle design and doc updates.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Lockfile parsing | pubspec.lock, pubspec.yaml | Trivy Dart analyzer; planned StellaOps Dart collector | Builds dependency graph with hosted path info; merges with runtime fingerprints. |
|
||||
| Snapshot fingerprinting | AOT snapshots, dill files | Planned Dart snapshot analyzer | Maps binaries to packages and versions; flagged for policy when unmatched. |
|
||||
| Capability mapping | Flutter platform channels, plugin manifests | Planned Dart capability analyzer | Records platform usage for policy weighting. |
|
||||
|
||||
## Swift analyzer assessment (Trivy, Grype, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 3 | iOS/macOS teams expect Package.resolved support once macOS scanning lands. |
|
||||
| Competitive risk | 4 | Trivy supports SwiftPM and CocoaPods; Snyk ships swift plugin; Grype lacks native Swift analyzers. |
|
||||
| Engineering effort | 4 | Requires SwiftPM parsing, xcframework metadata, and runtime usage heuristics. |
|
||||
| Policy/config impact | 4 | Policies must check binary signature, platform targets, and dynamic library usage. |
|
||||
| Offline/air-gap impact | 3 | Need mirrored Swift package indexes for air-gapped runs. |
|
||||
|
||||
- **Competitor capability**: Trivy swift analyzers (pkg/fanal/analyzer/language/swift) parse Package.resolved and CocoaPods; Snyk swift plugin relies on SaaS.
|
||||
- **StellaOps gap**: No Swift analyzer yet; Windows/macOS coverage pending.
|
||||
- **Proposed plan**:
|
||||
1. Design SwiftPM parser and binary metadata collector under Swift analyzer guild.
|
||||
2. Plan signature validation and entitlements capture for macOS targets.
|
||||
3. Coordinate with Offline Kit to mirror Swift registries and xcframework assets.
|
||||
- **Policy considerations**: Provide predicates for platform targets, entitlements, and signing requirements.
|
||||
- **Next actions**: SCANNER-ENG-0013 and DOCS-SCANNER-BENCH-62-013 drive design and documentation tasks.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| SwiftPM parsing | Package.swift, Package.resolved | Trivy swift analyzer; planned StellaOps Swift collector | Produces dependency graph with target info; merges into SBOM inventory. |
|
||||
| CocoaPods integration | Podfile.lock, Pods directory | Trivy CocoaPods analyzer; planned StellaOps CocoaPods collector | Maps pods to app targets; merges with SwiftPM data. |
|
||||
| Binary metadata | xcframeworks, Mach-O signatures | Planned Swift binary analyzer | Captures signing, architectures, and entitlements; fed into policy engine. |
|
||||
|
||||
## Kubernetes/VM target coverage alignment (Trivy, Snyk)
|
||||
### Scorecard
|
||||
| Dimension | Score (1-5) | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Customer demand | 4 | Platform teams expect coverage for live clusters, VMs, and admission controls. |
|
||||
| Competitive risk | 3 | Trivy Operator and Snyk monitor clusters but lack deterministic attestations; Grype stays image-focused. |
|
||||
| Engineering effort | 3 | Needs coordination between Scanner, Zastava, and runtime posture services. |
|
||||
| Policy/config impact | 4 | Policies must combine runtime posture data with scan evidence. |
|
||||
| Offline/air-gap impact | 3 | Requires offline posture bundles and admission controller configuration guidance. |
|
||||
|
||||
- **Competitor capability**: Trivy Operator scans clusters via live API calls; Snyk relies on SaaS; Grype covers images only.
|
||||
- **StellaOps gap**: Scanner handles images/filesystems; runtime enforcement lives in Zastava and needs coordinated roadmap.
|
||||
- **Proposed plan**:
|
||||
1. Produce joint roadmap clarifying Scanner versus Zastava responsibilities for clusters and VMs.
|
||||
2. Document how runtime posture feeds into Policy Engine and Export Center outputs.
|
||||
3. Provide Offline Kit assets for Zastava admission policies and scheduler hooks.
|
||||
- **Policy considerations**: Expand policy predicates to ingest runtime posture signals (signed images, SBOM availability, policy verdicts).
|
||||
- **Next actions**: SCANNER-ENG-0014 and DOCS-SCANNER-BENCH-62-014 deliver roadmap and documentation.
|
||||
|
||||
### Detection techniques
|
||||
| Technique | Artifacts | Analyzer / Module | Merge strategy |
|
||||
|-----------|-----------|-------------------|----------------|
|
||||
| Image/file scan | Container images, filesystem snapshots | StellaOps.Scanner.Worker | Provides deterministic inventory and usage data for images. |
|
||||
| Runtime posture (planned) | Admission events, sensor telemetry | Zastava.Observer and Webhook | Supplies runtime evidence (signed images, drift) merged into policy overlays. |
|
||||
| Cluster drift detection | Scheduler and Vuln Explorer events | Scheduler + Policy Engine | Detects advisory/VEX deltas and schedules analysis-only runs. |
|
||||
| Competitor workflow | Live API queries and SaaS services | Trivy Operator; Snyk Kubernetes integration | Offers visibility but lacks attestations and offline parity. |
|
||||
17
docs/benchmarks/scanner/windows-macos-demand.md
Normal file
17
docs/benchmarks/scanner/windows-macos-demand.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# Windows / macOS Analyzer Demand Capture
|
||||
|
||||
## Current competitive posture
|
||||
- **Trivy** coverage tables enumerate Linux family distributions only (Alpine, Wolfi, Chainguard, Debian/Ubuntu, RHEL/Alma/Rocky, SUSE, Photon, Amazon, Bottlerocket) with no mention of Windows or macOS package managers (source: /tmp/trivy-docs/docs/docs/coverage/os/index.md).
|
||||
- **Grype** matchers target Linux ecosystems via Syft catalogers (APK, DPKG/APT, RPM, Portage, Bitnami) with no coverage for Windows Installer, MSI, Chocolatey, or macOS Homebrew/App bundles (source: /tmp/grype-data/grype/matcher/{apk,dpkg,rpm}/matcher.go).
|
||||
- **Snyk CLI** focuses on container, open source, IaC, and code scanning routed through the SaaS service; CLI documentation does not advertise Windows/macOS package coverage beyond container images (source: /tmp/snyk-cli/README.md).
|
||||
|
||||
## Signals to gather
|
||||
1. **Customer interviews** – ask regulated customers deploying Windows Server or Windows container workloads which artifacts require SBOM + VEX and whether current StellaOps scope (Linux images) blocks adoption.
|
||||
2. **Sales & SE feedback loop** – capture any RFP items referencing Windows/macOS scanning and log them in the Scanner guild tracker (SCANNER-ANALYZERS-OS-*).
|
||||
3. **Support telemetry** – review ticket tags for “windows”, “macos”, “dotnet framework” to quantify inbound demand.
|
||||
4. **Community landscape** – monitor Trivy/Grype/Snyk release notes for Windows/macOS announcements; update this note and the feature matrix when competitors change posture.
|
||||
|
||||
## Next actions
|
||||
- Coordinate with Product Marketing to add Windows/macOS discovery prompts into upcoming customer advisory sessions (target: Sprint 132 intake).
|
||||
- Instrument the scanner roadmap intake form with explicit checkboxes for Windows/macOS package ecosystems.
|
||||
- If three or more qualified customers flag Windows/macOS coverage as a blocking requirement, open a design spike under the Scanner Analyzer Guild with scope/time estimates and Offline Kit considerations.
|
||||
@@ -38,6 +38,7 @@ Follow the sprint files below in order. Update task status in both `SPRINTS` and
|
||||
> 2025-11-02: SURFACE-FS-02 moved to DOING (Surface FS Guild) – building core abstractions and deterministic serializers.
|
||||
> 2025-11-02: SURFACE-SECRETS-01 moved to DOING (Surface Secrets Guild) – updating secrets design for provider matrix.
|
||||
> 2025-11-02: SURFACE-SECRETS-02 moved to DOING (Surface Secrets Guild) – implementing base providers + tests.
|
||||
> 2025-11-02: AUTH-POLICY-27-002 marked DONE (Authority Core & Security Guild) – interactive-only policy publish/promote scopes delivered with metadata, fresh-auth enforcement, and audit/docs updates.
|
||||
> 2025-11-02: SCANNER-ENTRYTRACE-18-506 moved to DOING (EntryTrace Guild, Scanner WebService Guild) – surfacing EntryTrace results via WebService/CLI with confidence metadata.
|
||||
> 2025-11-02: ATTESTOR-74-001 marked DONE (Attestor Service Guild) – witness client integration, repository schema, and verification/reporting updates landed with tests.
|
||||
> 2025-11-02: AUTH-OAS-63-001 moved to DOING (Authority Core & Security Guild, API Governance Guild) – verifying legacy `/oauth/*` deprecation signalling and notifications ahead of sunset.
|
||||
@@ -51,6 +52,8 @@ Follow the sprint files below in order. Update task status in both `SPRINTS` and
|
||||
> 2025-11-02: AUTH-ORCH-34-001 marked DONE (Authority Core & Security Guild) – `orch:backfill` scope enforced with reason/ticket metadata, Authority + CLI updated, docs/config refreshed for Orchestrator admins.
|
||||
> 2025-11-02: AUTH-PACKS-41-001 moved to DOING (Authority Core & Security Guild) – defining packs scope catalogue, issuer templates, and offline defaults.
|
||||
> 2025-11-02: AUTH-PACKS-41-001 added shared OpenSSL 1.1 test libs so Authority & Signals Mongo2Go suites run on OpenSSL 3.
|
||||
> 2025-11-02: AUTH-NOTIFY-42-001 moved to DOING (Authority Core & Security Guild) – investigating `/notify/ack-tokens/rotate` 500 responses when key metadata missing.
|
||||
> 2025-11-02: AUTH-NOTIFY-42-001 marked DONE (Authority Core & Security Guild) – bootstrap rotate defaults fixed, `StellaOpsBearer` test alias added, and notify ack rotation regression passes.
|
||||
> 2025-11-02: ENTRYTRACE-SURFACE-02 moved to DOING (EntryTrace Guild) – replacing direct env/secret access with Surface.Secrets provider for EntryTrace runs.
|
||||
> 2025-11-02: ENTRYTRACE-SURFACE-01 marked DONE (EntryTrace Guild) – Surface.Validation + Surface.FS cache now drive EntryTrace reuse with regression tests.
|
||||
> 2025-11-02: ENTRYTRACE-SURFACE-02 marked DONE (EntryTrace Guild) – EntryTrace environment placeholders resolved via Surface.Secrets with updated docs/tests.
|
||||
@@ -63,3 +66,26 @@ Follow the sprint files below in order. Update task status in both `SPRINTS` and
|
||||
> 2025-11-02: CONCELIER-WEB-OAS-61-001 marked DONE (Concelier WebService Guild) – discovery endpoint now serves signed OpenAPI 3.1 document with ETag support.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-001 moved to DOING (Docs Guild, Scanner Guild) – refreshing Trivy/Grype/Snyk comparison docs and ecosystem matrix with source-linked coverage.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-001 marked DONE (Docs Guild, Scanner Guild) – matrix updated with Windows/macOS coverage row and secret detection techniques; deep dives cite Trivy/Grype/Snyk sources.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-003 added (Docs Guild, Product Guild) – recording Python lockfile/editable-install demand signals for policy guidance follow-up.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-004 added (Docs Guild, Java Analyzer Guild) – documenting Java lockfile ingestion plan and policy templates.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-005 added (Docs Guild, Go Analyzer Guild) – documenting Go stripped-binary fallback enrichment guidance.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-006 added (Docs Guild, Rust Analyzer Guild) – documenting Rust fingerprint enrichment guidance.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-007 added (Docs Guild, Security Guild) – documenting secret leak detection guidance.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-008 added (Docs Guild, EntryTrace Guild) – documenting EntryTrace heuristic maintenance guidance.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-009 added (Docs Guild, Ruby Analyzer Guild) – deepening Ruby gap analysis with detection tables; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-010 added (Docs Guild, PHP Analyzer Guild) – documenting PHP analyzer parity gaps; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-011 added (Docs Guild, Language Analyzer Guild) – capturing Deno runtime gap analysis; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-012 added (Docs Guild, Language Analyzer Guild) – expanding Dart ecosystem comparison; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-013 added (Docs Guild, Swift Analyzer Guild) – expanding Swift coverage analysis; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-014 added (Docs Guild, Runtime Guild) – detailing Kubernetes/VM coverage plan; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-015 added (Docs Guild, Export Center Guild) – outlining DSSE/Rekor operator enablement guidance; status set to DOING.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-009 marked DONE (Docs Guild, Ruby Analyzer Guild) – Ruby gap section delivered with detection tables and backlog links.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-010 marked DONE (Docs Guild, PHP Analyzer Guild) – PHP gap analysis updated with implementation notes.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-011 marked DONE (Docs Guild, Language Analyzer Guild) – Deno plan documented with detection technique table.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-012 marked DONE (Docs Guild, Language Analyzer Guild) – Dart coverage section fleshed out with detection strategies.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-013 marked DONE (Docs Guild, Swift Analyzer Guild) – Swift analyzer roadmap captured with policy hooks.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-014 marked DONE (Docs Guild, Runtime Guild) – Kubernetes/VM alignment section published.
|
||||
> 2025-11-02: DOCS-SCANNER-BENCH-62-015 marked DONE (Docs Guild, Export Center Guild) – DSSE/Rekor enablement guidance appended to gap doc.
|
||||
> 2025-11-02: AIAI-31-011 moved to DOING (Advisory AI Guild) – implementing Excititor VEX document provider.
|
||||
> 2025-11-02: AIAI-31-011 marked DONE (Advisory AI Guild) – Excititor VEX provider + OpenVEX chunking shipped with tests.
|
||||
> 2025-11-02: AIAI-31-002 moved to DOING (Advisory AI Guild, SBOM Service Guild) – building SBOM context retriever for timelines/paths/blast radius.
|
||||
|
||||
@@ -52,6 +52,8 @@ AUTH-NOTIFY-38-001 | DONE (2025-11-01) | Define `Notify.Viewer`, `Notify.Operato
|
||||
> 2025-11-01: AUTH-NOTIFY-38-001 completed—Notify scope catalog, discovery metadata, docs, configuration samples, and service tests updated for new roles.
|
||||
AUTH-NOTIFY-40-001 | DONE (2025-11-02) | Implement signed ack token key rotation, webhook allowlists, admin-only escalation settings, and audit logging of ack actions. Dependencies: AUTH-NOTIFY-38-001, WEB-NOTIFY-40-001. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
> 2025-11-02: `/notify/ack-tokens/rotate` (notify.admin) now rotates DSSE keys with audit coverage and integration tests. Webhook allowlist + escalation scope enforcement verified.
|
||||
AUTH-NOTIFY-42-001 | DONE (2025-11-02) | Investigate ack token rotation 500 errors (test Rotate_ReturnsBadRequest_WhenKeyIdMissing_AndAuditsFailure still failing). Capture logs, identify root cause, and patch handler. Dependencies: AUTH-NOTIFY-40-001. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
> 2025-11-02: Added `StellaOpsBearer` mapping to test harness, fixed bootstrap rotate handler defaults, and reran targeted notify ack rotation test (now returning BadRequest instead of 500).
|
||||
AUTH-OAS-62-001 | DONE (2025-11-02) | Provide SDK helpers for OAuth2/PAT flows, tenancy override header; add integration tests. Dependencies: AUTH-OAS-61-001, SDKGEN-63-001. | Authority Core & Security Guild, SDK Generator Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
> 2025-11-02: Added HttpClient auth helper (OAuth2 + PAT) with tenant header support, plus coverage in `StellaOps.Auth.Client.Tests`.
|
||||
AUTH-OAS-63-001 | DONE (2025-11-02) | Emit deprecation headers and notifications for legacy auth endpoints. Dependencies: AUTH-OAS-62-001, APIGOV-63-001. | Authority Core & Security Guild, API Governance Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
@@ -77,9 +79,10 @@ Task ID | State | Task description | Owners (Source)
|
||||
--- | --- | --- | ---
|
||||
AUTH-POLICY-23-002 | BLOCKED (2025-10-29) | Implement optional two-person rule for activation: require two distinct `policy:activate` approvals when configured; emit audit logs. Dependencies: AUTH-POLICY-23-001. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-POLICY-23-003 | BLOCKED (2025-10-29) | Update documentation and sample configs for policy roles, approval workflow, and signing requirements. Dependencies: AUTH-POLICY-23-001. | Authority Core & Docs Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-POLICY-27-002 | TODO | Provide attestation signing service bindings (OIDC token exchange, cosign integration) and enforce publish/promote scope checks, fresh-auth requirements, and audit logging. Dependencies: AUTH-POLICY-27-001, REGISTRY-API-27-007. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-POLICY-27-003 | TODO | Update Authority configuration/docs for Policy Studio roles, signing policies, approval workflows, and CLI integration; include compliance checklist. Dependencies: AUTH-POLICY-27-001, AUTH-POLICY-27-002. | Authority Core & Docs Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-TEN-49-001 | TODO | Implement service accounts & delegation tokens (`act` chain), per-tenant quotas, audit stream of auth decisions, and revocation APIs. Dependencies: AUTH-TEN-47-001. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-POLICY-27-002 | DONE (2025-11-02) | Provide attestation signing service bindings (OIDC token exchange, cosign integration) and enforce publish/promote scope checks, fresh-auth requirements, and audit logging. Dependencies: AUTH-POLICY-27-001, REGISTRY-API-27-007. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
> 2025-11-02: Added interactive-only `policy:publish`/`policy:promote` scopes with metadata requirements (`policy_reason`, `policy_ticket`, `policy_digest`), fresh-auth validation, audit enrichment, and updated config/docs for operators.
|
||||
AUTH-POLICY-27-003 | DOING (2025-11-02) | Update Authority configuration/docs for Policy Studio roles, signing policies, approval workflows, and CLI integration; include compliance checklist. Dependencies: AUTH-POLICY-27-001, AUTH-POLICY-27-002. | Authority Core & Docs Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-TEN-49-001 | DOING (2025-11-02) | Implement service accounts & delegation tokens (`act` chain), per-tenant quotas, audit stream of auth decisions, and revocation APIs. Dependencies: AUTH-TEN-47-001. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-VULN-29-001 | TODO | Define Vuln Explorer scopes/roles (`vuln:view`, `vuln:investigate`, `vuln:operate`, `vuln:audit`) with ABAC attributes (env, owner, business_tier) and update discovery metadata/offline kit defaults. Dependencies: AUTH-POLICY-27-001. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-VULN-29-002 | TODO | Enforce CSRF/anti-forgery tokens for workflow actions, sign attachment tokens, and record audit logs with ledger event hashes. Dependencies: AUTH-VULN-29-001, LEDGER-29-002. | Authority Core & Security Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
AUTH-VULN-29-003 | TODO | Update security docs/config samples for Vuln Explorer roles, ABAC policies, attachment signing, and ledger verification guidance. Dependencies: AUTH-VULN-29-001..002. | Authority Core & Docs Guild (src/Authority/StellaOps.Authority/TASKS.md)
|
||||
|
||||
@@ -5,16 +5,21 @@ Depends on: Sprint 100.A - Attestor
|
||||
Summary: Ingestion & Evidence focus on AdvisoryAI.
|
||||
Task ID | State | Task description | Owners (Source)
|
||||
--- | --- | --- | ---
|
||||
AIAI-31-001 | DOING (2025-11-02) | Implement structured and vector retrievers for advisories/VEX with paragraph anchors and citation metadata. Dependencies: CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-002 | TODO | Build SBOM context retriever (purl version timelines, dependency paths, env flags, blast radius estimator). Dependencies: SBOM-VULN-29-001. | Advisory AI Guild, SBOM Service Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-001 | DONE (2025-11-02) | Implement structured and vector retrievers for advisories/VEX with paragraph anchors and citation metadata. Dependencies: CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-002 | DOING | Build SBOM context retriever (purl version timelines, dependency paths, env flags, blast radius estimator). Dependencies: SBOM-VULN-29-001. | Advisory AI Guild, SBOM Service Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-003 | TODO | Implement deterministic toolset (version comparators, range checks, dependency analysis, policy lookup) exposed via orchestrator. Dependencies: AIAI-31-001..002. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-004 | TODO | Build orchestration pipeline for Summary/Conflict/Remediation tasks (prompt templates, tool calls, token budgets, caching). Dependencies: AIAI-31-001..003, AUTH-VULN-29-001. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-005 | TODO | Implement guardrails (redaction, injection defense, output validation, citation enforcement) and fail-safe handling. Dependencies: AIAI-31-004. | Advisory AI Guild, Security Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-006 | TODO | Expose REST API endpoints (`/advisory/ai/*`) with RBAC, rate limits, OpenAPI schemas, and batching support. Dependencies: AIAI-31-004..005. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-007 | TODO | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. Dependencies: AIAI-31-004..006. | Advisory AI Guild, Observability Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-008 | TODO | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. Dependencies: AIAI-31-006..007. | Advisory AI Guild, DevOps Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-010 | DONE (2025-11-02) | Implement Concelier advisory raw document provider mapping CSAF/OSV payloads into structured chunks for retrieval. Dependencies: CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-011 | DONE (2025-11-02) | Implement Excititor VEX document provider to surface structured VEX statements for retrieval. Dependencies: EXCITITOR-LNM-21-201, EXCITITOR-CORE-AOC-19-002. | Advisory AI Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
AIAI-31-009 | TODO | Develop unit/golden/property/perf tests, injection harness, and regression suite; ensure determinism with seeded caches. Dependencies: AIAI-31-001..006. | Advisory AI Guild, QA Guild (src/AdvisoryAI/StellaOps.AdvisoryAI/TASKS.md)
|
||||
|
||||
> 2025-11-02: Structured + vector retrievers landed with deterministic CSAF/OSV/Markdown chunkers, deterministic hash embeddings, and unit coverage for sample advisories.
|
||||
> 2025-11-02: SBOM context request/result models finalized; retriever tests now validate environment-flag toggles and dependency-path dedupe. SBOM guild to wire real context service client.
|
||||
|
||||
|
||||
[Ingestion & Evidence] 110.B) Concelier.I
|
||||
Depends on: Sprint 100.A - Attestor
|
||||
|
||||
@@ -206,6 +206,14 @@ Task ID | State | Task description | Owners (Source)
|
||||
DOCS-SIG-26-008 | TODO | Write `/docs/migration/enable-reachability.md` guiding rollout, fallbacks, monitoring. Dependencies: DOCS-SIG-26-007. | Docs Guild, DevOps Guild (docs/TASKS.md)
|
||||
DOCS-SURFACE-01 | TODO | Create `/docs/modules/scanner/scanner-engine.md` covering Surface.FS/Env/Secrets workflow between Scanner, Zastava, Scheduler, and Ops. | Docs Guild, Scanner Guild, Zastava Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-001 | DONE (2025-11-02) | Refresh scanner comparison docs (Trivy/Grype/Snyk) and keep ecosystem matrix aligned with source implementations. | Docs Guild, Scanner Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-002 | TODO | Capture customer demand for Windows/macOS analyzer coverage and document outcomes. | Docs Guild, Product Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-003 | TODO | Capture Python lockfile/editable install requirements and document policy guidance. | Docs Guild, Product Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-004 | TODO | Document Java lockfile ingestion guidance and policy templates. | Docs Guild, Java Analyzer Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-005 | TODO | Document Go stripped-binary fallback enrichment guidance once implementation lands. | Docs Guild, Go Analyzer Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-006 | TODO | Document Rust fingerprint enrichment guidance and policy examples. | Docs Guild, Rust Analyzer Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-007 | TODO | Produce secret leak detection documentation (rules, policy templates). | Docs Guild, Security Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-008 | TODO | Publish EntryTrace explain/heuristic maintenance guide. | Docs Guild, EntryTrace Guild (docs/TASKS.md)
|
||||
DOCS-SCANNER-BENCH-62-009 | TODO | Produce SAST integration documentation (connector framework, policy templates). | Docs Guild, Policy Guild (docs/TASKS.md)
|
||||
DOCS-TEN-47-001 | TODO | Publish `/docs/security/tenancy-overview.md` and `/docs/security/scopes-and-roles.md` outlining scope grammar, tenant model, imposed rule reminder. | Docs Guild, Authority Core (docs/TASKS.md)
|
||||
DOCS-TEN-48-001 | TODO | Publish `/docs/operations/multi-tenancy.md`, `/docs/operations/rls-and-data-isolation.md`, `/docs/console/admin-tenants.md`. Dependencies: DOCS-TEN-47-001. | Docs Guild, Platform Ops (docs/TASKS.md)
|
||||
DOCS-TEN-49-001 | TODO | Publish `/docs/modules/cli/guides/authentication.md`, `/docs/api/authentication.md`, `/docs/policy/examples/abac-overlays.md`, update `/docs/install/configuration-reference.md` with new env vars, all ending with imposed rule line. Dependencies: DOCS-TEN-48-001. | Docs & DevEx Guilds (docs/TASKS.md)
|
||||
|
||||
@@ -6,5 +6,20 @@
|
||||
|----|--------|----------|-------------|-------|
|
||||
| SCANNER-DOCS-0001 | DOING (2025-10-29) | Docs Guild | Validate that ./README.md aligns with the latest release notes. | See ./AGENTS.md |
|
||||
| SCANNER-DOCS-0002 | DONE (2025-11-02) | Docs Guild | Keep scanner benchmark comparisons (Trivy/Grype/Snyk) and deep-dive matrix current with source references. | Coordinate with docs/benchmarks owners |
|
||||
| SCANNER-DOCS-0003 | TODO | Docs Guild, Product Guild | Gather Windows/macOS analyzer demand signals and record findings in `docs/benchmarks/scanner/windows-macos-demand.md`. | Coordinate with Product Marketing & Sales enablement |
|
||||
| SCANNER-ENG-0008 | TODO | EntryTrace Guild, QA Guild | Maintain EntryTrace heuristic cadence per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Include quarterly pattern review + explain trace updates |
|
||||
| SCANNER-ENG-0009 | TODO | Ruby Analyzer Guild | SCANNER-ANALYZERS-RUBY-28-001..012 | Deliver Ruby analyzer parity and observation pipeline per gap doc (lockfiles, runtime graph, policy signals). | Design complete; fixtures published; CLI/Offline docs updated. |
|
||||
| SCANNER-ENG-0010 | TODO | PHP Analyzer Guild | SCANNER-ANALYZERS-PHP-27-001..012 | Ship PHP analyzer pipeline (composer lock, autoload graph, capability signals) to close comparison gaps. | Analyzer + policy integration merged; fixtures + docs aligned. |
|
||||
| SCANNER-ENG-0011 | TODO | Language Analyzer Guild | — | Scope Deno runtime analyzer (lockfile resolver, import graphs) based on competitor techniques. | Design doc approved; backlog split into analyzer/runtime work. |
|
||||
| SCANNER-ENG-0012 | TODO | Language Analyzer Guild | — | Evaluate Dart analyzer requirements (pubspec parsing, AOT artifacts) to restore parity. | Investigation summary + task split filed with Dart guild. |
|
||||
| SCANNER-ENG-0013 | TODO | Swift Analyzer Guild | — | Plan Swift Package Manager coverage (Package.resolved, xcframeworks, runtime hints) with policy hooks. | Design brief approved; backlog seeded with analyzer tasks. |
|
||||
| SCANNER-ENG-0014 | TODO | Runtime Guild, Zastava Guild | — | Align Kubernetes/VM target coverage roadmap between Scanner and Zastava per comparison findings. | Joint roadmap doc approved; cross-guild tasks opened. |
|
||||
| SCANNER-ENG-0015 | TODO | Export Center Guild, Scanner Guild | — | Document DSSE/Rekor operator enablement guidance and rollout levers surfaced in gap analysis. | Playbook drafted; Export Center backlog updated. |
|
||||
| SCANNER-ENG-0002 | TODO | Scanner Guild, CLI Guild | Design Node.js lockfile collector/CLI validator per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Capture Surface & policy requirements before implementation |
|
||||
| SCANNER-ENG-0003 | TODO | Python Analyzer Guild, CLI Guild | Design Python lockfile/editable install parity checks per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Include policy predicates & CLI story in design |
|
||||
| SCANNER-ENG-0004 | TODO | Java Analyzer Guild, CLI Guild | Design Java lockfile ingestion & validation per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Cover Gradle/SBT collectors, CLI verb, policy hooks |
|
||||
| SCANNER-ENG-0005 | TODO | Go Analyzer Guild | Enhance Go stripped-binary fallback inference per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Include inferred module metadata & policy integration |
|
||||
| SCANNER-ENG-0006 | TODO | Rust Analyzer Guild | Expand Rust fingerprint coverage per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Ship enriched fingerprint catalogue + policy controls |
|
||||
| SCANNER-ENG-0007 | TODO | Scanner Guild, Policy Guild | Design deterministic secret leak detection pipeline per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`. | Include rule packaging, Policy Engine integration, CLI workflow |
|
||||
| SCANNER-OPS-0001 | TODO | Ops Guild | Review runbooks/observability assets after next sprint demo. | Sync outcomes back to ../../TASKS.md |
|
||||
| SCANNER-ENG-0001 | TODO | Module Team | Cross-check implementation plan milestones against ../../implplan/SPRINTS.md. | Update status via ./AGENTS.md workflow |
|
||||
|
||||
@@ -45,6 +45,8 @@ Authority issues short-lived tokens bound to tenants and scopes. Sprint 19 int
|
||||
| `policy:review` | Policy Studio review panes | Review drafts, leave comments, request changes. | Tenant required; pair with `policy:simulate` for diff previews. |
|
||||
| `policy:approve` | Policy Studio approvals | Approve or reject policy drafts. | Tenant required; fresh-auth enforced by Console UI. |
|
||||
| `policy:operate` | Policy Studio promotion controls | Trigger batch simulations, promotions, and canary runs. | Tenant required; combine with `policy:run`/`policy:activate`. |
|
||||
| `policy:publish` | Policy Studio / CLI attestation flows | Publish approved policy versions and generate signing bundles. | Interactive only; tenant required; tokens must include `policy_reason`, `policy_ticket`, and policy digest (fresh-auth enforced). |
|
||||
| `policy:promote` | Policy Studio / CLI attestation flows | Promote policy attestations between environments (e.g., staging → prod). | Interactive only; tenant required; requires `policy_reason`, `policy_ticket`, digest, and fresh-auth within 5 minutes. |
|
||||
| `policy:audit` | Policy audit exports | Access immutable policy history, comments, and signatures. | Tenant required; read-only access. |
|
||||
| `policy:simulate` | Policy Studio / CLI simulations | Run simulations against tenant inventories. | Tenant required; available to authors, reviewers, operators. |
|
||||
| `vuln:read` | Vuln Explorer API/UI | Read normalized vulnerability data. | Tenant required. |
|
||||
@@ -89,7 +91,7 @@ Authority issues short-lived tokens bound to tenants and scopes. Sprint 19 int
|
||||
- **`role/policy-author`** → `policy:author`, `policy:read`, `policy:simulate`, `findings:read`.
|
||||
- **`role/policy-reviewer`** → `policy:review`, `policy:read`, `policy:simulate`, `findings:read`.
|
||||
- **`role/policy-approver`** → `policy:approve`, `policy:review`, `policy:read`, `policy:simulate`, `findings:read`.
|
||||
- **`role/policy-operator`** → `policy:operate`, `policy:run`, `policy:activate`, `policy:read`, `policy:simulate`, `findings:read`.
|
||||
- **`role/policy-operator`** → `policy:operate`, `policy:run`, `policy:activate`, `policy:publish`, `policy:promote`, `policy:read`, `policy:simulate`, `findings:read`.
|
||||
- **`role/policy-auditor`** → `policy:audit`, `policy:read`, `policy:simulate`, `findings:read`.
|
||||
- **`role/export-viewer`** *(Authority role: `Export.Viewer`)* → `export.viewer`.
|
||||
- **`role/export-operator`** *(Authority role: `Export.Operator`)* → `export.viewer`, `export.operator`.
|
||||
@@ -130,7 +132,7 @@ tenants:
|
||||
policy-approver:
|
||||
scopes: [policy:approve, policy:review, policy:read, policy:simulate, findings:read]
|
||||
policy-operator:
|
||||
scopes: [policy:operate, policy:run, policy:activate, policy:read, policy:simulate, findings:read]
|
||||
scopes: [policy:operate, policy:run, policy:activate, policy:publish, policy:promote, policy:read, policy:simulate, findings:read]
|
||||
policy-auditor:
|
||||
scopes: [policy:audit, policy:read, policy:simulate, findings:read]
|
||||
pack-viewer:
|
||||
|
||||
@@ -403,7 +403,7 @@ tenants:
|
||||
policy-approver:
|
||||
scopes: [ "policy:approve", "policy:review", "policy:read", "policy:simulate", "findings:read" ]
|
||||
policy-operator:
|
||||
scopes: [ "policy:operate", "policy:run", "policy:activate", "policy:read", "policy:simulate", "findings:read" ]
|
||||
scopes: [ "policy:operate", "policy:run", "policy:activate", "policy:publish", "policy:promote", "policy:read", "policy:simulate", "findings:read" ]
|
||||
policy-auditor:
|
||||
scopes: [ "policy:audit", "policy:read", "policy:simulate", "findings:read" ]
|
||||
pack-viewer:
|
||||
|
||||
193
src/AdvisoryAI/StellaOps.AdvisoryAI.sln
Normal file
193
src/AdvisoryAI/StellaOps.AdvisoryAI.sln
Normal file
@@ -0,0 +1,193 @@
|
||||
|
||||
Microsoft Visual Studio Solution File, Format Version 12.00
|
||||
# Visual Studio Version 17
|
||||
VisualStudioVersion = 17.0.31903.59
|
||||
MinimumVisualStudioVersion = 10.0.40219.1
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.AdvisoryAI", "StellaOps.AdvisoryAI\StellaOps.AdvisoryAI.csproj", "{E41E2FDA-3827-4B18-8596-B25BDE882D5F}"
|
||||
EndProject
|
||||
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "__Tests", "__Tests", "{56BCE1BF-7CBA-7CE8-203D-A88051F1D642}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.AdvisoryAI.Tests", "__Tests\StellaOps.AdvisoryAI.Tests\StellaOps.AdvisoryAI.Tests.csproj", "{F6860DE5-0C7C-4848-8356-7555E3C391A3}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.Testing", "..\Concelier\__Libraries\StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj", "{B53E4FED-8988-4354-8D1A-D3C618DBFD78}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.Connector.Common", "..\Concelier\__Libraries\StellaOps.Concelier.Connector.Common\StellaOps.Concelier.Connector.Common.csproj", "{E98A7C01-1619-41A0-A586-84EF9952F75D}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.Storage.Mongo", "..\Concelier\__Libraries\StellaOps.Concelier.Storage.Mongo\StellaOps.Concelier.Storage.Mongo.csproj", "{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.Core", "..\Concelier\__Libraries\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj", "{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.Models", "..\Concelier\__Libraries\StellaOps.Concelier.Models\StellaOps.Concelier.Models.csproj", "{BBB5CD3C-866A-4298-ACE1-598413631CF5}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.RawModels", "..\Concelier\__Libraries\StellaOps.Concelier.RawModels\StellaOps.Concelier.RawModels.csproj", "{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Concelier.Normalization", "..\Concelier\__Libraries\StellaOps.Concelier.Normalization\StellaOps.Concelier.Normalization.csproj", "{1313202A-E8A8-41E3-80BC-472096074681}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Plugin", "..\__Libraries\StellaOps.Plugin\StellaOps.Plugin.csproj", "{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.DependencyInjection", "..\__Libraries\StellaOps.DependencyInjection\StellaOps.DependencyInjection.csproj", "{F567F20C-552F-4761-941A-0552CEF68160}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Aoc", "..\Aoc\__Libraries\StellaOps.Aoc\StellaOps.Aoc.csproj", "{C8CE71D3-952A-43F7-9346-20113E37F672}"
|
||||
EndProject
|
||||
Global
|
||||
GlobalSection(SolutionConfigurationPlatforms) = preSolution
|
||||
Debug|Any CPU = Debug|Any CPU
|
||||
Debug|x64 = Debug|x64
|
||||
Debug|x86 = Debug|x86
|
||||
Release|Any CPU = Release|Any CPU
|
||||
Release|x64 = Release|x64
|
||||
Release|x86 = Release|x86
|
||||
EndGlobalSection
|
||||
GlobalSection(ProjectConfigurationPlatforms) = postSolution
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Release|x64.Build.0 = Release|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{E41E2FDA-3827-4B18-8596-B25BDE882D5F}.Release|x86.Build.0 = Release|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Release|x64.Build.0 = Release|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3}.Release|x86.Build.0 = Release|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Release|x64.Build.0 = Release|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{B53E4FED-8988-4354-8D1A-D3C618DBFD78}.Release|x86.Build.0 = Release|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Release|x64.Build.0 = Release|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{E98A7C01-1619-41A0-A586-84EF9952F75D}.Release|x86.Build.0 = Release|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Release|x64.Build.0 = Release|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{973DD52D-AD3C-4526-92CB-F35FDD9AEA10}.Release|x86.Build.0 = Release|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Release|x64.Build.0 = Release|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{F7FB8ABD-31D7-4B4D-8B2A-F4D2B696ACAF}.Release|x86.Build.0 = Release|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Release|x64.Build.0 = Release|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{BBB5CD3C-866A-4298-ACE1-598413631CF5}.Release|x86.Build.0 = Release|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Release|x64.Build.0 = Release|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{7E3D9A33-BD0E-424A-88E6-F4440E386A3C}.Release|x86.Build.0 = Release|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Release|x64.Build.0 = Release|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{1313202A-E8A8-41E3-80BC-472096074681}.Release|x86.Build.0 = Release|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Release|x64.Build.0 = Release|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{1CC5F6F8-DF9A-4BCC-8C69-79E2DF604F6D}.Release|x86.Build.0 = Release|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Release|x64.Build.0 = Release|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{F567F20C-552F-4761-941A-0552CEF68160}.Release|x86.Build.0 = Release|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Release|x64.Build.0 = Release|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{C8CE71D3-952A-43F7-9346-20113E37F672}.Release|x86.Build.0 = Release|Any CPU
|
||||
EndGlobalSection
|
||||
GlobalSection(SolutionProperties) = preSolution
|
||||
HideSolutionNode = FALSE
|
||||
EndGlobalSection
|
||||
GlobalSection(NestedProjects) = preSolution
|
||||
{F6860DE5-0C7C-4848-8356-7555E3C391A3} = {56BCE1BF-7CBA-7CE8-203D-A88051F1D642}
|
||||
EndGlobalSection
|
||||
EndGlobal
|
||||
@@ -0,0 +1,10 @@
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using StellaOps.AdvisoryAI.Context;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Abstractions;
|
||||
|
||||
public interface ISbomContextRetriever
|
||||
{
|
||||
Task<SbomContextResult> RetrieveAsync(SbomContextRequest request, CancellationToken cancellationToken);
|
||||
}
|
||||
@@ -0,0 +1,82 @@
|
||||
using System;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Abstractions;
|
||||
|
||||
/// <summary>
|
||||
/// Defines the inputs required to build SBOM-derived context for Advisory AI prompts.
|
||||
/// </summary>
|
||||
public sealed class SbomContextRequest
|
||||
{
|
||||
/// <summary>
|
||||
/// Maximum number of version timeline entries we will ever request from the SBOM service.
|
||||
/// </summary>
|
||||
public const int TimelineLimitCeiling = 500;
|
||||
|
||||
/// <summary>
|
||||
/// Maximum number of dependency paths we will ever request from the SBOM service.
|
||||
/// </summary>
|
||||
public const int DependencyPathLimitCeiling = 200;
|
||||
|
||||
public SbomContextRequest(
|
||||
string artifactId,
|
||||
string? purl = null,
|
||||
int maxTimelineEntries = 50,
|
||||
int maxDependencyPaths = 25,
|
||||
bool includeEnvironmentFlags = true,
|
||||
bool includeBlastRadius = true)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(artifactId);
|
||||
|
||||
ArtifactId = artifactId.Trim();
|
||||
Purl = string.IsNullOrWhiteSpace(purl) ? null : purl.Trim();
|
||||
MaxTimelineEntries = NormalizeLimit(maxTimelineEntries, TimelineLimitCeiling);
|
||||
MaxDependencyPaths = NormalizeLimit(maxDependencyPaths, DependencyPathLimitCeiling);
|
||||
IncludeEnvironmentFlags = includeEnvironmentFlags;
|
||||
IncludeBlastRadius = includeBlastRadius;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// The advisory artifact identifier (e.g. internal scan ID).
|
||||
/// </summary>
|
||||
public string ArtifactId { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Optional package URL used to scope SBOM data.
|
||||
/// </summary>
|
||||
public string? Purl { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Maximum number of timeline entries that should be returned. Set to 0 to disable.
|
||||
/// </summary>
|
||||
public int MaxTimelineEntries { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Maximum number of dependency paths that should be returned. Set to 0 to disable.
|
||||
/// </summary>
|
||||
public int MaxDependencyPaths { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Whether environment feature flags (prod/staging/etc.) should be returned.
|
||||
/// </summary>
|
||||
public bool IncludeEnvironmentFlags { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Whether blast radius summaries should be returned.
|
||||
/// </summary>
|
||||
public bool IncludeBlastRadius { get; }
|
||||
|
||||
private static int NormalizeLimit(int requested, int ceiling)
|
||||
{
|
||||
if (requested <= 0)
|
||||
{
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (requested > ceiling)
|
||||
{
|
||||
return ceiling;
|
||||
}
|
||||
|
||||
return requested;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,278 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Chunking;
|
||||
|
||||
internal sealed class CsafDocumentChunker : IDocumentChunker
|
||||
{
|
||||
private static readonly ImmutableHashSet<string> SupportedNoteCategories =
|
||||
ImmutableHashSet.Create(StringComparer.OrdinalIgnoreCase, "summary", "description", "remediation");
|
||||
|
||||
public bool CanHandle(DocumentFormat format) => format == DocumentFormat.Csaf;
|
||||
|
||||
public IEnumerable<AdvisoryChunk> Chunk(AdvisoryDocument document)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(document);
|
||||
|
||||
using var jsonDocument = JsonDocument.Parse(document.Content);
|
||||
var root = jsonDocument.RootElement;
|
||||
|
||||
var chunkIndex = 0;
|
||||
var sectionStack = new Stack<string>();
|
||||
sectionStack.Push("document");
|
||||
|
||||
string NextChunkId() => $"{document.DocumentId}:{++chunkIndex:D4}";
|
||||
|
||||
foreach (var chunk in ExtractDocumentNotes(document, root, NextChunkId))
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
|
||||
foreach (var chunk in ExtractVulnerabilities(document, root, NextChunkId))
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<AdvisoryChunk> ExtractDocumentNotes(
|
||||
AdvisoryDocument document,
|
||||
JsonElement root,
|
||||
Func<string> chunkIdFactory)
|
||||
{
|
||||
if (!root.TryGetProperty("document", out var documentNode))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
if (!documentNode.TryGetProperty("notes", out var notesNode) || notesNode.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var index = 0;
|
||||
foreach (var note in notesNode.EnumerateArray())
|
||||
{
|
||||
index++;
|
||||
if (note.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var text = note.GetPropertyOrDefault("text");
|
||||
if (string.IsNullOrWhiteSpace(text))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var category = note.GetPropertyOrDefault("category");
|
||||
if (category.Length > 0 && !SupportedNoteCategories.Contains(category))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["format"] = "csaf",
|
||||
["section"] = "document.notes",
|
||||
["index"] = index.ToString(),
|
||||
};
|
||||
|
||||
if (category.Length > 0)
|
||||
{
|
||||
metadata["category"] = category;
|
||||
}
|
||||
|
||||
yield return AdvisoryChunk.Create(
|
||||
document.DocumentId,
|
||||
chunkIdFactory(),
|
||||
section: "document.notes",
|
||||
paragraphId: $"document.notes[{index}]",
|
||||
text: text.Trim(),
|
||||
metadata: metadata);
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<AdvisoryChunk> ExtractVulnerabilities(
|
||||
AdvisoryDocument document,
|
||||
JsonElement root,
|
||||
Func<string> chunkIdFactory)
|
||||
{
|
||||
if (!root.TryGetProperty("vulnerabilities", out var vulnerabilitiesNode) ||
|
||||
vulnerabilitiesNode.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var vulnIndex = 0;
|
||||
foreach (var vulnerability in vulnerabilitiesNode.EnumerateArray())
|
||||
{
|
||||
vulnIndex++;
|
||||
if (vulnerability.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var vulnerabilityId = vulnerability.GetPropertyOrDefault("id", fallback: vulnIndex.ToString());
|
||||
|
||||
foreach (var chunk in ExtractVulnerabilityTitle(document, vulnerability, vulnerabilityId, chunkIdFactory))
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
|
||||
foreach (var chunk in ExtractVulnerabilityNotes(document, vulnerability, vulnerabilityId, chunkIdFactory))
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
|
||||
foreach (var chunk in ExtractRemediations(document, vulnerability, vulnerabilityId, chunkIdFactory))
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<AdvisoryChunk> ExtractVulnerabilityTitle(
|
||||
AdvisoryDocument document,
|
||||
JsonElement vulnerability,
|
||||
string vulnerabilityId,
|
||||
Func<string> chunkIdFactory)
|
||||
{
|
||||
var title = vulnerability.GetPropertyOrDefault("title");
|
||||
var description = vulnerability.GetPropertyOrDefault("description");
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(title))
|
||||
{
|
||||
yield return CreateChunk(document, chunkIdFactory(), "vulnerabilities.title", vulnerabilityId, title!);
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(description))
|
||||
{
|
||||
yield return CreateChunk(document, chunkIdFactory(), "vulnerabilities.description", vulnerabilityId, description!);
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<AdvisoryChunk> ExtractVulnerabilityNotes(
|
||||
AdvisoryDocument document,
|
||||
JsonElement vulnerability,
|
||||
string vulnerabilityId,
|
||||
Func<string> chunkIdFactory)
|
||||
{
|
||||
if (!vulnerability.TryGetProperty("notes", out var notes) || notes.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var noteIndex = 0;
|
||||
foreach (var note in notes.EnumerateArray())
|
||||
{
|
||||
noteIndex++;
|
||||
if (note.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var text = note.GetPropertyOrDefault("text");
|
||||
if (string.IsNullOrWhiteSpace(text))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
yield return CreateChunk(
|
||||
document,
|
||||
chunkIdFactory(),
|
||||
"vulnerabilities.notes",
|
||||
vulnerabilityId,
|
||||
text!,
|
||||
additionalMetadata: new Dictionary<string, string>
|
||||
{
|
||||
["noteIndex"] = noteIndex.ToString(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<AdvisoryChunk> ExtractRemediations(
|
||||
AdvisoryDocument document,
|
||||
JsonElement vulnerability,
|
||||
string vulnerabilityId,
|
||||
Func<string> chunkIdFactory)
|
||||
{
|
||||
if (!vulnerability.TryGetProperty("remediations", out var remediations) || remediations.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var remediationIndex = 0;
|
||||
foreach (var remediation in remediations.EnumerateArray())
|
||||
{
|
||||
remediationIndex++;
|
||||
if (remediation.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var details = remediation.GetPropertyOrDefault("details");
|
||||
if (string.IsNullOrWhiteSpace(details))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["remediationIndex"] = remediationIndex.ToString(),
|
||||
};
|
||||
|
||||
var type = remediation.GetPropertyOrDefault("category");
|
||||
if (!string.IsNullOrWhiteSpace(type))
|
||||
{
|
||||
metadata["category"] = type!;
|
||||
}
|
||||
|
||||
var productIds = remediation.GetPropertyOrDefault("product_ids");
|
||||
if (!string.IsNullOrWhiteSpace(productIds))
|
||||
{
|
||||
metadata["product_ids"] = productIds!;
|
||||
}
|
||||
|
||||
yield return CreateChunk(
|
||||
document,
|
||||
chunkIdFactory(),
|
||||
"vulnerabilities.remediations",
|
||||
vulnerabilityId,
|
||||
details!,
|
||||
metadata);
|
||||
}
|
||||
}
|
||||
|
||||
private static AdvisoryChunk CreateChunk(
|
||||
AdvisoryDocument document,
|
||||
string chunkId,
|
||||
string section,
|
||||
string paragraph,
|
||||
string text,
|
||||
IReadOnlyDictionary<string, string>? additionalMetadata = null)
|
||||
{
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["format"] = "csaf",
|
||||
["section"] = section,
|
||||
["paragraph"] = paragraph,
|
||||
};
|
||||
|
||||
if (additionalMetadata is not null)
|
||||
{
|
||||
foreach (var pair in additionalMetadata)
|
||||
{
|
||||
metadata[pair.Key] = pair.Value;
|
||||
}
|
||||
}
|
||||
|
||||
return AdvisoryChunk.Create(
|
||||
document.DocumentId,
|
||||
chunkId,
|
||||
section,
|
||||
paragraph,
|
||||
text.Trim(),
|
||||
metadata);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,41 @@
|
||||
using System.Linq;
|
||||
using System.Text.Json;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Chunking;
|
||||
|
||||
internal static class JsonElementExtensions
|
||||
{
|
||||
public static string GetPropertyOrDefault(this JsonElement element, string propertyName, string? fallback = "")
|
||||
{
|
||||
if (element.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return fallback ?? string.Empty;
|
||||
}
|
||||
|
||||
if (element.TryGetProperty(propertyName, out var property))
|
||||
{
|
||||
return property.ValueKind switch
|
||||
{
|
||||
JsonValueKind.String => property.GetString() ?? string.Empty,
|
||||
JsonValueKind.Array => string.Join(",", property.EnumerateArray().Select(ToStringValue).Where(static v => !string.IsNullOrWhiteSpace(v))),
|
||||
JsonValueKind.Object => property.ToString(),
|
||||
JsonValueKind.Number => property.ToString(),
|
||||
JsonValueKind.True => "true",
|
||||
JsonValueKind.False => "false",
|
||||
_ => string.Empty,
|
||||
};
|
||||
}
|
||||
|
||||
return fallback ?? string.Empty;
|
||||
}
|
||||
|
||||
private static string? ToStringValue(JsonElement element)
|
||||
=> element.ValueKind switch
|
||||
{
|
||||
JsonValueKind.String => element.GetString(),
|
||||
JsonValueKind.Number => element.ToString(),
|
||||
JsonValueKind.True => "true",
|
||||
JsonValueKind.False => "false",
|
||||
_ => null,
|
||||
};
|
||||
}
|
||||
@@ -0,0 +1,86 @@
|
||||
using System.Text.RegularExpressions;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Chunking;
|
||||
|
||||
internal sealed class MarkdownDocumentChunker : IDocumentChunker
|
||||
{
|
||||
private static readonly Regex HeadingRegex = new("^(#+)\\s+(?<title>.+)$", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
||||
|
||||
public bool CanHandle(DocumentFormat format) => format == DocumentFormat.Markdown;
|
||||
|
||||
public IEnumerable<AdvisoryChunk> Chunk(AdvisoryDocument document)
|
||||
{
|
||||
var lines = document.Content.Replace("\r\n", "\n", StringComparison.Ordinal).Split('\n');
|
||||
var section = "body";
|
||||
var paragraphId = 0;
|
||||
var chunkIndex = 0;
|
||||
var buffer = new List<string>();
|
||||
|
||||
IEnumerable<AdvisoryChunk> FlushBuffer()
|
||||
{
|
||||
if (buffer.Count == 0)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var text = string.Join("\n", buffer).Trim();
|
||||
buffer.Clear();
|
||||
if (string.IsNullOrWhiteSpace(text))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
paragraphId++;
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["format"] = "markdown",
|
||||
["section"] = section,
|
||||
["paragraph"] = paragraphId.ToString(),
|
||||
};
|
||||
|
||||
yield return AdvisoryChunk.Create(
|
||||
document.DocumentId,
|
||||
chunkId: $"{document.DocumentId}:{++chunkIndex:D4}",
|
||||
section,
|
||||
paragraphId: $"{section}#{paragraphId}",
|
||||
text,
|
||||
metadata);
|
||||
}
|
||||
|
||||
foreach (var rawLine in lines)
|
||||
{
|
||||
var line = rawLine.TrimEnd();
|
||||
if (line.Length == 0)
|
||||
{
|
||||
foreach (var chunk in FlushBuffer())
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
var headingMatch = HeadingRegex.Match(line);
|
||||
if (headingMatch.Success)
|
||||
{
|
||||
foreach (var chunk in FlushBuffer())
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
|
||||
var level = headingMatch.Groups[1].Value.Length;
|
||||
var title = headingMatch.Groups["title"].Value.Trim();
|
||||
section = level == 1 ? title : $"{section}/{title}";
|
||||
paragraphId = 0;
|
||||
continue;
|
||||
}
|
||||
|
||||
buffer.Add(line);
|
||||
}
|
||||
|
||||
foreach (var chunk in FlushBuffer())
|
||||
{
|
||||
yield return chunk;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,199 @@
|
||||
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Chunking;
|
||||
|
||||
internal sealed class OpenVexDocumentChunker : IDocumentChunker
|
||||
{
|
||||
public bool CanHandle(DocumentFormat format) => format == DocumentFormat.OpenVex;
|
||||
|
||||
public IEnumerable<AdvisoryChunk> Chunk(AdvisoryDocument document)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(document);
|
||||
|
||||
using var jsonDocument = JsonDocument.Parse(document.Content);
|
||||
var root = jsonDocument.RootElement;
|
||||
|
||||
if (!root.TryGetProperty("statements", out var statements) || statements.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var index = 0;
|
||||
foreach (var statement in statements.EnumerateArray())
|
||||
{
|
||||
index++;
|
||||
if (statement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var vulnerabilityId = statement.GetPropertyOrDefault("vulnerability", fallback: string.Empty);
|
||||
var status = statement.GetPropertyOrDefault("status", fallback: string.Empty);
|
||||
var justification = statement.GetPropertyOrDefault("justification", fallback: string.Empty);
|
||||
var impact = statement.GetPropertyOrDefault("impact_statement", fallback: string.Empty);
|
||||
var notes = statement.GetPropertyOrDefault("status_notes", fallback: string.Empty);
|
||||
var timestamp = statement.GetPropertyOrDefault("timestamp", fallback: string.Empty);
|
||||
var lastUpdated = statement.GetPropertyOrDefault("last_updated", fallback: string.Empty);
|
||||
|
||||
var products = ExtractProducts(statement);
|
||||
var section = "vex.statements";
|
||||
var paragraphId = $"statements[{index}]";
|
||||
var chunkId = $"{document.DocumentId}:{index:D4}";
|
||||
|
||||
var text = BuildStatementSummary(vulnerabilityId, status, justification, impact, notes, products);
|
||||
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["format"] = "openvex",
|
||||
["section"] = section,
|
||||
};
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(vulnerabilityId))
|
||||
{
|
||||
metadata["vulnerability"] = vulnerabilityId;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(status))
|
||||
{
|
||||
metadata["status"] = status;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(justification))
|
||||
{
|
||||
metadata["justification"] = justification;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(impact))
|
||||
{
|
||||
metadata["impact_statement"] = impact;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(notes))
|
||||
{
|
||||
metadata["status_notes"] = notes;
|
||||
}
|
||||
|
||||
if (products.Count > 0)
|
||||
{
|
||||
metadata["products"] = string.Join(",", products);
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(timestamp))
|
||||
{
|
||||
metadata["timestamp"] = timestamp;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(lastUpdated))
|
||||
{
|
||||
metadata["last_updated"] = lastUpdated;
|
||||
}
|
||||
|
||||
yield return AdvisoryChunk.Create(
|
||||
document.DocumentId,
|
||||
chunkId,
|
||||
section,
|
||||
paragraphId,
|
||||
text,
|
||||
metadata);
|
||||
}
|
||||
}
|
||||
|
||||
private static List<string> ExtractProducts(JsonElement statement)
|
||||
{
|
||||
if (!statement.TryGetProperty("products", out var productsElement) ||
|
||||
productsElement.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return new List<string>();
|
||||
}
|
||||
|
||||
var results = new List<string>();
|
||||
foreach (var product in productsElement.EnumerateArray())
|
||||
{
|
||||
switch (product.ValueKind)
|
||||
{
|
||||
case JsonValueKind.String:
|
||||
var value = product.GetString();
|
||||
if (!string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
results.Add(value.Trim());
|
||||
}
|
||||
break;
|
||||
case JsonValueKind.Object:
|
||||
var productId = product.GetPropertyOrDefault("product_id", fallback: string.Empty);
|
||||
if (!string.IsNullOrWhiteSpace(productId))
|
||||
{
|
||||
results.Add(productId);
|
||||
break;
|
||||
}
|
||||
|
||||
var name = product.GetPropertyOrDefault("name", fallback: string.Empty);
|
||||
if (!string.IsNullOrWhiteSpace(name))
|
||||
{
|
||||
results.Add(name);
|
||||
}
|
||||
|
||||
break;
|
||||
default:
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
private static string BuildStatementSummary(
|
||||
string vulnerabilityId,
|
||||
string status,
|
||||
string justification,
|
||||
string impact,
|
||||
string notes,
|
||||
IReadOnlyList<string> products)
|
||||
{
|
||||
var builder = new StringBuilder();
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(vulnerabilityId))
|
||||
{
|
||||
builder.Append(vulnerabilityId.Trim());
|
||||
}
|
||||
else
|
||||
{
|
||||
builder.Append("Unknown vulnerability");
|
||||
}
|
||||
|
||||
if (products.Count > 0)
|
||||
{
|
||||
builder.Append(" affects ");
|
||||
builder.Append(string.Join(", ", products));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(status))
|
||||
{
|
||||
builder.Append(" → status: ");
|
||||
builder.Append(status.Trim());
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(justification))
|
||||
{
|
||||
builder.Append(" (justification: ");
|
||||
builder.Append(justification.Trim());
|
||||
builder.Append(')');
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(impact))
|
||||
{
|
||||
builder.Append(". Impact: ");
|
||||
builder.Append(impact.Trim());
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(notes))
|
||||
{
|
||||
builder.Append(". Notes: ");
|
||||
builder.Append(notes.Trim());
|
||||
}
|
||||
|
||||
return builder.ToString();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,138 @@
|
||||
using System.Text.Json;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Chunking;
|
||||
|
||||
internal sealed class OsvDocumentChunker : IDocumentChunker
|
||||
{
|
||||
public bool CanHandle(DocumentFormat format) => format == DocumentFormat.Osv;
|
||||
|
||||
public IEnumerable<AdvisoryChunk> Chunk(AdvisoryDocument document)
|
||||
{
|
||||
using var jsonDocument = JsonDocument.Parse(document.Content);
|
||||
var root = jsonDocument.RootElement;
|
||||
|
||||
var chunkIndex = 0;
|
||||
string NextChunkId() => $"{document.DocumentId}:{++chunkIndex:D4}";
|
||||
|
||||
string summary = root.GetPropertyOrDefault("summary");
|
||||
if (!string.IsNullOrWhiteSpace(summary))
|
||||
{
|
||||
yield return CreateChunk(document, NextChunkId(), "summary", "summary", summary!);
|
||||
}
|
||||
|
||||
string details = root.GetPropertyOrDefault("details");
|
||||
if (!string.IsNullOrWhiteSpace(details))
|
||||
{
|
||||
yield return CreateChunk(document, NextChunkId(), "details", "details", details!);
|
||||
}
|
||||
|
||||
if (root.TryGetProperty("affected", out var affectedNode) && affectedNode.ValueKind == JsonValueKind.Array)
|
||||
{
|
||||
var affectedIndex = 0;
|
||||
foreach (var affected in affectedNode.EnumerateArray())
|
||||
{
|
||||
affectedIndex++;
|
||||
if (affected.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var packageName = string.Empty;
|
||||
var ecosystem = string.Empty;
|
||||
if (affected.TryGetProperty("package", out var package))
|
||||
{
|
||||
packageName = package.GetPropertyOrDefault("name", string.Empty);
|
||||
ecosystem = package.GetPropertyOrDefault("ecosystem", string.Empty);
|
||||
}
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["package"] = packageName,
|
||||
["ecosystem"] = ecosystem,
|
||||
};
|
||||
|
||||
if (affected.TryGetProperty("ranges", out var ranges) && ranges.ValueKind == JsonValueKind.Array)
|
||||
{
|
||||
var rangeIndex = 0;
|
||||
foreach (var range in ranges.EnumerateArray())
|
||||
{
|
||||
rangeIndex++;
|
||||
var events = range.GetPropertyOrDefault("events");
|
||||
if (string.IsNullOrWhiteSpace(events))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var rangeMetadata = new Dictionary<string, string>(metadata)
|
||||
{
|
||||
["range.type"] = range.GetPropertyOrDefault("type", fallback: ""),
|
||||
["range.index"] = rangeIndex.ToString(),
|
||||
};
|
||||
yield return CreateChunk(
|
||||
document,
|
||||
NextChunkId(),
|
||||
"affected.ranges",
|
||||
$"affected[{affectedIndex}]",
|
||||
events!,
|
||||
metadata: rangeMetadata);
|
||||
}
|
||||
}
|
||||
|
||||
var versions = affected.GetPropertyOrDefault("versions");
|
||||
if (!string.IsNullOrWhiteSpace(versions))
|
||||
{
|
||||
var versionMetadata = new Dictionary<string, string>(metadata)
|
||||
{
|
||||
["section"] = "affected.versions",
|
||||
};
|
||||
|
||||
yield return CreateChunk(
|
||||
document,
|
||||
NextChunkId(),
|
||||
"affected.versions",
|
||||
$"affected[{affectedIndex}]",
|
||||
versions!,
|
||||
metadata: versionMetadata);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var references = root.GetPropertyOrDefault("references");
|
||||
if (!string.IsNullOrWhiteSpace(references))
|
||||
{
|
||||
yield return CreateChunk(document, NextChunkId(), "references", "references", references!);
|
||||
}
|
||||
}
|
||||
|
||||
private static AdvisoryChunk CreateChunk(
|
||||
AdvisoryDocument document,
|
||||
string chunkId,
|
||||
string section,
|
||||
string paragraph,
|
||||
string text,
|
||||
IReadOnlyDictionary<string, string>? metadata = null)
|
||||
{
|
||||
var meta = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["format"] = "osv",
|
||||
["section"] = section,
|
||||
["paragraph"] = paragraph,
|
||||
};
|
||||
|
||||
if (metadata is not null)
|
||||
{
|
||||
foreach (var pair in metadata)
|
||||
{
|
||||
meta[pair.Key] = pair.Value;
|
||||
}
|
||||
}
|
||||
|
||||
return AdvisoryChunk.Create(
|
||||
document.DocumentId,
|
||||
chunkId,
|
||||
section,
|
||||
paragraph,
|
||||
text.Trim(),
|
||||
meta);
|
||||
}
|
||||
}
|
||||
189
src/AdvisoryAI/StellaOps.AdvisoryAI/Context/SbomContextResult.cs
Normal file
189
src/AdvisoryAI/StellaOps.AdvisoryAI/Context/SbomContextResult.cs
Normal file
@@ -0,0 +1,189 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Context;
|
||||
|
||||
/// <summary>
|
||||
/// Represents SBOM-derived context that Advisory AI can hydrate into prompts.
|
||||
/// </summary>
|
||||
public sealed class SbomContextResult
|
||||
{
|
||||
private SbomContextResult(
|
||||
string artifactId,
|
||||
string? purl,
|
||||
ImmutableArray<SbomVersionTimelineEntry> versionTimeline,
|
||||
ImmutableArray<SbomDependencyPath> dependencyPaths,
|
||||
ImmutableDictionary<string, string> environmentFlags,
|
||||
SbomBlastRadiusSummary? blastRadius,
|
||||
ImmutableDictionary<string, string> metadata)
|
||||
{
|
||||
ArtifactId = artifactId;
|
||||
Purl = purl;
|
||||
VersionTimeline = versionTimeline;
|
||||
DependencyPaths = dependencyPaths;
|
||||
EnvironmentFlags = environmentFlags;
|
||||
BlastRadius = blastRadius;
|
||||
Metadata = metadata;
|
||||
}
|
||||
|
||||
public string ArtifactId { get; }
|
||||
|
||||
public string? Purl { get; }
|
||||
|
||||
public ImmutableArray<SbomVersionTimelineEntry> VersionTimeline { get; }
|
||||
|
||||
public ImmutableArray<SbomDependencyPath> DependencyPaths { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> EnvironmentFlags { get; }
|
||||
|
||||
public SbomBlastRadiusSummary? BlastRadius { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
|
||||
public static SbomContextResult Create(
|
||||
string artifactId,
|
||||
string? purl,
|
||||
IEnumerable<SbomVersionTimelineEntry> versionTimeline,
|
||||
IEnumerable<SbomDependencyPath> dependencyPaths,
|
||||
IReadOnlyDictionary<string, string>? environmentFlags = null,
|
||||
SbomBlastRadiusSummary? blastRadius = null,
|
||||
IReadOnlyDictionary<string, string>? metadata = null)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(artifactId);
|
||||
ArgumentNullException.ThrowIfNull(versionTimeline);
|
||||
ArgumentNullException.ThrowIfNull(dependencyPaths);
|
||||
|
||||
var timeline = versionTimeline.ToImmutableArray();
|
||||
var paths = dependencyPaths.ToImmutableArray();
|
||||
var flags = environmentFlags is null
|
||||
? ImmutableDictionary<string, string>.Empty
|
||||
: environmentFlags.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
var meta = metadata is null
|
||||
? ImmutableDictionary<string, string>.Empty
|
||||
: metadata.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
|
||||
return new SbomContextResult(
|
||||
artifactId.Trim(),
|
||||
string.IsNullOrWhiteSpace(purl) ? null : purl.Trim(),
|
||||
timeline,
|
||||
paths,
|
||||
flags,
|
||||
blastRadius,
|
||||
meta);
|
||||
}
|
||||
|
||||
public static SbomContextResult Empty(string artifactId, string? purl = null)
|
||||
=> Create(artifactId, purl, Array.Empty<SbomVersionTimelineEntry>(), Array.Empty<SbomDependencyPath>());
|
||||
}
|
||||
|
||||
public sealed class SbomVersionTimelineEntry
|
||||
{
|
||||
public SbomVersionTimelineEntry(
|
||||
string version,
|
||||
DateTimeOffset firstObserved,
|
||||
DateTimeOffset? lastObserved,
|
||||
string status,
|
||||
string source)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(version);
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(status);
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(source);
|
||||
|
||||
Version = version.Trim();
|
||||
FirstObserved = firstObserved;
|
||||
LastObserved = lastObserved;
|
||||
Status = status.Trim();
|
||||
Source = source.Trim();
|
||||
}
|
||||
|
||||
public string Version { get; }
|
||||
|
||||
public DateTimeOffset FirstObserved { get; }
|
||||
|
||||
public DateTimeOffset? LastObserved { get; }
|
||||
|
||||
public string Status { get; }
|
||||
|
||||
public string Source { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomDependencyPath
|
||||
{
|
||||
public SbomDependencyPath(
|
||||
IEnumerable<SbomDependencyNode> nodes,
|
||||
bool isRuntime,
|
||||
string? source = null,
|
||||
IReadOnlyDictionary<string, string>? metadata = null)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(nodes);
|
||||
|
||||
var immutableNodes = nodes.ToImmutableArray();
|
||||
if (immutableNodes.IsDefaultOrEmpty)
|
||||
{
|
||||
throw new ArgumentException("At least one node must be supplied.", nameof(nodes));
|
||||
}
|
||||
|
||||
Nodes = immutableNodes;
|
||||
IsRuntime = isRuntime;
|
||||
Source = string.IsNullOrWhiteSpace(source) ? null : source.Trim();
|
||||
Metadata = metadata is null
|
||||
? ImmutableDictionary<string, string>.Empty
|
||||
: metadata.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
public ImmutableArray<SbomDependencyNode> Nodes { get; }
|
||||
|
||||
public bool IsRuntime { get; }
|
||||
|
||||
public string? Source { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomDependencyNode
|
||||
{
|
||||
public SbomDependencyNode(string identifier, string? version)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(identifier);
|
||||
|
||||
Identifier = identifier.Trim();
|
||||
Version = string.IsNullOrWhiteSpace(version) ? null : version.Trim();
|
||||
}
|
||||
|
||||
public string Identifier { get; }
|
||||
|
||||
public string? Version { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomBlastRadiusSummary
|
||||
{
|
||||
public SbomBlastRadiusSummary(
|
||||
int impactedAssets,
|
||||
int impactedWorkloads,
|
||||
int impactedNamespaces,
|
||||
double? impactedPercentage,
|
||||
IReadOnlyDictionary<string, string>? metadata = null)
|
||||
{
|
||||
ImpactedAssets = Math.Max(0, impactedAssets);
|
||||
ImpactedWorkloads = Math.Max(0, impactedWorkloads);
|
||||
ImpactedNamespaces = Math.Max(0, impactedNamespaces);
|
||||
ImpactedPercentage = impactedPercentage.HasValue
|
||||
? Math.Max(0, impactedPercentage.Value)
|
||||
: null;
|
||||
Metadata = metadata is null
|
||||
? ImmutableDictionary<string, string>.Empty
|
||||
: metadata.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
public int ImpactedAssets { get; }
|
||||
|
||||
public int ImpactedWorkloads { get; }
|
||||
|
||||
public int ImpactedNamespaces { get; }
|
||||
|
||||
public double? ImpactedPercentage { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
}
|
||||
@@ -6,4 +6,5 @@ public enum DocumentFormat
|
||||
Csaf,
|
||||
Osv,
|
||||
Markdown,
|
||||
OpenVex,
|
||||
}
|
||||
|
||||
@@ -0,0 +1,24 @@
|
||||
using System.Globalization;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Documents;
|
||||
|
||||
internal static class DocumentFormatMapper
|
||||
{
|
||||
public static DocumentFormat Map(string? format)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(format))
|
||||
{
|
||||
return DocumentFormat.Unknown;
|
||||
}
|
||||
|
||||
var normalized = format.Trim().ToLowerInvariant();
|
||||
return normalized switch
|
||||
{
|
||||
"csaf" or "csaf-json" or "csaf_json" or "csaf/v2" => DocumentFormat.Csaf,
|
||||
"osv" => DocumentFormat.Osv,
|
||||
"markdown" or "md" => DocumentFormat.Markdown,
|
||||
"openvex" or "open-vex" or "vex" => DocumentFormat.OpenVex,
|
||||
_ => DocumentFormat.Unknown,
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,3 @@
|
||||
using System.Runtime.CompilerServices;
|
||||
|
||||
[assembly: InternalsVisibleTo("StellaOps.AdvisoryAI.Tests")]
|
||||
@@ -0,0 +1,124 @@
|
||||
using System.Collections.Immutable;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.Concelier.Core.Raw;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Providers;
|
||||
|
||||
public sealed class ConcelierAdvisoryDocumentProviderOptions
|
||||
{
|
||||
public string Tenant { get; set; } = string.Empty;
|
||||
|
||||
public ImmutableArray<string> Vendors { get; set; } = ImmutableArray<string>.Empty;
|
||||
|
||||
public int MaxDocuments { get; set; } = 25;
|
||||
}
|
||||
|
||||
internal sealed class ConcelierAdvisoryDocumentProvider : IAdvisoryDocumentProvider
|
||||
{
|
||||
private readonly IAdvisoryRawService _rawService;
|
||||
private readonly ConcelierAdvisoryDocumentProviderOptions _options;
|
||||
private readonly ILogger<ConcelierAdvisoryDocumentProvider>? _logger;
|
||||
|
||||
public ConcelierAdvisoryDocumentProvider(
|
||||
IAdvisoryRawService rawService,
|
||||
IOptions<ConcelierAdvisoryDocumentProviderOptions> options,
|
||||
ILogger<ConcelierAdvisoryDocumentProvider>? logger = null)
|
||||
{
|
||||
_rawService = rawService ?? throw new ArgumentNullException(nameof(rawService));
|
||||
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
|
||||
_logger = logger;
|
||||
|
||||
if (string.IsNullOrWhiteSpace(_options.Tenant))
|
||||
{
|
||||
throw new ArgumentException("Tenant must be configured.", nameof(options));
|
||||
}
|
||||
|
||||
if (_options.MaxDocuments <= 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(options), "MaxDocuments must be positive.");
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<AdvisoryDocument>> GetDocumentsAsync(string advisoryKey, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryKey);
|
||||
|
||||
var options = new AdvisoryRawQueryOptions(_options.Tenant)
|
||||
{
|
||||
Aliases = ImmutableArray.Create(advisoryKey),
|
||||
UpstreamIds = ImmutableArray.Create(advisoryKey),
|
||||
Vendors = _options.Vendors,
|
||||
Limit = _options.MaxDocuments
|
||||
};
|
||||
|
||||
var result = await _rawService.QueryAsync(options, cancellationToken).ConfigureAwait(false);
|
||||
if (result.Records.Count == 0)
|
||||
{
|
||||
_logger?.LogDebug("No advisory raw records returned for key {AdvisoryKey}", advisoryKey);
|
||||
return Array.Empty<AdvisoryDocument>();
|
||||
}
|
||||
|
||||
var documents = new List<AdvisoryDocument>(result.Records.Count);
|
||||
foreach (var record in result.Records)
|
||||
{
|
||||
var raw = record.Document.Content;
|
||||
var format = DocumentFormatMapper.Map(raw.Format);
|
||||
if (format == DocumentFormat.Unknown)
|
||||
{
|
||||
_logger?.LogWarning("Unsupported advisory content format {Format} for advisory {AdvisoryKey}", raw.Format, advisoryKey);
|
||||
continue;
|
||||
}
|
||||
|
||||
var documentId = DetermineDocumentId(record);
|
||||
var metadata = BuildMetadata(record);
|
||||
var json = record.Document.Content.Raw.GetRawText();
|
||||
|
||||
documents.Add(AdvisoryDocument.Create(documentId, format, record.Document.Source.Vendor, json, metadata));
|
||||
}
|
||||
|
||||
return documents;
|
||||
}
|
||||
|
||||
private static string DetermineDocumentId(AdvisoryRawRecord record)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(record.Document.Upstream.UpstreamId))
|
||||
{
|
||||
return record.Document.Upstream.UpstreamId;
|
||||
}
|
||||
|
||||
var primary = record.Document.Identifiers.PrimaryId;
|
||||
if (!string.IsNullOrWhiteSpace(primary))
|
||||
{
|
||||
return primary;
|
||||
}
|
||||
|
||||
return record.Id;
|
||||
}
|
||||
|
||||
private static IReadOnlyDictionary<string, string> BuildMetadata(AdvisoryRawRecord record)
|
||||
{
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["tenant"] = record.Document.Tenant,
|
||||
["vendor"] = record.Document.Source.Vendor,
|
||||
["connector"] = record.Document.Source.Connector,
|
||||
["content_hash"] = record.Document.Upstream.ContentHash,
|
||||
["ingested_at"] = record.IngestedAt.UtcDateTime.ToString("O"),
|
||||
};
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(record.Document.Source.Stream))
|
||||
{
|
||||
metadata["stream"] = record.Document.Source.Stream!;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(record.Document.Upstream.DocumentVersion))
|
||||
{
|
||||
metadata["document_version"] = record.Document.Upstream.DocumentVersion!;
|
||||
}
|
||||
|
||||
return metadata;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,202 @@
|
||||
|
||||
using System.Collections.Immutable;
|
||||
using System.Globalization;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Nodes;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.Excititor.Core;
|
||||
using StellaOps.Excititor.Core.Observations;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Providers;
|
||||
|
||||
public sealed class ExcititorVexDocumentProviderOptions
|
||||
{
|
||||
public string Tenant { get; set; } = string.Empty;
|
||||
|
||||
public ImmutableArray<string> ProviderIds { get; set; } = ImmutableArray<string>.Empty;
|
||||
|
||||
public ImmutableArray<VexClaimStatus> Statuses { get; set; } = ImmutableArray<VexClaimStatus>.Empty;
|
||||
|
||||
public int MaxObservations { get; set; } = 25;
|
||||
}
|
||||
|
||||
internal sealed class ExcititorVexDocumentProvider : IAdvisoryDocumentProvider
|
||||
{
|
||||
private static readonly JsonSerializerOptions JsonOptions = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
WriteIndented = false,
|
||||
};
|
||||
|
||||
private readonly IVexObservationQueryService _queryService;
|
||||
private readonly ExcititorVexDocumentProviderOptions _options;
|
||||
private readonly ILogger<ExcititorVexDocumentProvider>? _logger;
|
||||
|
||||
public ExcititorVexDocumentProvider(
|
||||
IVexObservationQueryService queryService,
|
||||
IOptions<ExcititorVexDocumentProviderOptions> options,
|
||||
ILogger<ExcititorVexDocumentProvider>? logger = null)
|
||||
{
|
||||
_queryService = queryService ?? throw new ArgumentNullException(nameof(queryService));
|
||||
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
|
||||
_logger = logger;
|
||||
|
||||
if (string.IsNullOrWhiteSpace(_options.Tenant))
|
||||
{
|
||||
throw new ArgumentException("Tenant must be configured.", nameof(options));
|
||||
}
|
||||
|
||||
if (_options.MaxObservations <= 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(options), "MaxObservations must be positive.");
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<AdvisoryDocument>> GetDocumentsAsync(string advisoryKey, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryKey);
|
||||
|
||||
var normalizedKey = advisoryKey.Trim();
|
||||
var lookup = ImmutableArray.Create(normalizedKey);
|
||||
|
||||
var providerIds = _options.ProviderIds.IsDefaultOrEmpty
|
||||
? ImmutableArray<string>.Empty
|
||||
: _options.ProviderIds;
|
||||
var statuses = _options.Statuses.IsDefaultOrEmpty
|
||||
? ImmutableArray<VexClaimStatus>.Empty
|
||||
: _options.Statuses;
|
||||
|
||||
var options = new VexObservationQueryOptions(
|
||||
_options.Tenant,
|
||||
observationIds: lookup,
|
||||
vulnerabilityIds: lookup,
|
||||
productKeys: lookup,
|
||||
purls: ImmutableArray<string>.Empty,
|
||||
cpes: ImmutableArray<string>.Empty,
|
||||
providerIds: providerIds,
|
||||
statuses: statuses,
|
||||
limit: _options.MaxObservations);
|
||||
|
||||
var result = await _queryService.QueryAsync(options, cancellationToken).ConfigureAwait(false);
|
||||
if (result.Observations.IsDefaultOrEmpty)
|
||||
{
|
||||
_logger?.LogDebug("No VEX observations returned for advisory key {AdvisoryKey}", normalizedKey);
|
||||
return Array.Empty<AdvisoryDocument>();
|
||||
}
|
||||
|
||||
var documents = new List<AdvisoryDocument>(result.Observations.Length);
|
||||
foreach (var observation in result.Observations)
|
||||
{
|
||||
var format = DocumentFormatMapper.Map(observation.Content.Format);
|
||||
if (format == DocumentFormat.Unknown)
|
||||
{
|
||||
_logger?.LogWarning(
|
||||
"Unsupported VEX content format {Format} for observation {ObservationId}",
|
||||
observation.Content.Format,
|
||||
observation.ObservationId);
|
||||
continue;
|
||||
}
|
||||
|
||||
var content = observation.Content.Raw.ToJsonString(JsonOptions);
|
||||
var metadata = BuildMetadata(observation);
|
||||
|
||||
documents.Add(AdvisoryDocument.Create(
|
||||
observation.ObservationId,
|
||||
format,
|
||||
observation.ProviderId,
|
||||
content,
|
||||
metadata));
|
||||
}
|
||||
|
||||
return documents;
|
||||
}
|
||||
|
||||
private static IReadOnlyDictionary<string, string> BuildMetadata(VexObservation observation)
|
||||
{
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["tenant"] = observation.Tenant,
|
||||
["provider"] = observation.ProviderId,
|
||||
["stream"] = observation.StreamId,
|
||||
["created_at"] = observation.CreatedAt.ToString("O", CultureInfo.InvariantCulture),
|
||||
["statement_count"] = observation.Statements.Length.ToString(CultureInfo.InvariantCulture),
|
||||
["content_format"] = observation.Content.Format,
|
||||
["content_hash"] = observation.Upstream.ContentHash,
|
||||
};
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(observation.Content.SpecVersion))
|
||||
{
|
||||
metadata["spec_version"] = observation.Content.SpecVersion!;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(observation.Upstream.DocumentVersion))
|
||||
{
|
||||
metadata["document_version"] = observation.Upstream.DocumentVersion!;
|
||||
}
|
||||
|
||||
if (observation.Supersedes.Length > 0)
|
||||
{
|
||||
metadata["supersedes"] = string.Join(",", observation.Supersedes);
|
||||
}
|
||||
|
||||
if (observation.Linkset.Aliases.Length > 0)
|
||||
{
|
||||
metadata["aliases"] = string.Join(",", observation.Linkset.Aliases);
|
||||
}
|
||||
|
||||
if (observation.Linkset.Purls.Length > 0)
|
||||
{
|
||||
metadata["purls"] = string.Join(",", observation.Linkset.Purls);
|
||||
}
|
||||
|
||||
if (observation.Linkset.Cpes.Length > 0)
|
||||
{
|
||||
metadata["cpes"] = string.Join(",", observation.Linkset.Cpes);
|
||||
}
|
||||
|
||||
var statusSummary = BuildStatusSummary(observation.Statements);
|
||||
if (statusSummary.Length > 0)
|
||||
{
|
||||
metadata["status_counts"] = statusSummary;
|
||||
}
|
||||
|
||||
return metadata;
|
||||
}
|
||||
|
||||
private static string BuildStatusSummary(ImmutableArray<VexObservationStatement> statements)
|
||||
{
|
||||
if (statements.IsDefaultOrEmpty)
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
var counts = new SortedDictionary<string, int>(StringComparer.Ordinal);
|
||||
foreach (var statement in statements)
|
||||
{
|
||||
var key = ToStatusKey(statement.Status);
|
||||
counts.TryGetValue(key, out var current);
|
||||
counts[key] = current + 1;
|
||||
}
|
||||
|
||||
if (counts.Count == 0)
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
return string.Join(
|
||||
';',
|
||||
counts.Select(pair => $"{pair.Key}:{pair.Value.ToString(CultureInfo.InvariantCulture)}"));
|
||||
}
|
||||
|
||||
private static string ToStatusKey(VexClaimStatus status)
|
||||
=> status switch
|
||||
{
|
||||
VexClaimStatus.Affected => "affected",
|
||||
VexClaimStatus.NotAffected => "not_affected",
|
||||
VexClaimStatus.Fixed => "fixed",
|
||||
VexClaimStatus.UnderInvestigation => "under_investigation",
|
||||
_ => status.ToString().ToLowerInvariant(),
|
||||
};
|
||||
}
|
||||
@@ -0,0 +1,196 @@
|
||||
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Providers;
|
||||
|
||||
public interface ISbomContextClient
|
||||
{
|
||||
Task<SbomContextDocument?> GetContextAsync(SbomContextQuery query, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
public sealed class SbomContextQuery
|
||||
{
|
||||
public SbomContextQuery(
|
||||
string artifactId,
|
||||
string? purl,
|
||||
int maxTimelineEntries,
|
||||
int maxDependencyPaths,
|
||||
bool includeEnvironmentFlags,
|
||||
bool includeBlastRadius)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(artifactId))
|
||||
{
|
||||
throw new ArgumentException("ArtifactId must be provided.", nameof(artifactId));
|
||||
}
|
||||
|
||||
ArtifactId = artifactId.Trim();
|
||||
Purl = string.IsNullOrWhiteSpace(purl) ? null : purl.Trim();
|
||||
MaxTimelineEntries = Math.Max(0, maxTimelineEntries);
|
||||
MaxDependencyPaths = Math.Max(0, maxDependencyPaths);
|
||||
IncludeEnvironmentFlags = includeEnvironmentFlags;
|
||||
IncludeBlastRadius = includeBlastRadius;
|
||||
}
|
||||
|
||||
public string ArtifactId { get; }
|
||||
|
||||
public string? Purl { get; }
|
||||
|
||||
public int MaxTimelineEntries { get; }
|
||||
|
||||
public int MaxDependencyPaths { get; }
|
||||
|
||||
public bool IncludeEnvironmentFlags { get; }
|
||||
|
||||
public bool IncludeBlastRadius { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomContextDocument
|
||||
{
|
||||
public SbomContextDocument(
|
||||
string artifactId,
|
||||
string? purl,
|
||||
ImmutableArray<SbomVersionRecord> versions,
|
||||
ImmutableArray<SbomDependencyPathRecord> dependencyPaths,
|
||||
ImmutableDictionary<string, string> environmentFlags,
|
||||
SbomBlastRadiusRecord? blastRadius,
|
||||
ImmutableDictionary<string, string> metadata)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(artifactId))
|
||||
{
|
||||
throw new ArgumentException("ArtifactId must be provided.", nameof(artifactId));
|
||||
}
|
||||
|
||||
ArtifactId = artifactId.Trim();
|
||||
Purl = string.IsNullOrWhiteSpace(purl) ? null : purl.Trim();
|
||||
Versions = versions.IsDefault ? ImmutableArray<SbomVersionRecord>.Empty : versions;
|
||||
DependencyPaths = dependencyPaths.IsDefault ? ImmutableArray<SbomDependencyPathRecord>.Empty : dependencyPaths;
|
||||
EnvironmentFlags = environmentFlags == default ? ImmutableDictionary<string, string>.Empty : environmentFlags;
|
||||
BlastRadius = blastRadius;
|
||||
Metadata = metadata == default ? ImmutableDictionary<string, string>.Empty : metadata;
|
||||
}
|
||||
|
||||
public string ArtifactId { get; }
|
||||
|
||||
public string? Purl { get; }
|
||||
|
||||
public ImmutableArray<SbomVersionRecord> Versions { get; }
|
||||
|
||||
public ImmutableArray<SbomDependencyPathRecord> DependencyPaths { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> EnvironmentFlags { get; }
|
||||
|
||||
public SbomBlastRadiusRecord? BlastRadius { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomVersionRecord
|
||||
{
|
||||
public SbomVersionRecord(
|
||||
string version,
|
||||
DateTimeOffset firstObserved,
|
||||
DateTimeOffset? lastObserved,
|
||||
string status,
|
||||
string source,
|
||||
bool isFixAvailable,
|
||||
ImmutableDictionary<string, string> metadata)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(version))
|
||||
{
|
||||
throw new ArgumentException("Version must be provided.", nameof(version));
|
||||
}
|
||||
|
||||
Version = version.Trim();
|
||||
FirstObserved = firstObserved;
|
||||
LastObserved = lastObserved;
|
||||
Status = string.IsNullOrWhiteSpace(status) ? "unknown" : status.Trim();
|
||||
Source = string.IsNullOrWhiteSpace(source) ? "unknown" : source.Trim();
|
||||
IsFixAvailable = isFixAvailable;
|
||||
Metadata = metadata == default ? ImmutableDictionary<string, string>.Empty : metadata;
|
||||
}
|
||||
|
||||
public string Version { get; }
|
||||
|
||||
public DateTimeOffset FirstObserved { get; }
|
||||
|
||||
public DateTimeOffset? LastObserved { get; }
|
||||
|
||||
public string Status { get; }
|
||||
|
||||
public string Source { get; }
|
||||
|
||||
public bool IsFixAvailable { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomDependencyPathRecord
|
||||
{
|
||||
public SbomDependencyPathRecord(
|
||||
ImmutableArray<SbomDependencyNodeRecord> nodes,
|
||||
bool isRuntime,
|
||||
string? source,
|
||||
ImmutableDictionary<string, string> metadata)
|
||||
{
|
||||
Nodes = nodes.IsDefault ? ImmutableArray<SbomDependencyNodeRecord>.Empty : nodes;
|
||||
IsRuntime = isRuntime;
|
||||
Source = string.IsNullOrWhiteSpace(source) ? null : source.Trim();
|
||||
Metadata = metadata == default ? ImmutableDictionary<string, string>.Empty : metadata;
|
||||
}
|
||||
|
||||
public ImmutableArray<SbomDependencyNodeRecord> Nodes { get; }
|
||||
|
||||
public bool IsRuntime { get; }
|
||||
|
||||
public string? Source { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomDependencyNodeRecord
|
||||
{
|
||||
public SbomDependencyNodeRecord(string identifier, string? version)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(identifier))
|
||||
{
|
||||
throw new ArgumentException("Identifier must be provided.", nameof(identifier));
|
||||
}
|
||||
|
||||
Identifier = identifier.Trim();
|
||||
Version = string.IsNullOrWhiteSpace(version) ? null : version.Trim();
|
||||
}
|
||||
|
||||
public string Identifier { get; }
|
||||
|
||||
public string? Version { get; }
|
||||
}
|
||||
|
||||
public sealed class SbomBlastRadiusRecord
|
||||
{
|
||||
public SbomBlastRadiusRecord(
|
||||
int impactedAssets,
|
||||
int impactedWorkloads,
|
||||
int impactedNamespaces,
|
||||
double? impactedPercentage,
|
||||
ImmutableDictionary<string, string> metadata)
|
||||
{
|
||||
ImpactedAssets = impactedAssets;
|
||||
ImpactedWorkloads = impactedWorkloads;
|
||||
ImpactedNamespaces = impactedNamespaces;
|
||||
ImpactedPercentage = impactedPercentage;
|
||||
Metadata = metadata == default ? ImmutableDictionary<string, string>.Empty : metadata;
|
||||
}
|
||||
|
||||
public int ImpactedAssets { get; }
|
||||
|
||||
public int ImpactedWorkloads { get; }
|
||||
|
||||
public int ImpactedNamespaces { get; }
|
||||
|
||||
public double? ImpactedPercentage { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Metadata { get; }
|
||||
}
|
||||
@@ -0,0 +1,71 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Chunking;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Retrievers;
|
||||
|
||||
internal sealed class AdvisoryStructuredRetriever : IAdvisoryStructuredRetriever
|
||||
{
|
||||
private readonly IAdvisoryDocumentProvider _documentProvider;
|
||||
private readonly DocumentChunkerFactory _chunkerFactory;
|
||||
private readonly ILogger<AdvisoryStructuredRetriever>? _logger;
|
||||
|
||||
public AdvisoryStructuredRetriever(
|
||||
IAdvisoryDocumentProvider documentProvider,
|
||||
IEnumerable<IDocumentChunker> chunkers,
|
||||
ILogger<AdvisoryStructuredRetriever>? logger = null)
|
||||
{
|
||||
_documentProvider = documentProvider ?? throw new ArgumentNullException(nameof(documentProvider));
|
||||
_chunkerFactory = new DocumentChunkerFactory(chunkers ?? throw new ArgumentNullException(nameof(chunkers)));
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
public async Task<AdvisoryRetrievalResult> RetrieveAsync(AdvisoryRetrievalRequest request, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(request);
|
||||
|
||||
var documents = await _documentProvider.GetDocumentsAsync(request.AdvisoryKey, cancellationToken).ConfigureAwait(false);
|
||||
if (documents.Count == 0)
|
||||
{
|
||||
_logger?.LogWarning("No documents returned for advisory {AdvisoryKey}", request.AdvisoryKey);
|
||||
return AdvisoryRetrievalResult.Create(request.AdvisoryKey, Array.Empty<AdvisoryChunk>());
|
||||
}
|
||||
|
||||
var preferredSections = request.PreferredSections is null
|
||||
? null
|
||||
: request.PreferredSections.Select(section => section.Trim()).Where(static s => s.Length > 0).ToHashSet(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
var chunks = new List<AdvisoryChunk>();
|
||||
foreach (var document in documents)
|
||||
{
|
||||
var chunker = _chunkerFactory.Resolve(document.Format);
|
||||
foreach (var chunk in chunker.Chunk(document))
|
||||
{
|
||||
if (preferredSections is not null && !preferredSections.Contains(chunk.Section))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
chunks.Add(chunk);
|
||||
}
|
||||
}
|
||||
|
||||
chunks.Sort(static (left, right) => string.CompareOrdinal(left.ChunkId, right.ChunkId));
|
||||
|
||||
if (request.MaxChunks is int max && max > 0 && chunks.Count > max)
|
||||
{
|
||||
chunks = chunks.Take(max).ToList();
|
||||
}
|
||||
|
||||
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["documents"] = string.Join(",", documents.Select(d => d.DocumentId)),
|
||||
["chunk_count"] = chunks.Count.ToString(),
|
||||
};
|
||||
|
||||
return AdvisoryRetrievalResult.Create(request.AdvisoryKey, chunks, metadata);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,73 @@
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Vectorization;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Retrievers;
|
||||
|
||||
internal sealed class AdvisoryVectorRetriever : IAdvisoryVectorRetriever
|
||||
{
|
||||
private readonly IAdvisoryStructuredRetriever _structuredRetriever;
|
||||
private readonly IVectorEncoder _encoder;
|
||||
|
||||
public AdvisoryVectorRetriever(IAdvisoryStructuredRetriever structuredRetriever, IVectorEncoder encoder)
|
||||
{
|
||||
_structuredRetriever = structuredRetriever ?? throw new ArgumentNullException(nameof(structuredRetriever));
|
||||
_encoder = encoder ?? throw new ArgumentNullException(nameof(encoder));
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<VectorRetrievalMatch>> SearchAsync(VectorRetrievalRequest request, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(request);
|
||||
if (request.TopK <= 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(request.TopK), "TopK must be a positive integer.");
|
||||
}
|
||||
|
||||
var retrieval = await _structuredRetriever.RetrieveAsync(request.Retrieval, cancellationToken).ConfigureAwait(false);
|
||||
if (retrieval.Chunks.Count == 0)
|
||||
{
|
||||
return Array.Empty<VectorRetrievalMatch>();
|
||||
}
|
||||
|
||||
var queryVector = _encoder.Encode(request.Query);
|
||||
var matches = new List<VectorRetrievalMatch>(retrieval.Chunks.Count);
|
||||
|
||||
foreach (var chunk in retrieval.Chunks)
|
||||
{
|
||||
var vector = chunk.Embedding ?? _encoder.Encode(chunk.Text);
|
||||
var score = CosineSimilarity(queryVector, vector);
|
||||
matches.Add(new VectorRetrievalMatch(chunk.DocumentId, chunk.ChunkId, chunk.Text, score, chunk.Metadata));
|
||||
}
|
||||
|
||||
matches.Sort(static (left, right) => right.Score.CompareTo(left.Score));
|
||||
if (matches.Count > request.TopK)
|
||||
{
|
||||
matches.RemoveRange(request.TopK, matches.Count - request.TopK);
|
||||
}
|
||||
|
||||
return matches;
|
||||
}
|
||||
|
||||
private static double CosineSimilarity(float[] left, float[] right)
|
||||
{
|
||||
var length = Math.Min(left.Length, right.Length);
|
||||
double dot = 0;
|
||||
double leftNorm = 0;
|
||||
double rightNorm = 0;
|
||||
|
||||
for (var i = 0; i < length; i++)
|
||||
{
|
||||
var l = left[i];
|
||||
var r = right[i];
|
||||
dot += l * r;
|
||||
leftNorm += l * l;
|
||||
rightNorm += r * r;
|
||||
}
|
||||
|
||||
if (leftNorm <= 0 || rightNorm <= 0)
|
||||
{
|
||||
return 0;
|
||||
}
|
||||
|
||||
return dot / Math.Sqrt(leftNorm * rightNorm);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,209 @@
|
||||
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Context;
|
||||
using StellaOps.AdvisoryAI.Providers;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Retrievers;
|
||||
|
||||
internal sealed class SbomContextRetriever : ISbomContextRetriever
|
||||
{
|
||||
private readonly ISbomContextClient _client;
|
||||
private readonly ILogger<SbomContextRetriever>? _logger;
|
||||
|
||||
public SbomContextRetriever(ISbomContextClient client, ILogger<SbomContextRetriever>? logger = null)
|
||||
{
|
||||
_client = client ?? throw new ArgumentNullException(nameof(client));
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
public async Task<SbomContextResult> RetrieveAsync(SbomContextRequest request, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(request);
|
||||
|
||||
var query = new SbomContextQuery(
|
||||
request.ArtifactId,
|
||||
request.Purl,
|
||||
request.MaxTimelineEntries,
|
||||
request.MaxDependencyPaths,
|
||||
request.IncludeEnvironmentFlags,
|
||||
request.IncludeBlastRadius);
|
||||
|
||||
SbomContextDocument? document;
|
||||
try
|
||||
{
|
||||
document = await _client.GetContextAsync(query, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger?.LogError(ex, "Failed to retrieve SBOM context for artifact {ArtifactId}", request.ArtifactId);
|
||||
document = null;
|
||||
}
|
||||
|
||||
if (document is null)
|
||||
{
|
||||
_logger?.LogWarning("No SBOM context returned for artifact {ArtifactId}", request.ArtifactId);
|
||||
return SbomContextResult.Empty(request.ArtifactId, request.Purl);
|
||||
}
|
||||
|
||||
var timeline = ShapeTimeline(document.Versions, request.MaxTimelineEntries);
|
||||
var paths = ShapeDependencyPaths(document.DependencyPaths, request.MaxDependencyPaths);
|
||||
var environmentFlags = request.IncludeEnvironmentFlags
|
||||
? NormalizeEnvironmentFlags(document.EnvironmentFlags)
|
||||
: ImmutableDictionary<string, string>.Empty;
|
||||
var blastRadius = request.IncludeBlastRadius ? ShapeBlastRadius(document.BlastRadius) : null;
|
||||
var metadata = BuildMetadata(document, timeline.Count, paths.Count, environmentFlags.Count, blastRadius is not null);
|
||||
|
||||
return SbomContextResult.Create(
|
||||
document.ArtifactId,
|
||||
document.Purl,
|
||||
timeline,
|
||||
paths,
|
||||
environmentFlags,
|
||||
blastRadius,
|
||||
metadata);
|
||||
}
|
||||
|
||||
private static IReadOnlyList<SbomVersionTimelineEntry> ShapeTimeline(ImmutableArray<SbomVersionRecord> versions, int max)
|
||||
{
|
||||
if (versions.IsDefaultOrEmpty || max == 0)
|
||||
{
|
||||
return Array.Empty<SbomVersionTimelineEntry>();
|
||||
}
|
||||
|
||||
return versions
|
||||
.OrderBy(static v => v.FirstObserved)
|
||||
.ThenBy(static v => v.Version, StringComparer.Ordinal)
|
||||
.Take(max > 0 ? max : int.MaxValue)
|
||||
.Select(static v => new SbomVersionTimelineEntry(
|
||||
v.Version,
|
||||
v.FirstObserved,
|
||||
v.LastObserved,
|
||||
string.IsNullOrWhiteSpace(v.Status)
|
||||
? (v.IsFixAvailable ? "fixed" : "unknown")
|
||||
: v.Status.Trim(),
|
||||
string.IsNullOrWhiteSpace(v.Source) ? "sbom" : v.Source.Trim()))
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static IReadOnlyList<SbomDependencyPath> ShapeDependencyPaths(ImmutableArray<SbomDependencyPathRecord> paths, int max)
|
||||
{
|
||||
if (paths.IsDefaultOrEmpty || max == 0)
|
||||
{
|
||||
return Array.Empty<SbomDependencyPath>();
|
||||
}
|
||||
|
||||
var distinct = new SortedDictionary<string, SbomDependencyPath>(StringComparer.Ordinal);
|
||||
foreach (var path in paths)
|
||||
{
|
||||
if (path.Nodes.IsDefaultOrEmpty)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var nodeList = path.Nodes
|
||||
.Select(static node => new SbomDependencyNode(node.Identifier, node.Version))
|
||||
.ToImmutableArray();
|
||||
|
||||
if (nodeList.IsDefaultOrEmpty)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var key = string.Join(
|
||||
"|",
|
||||
nodeList.Select(static n => string.Concat(n.Identifier, "@", n.Version ?? string.Empty)));
|
||||
|
||||
if (distinct.ContainsKey(key))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var dependencyPath = new SbomDependencyPath(
|
||||
nodeList,
|
||||
path.IsRuntime,
|
||||
string.IsNullOrWhiteSpace(path.Source) ? null : path.Source.Trim());
|
||||
|
||||
distinct[key] = dependencyPath;
|
||||
}
|
||||
|
||||
return distinct.Values
|
||||
.OrderBy(p => p.IsRuntime ? 0 : 1)
|
||||
.ThenBy(p => p.Nodes.Length)
|
||||
.ThenBy(p => string.Join("|", p.Nodes.Select(n => n.Identifier)), StringComparer.Ordinal)
|
||||
.Take(max > 0 ? max : int.MaxValue)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static IReadOnlyDictionary<string, string> NormalizeEnvironmentFlags(ImmutableDictionary<string, string> flags)
|
||||
{
|
||||
if (flags == default || flags.IsEmpty)
|
||||
{
|
||||
return ImmutableDictionary<string, string>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableSortedDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
foreach (var pair in flags)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(pair.Key) || pair.Value is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
builder[pair.Key.Trim()] = pair.Value.Trim();
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static SbomBlastRadiusSummary? ShapeBlastRadius(SbomBlastRadiusRecord? record)
|
||||
{
|
||||
if (record is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var metadata = record.Metadata == default
|
||||
? ImmutableDictionary<string, string>.Empty
|
||||
: record.Metadata.ToImmutableDictionary(StringComparer.Ordinal);
|
||||
|
||||
return new SbomBlastRadiusSummary(
|
||||
record.ImpactedAssets,
|
||||
record.ImpactedWorkloads,
|
||||
record.ImpactedNamespaces,
|
||||
record.ImpactedPercentage,
|
||||
metadata);
|
||||
}
|
||||
|
||||
private static IReadOnlyDictionary<string, string> BuildMetadata(
|
||||
SbomContextDocument document,
|
||||
int timelineCount,
|
||||
int pathCount,
|
||||
int environmentFlagCount,
|
||||
bool hasBlastRadius)
|
||||
{
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
foreach (var pair in document.Metadata)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(pair.Key) || pair.Value is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
builder[pair.Key.Trim()] = pair.Value.Trim();
|
||||
}
|
||||
|
||||
builder["version_count"] = timelineCount.ToString();
|
||||
builder["dependency_path_count"] = pathCount.ToString();
|
||||
builder["environment_flag_count"] = environmentFlagCount.ToString();
|
||||
builder["blast_radius_present"] = hasBlastRadius.ToString();
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
}
|
||||
@@ -10,6 +10,11 @@
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
|
||||
<PackageReference Include="Microsoft.Extensions.Options" Version="10.0.0-rc.2.25502.107" />
|
||||
<PackageReference Include="System.Text.Json" Version="10.0.0-rc.2.25502.2" />
|
||||
<PackageReference Include="System.Text.Json" Version="10.0.0-rc.2.25502.107" />
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\Concelier\__Libraries\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
|
||||
<ProjectReference Include="..\..\Concelier\__Libraries\StellaOps.Concelier.RawModels\StellaOps.Concelier.RawModels.csproj" />
|
||||
<ProjectReference Include="..\..\Excititor\__Libraries\StellaOps.Excititor.Core\StellaOps.Excititor.Core.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
|
||||
@@ -1,12 +1,16 @@
|
||||
# Advisory AI Task Board — Epic 8
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| AIAI-31-001 | DOING (2025-11-02) | Advisory AI Guild | CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001 | Implement structured and vector retrievers for advisories/VEX with paragraph anchors and citation metadata. | Retrievers return deterministic chunks with source IDs/sections; unit tests cover CSAF/OSV/vendor formats. |
|
||||
| AIAI-31-002 | TODO | Advisory AI Guild, SBOM Service Guild | SBOM-VULN-29-001 | Build SBOM context retriever (purl version timelines, dependency paths, env flags, blast radius estimator). | Retriever returns paths/metrics under SLA; tests cover ecosystems. |
|
||||
| AIAI-31-003 | TODO | Advisory AI Guild | AIAI-31-001..002 | Implement deterministic toolset (version comparators, range checks, dependency analysis, policy lookup) exposed via orchestrator. | Tools validated with property tests; outputs cached; docs updated. |
|
||||
| AIAI-31-004 | TODO | Advisory AI Guild | AIAI-31-001..003, AUTH-VULN-29-001 | Build orchestration pipeline for Summary/Conflict/Remediation tasks (prompt templates, tool calls, token budgets, caching). | Pipeline executes tasks deterministically; caches keyed by tuple+policy; integration tests cover tasks. |
|
||||
| AIAI-31-005 | TODO | Advisory AI Guild, Security Guild | AIAI-31-004 | Implement guardrails (redaction, injection defense, output validation, citation enforcement) and fail-safe handling. | Guardrails block adversarial inputs; output validator enforces schemas; security tests pass. |
|
||||
| AIAI-31-006 | TODO | Advisory AI Guild | AIAI-31-004..005 | Expose REST API endpoints (`/advisory/ai/*`) with RBAC, rate limits, OpenAPI schemas, and batching support. | Endpoints deployed with schema validation; rate limits enforced; integration tests cover error codes. |
|
||||
| AIAI-31-007 | TODO | Advisory AI Guild, Observability Guild | AIAI-31-004..006 | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. | Telemetry live; dashboards approved; alerts configured. |
|
||||
| AIAI-31-008 | TODO | Advisory AI Guild, DevOps Guild | AIAI-31-006..007 | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. | Deployment docs merged; smoke deploy executed; offline kit updated; feature flags documented. |
|
||||
| AIAI-31-009 | TODO | Advisory AI Guild, QA Guild | AIAI-31-001..006 | Develop unit/golden/property/perf tests, injection harness, and regression suite; ensure determinism with seeded caches. | Test suite green; golden outputs stored; injection tests pass; perf targets documented. |
|
||||
# Advisory AI Task Board — Epic 8
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| AIAI-31-001 | DONE (2025-11-02) | Advisory AI Guild | CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001 | Implement structured and vector retrievers for advisories/VEX with paragraph anchors and citation metadata. | Retrievers return deterministic chunks with source IDs/sections; unit tests cover CSAF/OSV/vendor formats. |
|
||||
| AIAI-31-002 | DOING | Advisory AI Guild, SBOM Service Guild | SBOM-VULN-29-001 | Build SBOM context retriever (purl version timelines, dependency paths, env flags, blast radius estimator). | Retriever returns paths/metrics under SLA; tests cover ecosystems. |
|
||||
| AIAI-31-003 | TODO | Advisory AI Guild | AIAI-31-001..002 | Implement deterministic toolset (version comparators, range checks, dependency analysis, policy lookup) exposed via orchestrator. | Tools validated with property tests; outputs cached; docs updated. |
|
||||
| AIAI-31-004 | TODO | Advisory AI Guild | AIAI-31-001..003, AUTH-VULN-29-001 | Build orchestration pipeline for Summary/Conflict/Remediation tasks (prompt templates, tool calls, token budgets, caching). | Pipeline executes tasks deterministically; caches keyed by tuple+policy; integration tests cover tasks. |
|
||||
| AIAI-31-005 | TODO | Advisory AI Guild, Security Guild | AIAI-31-004 | Implement guardrails (redaction, injection defense, output validation, citation enforcement) and fail-safe handling. | Guardrails block adversarial inputs; output validator enforces schemas; security tests pass. |
|
||||
| AIAI-31-006 | TODO | Advisory AI Guild | AIAI-31-004..005 | Expose REST API endpoints (`/advisory/ai/*`) with RBAC, rate limits, OpenAPI schemas, and batching support. | Endpoints deployed with schema validation; rate limits enforced; integration tests cover error codes. |
|
||||
| AIAI-31-007 | TODO | Advisory AI Guild, Observability Guild | AIAI-31-004..006 | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. | Telemetry live; dashboards approved; alerts configured. |
|
||||
| AIAI-31-008 | TODO | Advisory AI Guild, DevOps Guild | AIAI-31-006..007 | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. | Deployment docs merged; smoke deploy executed; offline kit updated; feature flags documented. |
|
||||
| AIAI-31-010 | DONE (2025-11-02) | Advisory AI Guild | CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001 | Implement Concelier advisory raw document provider mapping CSAF/OSV payloads into structured chunks for retrieval. | Provider resolves content format, preserves metadata, and passes unit tests covering CSAF/OSV cases. |
|
||||
| AIAI-31-011 | DONE (2025-11-02) | Advisory AI Guild | EXCITITOR-LNM-21-201, EXCITITOR-CORE-AOC-19-002 | Implement Excititor VEX document provider to surface structured VEX statements for vector retrieval. | Provider returns conflict-aware VEX chunks with deterministic metadata and tests for representative statements. |
|
||||
| AIAI-31-009 | TODO | Advisory AI Guild, QA Guild | AIAI-31-001..006 | Develop unit/golden/property/perf tests, injection harness, and regression suite; ensure determinism with seeded caches. | Test suite green; golden outputs stored; injection tests pass; perf targets documented. |
|
||||
|
||||
> 2025-11-02: AIAI-31-002 – SBOM context domain models finalized with limiter guards; retriever tests now cover flag toggles and path dedupe. Service client integration still pending with SBOM guild.
|
||||
|
||||
@@ -0,0 +1,77 @@
|
||||
using System.Buffers;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.RegularExpressions;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Vectorization;
|
||||
|
||||
internal interface IVectorEncoder
|
||||
{
|
||||
float[] Encode(string text);
|
||||
}
|
||||
|
||||
internal sealed class DeterministicHashVectorEncoder : IVectorEncoder, IDisposable
|
||||
{
|
||||
private const int DefaultDimensions = 64;
|
||||
private static readonly Regex TokenRegex = new("[A-Za-z0-9]+", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
||||
private readonly IncrementalHash _hash;
|
||||
private readonly int _dimensions;
|
||||
|
||||
public DeterministicHashVectorEncoder(int dimensions = DefaultDimensions)
|
||||
{
|
||||
if (dimensions <= 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(dimensions));
|
||||
}
|
||||
|
||||
_dimensions = dimensions;
|
||||
_hash = IncrementalHash.CreateHash(HashAlgorithmName.SHA256);
|
||||
}
|
||||
|
||||
public float[] Encode(string text)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(text);
|
||||
|
||||
var vector = new float[_dimensions];
|
||||
var tokenMatches = TokenRegex.Matches(text);
|
||||
if (tokenMatches.Count == 0)
|
||||
{
|
||||
return vector;
|
||||
}
|
||||
|
||||
Span<byte> hash = stackalloc byte[32];
|
||||
|
||||
foreach (Match match in tokenMatches)
|
||||
{
|
||||
var token = match.Value.ToLowerInvariant();
|
||||
var bytes = Encoding.UTF8.GetBytes(token);
|
||||
_hash.AppendData(bytes);
|
||||
_hash.GetHashAndReset(hash);
|
||||
var index = (int)(BitConverter.ToUInt32(hash[..4]) % (uint)_dimensions);
|
||||
vector[index] += 1f;
|
||||
}
|
||||
|
||||
Normalize(vector);
|
||||
return vector;
|
||||
}
|
||||
|
||||
private static void Normalize(float[] vector)
|
||||
{
|
||||
var sumSquares = vector.Sum(v => v * v);
|
||||
if (sumSquares <= 0f)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var length = MathF.Sqrt(sumSquares);
|
||||
for (var i = 0; i < vector.Length; i++)
|
||||
{
|
||||
vector[i] /= length;
|
||||
}
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
_hash.Dispose();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,142 @@
|
||||
using System.Text.Json;
|
||||
using FluentAssertions;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Chunking;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.AdvisoryAI.Retrievers;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class AdvisoryStructuredRetrieverTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_ReturnsCsafChunksWithMetadata()
|
||||
{
|
||||
var provider = CreateProvider(
|
||||
"test-advisory",
|
||||
AdvisoryDocument.Create(
|
||||
"CSA-2024-0001",
|
||||
DocumentFormat.Csaf,
|
||||
"csaf",
|
||||
await LoadAsync("sample-csaf.json")));
|
||||
|
||||
var retriever = CreateRetriever(provider);
|
||||
var result = await retriever.RetrieveAsync(new AdvisoryRetrievalRequest("test-advisory"), CancellationToken.None);
|
||||
|
||||
result.Chunks.Should().NotBeEmpty();
|
||||
result.Chunks.Should().HaveCountGreaterThan(4);
|
||||
result.Chunks.Select(c => c.ChunkId).Should().BeInAscendingOrder();
|
||||
result.Chunks.All(c => c.Metadata["format"] == "csaf").Should().BeTrue();
|
||||
result.Chunks.Any(c => c.Section == "vulnerabilities.remediations").Should().BeTrue();
|
||||
result.Chunks.Any(c => c.Section == "document.notes").Should().BeTrue();
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_ReturnsOsvChunksWithAffectedMetadata()
|
||||
{
|
||||
var provider = CreateProvider(
|
||||
"osv-advisory",
|
||||
AdvisoryDocument.Create(
|
||||
"OSV-2024-0001",
|
||||
DocumentFormat.Osv,
|
||||
"osv",
|
||||
await LoadAsync("sample-osv.json")));
|
||||
|
||||
var retriever = CreateRetriever(provider);
|
||||
var result = await retriever.RetrieveAsync(new AdvisoryRetrievalRequest("osv-advisory"), CancellationToken.None);
|
||||
|
||||
result.Chunks.Should().NotBeEmpty();
|
||||
result.Chunks.Should().ContainSingle(c => c.Section == "summary");
|
||||
result.Chunks.Should().Contain(c => c.Section == "affected.ranges");
|
||||
result.Chunks.First(c => c.Section == "affected.ranges").Metadata.Should().ContainKey("package");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_ReturnsOpenVexChunksWithStatusMetadata()
|
||||
{
|
||||
var provider = CreateProvider(
|
||||
"openvex-advisory",
|
||||
AdvisoryDocument.Create(
|
||||
"OPENVEX-2024-0001",
|
||||
DocumentFormat.OpenVex,
|
||||
"exc-provider",
|
||||
await LoadAsync("sample-openvex.json")));
|
||||
|
||||
var retriever = CreateRetriever(provider);
|
||||
var result = await retriever.RetrieveAsync(new AdvisoryRetrievalRequest("openvex-advisory"), CancellationToken.None);
|
||||
|
||||
result.Chunks.Should().HaveCount(2);
|
||||
result.Chunks.Select(c => c.Metadata["status"]).Should().Contain(new[] { "not_affected", "affected" });
|
||||
result.Chunks.First().Metadata.Should().ContainKey("justification");
|
||||
result.Chunks.Should().AllSatisfy(chunk => chunk.Section.Should().Be("vex.statements"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_FiltersToPreferredSections()
|
||||
{
|
||||
var provider = CreateProvider(
|
||||
"markdown-advisory",
|
||||
AdvisoryDocument.Create(
|
||||
"VENDOR-2024-0001",
|
||||
DocumentFormat.Markdown,
|
||||
"vendor",
|
||||
await LoadAsync("sample-vendor.md")));
|
||||
|
||||
var retriever = CreateRetriever(provider);
|
||||
var request = new AdvisoryRetrievalRequest(
|
||||
"markdown-advisory",
|
||||
PreferredSections: new[] { "Impact" });
|
||||
|
||||
var result = await retriever.RetrieveAsync(request, CancellationToken.None);
|
||||
|
||||
result.Chunks.Should().NotBeEmpty();
|
||||
result.Chunks.Should().OnlyContain(chunk => chunk.Section.StartsWith("Impact", StringComparison.Ordinal));
|
||||
}
|
||||
|
||||
private static AdvisoryStructuredRetriever CreateRetriever(IAdvisoryDocumentProvider provider)
|
||||
{
|
||||
var chunkers = new IDocumentChunker[]
|
||||
{
|
||||
new CsafDocumentChunker(),
|
||||
new OsvDocumentChunker(),
|
||||
new MarkdownDocumentChunker(),
|
||||
new OpenVexDocumentChunker(),
|
||||
};
|
||||
|
||||
return new AdvisoryStructuredRetriever(provider, chunkers, NullLogger<AdvisoryStructuredRetriever>.Instance);
|
||||
}
|
||||
|
||||
private static async Task<string> LoadAsync(string fileName)
|
||||
{
|
||||
var path = Path.Combine(AppContext.BaseDirectory, "TestData", fileName);
|
||||
return await File.ReadAllTextAsync(path);
|
||||
}
|
||||
|
||||
private static IAdvisoryDocumentProvider CreateProvider(string key, params AdvisoryDocument[] documents)
|
||||
=> new InMemoryAdvisoryDocumentProvider(new Dictionary<string, IReadOnlyList<AdvisoryDocument>>(StringComparer.Ordinal)
|
||||
{
|
||||
[key] = documents,
|
||||
});
|
||||
|
||||
private sealed class InMemoryAdvisoryDocumentProvider : IAdvisoryDocumentProvider
|
||||
{
|
||||
private readonly IReadOnlyDictionary<string, IReadOnlyList<AdvisoryDocument>> _documents;
|
||||
|
||||
public InMemoryAdvisoryDocumentProvider(IReadOnlyDictionary<string, IReadOnlyList<AdvisoryDocument>> documents)
|
||||
{
|
||||
_documents = documents;
|
||||
}
|
||||
|
||||
public Task<IReadOnlyList<AdvisoryDocument>> GetDocumentsAsync(string advisoryKey, CancellationToken cancellationToken)
|
||||
{
|
||||
if (_documents.TryGetValue(advisoryKey, out var documents))
|
||||
{
|
||||
return Task.FromResult(documents);
|
||||
}
|
||||
|
||||
return Task.FromResult<IReadOnlyList<AdvisoryDocument>>(Array.Empty<AdvisoryDocument>());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,82 @@
|
||||
using FluentAssertions;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Chunking;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.AdvisoryAI.Retrievers;
|
||||
using StellaOps.AdvisoryAI.Vectorization;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class AdvisoryVectorRetrieverTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task SearchAsync_ReturnsBestMatchingChunk()
|
||||
{
|
||||
var advisoryContent = """
|
||||
# Advisory
|
||||
|
||||
## Impact
|
||||
The vulnerability allows remote attackers to execute arbitrary code.
|
||||
|
||||
## Remediation
|
||||
Update to version 2.1.3 or later and restart the service.
|
||||
""";
|
||||
|
||||
var provider = new InMemoryAdvisoryDocumentProvider(new Dictionary<string, IReadOnlyList<AdvisoryDocument>>(StringComparer.Ordinal)
|
||||
{
|
||||
["adv"] = new[]
|
||||
{
|
||||
AdvisoryDocument.Create("VENDOR-1", DocumentFormat.Markdown, "vendor", advisoryContent)
|
||||
}
|
||||
});
|
||||
|
||||
var structuredRetriever = new AdvisoryStructuredRetriever(
|
||||
provider,
|
||||
new IDocumentChunker[]
|
||||
{
|
||||
new CsafDocumentChunker(),
|
||||
new OsvDocumentChunker(),
|
||||
new MarkdownDocumentChunker(),
|
||||
});
|
||||
|
||||
using var encoder = new DeterministicHashVectorEncoder();
|
||||
var vectorRetriever = new AdvisoryVectorRetriever(structuredRetriever, encoder);
|
||||
|
||||
var matches = await vectorRetriever.SearchAsync(
|
||||
new VectorRetrievalRequest(
|
||||
new AdvisoryRetrievalRequest("adv"),
|
||||
Query: "How do I remediate the vulnerability?",
|
||||
TopK: 1),
|
||||
CancellationToken.None);
|
||||
|
||||
matches.Should().HaveCount(1);
|
||||
matches[0].Section().Should().Be("Remediation");
|
||||
}
|
||||
}
|
||||
|
||||
file static class VectorRetrievalMatchExtensions
|
||||
{
|
||||
public static string Section(this VectorRetrievalMatch match)
|
||||
=> match.Metadata.TryGetValue("section", out var value) ? value : string.Empty;
|
||||
}
|
||||
|
||||
file sealed class InMemoryAdvisoryDocumentProvider : IAdvisoryDocumentProvider
|
||||
{
|
||||
private readonly IReadOnlyDictionary<string, IReadOnlyList<AdvisoryDocument>> _documents;
|
||||
|
||||
public InMemoryAdvisoryDocumentProvider(IReadOnlyDictionary<string, IReadOnlyList<AdvisoryDocument>> documents)
|
||||
{
|
||||
_documents = documents;
|
||||
}
|
||||
|
||||
public Task<IReadOnlyList<AdvisoryDocument>> GetDocumentsAsync(string advisoryKey, CancellationToken cancellationToken)
|
||||
{
|
||||
if (_documents.TryGetValue(advisoryKey, out var documents))
|
||||
{
|
||||
return Task.FromResult(documents);
|
||||
}
|
||||
|
||||
return Task.FromResult<IReadOnlyList<AdvisoryDocument>>(Array.Empty<AdvisoryDocument>());
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,75 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
using FluentAssertions;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Providers;
|
||||
using StellaOps.Concelier.Core.Raw;
|
||||
using StellaOps.Concelier.RawModels;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class ConcelierAdvisoryDocumentProviderTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task GetDocumentsAsync_ReturnsMappedDocuments()
|
||||
{
|
||||
var rawDocument = RawDocumentFactory.CreateAdvisory(
|
||||
tenant: "tenant-a",
|
||||
source: new RawSourceMetadata("vendor-a", "connector", "1.0"),
|
||||
upstream: new RawUpstreamMetadata(
|
||||
"UP-1",
|
||||
"1",
|
||||
DateTimeOffset.UtcNow,
|
||||
"hash-123",
|
||||
new RawSignatureMetadata(false),
|
||||
ImmutableDictionary<string, string>.Empty),
|
||||
content: new RawContent("csaf", "2.0", JsonDocument.Parse("{\"document\": {\"notes\": []}, \"vulnerabilities\": []}").RootElement),
|
||||
identifiers: new RawIdentifiers(ImmutableArray<string>.Empty, "UP-1"),
|
||||
linkset: new RawLinkset());
|
||||
|
||||
var records = new[]
|
||||
{
|
||||
new AdvisoryRawRecord("id-1", rawDocument, DateTimeOffset.UtcNow, DateTimeOffset.UtcNow)
|
||||
};
|
||||
|
||||
var service = new FakeAdvisoryRawService(records);
|
||||
var provider = new ConcelierAdvisoryDocumentProvider(
|
||||
service,
|
||||
Options.Create(new ConcelierAdvisoryDocumentProviderOptions
|
||||
{
|
||||
Tenant = "tenant-a",
|
||||
MaxDocuments = 5,
|
||||
}),
|
||||
NullLogger<ConcelierAdvisoryDocumentProvider>.Instance);
|
||||
|
||||
var results = await provider.GetDocumentsAsync("UP-1", CancellationToken.None);
|
||||
|
||||
results.Should().HaveCount(1);
|
||||
results[0].Format.Should().Be(Documents.DocumentFormat.Csaf);
|
||||
results[0].Source.Should().Be("vendor-a");
|
||||
}
|
||||
|
||||
private sealed class FakeAdvisoryRawService : IAdvisoryRawService
|
||||
{
|
||||
private readonly IReadOnlyList<AdvisoryRawRecord> _records;
|
||||
|
||||
public FakeAdvisoryRawService(IReadOnlyList<AdvisoryRawRecord> records)
|
||||
{
|
||||
_records = records;
|
||||
}
|
||||
|
||||
public Task<AdvisoryRawRecord?> FindByIdAsync(string tenant, string id, CancellationToken cancellationToken)
|
||||
=> Task.FromResult<AdvisoryRawRecord?>(null);
|
||||
|
||||
public Task<AdvisoryRawUpsertResult> IngestAsync(AdvisoryRawDocument document, CancellationToken cancellationToken)
|
||||
=> throw new NotImplementedException();
|
||||
|
||||
public Task<AdvisoryRawQueryResult> QueryAsync(AdvisoryRawQueryOptions options, CancellationToken cancellationToken)
|
||||
=> Task.FromResult(new AdvisoryRawQueryResult(_records, nextCursor: null, hasMore: false));
|
||||
|
||||
public Task<AdvisoryRawVerificationResult> VerifyAsync(AdvisoryRawVerificationRequest request, CancellationToken cancellationToken)
|
||||
=> throw new NotImplementedException();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,148 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json.Nodes;
|
||||
using FluentAssertions;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.AdvisoryAI.Documents;
|
||||
using StellaOps.AdvisoryAI.Providers;
|
||||
using StellaOps.Excititor.Core;
|
||||
using StellaOps.Excititor.Core.Observations;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class ExcititorVexDocumentProviderTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task GetDocumentsAsync_ReturnsMappedObservation()
|
||||
{
|
||||
const string vulnerabilityId = "CVE-2024-9999";
|
||||
const string productKey = "product-key";
|
||||
const string packageUrl = "pkg:docker/sample@1.0.0";
|
||||
const string cpe = "cpe:/a:sample:service";
|
||||
const string providerId = "exc-provider";
|
||||
const string tenantId = "tenant-a";
|
||||
|
||||
var observation = CreateObservation(vulnerabilityId, productKey, packageUrl, cpe, providerId, tenantId);
|
||||
var aggregate = new VexObservationAggregate(
|
||||
ImmutableArray.Create(vulnerabilityId),
|
||||
ImmutableArray.Create(productKey),
|
||||
ImmutableArray.Create(packageUrl),
|
||||
ImmutableArray.Create(cpe),
|
||||
ImmutableArray<VexObservationReference>.Empty,
|
||||
ImmutableArray.Create(providerId));
|
||||
|
||||
var queryResult = new VexObservationQueryResult(
|
||||
ImmutableArray.Create(observation),
|
||||
aggregate,
|
||||
NextCursor: null,
|
||||
HasMore: false);
|
||||
|
||||
var service = new FakeVexObservationQueryService(queryResult);
|
||||
var provider = new ExcititorVexDocumentProvider(
|
||||
service,
|
||||
Options.Create(new ExcititorVexDocumentProviderOptions
|
||||
{
|
||||
Tenant = tenantId,
|
||||
MaxObservations = 5,
|
||||
ProviderIds = ImmutableArray.Create(providerId),
|
||||
Statuses = ImmutableArray.Create(VexClaimStatus.NotAffected),
|
||||
}),
|
||||
NullLogger<ExcititorVexDocumentProvider>.Instance);
|
||||
|
||||
var documents = await provider.GetDocumentsAsync(vulnerabilityId, CancellationToken.None);
|
||||
|
||||
documents.Should().HaveCount(1);
|
||||
var document = documents[0];
|
||||
document.DocumentId.Should().Be("obs-1");
|
||||
document.Format.Should().Be(DocumentFormat.OpenVex);
|
||||
document.Source.Should().Be(providerId);
|
||||
document.Metadata.Should().ContainKey("status_counts");
|
||||
document.Metadata["status_counts"].Should().Be("not_affected:1");
|
||||
document.Metadata.Should().ContainKey("aliases");
|
||||
|
||||
service.LastOptions.Should().NotBeNull();
|
||||
service.LastOptions!.Tenant.Should().Be(tenantId);
|
||||
service.LastOptions.ProviderIds.Should().ContainSingle().Which.Should().Be(providerId);
|
||||
service.LastOptions.Statuses.Should().ContainSingle(VexClaimStatus.NotAffected);
|
||||
service.LastOptions.VulnerabilityIds.Should().Contain(vulnerabilityId);
|
||||
service.LastOptions.Limit.Should().Be(5);
|
||||
}
|
||||
|
||||
private static VexObservation CreateObservation(
|
||||
string vulnerabilityId,
|
||||
string productKey,
|
||||
string packageUrl,
|
||||
string cpe,
|
||||
string providerId,
|
||||
string tenantId)
|
||||
{
|
||||
var upstream = new VexObservationUpstream(
|
||||
"VEX-1",
|
||||
1,
|
||||
DateTimeOffset.Parse("2025-10-10T08:00:00Z"),
|
||||
DateTimeOffset.Parse("2025-10-10T08:05:00Z"),
|
||||
"hash-abc123",
|
||||
new VexObservationSignature(true, "dsse", "key-1", "signature"));
|
||||
|
||||
var evidence = ImmutableArray.Create<JsonNode>(JsonNode.Parse("{\"note\":\"deterministic\"}")!);
|
||||
|
||||
var statement = new VexObservationStatement(
|
||||
vulnerabilityId,
|
||||
productKey,
|
||||
VexClaimStatus.NotAffected,
|
||||
DateTimeOffset.Parse("2025-10-10T08:00:00Z"),
|
||||
locator: "selector",
|
||||
justification: VexJustification.ComponentNotPresent,
|
||||
introducedVersion: null,
|
||||
fixedVersion: null,
|
||||
purl: packageUrl,
|
||||
cpe: cpe,
|
||||
evidence: evidence,
|
||||
metadata: ImmutableDictionary<string, string>.Empty);
|
||||
|
||||
var content = new VexObservationContent(
|
||||
"openvex",
|
||||
"0.2",
|
||||
JsonNode.Parse("{\"statements\":[{\"status\":\"not_affected\"}]}")!);
|
||||
|
||||
var linkset = new VexObservationLinkset(
|
||||
aliases: new[] { vulnerabilityId },
|
||||
purls: new[] { packageUrl },
|
||||
cpes: new[] { cpe },
|
||||
references: null);
|
||||
|
||||
return new VexObservation(
|
||||
"obs-1",
|
||||
tenantId,
|
||||
providerId,
|
||||
"default",
|
||||
upstream,
|
||||
ImmutableArray.Create(statement),
|
||||
content,
|
||||
linkset,
|
||||
DateTimeOffset.Parse("2025-10-11T09:00:00Z"),
|
||||
supersedes: ImmutableArray<string>.Empty,
|
||||
attributes: ImmutableDictionary<string, string>.Empty);
|
||||
}
|
||||
|
||||
private sealed class FakeVexObservationQueryService : IVexObservationQueryService
|
||||
{
|
||||
private readonly VexObservationQueryResult _result;
|
||||
|
||||
public FakeVexObservationQueryService(VexObservationQueryResult result)
|
||||
{
|
||||
_result = result;
|
||||
}
|
||||
|
||||
public VexObservationQueryOptions? LastOptions { get; private set; }
|
||||
|
||||
public ValueTask<VexObservationQueryResult> QueryAsync(
|
||||
VexObservationQueryOptions options,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
LastOptions = options;
|
||||
return ValueTask.FromResult(_result);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,39 @@
|
||||
using FluentAssertions;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class SbomContextRequestTests
|
||||
{
|
||||
[Fact]
|
||||
public void Constructor_NormalizesWhitespaceAndLimits()
|
||||
{
|
||||
var request = new SbomContextRequest(
|
||||
artifactId: " scan-42 ",
|
||||
purl: " pkg:docker/sample@1.2.3 ",
|
||||
maxTimelineEntries: 600,
|
||||
maxDependencyPaths: -5,
|
||||
includeEnvironmentFlags: false,
|
||||
includeBlastRadius: false);
|
||||
|
||||
request.ArtifactId.Should().Be("scan-42");
|
||||
request.Purl.Should().Be("pkg:docker/sample@1.2.3");
|
||||
request.MaxTimelineEntries.Should().Be(SbomContextRequest.TimelineLimitCeiling);
|
||||
request.MaxDependencyPaths.Should().Be(0);
|
||||
request.IncludeEnvironmentFlags.Should().BeFalse();
|
||||
request.IncludeBlastRadius.Should().BeFalse();
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Constructor_AllowsNullPurlAndDefaults()
|
||||
{
|
||||
var request = new SbomContextRequest(artifactId: "scan-123", purl: null);
|
||||
|
||||
request.Purl.Should().BeNull();
|
||||
request.MaxTimelineEntries.Should().BeGreaterThan(0);
|
||||
request.MaxDependencyPaths.Should().BeGreaterThan(0);
|
||||
request.IncludeEnvironmentFlags.Should().BeTrue();
|
||||
request.IncludeBlastRadius.Should().BeTrue();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,212 @@
|
||||
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using FluentAssertions;
|
||||
using StellaOps.AdvisoryAI.Abstractions;
|
||||
using StellaOps.AdvisoryAI.Context;
|
||||
using StellaOps.AdvisoryAI.Providers;
|
||||
using StellaOps.AdvisoryAI.Retrievers;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.AdvisoryAI.Tests;
|
||||
|
||||
public sealed class SbomContextRetrieverTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_ReturnsDeterministicOrderingAndMetadata()
|
||||
{
|
||||
var document = new SbomContextDocument(
|
||||
"artifact-123",
|
||||
"pkg:docker/sample@1.0.0",
|
||||
ImmutableArray.Create(
|
||||
new SbomVersionRecord(
|
||||
"1.0.1",
|
||||
new DateTimeOffset(2025, 10, 15, 12, 0, 0, TimeSpan.Zero),
|
||||
null,
|
||||
"affected",
|
||||
"scanner",
|
||||
false,
|
||||
ImmutableDictionary<string, string>.Empty),
|
||||
new SbomVersionRecord(
|
||||
"1.0.0",
|
||||
new DateTimeOffset(2025, 9, 10, 8, 0, 0, TimeSpan.Zero),
|
||||
new DateTimeOffset(2025, 10, 15, 11, 0, 0, TimeSpan.Zero),
|
||||
"fixed",
|
||||
"inventory",
|
||||
true,
|
||||
ImmutableDictionary<string, string>.Empty)),
|
||||
ImmutableArray.Create(
|
||||
new SbomDependencyPathRecord(
|
||||
ImmutableArray.Create(
|
||||
new SbomDependencyNodeRecord("app", "1.0.0"),
|
||||
new SbomDependencyNodeRecord("lib-a", "2.1.3"),
|
||||
new SbomDependencyNodeRecord("lib-b", "3.4.5")),
|
||||
true,
|
||||
"runtime",
|
||||
ImmutableDictionary<string, string>.Empty),
|
||||
new SbomDependencyPathRecord(
|
||||
ImmutableArray.Create(
|
||||
new SbomDependencyNodeRecord("app", "1.0.0"),
|
||||
new SbomDependencyNodeRecord("test-helper", "0.9.0")),
|
||||
false,
|
||||
"dev",
|
||||
ImmutableDictionary<string, string>.Empty),
|
||||
new SbomDependencyPathRecord(
|
||||
ImmutableArray.Create(
|
||||
new SbomDependencyNodeRecord("app", "1.0.0"),
|
||||
new SbomDependencyNodeRecord("lib-a", "2.1.3"),
|
||||
new SbomDependencyNodeRecord("lib-b", "3.4.5")),
|
||||
true,
|
||||
"runtime",
|
||||
ImmutableDictionary<string, string>.Empty)),
|
||||
ImmutableDictionary.CreateRange(new[]
|
||||
{
|
||||
new KeyValuePair<string, string>("environment/prod", "true"),
|
||||
new KeyValuePair<string, string>("environment/dev", "false"),
|
||||
}),
|
||||
new SbomBlastRadiusRecord(
|
||||
12,
|
||||
8,
|
||||
4,
|
||||
0.25,
|
||||
ImmutableDictionary<string, string>.Empty),
|
||||
ImmutableDictionary<string, string>.Empty);
|
||||
|
||||
var client = new FakeSbomContextClient(document);
|
||||
var retriever = new SbomContextRetriever(client);
|
||||
|
||||
var request = new SbomContextRequest(
|
||||
artifactId: "artifact-123",
|
||||
purl: "pkg:docker/sample@1.0.0",
|
||||
maxTimelineEntries: 2,
|
||||
maxDependencyPaths: 2);
|
||||
|
||||
var result = await retriever.RetrieveAsync(request, CancellationToken.None);
|
||||
|
||||
result.ArtifactId.Should().Be("artifact-123");
|
||||
result.Purl.Should().Be("pkg:docker/sample@1.0.0");
|
||||
result.VersionTimeline.Select(v => v.Version).Should().ContainInOrder("1.0.0", "1.0.1");
|
||||
result.DependencyPaths.Should().HaveCount(2);
|
||||
result.DependencyPaths.First().IsRuntime.Should().BeTrue();
|
||||
result.DependencyPaths.First().Nodes.Select(n => n.Identifier).Should().Equal("app", "lib-a", "lib-b");
|
||||
result.EnvironmentFlags.Keys.Should().Equal(new[] { "environment/dev", "environment/prod" });
|
||||
result.EnvironmentFlags["environment/prod"].Should().Be("true");
|
||||
result.BlastRadius.Should().NotBeNull();
|
||||
result.BlastRadius!.ImpactedAssets.Should().Be(12);
|
||||
result.Metadata["version_count"].Should().Be("2");
|
||||
result.Metadata["dependency_path_count"].Should().Be("2");
|
||||
result.Metadata["environment_flag_count"].Should().Be("2");
|
||||
result.Metadata["blast_radius_present"].Should().Be(bool.TrueString);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_ReturnsEmptyWhenNoDocument()
|
||||
{
|
||||
var client = new FakeSbomContextClient(null);
|
||||
var retriever = new SbomContextRetriever(client);
|
||||
|
||||
var request = new SbomContextRequest("missing-artifact");
|
||||
var result = await retriever.RetrieveAsync(request, CancellationToken.None);
|
||||
|
||||
result.ArtifactId.Should().Be("missing-artifact");
|
||||
result.VersionTimeline.Should().BeEmpty();
|
||||
result.DependencyPaths.Should().BeEmpty();
|
||||
result.EnvironmentFlags.Should().BeEmpty();
|
||||
result.BlastRadius.Should().BeNull();
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_HonorsEnvironmentFlagToggle()
|
||||
{
|
||||
var document = new SbomContextDocument(
|
||||
"artifact-flag",
|
||||
null,
|
||||
ImmutableArray<SbomVersionRecord>.Empty,
|
||||
ImmutableArray<SbomDependencyPathRecord>.Empty,
|
||||
ImmutableDictionary.CreateRange(new[]
|
||||
{
|
||||
new KeyValuePair<string, string>("environment/prod", "true"),
|
||||
}),
|
||||
blastRadius: null,
|
||||
metadata: ImmutableDictionary<string, string>.Empty);
|
||||
|
||||
var client = new FakeSbomContextClient(document);
|
||||
var retriever = new SbomContextRetriever(client);
|
||||
|
||||
var request = new SbomContextRequest(
|
||||
artifactId: "artifact-flag",
|
||||
includeEnvironmentFlags: false,
|
||||
includeBlastRadius: false);
|
||||
|
||||
var result = await retriever.RetrieveAsync(request, CancellationToken.None);
|
||||
|
||||
result.EnvironmentFlags.Should().BeEmpty();
|
||||
result.Metadata["environment_flag_count"].Should().Be("0");
|
||||
|
||||
client.LastQuery.Should().NotBeNull();
|
||||
client.LastQuery!.IncludeEnvironmentFlags.Should().BeFalse();
|
||||
client.LastQuery.IncludeBlastRadius.Should().BeFalse();
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task RetrieveAsync_DeduplicatesDependencyPaths()
|
||||
{
|
||||
var duplicatePath = ImmutableArray.Create(
|
||||
new SbomDependencyNodeRecord("app", "1.0.0"),
|
||||
new SbomDependencyNodeRecord("lib-a", "2.0.0"));
|
||||
|
||||
var document = new SbomContextDocument(
|
||||
"artifact-paths",
|
||||
null,
|
||||
ImmutableArray<SbomVersionRecord>.Empty,
|
||||
ImmutableArray.Create(
|
||||
new SbomDependencyPathRecord(duplicatePath, true, "runtime", ImmutableDictionary<string, string>.Empty),
|
||||
new SbomDependencyPathRecord(duplicatePath, true, "runtime", ImmutableDictionary<string, string>.Empty),
|
||||
new SbomDependencyPathRecord(
|
||||
ImmutableArray.Create(
|
||||
new SbomDependencyNodeRecord("app", "1.0.0"),
|
||||
new SbomDependencyNodeRecord("dev-tool", "0.1.0")),
|
||||
false,
|
||||
"dev",
|
||||
ImmutableDictionary<string, string>.Empty)),
|
||||
ImmutableDictionary<string, string>.Empty,
|
||||
blastRadius: null,
|
||||
metadata: ImmutableDictionary<string, string>.Empty);
|
||||
|
||||
var client = new FakeSbomContextClient(document);
|
||||
var retriever = new SbomContextRetriever(client);
|
||||
|
||||
var request = new SbomContextRequest(
|
||||
artifactId: "artifact-paths",
|
||||
maxDependencyPaths: 5);
|
||||
|
||||
var result = await retriever.RetrieveAsync(request, CancellationToken.None);
|
||||
|
||||
result.DependencyPaths.Should().HaveCount(2);
|
||||
result.DependencyPaths.First().IsRuntime.Should().BeTrue();
|
||||
result.DependencyPaths.Last().IsRuntime.Should().BeFalse();
|
||||
result.Metadata["dependency_path_count"].Should().Be("2");
|
||||
}
|
||||
|
||||
private sealed class FakeSbomContextClient : ISbomContextClient
|
||||
{
|
||||
private readonly SbomContextDocument? _document;
|
||||
|
||||
public FakeSbomContextClient(SbomContextDocument? document)
|
||||
{
|
||||
_document = document;
|
||||
}
|
||||
|
||||
public SbomContextQuery? LastQuery { get; private set; }
|
||||
|
||||
public Task<SbomContextDocument?> GetContextAsync(SbomContextQuery query, CancellationToken cancellationToken)
|
||||
{
|
||||
LastQuery = query;
|
||||
return Task.FromResult(_document);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,31 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<IsPackable>false</IsPackable>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.0" />
|
||||
<PackageReference Include="xunit" Version="2.9.2" />
|
||||
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" />
|
||||
<PackageReference Include="FluentAssertions" Version="6.12.0" />
|
||||
<PackageReference Include="coverlet.collector" Version="6.0.4" />
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\StellaOps.AdvisoryAI\StellaOps.AdvisoryAI.csproj" />
|
||||
<ProjectReference Include="..\..\Concelier\__Libraries\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
|
||||
<ProjectReference Include="..\..\Concelier\__Libraries\StellaOps.Concelier.RawModels\StellaOps.Concelier.RawModels.csproj" />
|
||||
<ProjectReference Include="..\..\Excititor\__Libraries\StellaOps.Excititor.Core\StellaOps.Excititor.Core.csproj" />
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<None Update="TestData/*.json">
|
||||
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
|
||||
</None>
|
||||
<None Update="TestData/*.md">
|
||||
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
|
||||
</None>
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
@@ -0,0 +1,43 @@
|
||||
{
|
||||
"document": {
|
||||
"tracking": {
|
||||
"id": "CSA-2024-0001"
|
||||
},
|
||||
"notes": [
|
||||
{
|
||||
"category": "summary",
|
||||
"text": "The vendor has published guidance for CVE-2024-1234."
|
||||
},
|
||||
{
|
||||
"category": "description",
|
||||
"text": "Additional context for operators."
|
||||
},
|
||||
{
|
||||
"category": "other",
|
||||
"text": "This note should be ignored."
|
||||
}
|
||||
]
|
||||
},
|
||||
"vulnerabilities": [
|
||||
{
|
||||
"id": "CVE-2024-1234",
|
||||
"title": "Important vulnerability in component",
|
||||
"description": "Remote attackers may execute arbitrary code.",
|
||||
"notes": [
|
||||
{
|
||||
"category": "description",
|
||||
"text": "Applies to installations using default configuration."
|
||||
}
|
||||
],
|
||||
"remediations": [
|
||||
{
|
||||
"category": "mitigation",
|
||||
"details": "Apply patch level QFE-2024-11 or later.",
|
||||
"product_ids": [
|
||||
"pkg:deb/debian/component@1.2.3"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"openvex": "https://openvex.dev/ns/v0.2",
|
||||
"timestamp": "2025-10-15T12:00:00Z",
|
||||
"version": "1",
|
||||
"statements": [
|
||||
{
|
||||
"vulnerability": "CVE-2024-9999",
|
||||
"products": [
|
||||
{
|
||||
"product_id": "pkg:docker/sample@1.0.0"
|
||||
}
|
||||
],
|
||||
"status": "not_affected",
|
||||
"justification": "component_not_present",
|
||||
"impact_statement": "Component not shipped",
|
||||
"status_notes": "Distribution excludes this component",
|
||||
"timestamp": "2025-10-10T08:00:00Z",
|
||||
"last_updated": "2025-10-11T09:00:00Z"
|
||||
},
|
||||
{
|
||||
"vulnerability": "CVE-2024-8888",
|
||||
"products": [
|
||||
"component://sample/service"
|
||||
],
|
||||
"status": "affected",
|
||||
"status_notes": "Patch scheduled",
|
||||
"timestamp": "2025-10-12T13:30:00Z"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,33 @@
|
||||
{
|
||||
"id": "OSV-2024-0001",
|
||||
"summary": "Vulnerability in package affects multiple versions.",
|
||||
"details": "Remote attackers may exploit the issue under specific conditions.",
|
||||
"affected": [
|
||||
{
|
||||
"package": {
|
||||
"name": "example",
|
||||
"ecosystem": "npm"
|
||||
},
|
||||
"ranges": [
|
||||
{
|
||||
"type": "SEMVER",
|
||||
"events": [
|
||||
{
|
||||
"introduced": "0"
|
||||
},
|
||||
{
|
||||
"fixed": "1.2.3"
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"versions": ["0.9.0", "1.0.0"]
|
||||
}
|
||||
],
|
||||
"references": [
|
||||
{
|
||||
"type": "ADVISORY",
|
||||
"url": "https://example.org/advisory"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,12 @@
|
||||
# Vendor Advisory 2024-01
|
||||
|
||||
Initial notice describing the vulnerability and affected platforms.
|
||||
|
||||
## Impact
|
||||
End-users may experience remote compromise when default credentials are unchanged.
|
||||
|
||||
## Remediation
|
||||
Apply hotfix package 2024.11.5 and rotate secrets within 24 hours.
|
||||
|
||||
## References
|
||||
- https://vendor.example.com/advisories/2024-01
|
||||
@@ -24,8 +24,10 @@ public class StellaOpsScopesTests
|
||||
[InlineData(StellaOpsScopes.PolicyAuthor)]
|
||||
[InlineData(StellaOpsScopes.PolicySubmit)]
|
||||
[InlineData(StellaOpsScopes.PolicyApprove)]
|
||||
[InlineData(StellaOpsScopes.PolicyReview)]
|
||||
[InlineData(StellaOpsScopes.PolicyOperate)]
|
||||
[InlineData(StellaOpsScopes.PolicyReview)]
|
||||
[InlineData(StellaOpsScopes.PolicyOperate)]
|
||||
[InlineData(StellaOpsScopes.PolicyPublish)]
|
||||
[InlineData(StellaOpsScopes.PolicyPromote)]
|
||||
[InlineData(StellaOpsScopes.PolicyAudit)]
|
||||
[InlineData(StellaOpsScopes.PolicyRun)]
|
||||
[InlineData(StellaOpsScopes.PolicySimulate)]
|
||||
@@ -72,6 +74,8 @@ public class StellaOpsScopesTests
|
||||
[InlineData(" Signals:Write ", StellaOpsScopes.SignalsWrite)]
|
||||
[InlineData("AIRGAP:SEAL", StellaOpsScopes.AirgapSeal)]
|
||||
[InlineData("Policy:Author", StellaOpsScopes.PolicyAuthor)]
|
||||
[InlineData("Policy:Publish", StellaOpsScopes.PolicyPublish)]
|
||||
[InlineData("Policy:PROMOTE", StellaOpsScopes.PolicyPromote)]
|
||||
[InlineData("Export.Admin", StellaOpsScopes.ExportAdmin)]
|
||||
[InlineData("Advisory-AI:Operate", StellaOpsScopes.AdvisoryAiOperate)]
|
||||
[InlineData("Notify.Admin", StellaOpsScopes.NotifyAdmin)]
|
||||
|
||||
@@ -85,6 +85,26 @@ public static class StellaOpsClaimTypes
|
||||
/// </summary>
|
||||
public const string BackfillTicket = "stellaops:backfill_ticket";
|
||||
|
||||
/// <summary>
|
||||
/// Digest of the policy package being published or promoted.
|
||||
/// </summary>
|
||||
public const string PolicyDigest = "stellaops:policy_digest";
|
||||
|
||||
/// <summary>
|
||||
/// Change management ticket supplied when issuing policy publish/promote tokens.
|
||||
/// </summary>
|
||||
public const string PolicyTicket = "stellaops:policy_ticket";
|
||||
|
||||
/// <summary>
|
||||
/// Operator-provided justification supplied when issuing policy publish/promote tokens.
|
||||
/// </summary>
|
||||
public const string PolicyReason = "stellaops:policy_reason";
|
||||
|
||||
/// <summary>
|
||||
/// Operation discriminator indicating whether the policy token was issued for publish or promote.
|
||||
/// </summary>
|
||||
public const string PolicyOperation = "stellaops:policy_operation";
|
||||
|
||||
/// <summary>
|
||||
/// Incident activation reason recorded when issuing observability incident tokens.
|
||||
/// </summary>
|
||||
|
||||
@@ -154,14 +154,24 @@ public static class StellaOpsScopes
|
||||
public const string PolicyApprove = "policy:approve";
|
||||
|
||||
/// <summary>
|
||||
/// Scope granting permission to operate Policy Studio promotions and runs.
|
||||
/// </summary>
|
||||
public const string PolicyOperate = "policy:operate";
|
||||
|
||||
/// <summary>
|
||||
/// Scope granting permission to audit Policy Studio activity.
|
||||
/// </summary>
|
||||
public const string PolicyAudit = "policy:audit";
|
||||
/// Scope granting permission to operate Policy Studio promotions and runs.
|
||||
/// </summary>
|
||||
public const string PolicyOperate = "policy:operate";
|
||||
|
||||
/// <summary>
|
||||
/// Scope granting permission to publish approved policy versions with attested artefacts.
|
||||
/// </summary>
|
||||
public const string PolicyPublish = "policy:publish";
|
||||
|
||||
/// <summary>
|
||||
/// Scope granting permission to promote policy attestations between environments.
|
||||
/// </summary>
|
||||
public const string PolicyPromote = "policy:promote";
|
||||
|
||||
/// <summary>
|
||||
/// Scope granting permission to audit Policy Studio activity.
|
||||
/// </summary>
|
||||
public const string PolicyAudit = "policy:audit";
|
||||
|
||||
/// <summary>
|
||||
/// Scope granting permission to trigger policy runs and activation workflows.
|
||||
@@ -377,12 +387,14 @@ public static class StellaOpsScopes
|
||||
PolicyEdit,
|
||||
PolicyRead,
|
||||
PolicyReview,
|
||||
PolicySubmit,
|
||||
PolicyApprove,
|
||||
PolicyOperate,
|
||||
PolicyAudit,
|
||||
PolicyRun,
|
||||
PolicyActivate,
|
||||
PolicySubmit,
|
||||
PolicyApprove,
|
||||
PolicyOperate,
|
||||
PolicyPublish,
|
||||
PolicyPromote,
|
||||
PolicyAudit,
|
||||
PolicyRun,
|
||||
PolicyActivate,
|
||||
PolicySimulate,
|
||||
FindingsRead,
|
||||
EffectiveWrite,
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
namespace StellaOps.Authority.Storage.Mongo;
|
||||
|
||||
internal static class AuthorityMongoCollectionNames
|
||||
{
|
||||
public const string ServiceAccounts = "authority_service_accounts";
|
||||
}
|
||||
@@ -0,0 +1,46 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using MongoDB.Bson;
|
||||
using MongoDB.Bson.Serialization.Attributes;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Documents;
|
||||
|
||||
/// <summary>
|
||||
/// Represents a service account that can receive delegated tokens.
|
||||
/// </summary>
|
||||
[BsonIgnoreExtraElements]
|
||||
public sealed class AuthorityServiceAccountDocument
|
||||
{
|
||||
[BsonId]
|
||||
[BsonRepresentation(BsonType.ObjectId)]
|
||||
public string Id { get; set; } = ObjectId.GenerateNewId().ToString();
|
||||
|
||||
[BsonElement("accountId")]
|
||||
public string AccountId { get; set; } = string.Empty;
|
||||
|
||||
[BsonElement("tenant")]
|
||||
public string Tenant { get; set; } = string.Empty;
|
||||
|
||||
[BsonElement("displayName")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? DisplayName { get; set; }
|
||||
|
||||
[BsonElement("description")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? Description { get; set; }
|
||||
|
||||
[BsonElement("enabled")]
|
||||
public bool Enabled { get; set; } = true;
|
||||
|
||||
[BsonElement("allowedScopes")]
|
||||
public List<string> AllowedScopes { get; set; } = new();
|
||||
|
||||
[BsonElement("authorizedClients")]
|
||||
public List<string> AuthorizedClients { get; set; } = new();
|
||||
|
||||
[BsonElement("createdAt")]
|
||||
public DateTimeOffset CreatedAt { get; set; } = DateTimeOffset.UtcNow;
|
||||
|
||||
[BsonElement("updatedAt")]
|
||||
public DateTimeOffset UpdatedAt { get; set; } = DateTimeOffset.UtcNow;
|
||||
}
|
||||
@@ -1,75 +1,79 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using MongoDB.Bson;
|
||||
using MongoDB.Bson.Serialization.Attributes;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Documents;
|
||||
|
||||
/// <summary>
|
||||
/// Represents an OAuth token issued by Authority.
|
||||
/// </summary>
|
||||
[BsonIgnoreExtraElements]
|
||||
public sealed class AuthorityTokenDocument
|
||||
{
|
||||
[BsonId]
|
||||
[BsonRepresentation(BsonType.ObjectId)]
|
||||
public string Id { get; set; } = ObjectId.GenerateNewId().ToString();
|
||||
|
||||
[BsonElement("tokenId")]
|
||||
public string TokenId { get; set; } = Guid.NewGuid().ToString("N");
|
||||
|
||||
[BsonElement("type")]
|
||||
public string Type { get; set; } = string.Empty;
|
||||
|
||||
[BsonElement("subjectId")]
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using MongoDB.Bson;
|
||||
using MongoDB.Bson.Serialization.Attributes;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Documents;
|
||||
|
||||
/// <summary>
|
||||
/// Represents an OAuth token issued by Authority.
|
||||
/// </summary>
|
||||
[BsonIgnoreExtraElements]
|
||||
public sealed class AuthorityTokenDocument
|
||||
{
|
||||
[BsonId]
|
||||
[BsonRepresentation(BsonType.ObjectId)]
|
||||
public string Id { get; set; } = ObjectId.GenerateNewId().ToString();
|
||||
|
||||
[BsonElement("tokenId")]
|
||||
public string TokenId { get; set; } = Guid.NewGuid().ToString("N");
|
||||
|
||||
[BsonElement("type")]
|
||||
public string Type { get; set; } = string.Empty;
|
||||
[BsonElement(tokenKind)]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SubjectId { get; set; }
|
||||
|
||||
[BsonElement("clientId")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? ClientId { get; set; }
|
||||
|
||||
[BsonElement("scope")]
|
||||
public List<string> Scope { get; set; } = new();
|
||||
|
||||
[BsonElement("referenceId")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? ReferenceId { get; set; }
|
||||
|
||||
[BsonElement("status")]
|
||||
public string Status { get; set; } = "valid";
|
||||
|
||||
[BsonElement("payload")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? Payload { get; set; }
|
||||
|
||||
[BsonElement("createdAt")]
|
||||
public DateTimeOffset CreatedAt { get; set; } = DateTimeOffset.UtcNow;
|
||||
|
||||
[BsonElement("expiresAt")]
|
||||
[BsonIgnoreIfNull]
|
||||
public DateTimeOffset? ExpiresAt { get; set; }
|
||||
|
||||
[BsonElement("revokedAt")]
|
||||
[BsonIgnoreIfNull]
|
||||
public DateTimeOffset? RevokedAt { get; set; }
|
||||
|
||||
[BsonElement("revokedReason")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? RevokedReason { get; set; }
|
||||
|
||||
[BsonElement("revokedReasonDescription")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? RevokedReasonDescription { get; set; }
|
||||
|
||||
[BsonElement("senderConstraint")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SenderConstraint { get; set; }
|
||||
|
||||
[BsonElement("senderKeyThumbprint")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SenderKeyThumbprint { get; set; }
|
||||
public string? TokenKind { get; set; }
|
||||
|
||||
|
||||
[BsonElement("subjectId")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SubjectId { get; set; }
|
||||
|
||||
[BsonElement("clientId")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? ClientId { get; set; }
|
||||
|
||||
[BsonElement("scope")]
|
||||
public List<string> Scope { get; set; } = new();
|
||||
|
||||
[BsonElement("referenceId")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? ReferenceId { get; set; }
|
||||
|
||||
[BsonElement("status")]
|
||||
public string Status { get; set; } = "valid";
|
||||
|
||||
[BsonElement("payload")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? Payload { get; set; }
|
||||
|
||||
[BsonElement("createdAt")]
|
||||
public DateTimeOffset CreatedAt { get; set; } = DateTimeOffset.UtcNow;
|
||||
|
||||
[BsonElement("expiresAt")]
|
||||
[BsonIgnoreIfNull]
|
||||
public DateTimeOffset? ExpiresAt { get; set; }
|
||||
|
||||
[BsonElement("revokedAt")]
|
||||
[BsonIgnoreIfNull]
|
||||
public DateTimeOffset? RevokedAt { get; set; }
|
||||
|
||||
[BsonElement("revokedReason")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? RevokedReason { get; set; }
|
||||
|
||||
[BsonElement("revokedReasonDescription")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? RevokedReasonDescription { get; set; }
|
||||
|
||||
[BsonElement("senderConstraint")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SenderConstraint { get; set; }
|
||||
|
||||
[BsonElement("senderKeyThumbprint")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SenderKeyThumbprint { get; set; }
|
||||
|
||||
[BsonElement("senderNonce")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? SenderNonce { get; set; }
|
||||
@@ -81,16 +85,24 @@ public sealed class AuthorityTokenDocument
|
||||
[BsonElement("tenant")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? Tenant { get; set; }
|
||||
|
||||
[BsonElement("project")]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? Project { get; set; }
|
||||
|
||||
[BsonElement("devices")]
|
||||
[BsonIgnoreIfNull]
|
||||
public List<BsonDocument>? Devices { get; set; }
|
||||
|
||||
[BsonElement("revokedMetadata")]
|
||||
[BsonIgnoreIfNull]
|
||||
public Dictionary<string, string?>? RevokedMetadata { get; set; }
|
||||
|
||||
[BsonElement("project")]
|
||||
[BsonElement(serviceAccountId)]
|
||||
[BsonIgnoreIfNull]
|
||||
public string? Project { get; set; }
|
||||
public string? ServiceAccountId { get; set; }
|
||||
|
||||
[BsonElement("devices")]
|
||||
[BsonElement(actors)]
|
||||
[BsonIgnoreIfNull]
|
||||
public List<BsonDocument>? Devices { get; set; }
|
||||
|
||||
[BsonElement("revokedMetadata")]
|
||||
[BsonIgnoreIfNull]
|
||||
public Dictionary<string, string?>? RevokedMetadata { get; set; }
|
||||
}
|
||||
public List<string>? ActorChain { get; set; }
|
||||
}
|
||||
|
||||
@@ -2,7 +2,8 @@ using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.DependencyInjection.Extensions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
using StellaOps.Authority.Storage.Mongo;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
using StellaOps.Authority.Storage.Mongo.Initialization;
|
||||
using StellaOps.Authority.Storage.Mongo.Migrations;
|
||||
using StellaOps.Authority.Storage.Mongo.Options;
|
||||
@@ -112,6 +113,12 @@ public static class ServiceCollectionExtensions
|
||||
var database = sp.GetRequiredService<IMongoDatabase>();
|
||||
return database.GetCollection<AuthorityAirgapAuditDocument>(AuthorityMongoDefaults.Collections.AirgapAudit);
|
||||
});
|
||||
|
||||
services.AddSingleton(static sp =>
|
||||
{
|
||||
var database = sp.GetRequiredService<IMongoDatabase>();
|
||||
return database.GetCollection<AuthorityServiceAccountDocument>(AuthorityMongoCollectionNames.ServiceAccounts);
|
||||
});
|
||||
|
||||
services.TryAddSingleton<IAuthorityCollectionInitializer, AuthorityUserCollectionInitializer>();
|
||||
services.TryAddSingleton<IAuthorityCollectionInitializer, AuthorityClientCollectionInitializer>();
|
||||
@@ -121,6 +128,7 @@ public static class ServiceCollectionExtensions
|
||||
services.TryAddSingleton<IAuthorityCollectionInitializer, AuthorityRevocationCollectionInitializer>();
|
||||
services.TryAddSingleton<IAuthorityCollectionInitializer, AuthorityBootstrapInviteCollectionInitializer>();
|
||||
services.TryAddSingleton<IAuthorityCollectionInitializer, AuthorityAirgapAuditCollectionInitializer>();
|
||||
services.TryAddSingleton<IAuthorityCollectionInitializer, AuthorityServiceAccountCollectionInitializer>();
|
||||
|
||||
services.TryAddSingleton<IAuthorityUserStore, AuthorityUserStore>();
|
||||
services.TryAddSingleton<IAuthorityClientStore, AuthorityClientStore>();
|
||||
@@ -131,6 +139,7 @@ public static class ServiceCollectionExtensions
|
||||
services.TryAddSingleton<IAuthorityRevocationExportStateStore, AuthorityRevocationExportStateStore>();
|
||||
services.TryAddSingleton<IAuthorityBootstrapInviteStore, AuthorityBootstrapInviteStore>();
|
||||
services.TryAddSingleton<IAuthorityAirgapAuditStore, AuthorityAirgapAuditStore>();
|
||||
services.TryAddSingleton<IAuthorityServiceAccountStore, AuthorityServiceAccountStore>();
|
||||
|
||||
return services;
|
||||
}
|
||||
|
||||
@@ -0,0 +1,30 @@
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Authority.Storage.Mongo;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Initialization;
|
||||
|
||||
internal sealed class AuthorityServiceAccountCollectionInitializer : IAuthorityCollectionInitializer
|
||||
{
|
||||
public async ValueTask EnsureIndexesAsync(IMongoDatabase database, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(database);
|
||||
|
||||
var collection = database.GetCollection<AuthorityServiceAccountDocument>(AuthorityMongoCollectionNames.ServiceAccounts);
|
||||
|
||||
var indexModels = new[]
|
||||
{
|
||||
new CreateIndexModel<AuthorityServiceAccountDocument>(
|
||||
Builders<AuthorityServiceAccountDocument>.IndexKeys.Ascending(account => account.AccountId),
|
||||
new CreateIndexOptions { Name = "service_account_id_unique", Unique = true }),
|
||||
new CreateIndexModel<AuthorityServiceAccountDocument>(
|
||||
Builders<AuthorityServiceAccountDocument>.IndexKeys.Ascending(account => account.Tenant).Ascending(account => account.Enabled),
|
||||
new CreateIndexOptions { Name = "service_account_tenant_enabled" }),
|
||||
new CreateIndexModel<AuthorityServiceAccountDocument>(
|
||||
Builders<AuthorityServiceAccountDocument>.IndexKeys.Ascending("authorizedClients"),
|
||||
new CreateIndexOptions { Name = "service_account_authorized_clients" })
|
||||
};
|
||||
|
||||
await collection.Indexes.CreateManyAsync(indexModels, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
using Microsoft.Extensions.Logging;
|
||||
using MongoDB.Bson;
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Authority.Storage.Mongo;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Migrations;
|
||||
|
||||
@@ -16,7 +17,8 @@ internal sealed class EnsureAuthorityCollectionsMigration : IAuthorityMongoMigra
|
||||
AuthorityMongoDefaults.Collections.Scopes,
|
||||
AuthorityMongoDefaults.Collections.Tokens,
|
||||
AuthorityMongoDefaults.Collections.LoginAttempts,
|
||||
AuthorityMongoDefaults.Collections.AirgapAudit
|
||||
AuthorityMongoDefaults.Collections.AirgapAudit,
|
||||
AuthorityMongoCollectionNames.ServiceAccounts
|
||||
};
|
||||
|
||||
private readonly ILogger<EnsureAuthorityCollectionsMigration> logger;
|
||||
|
||||
@@ -0,0 +1,184 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Stores;
|
||||
|
||||
internal sealed class AuthorityServiceAccountStore : IAuthorityServiceAccountStore
|
||||
{
|
||||
private readonly IMongoCollection<AuthorityServiceAccountDocument> collection;
|
||||
private readonly TimeProvider clock;
|
||||
private readonly ILogger<AuthorityServiceAccountStore> logger;
|
||||
|
||||
public AuthorityServiceAccountStore(
|
||||
IMongoCollection<AuthorityServiceAccountDocument> collection,
|
||||
TimeProvider clock,
|
||||
ILogger<AuthorityServiceAccountStore> logger)
|
||||
{
|
||||
this.collection = collection ?? throw new ArgumentNullException(nameof(collection));
|
||||
this.clock = clock ?? throw new ArgumentNullException(nameof(clock));
|
||||
this.logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
}
|
||||
|
||||
public async ValueTask<AuthorityServiceAccountDocument?> FindByAccountIdAsync(
|
||||
string accountId,
|
||||
CancellationToken cancellationToken,
|
||||
IClientSessionHandle? session = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(accountId))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var normalized = accountId.Trim();
|
||||
var filter = Builders<AuthorityServiceAccountDocument>.Filter.Eq(account => account.AccountId, normalized);
|
||||
var cursor = session is { }
|
||||
? collection.Find(session, filter)
|
||||
: collection.Find(filter);
|
||||
|
||||
return await cursor.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
public async ValueTask<IReadOnlyList<AuthorityServiceAccountDocument>> ListByTenantAsync(
|
||||
string tenant,
|
||||
CancellationToken cancellationToken,
|
||||
IClientSessionHandle? session = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(tenant))
|
||||
{
|
||||
return Array.Empty<AuthorityServiceAccountDocument>();
|
||||
}
|
||||
|
||||
var normalized = tenant.Trim().ToLowerInvariant();
|
||||
var filter = Builders<AuthorityServiceAccountDocument>.Filter.Eq(account => account.Tenant, normalized);
|
||||
var cursor = session is { }
|
||||
? collection.Find(session, filter)
|
||||
: collection.Find(filter);
|
||||
|
||||
var results = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
|
||||
return results;
|
||||
}
|
||||
|
||||
public async ValueTask UpsertAsync(
|
||||
AuthorityServiceAccountDocument document,
|
||||
CancellationToken cancellationToken,
|
||||
IClientSessionHandle? session = null)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(document);
|
||||
|
||||
NormalizeDocument(document);
|
||||
var now = clock.GetUtcNow();
|
||||
document.UpdatedAt = now;
|
||||
document.CreatedAt = document.CreatedAt == default ? now : document.CreatedAt;
|
||||
|
||||
var filter = Builders<AuthorityServiceAccountDocument>.Filter.Eq(account => account.AccountId, document.AccountId);
|
||||
var options = new ReplaceOptions { IsUpsert = true };
|
||||
|
||||
ReplaceOneResult result;
|
||||
if (session is { })
|
||||
{
|
||||
result = await collection.ReplaceOneAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
else
|
||||
{
|
||||
result = await collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
if (result.UpsertedId is not null)
|
||||
{
|
||||
logger.LogInformation("Inserted Authority service account {AccountId}.", document.AccountId);
|
||||
}
|
||||
else
|
||||
{
|
||||
logger.LogDebug("Updated Authority service account {AccountId}.", document.AccountId);
|
||||
}
|
||||
}
|
||||
|
||||
public async ValueTask<bool> DeleteAsync(
|
||||
string accountId,
|
||||
CancellationToken cancellationToken,
|
||||
IClientSessionHandle? session = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(accountId))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
var normalized = accountId.Trim();
|
||||
var filter = Builders<AuthorityServiceAccountDocument>.Filter.Eq(account => account.AccountId, normalized);
|
||||
|
||||
DeleteResult result;
|
||||
if (session is { })
|
||||
{
|
||||
result = await collection.DeleteOneAsync(session, filter, cancellationToken: cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
else
|
||||
{
|
||||
result = await collection.DeleteOneAsync(filter, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
if (result.DeletedCount > 0)
|
||||
{
|
||||
logger.LogInformation("Deleted Authority service account {AccountId}.", normalized);
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
private static void NormalizeDocument(AuthorityServiceAccountDocument document)
|
||||
{
|
||||
document.AccountId = string.IsNullOrWhiteSpace(document.AccountId)
|
||||
? string.Empty
|
||||
: document.AccountId.Trim().ToLowerInvariant();
|
||||
|
||||
document.Tenant = string.IsNullOrWhiteSpace(document.Tenant)
|
||||
? string.Empty
|
||||
: document.Tenant.Trim().ToLowerInvariant();
|
||||
|
||||
NormalizeList(document.AllowedScopes, static scope => scope.Trim().ToLowerInvariant(), StringComparer.Ordinal);
|
||||
NormalizeList(document.AuthorizedClients, static client => client.Trim().ToLowerInvariant(), StringComparer.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
private static void NormalizeList(IList<string> values, Func<string, string> normalizer, IEqualityComparer<string> comparer)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(values);
|
||||
ArgumentNullException.ThrowIfNull(normalizer);
|
||||
comparer ??= StringComparer.Ordinal;
|
||||
|
||||
if (values.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var seen = new HashSet<string>(comparer);
|
||||
for (var index = values.Count - 1; index >= 0; index--)
|
||||
{
|
||||
var current = values[index];
|
||||
if (string.IsNullOrWhiteSpace(current))
|
||||
{
|
||||
values.RemoveAt(index);
|
||||
continue;
|
||||
}
|
||||
|
||||
var normalized = normalizer(current);
|
||||
if (string.IsNullOrWhiteSpace(normalized))
|
||||
{
|
||||
values.RemoveAt(index);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!seen.Add(normalized))
|
||||
{
|
||||
values.RemoveAt(index);
|
||||
continue;
|
||||
}
|
||||
|
||||
values[index] = normalized;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,18 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
|
||||
namespace StellaOps.Authority.Storage.Mongo.Stores;
|
||||
|
||||
internal interface IAuthorityServiceAccountStore
|
||||
{
|
||||
ValueTask<AuthorityServiceAccountDocument?> FindByAccountIdAsync(string accountId, CancellationToken cancellationToken, IClientSessionHandle? session = null);
|
||||
|
||||
ValueTask<IReadOnlyList<AuthorityServiceAccountDocument>> ListByTenantAsync(string tenant, CancellationToken cancellationToken, IClientSessionHandle? session = null);
|
||||
|
||||
ValueTask UpsertAsync(AuthorityServiceAccountDocument document, CancellationToken cancellationToken, IClientSessionHandle? session = null);
|
||||
|
||||
ValueTask<bool> DeleteAsync(string accountId, CancellationToken cancellationToken, IClientSessionHandle? session = null);
|
||||
}
|
||||
@@ -68,12 +68,13 @@ public sealed class NotifyAckTokenRotationEndpointTests : IClassFixture<Authorit
|
||||
services.RemoveAll<IAuthEventSink>();
|
||||
services.AddSingleton<IAuthEventSink>(sink);
|
||||
services.Replace(ServiceDescriptor.Singleton<TimeProvider>(timeProvider));
|
||||
services.AddAuthentication(options =>
|
||||
var authBuilder = services.AddAuthentication(options =>
|
||||
{
|
||||
options.DefaultAuthenticateScheme = TestAuthHandler.SchemeName;
|
||||
options.DefaultChallengeScheme = TestAuthHandler.SchemeName;
|
||||
})
|
||||
.AddScheme<AuthenticationSchemeOptions, TestAuthHandler>(TestAuthHandler.SchemeName, _ => { });
|
||||
});
|
||||
authBuilder.AddScheme<AuthenticationSchemeOptions, TestAuthHandler>(TestAuthHandler.SchemeName, _ => { });
|
||||
authBuilder.AddScheme<AuthenticationSchemeOptions, TestAuthHandler>(StellaOpsAuthenticationDefaults.AuthenticationScheme, _ => { });
|
||||
});
|
||||
});
|
||||
|
||||
@@ -143,12 +144,13 @@ public sealed class NotifyAckTokenRotationEndpointTests : IClassFixture<Authorit
|
||||
services.RemoveAll<IAuthEventSink>();
|
||||
services.AddSingleton<IAuthEventSink>(sink);
|
||||
services.Replace(ServiceDescriptor.Singleton<TimeProvider>(timeProvider));
|
||||
services.AddAuthentication(options =>
|
||||
var authBuilder = services.AddAuthentication(options =>
|
||||
{
|
||||
options.DefaultAuthenticateScheme = TestAuthHandler.SchemeName;
|
||||
options.DefaultChallengeScheme = TestAuthHandler.SchemeName;
|
||||
})
|
||||
.AddScheme<AuthenticationSchemeOptions, TestAuthHandler>(TestAuthHandler.SchemeName, _ => { });
|
||||
});
|
||||
authBuilder.AddScheme<AuthenticationSchemeOptions, TestAuthHandler>(TestAuthHandler.SchemeName, _ => { });
|
||||
authBuilder.AddScheme<AuthenticationSchemeOptions, TestAuthHandler>(StellaOpsAuthenticationDefaults.AuthenticationScheme, _ => { });
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -386,6 +386,74 @@ public class ClientCredentialsHandlersTests
|
||||
Assert.Equal(new[] { "policy:author" }, grantedScopes);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidateClientCredentials_RejectsPolicyPublishForClientCredentials()
|
||||
{
|
||||
var clientDocument = CreateClient(
|
||||
secret: "s3cr3t!",
|
||||
allowedGrantTypes: "client_credentials",
|
||||
allowedScopes: "policy:publish",
|
||||
tenant: "tenant-alpha");
|
||||
|
||||
var registry = CreateRegistry(withClientProvisioning: true, clientDescriptor: CreateDescriptor(clientDocument));
|
||||
var options = TestHelpers.CreateAuthorityOptions();
|
||||
var handler = new ValidateClientCredentialsHandler(
|
||||
new TestClientStore(clientDocument),
|
||||
registry,
|
||||
TestActivitySource,
|
||||
new TestAuthEventSink(),
|
||||
new TestRateLimiterMetadataAccessor(),
|
||||
TimeProvider.System,
|
||||
new NoopCertificateValidator(),
|
||||
new HttpContextAccessor(),
|
||||
options,
|
||||
NullLogger<ValidateClientCredentialsHandler>.Instance);
|
||||
|
||||
var transaction = CreateTokenTransaction(clientDocument.ClientId, "s3cr3t!", scope: "policy:publish");
|
||||
var context = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
|
||||
await handler.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidScope, context.Error);
|
||||
Assert.Equal("Scope 'policy:publish' requires interactive authentication.", context.ErrorDescription);
|
||||
Assert.Equal(StellaOpsScopes.PolicyPublish, context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty]);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidateClientCredentials_RejectsPolicyPromoteForClientCredentials()
|
||||
{
|
||||
var clientDocument = CreateClient(
|
||||
secret: "s3cr3t!",
|
||||
allowedGrantTypes: "client_credentials",
|
||||
allowedScopes: "policy:promote",
|
||||
tenant: "tenant-alpha");
|
||||
|
||||
var registry = CreateRegistry(withClientProvisioning: true, clientDescriptor: CreateDescriptor(clientDocument));
|
||||
var options = TestHelpers.CreateAuthorityOptions();
|
||||
var handler = new ValidateClientCredentialsHandler(
|
||||
new TestClientStore(clientDocument),
|
||||
registry,
|
||||
TestActivitySource,
|
||||
new TestAuthEventSink(),
|
||||
new TestRateLimiterMetadataAccessor(),
|
||||
TimeProvider.System,
|
||||
new NoopCertificateValidator(),
|
||||
new HttpContextAccessor(),
|
||||
options,
|
||||
NullLogger<ValidateClientCredentialsHandler>.Instance);
|
||||
|
||||
var transaction = CreateTokenTransaction(clientDocument.ClientId, "s3cr3t!", scope: "policy:promote");
|
||||
var context = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
|
||||
await handler.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidScope, context.Error);
|
||||
Assert.Equal("Scope 'policy:promote' requires interactive authentication.", context.ErrorDescription);
|
||||
Assert.Equal(StellaOpsScopes.PolicyPromote, context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty]);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidateClientCredentials_AllowsAdvisoryReadWithAocVerify()
|
||||
{
|
||||
@@ -3163,6 +3231,185 @@ public class ObservabilityIncidentTokenHandlerTests
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidToken, context.Error);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidateAccessTokenHandler_RejectsPolicyAttestationMissingClaims()
|
||||
{
|
||||
var clientDocument = CreateClient(tenant: "tenant-alpha");
|
||||
var tokenStore = new TestTokenStore
|
||||
{
|
||||
Inserted = new AuthorityTokenDocument
|
||||
{
|
||||
TokenId = "token-policy",
|
||||
Status = "valid",
|
||||
ClientId = clientDocument.ClientId,
|
||||
Tenant = "tenant-alpha",
|
||||
Scope = new List<string> { StellaOpsScopes.PolicyPublish },
|
||||
CreatedAt = DateTimeOffset.UtcNow
|
||||
}
|
||||
};
|
||||
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var auditSink = new TestAuthEventSink();
|
||||
var sessionAccessor = new NullMongoSessionAccessor();
|
||||
var handler = new ValidateAccessTokenHandler(
|
||||
tokenStore,
|
||||
sessionAccessor,
|
||||
new TestClientStore(clientDocument),
|
||||
CreateRegistry(withClientProvisioning: true, clientDescriptor: CreateDescriptor(clientDocument)),
|
||||
metadataAccessor,
|
||||
auditSink,
|
||||
TimeProvider.System,
|
||||
ActivitySource,
|
||||
NullLogger<ValidateAccessTokenHandler>.Instance);
|
||||
|
||||
var transaction = new OpenIddictServerTransaction
|
||||
{
|
||||
Options = new OpenIddictServerOptions(),
|
||||
EndpointType = OpenIddictServerEndpointType.Token,
|
||||
Request = new OpenIddictRequest()
|
||||
};
|
||||
|
||||
var principal = CreatePrincipal(clientDocument.ClientId, "token-policy", clientDocument.Plugin);
|
||||
principal.SetScopes(StellaOpsScopes.PolicyPublish);
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyOperation, AuthorityOpenIddictConstants.PolicyOperationPublishValue);
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyReason, "Publish approved policy");
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyTicket, "CR-2000");
|
||||
principal.SetClaim(OpenIddictConstants.Claims.AuthenticationTime, DateTimeOffset.UtcNow.ToUnixTimeSeconds().ToString(CultureInfo.InvariantCulture));
|
||||
|
||||
var context = new OpenIddictServerEvents.ValidateTokenContext(transaction)
|
||||
{
|
||||
Principal = principal,
|
||||
TokenId = "token-policy"
|
||||
};
|
||||
|
||||
await handler.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidToken, context.Error);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidateAccessTokenHandler_RejectsPolicyAttestationNotFresh()
|
||||
{
|
||||
var clientDocument = CreateClient(tenant: "tenant-alpha");
|
||||
var tokenStore = new TestTokenStore
|
||||
{
|
||||
Inserted = new AuthorityTokenDocument
|
||||
{
|
||||
TokenId = "token-policy-stale",
|
||||
Status = "valid",
|
||||
ClientId = clientDocument.ClientId,
|
||||
Tenant = "tenant-alpha",
|
||||
Scope = new List<string> { StellaOpsScopes.PolicyPublish },
|
||||
CreatedAt = DateTimeOffset.UtcNow.AddMinutes(-20)
|
||||
}
|
||||
};
|
||||
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var auditSink = new TestAuthEventSink();
|
||||
var sessionAccessor = new NullMongoSessionAccessor();
|
||||
var handler = new ValidateAccessTokenHandler(
|
||||
tokenStore,
|
||||
sessionAccessor,
|
||||
new TestClientStore(clientDocument),
|
||||
CreateRegistry(withClientProvisioning: true, clientDescriptor: CreateDescriptor(clientDocument)),
|
||||
metadataAccessor,
|
||||
auditSink,
|
||||
TimeProvider.System,
|
||||
ActivitySource,
|
||||
NullLogger<ValidateAccessTokenHandler>.Instance);
|
||||
|
||||
var transaction = new OpenIddictServerTransaction
|
||||
{
|
||||
Options = new OpenIddictServerOptions(),
|
||||
EndpointType = OpenIddictServerEndpointType.Token,
|
||||
Request = new OpenIddictRequest()
|
||||
};
|
||||
|
||||
var principal = CreatePrincipal(clientDocument.ClientId, "token-policy-stale", clientDocument.Plugin);
|
||||
principal.SetScopes(StellaOpsScopes.PolicyPublish);
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyOperation, AuthorityOpenIddictConstants.PolicyOperationPublishValue);
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyDigest, new string('a', 64));
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyReason, "Publish approved policy");
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyTicket, "CR-2001");
|
||||
var staleAuth = DateTimeOffset.UtcNow.AddMinutes(-10);
|
||||
principal.SetClaim(OpenIddictConstants.Claims.AuthenticationTime, staleAuth.ToUnixTimeSeconds().ToString(CultureInfo.InvariantCulture));
|
||||
|
||||
var context = new OpenIddictServerEvents.ValidateTokenContext(transaction)
|
||||
{
|
||||
Principal = principal,
|
||||
TokenId = "token-policy-stale"
|
||||
};
|
||||
|
||||
await handler.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidToken, context.Error);
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData(StellaOpsScopes.PolicyPublish, AuthorityOpenIddictConstants.PolicyOperationPublishValue)]
|
||||
[InlineData(StellaOpsScopes.PolicyPromote, AuthorityOpenIddictConstants.PolicyOperationPromoteValue)]
|
||||
public async Task ValidateAccessTokenHandler_AllowsPolicyAttestationWithMetadata(string scope, string expectedOperation)
|
||||
{
|
||||
var clientDocument = CreateClient(tenant: "tenant-alpha");
|
||||
var tokenStore = new TestTokenStore
|
||||
{
|
||||
Inserted = new AuthorityTokenDocument
|
||||
{
|
||||
TokenId = $"token-{expectedOperation}",
|
||||
Status = "valid",
|
||||
ClientId = clientDocument.ClientId,
|
||||
Tenant = "tenant-alpha",
|
||||
Scope = new List<string> { scope },
|
||||
CreatedAt = DateTimeOffset.UtcNow
|
||||
}
|
||||
};
|
||||
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var auditSink = new TestAuthEventSink();
|
||||
var sessionAccessor = new NullMongoSessionAccessor();
|
||||
var handler = new ValidateAccessTokenHandler(
|
||||
tokenStore,
|
||||
sessionAccessor,
|
||||
new TestClientStore(clientDocument),
|
||||
CreateRegistry(withClientProvisioning: true, clientDescriptor: CreateDescriptor(clientDocument)),
|
||||
metadataAccessor,
|
||||
auditSink,
|
||||
TimeProvider.System,
|
||||
ActivitySource,
|
||||
NullLogger<ValidateAccessTokenHandler>.Instance);
|
||||
|
||||
var transaction = new OpenIddictServerTransaction
|
||||
{
|
||||
Options = new OpenIddictServerOptions(),
|
||||
EndpointType = OpenIddictServerEndpointType.Token,
|
||||
Request = new OpenIddictRequest()
|
||||
};
|
||||
|
||||
var principal = CreatePrincipal(clientDocument.ClientId, $"token-{expectedOperation}", clientDocument.Plugin);
|
||||
principal.SetScopes(scope);
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyOperation, expectedOperation);
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyDigest, new string('b', 64));
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyReason, "Promotion approved");
|
||||
principal.SetClaim(StellaOpsClaimTypes.PolicyTicket, "CR-2002");
|
||||
principal.SetClaim(OpenIddictConstants.Claims.AuthenticationTime, DateTimeOffset.UtcNow.ToUnixTimeSeconds().ToString(CultureInfo.InvariantCulture));
|
||||
|
||||
var context = new OpenIddictServerEvents.ValidateTokenContext(transaction)
|
||||
{
|
||||
Principal = principal,
|
||||
TokenId = $"token-{expectedOperation}"
|
||||
};
|
||||
|
||||
await handler.HandleAsync(context);
|
||||
|
||||
Assert.False(context.IsRejected);
|
||||
var metadata = metadataAccessor.GetMetadata();
|
||||
Assert.NotNull(metadata);
|
||||
Assert.True(metadata!.Tags.TryGetValue("authority.policy_attestation_validated", out var tagValue));
|
||||
Assert.Equal(expectedOperation.ToLowerInvariant(), tagValue);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidateRefreshTokenHandler_RejectsObsIncidentScope()
|
||||
{
|
||||
|
||||
@@ -203,6 +203,137 @@ public class PasswordGrantHandlersTests
|
||||
Assert.Contains(sink.Events, record => record.EventType == "authority.password.grant" && record.Outcome == AuthEventOutcome.Failure);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidatePasswordGrant_RejectsPolicyPublishWithoutReason()
|
||||
{
|
||||
var sink = new TestAuthEventSink();
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var registry = CreateRegistry(new SuccessCredentialStore());
|
||||
var clientStore = new StubClientStore(CreateClientDocument("policy:publish"));
|
||||
var validate = new ValidatePasswordGrantHandler(registry, TestActivitySource, sink, metadataAccessor, clientStore, TimeProvider.System, NullLogger<ValidatePasswordGrantHandler>.Instance);
|
||||
|
||||
var transaction = CreatePasswordTransaction("alice", "Password1!", "policy:publish");
|
||||
transaction.Request.SetParameter("policy_ticket", "CR-1001");
|
||||
transaction.Request.SetParameter("policy_digest", new string('a', 64));
|
||||
var context = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
|
||||
await validate.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidRequest, context.Error);
|
||||
Assert.Equal("Policy attestation actions require 'policy_reason'.", context.ErrorDescription);
|
||||
Assert.Equal(StellaOpsScopes.PolicyPublish, context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty]);
|
||||
Assert.Contains(sink.Events, record =>
|
||||
record.EventType == "authority.password.grant" &&
|
||||
record.Outcome == AuthEventOutcome.Failure &&
|
||||
record.Properties.Any(property => property.Name == "policy.action"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidatePasswordGrant_RejectsPolicyPublishWithoutTicket()
|
||||
{
|
||||
var sink = new TestAuthEventSink();
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var registry = CreateRegistry(new SuccessCredentialStore());
|
||||
var clientStore = new StubClientStore(CreateClientDocument("policy:publish"));
|
||||
var validate = new ValidatePasswordGrantHandler(registry, TestActivitySource, sink, metadataAccessor, clientStore, TimeProvider.System, NullLogger<ValidatePasswordGrantHandler>.Instance);
|
||||
|
||||
var transaction = CreatePasswordTransaction("alice", "Password1!", "policy:publish");
|
||||
transaction.Request.SetParameter("policy_reason", "Publish approved policy");
|
||||
transaction.Request.SetParameter("policy_digest", new string('b', 64));
|
||||
var context = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
|
||||
await validate.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidRequest, context.Error);
|
||||
Assert.Equal("Policy attestation actions require 'policy_ticket'.", context.ErrorDescription);
|
||||
Assert.Equal(StellaOpsScopes.PolicyPublish, context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty]);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidatePasswordGrant_RejectsPolicyPublishWithoutDigest()
|
||||
{
|
||||
var sink = new TestAuthEventSink();
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var registry = CreateRegistry(new SuccessCredentialStore());
|
||||
var clientStore = new StubClientStore(CreateClientDocument("policy:publish"));
|
||||
var validate = new ValidatePasswordGrantHandler(registry, TestActivitySource, sink, metadataAccessor, clientStore, TimeProvider.System, NullLogger<ValidatePasswordGrantHandler>.Instance);
|
||||
|
||||
var transaction = CreatePasswordTransaction("alice", "Password1!", "policy:publish");
|
||||
transaction.Request.SetParameter("policy_reason", "Publish approved policy");
|
||||
transaction.Request.SetParameter("policy_ticket", "CR-1002");
|
||||
var context = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
|
||||
await validate.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidRequest, context.Error);
|
||||
Assert.Equal("Policy attestation actions require 'policy_digest'.", context.ErrorDescription);
|
||||
Assert.Equal(StellaOpsScopes.PolicyPublish, context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty]);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidatePasswordGrant_RejectsPolicyPublishWithInvalidDigest()
|
||||
{
|
||||
var sink = new TestAuthEventSink();
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var registry = CreateRegistry(new SuccessCredentialStore());
|
||||
var clientStore = new StubClientStore(CreateClientDocument("policy:publish"));
|
||||
var validate = new ValidatePasswordGrantHandler(registry, TestActivitySource, sink, metadataAccessor, clientStore, TimeProvider.System, NullLogger<ValidatePasswordGrantHandler>.Instance);
|
||||
|
||||
var transaction = CreatePasswordTransaction("alice", "Password1!", "policy:publish");
|
||||
transaction.Request.SetParameter("policy_reason", "Publish approved policy");
|
||||
transaction.Request.SetParameter("policy_ticket", "CR-1003");
|
||||
transaction.Request.SetParameter("policy_digest", "not-hex");
|
||||
var context = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
|
||||
await validate.HandleAsync(context);
|
||||
|
||||
Assert.True(context.IsRejected);
|
||||
Assert.Equal(OpenIddictConstants.Errors.InvalidRequest, context.Error);
|
||||
Assert.Equal("policy_digest must be a hexadecimal string between 32 and 128 characters.", context.ErrorDescription);
|
||||
Assert.Equal(StellaOpsScopes.PolicyPublish, context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty]);
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("policy:publish", AuthorityOpenIddictConstants.PolicyOperationPublishValue)]
|
||||
[InlineData("policy:promote", AuthorityOpenIddictConstants.PolicyOperationPromoteValue)]
|
||||
public async Task HandlePasswordGrant_AddsPolicyAttestationClaims(string scope, string expectedOperation)
|
||||
{
|
||||
var sink = new TestAuthEventSink();
|
||||
var metadataAccessor = new TestRateLimiterMetadataAccessor();
|
||||
var registry = CreateRegistry(new SuccessCredentialStore());
|
||||
var clientDocument = CreateClientDocument(scope);
|
||||
var clientStore = new StubClientStore(clientDocument);
|
||||
|
||||
var validate = new ValidatePasswordGrantHandler(registry, TestActivitySource, sink, metadataAccessor, clientStore, TimeProvider.System, NullLogger<ValidatePasswordGrantHandler>.Instance);
|
||||
var handle = new HandlePasswordGrantHandler(registry, clientStore, TestActivitySource, sink, metadataAccessor, TimeProvider.System, NullLogger<HandlePasswordGrantHandler>.Instance);
|
||||
|
||||
var transaction = CreatePasswordTransaction("alice", "Password1!", scope);
|
||||
transaction.Request.SetParameter("policy_reason", "Promote approved policy");
|
||||
transaction.Request.SetParameter("policy_ticket", "CR-1004");
|
||||
transaction.Request.SetParameter("policy_digest", new string('c', 64));
|
||||
|
||||
var validateContext = new OpenIddictServerEvents.ValidateTokenRequestContext(transaction);
|
||||
await validate.HandleAsync(validateContext);
|
||||
Assert.False(validateContext.IsRejected);
|
||||
|
||||
var handleContext = new OpenIddictServerEvents.HandleTokenRequestContext(transaction);
|
||||
await handle.HandleAsync(handleContext);
|
||||
|
||||
Assert.False(handleContext.IsRejected);
|
||||
var principal = Assert.IsType<ClaimsPrincipal>(handleContext.Principal);
|
||||
Assert.Equal(expectedOperation, principal.GetClaim(StellaOpsClaimTypes.PolicyOperation));
|
||||
Assert.Equal(new string('c', 64), principal.GetClaim(StellaOpsClaimTypes.PolicyDigest));
|
||||
Assert.Equal("Promote approved policy", principal.GetClaim(StellaOpsClaimTypes.PolicyReason));
|
||||
Assert.Equal("CR-1004", principal.GetClaim(StellaOpsClaimTypes.PolicyTicket));
|
||||
Assert.Contains(sink.Events, record =>
|
||||
record.EventType == "authority.password.grant" &&
|
||||
record.Outcome == AuthEventOutcome.Success &&
|
||||
record.Properties.Any(property => property.Name == "policy.action"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ValidatePasswordGrant_RejectsPolicyAuthorWithoutTenant()
|
||||
{
|
||||
|
||||
@@ -15,7 +15,7 @@
|
||||
<PackageReference Include="MongoDB.Driver" Version="3.5.0" />
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<Compile Include="../../../tests/shared/OpenSslLegacyShim.cs" Link="Infrastructure/OpenSslLegacyShim.cs" />
|
||||
<None Include="../../../tests/native/openssl-1.1/linux-x64/*" Link="native/linux-x64/%(Filename)%(Extension)" CopyToOutputDirectory="PreserveNewest" />
|
||||
<Compile Include="../../../../tests/shared/OpenSslLegacyShim.cs" Link="Infrastructure/OpenSslLegacyShim.cs" />
|
||||
<None Include="../../../../tests/native/openssl-1.1/linux-x64/*" Link="native/linux-x64/%(Filename)%(Extension)" CopyToOutputDirectory="PreserveNewest" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
|
||||
@@ -46,4 +46,14 @@ internal static class AuthorityOpenIddictConstants
|
||||
internal const string BackfillTicketProperty = "authority:backfill_ticket";
|
||||
internal const string BackfillReasonParameterName = "backfill_reason";
|
||||
internal const string BackfillTicketParameterName = "backfill_ticket";
|
||||
internal const string PolicyReasonProperty = "authority:policy_reason";
|
||||
internal const string PolicyTicketProperty = "authority:policy_ticket";
|
||||
internal const string PolicyDigestProperty = "authority:policy_digest";
|
||||
internal const string PolicyOperationProperty = "authority:policy_operation";
|
||||
internal const string PolicyAuditPropertiesProperty = "authority:policy_audit_properties";
|
||||
internal const string PolicyReasonParameterName = "policy_reason";
|
||||
internal const string PolicyTicketParameterName = "policy_ticket";
|
||||
internal const string PolicyDigestParameterName = "policy_digest";
|
||||
internal const string PolicyOperationPublishValue = "publish";
|
||||
internal const string PolicyOperationPromoteValue = "promote";
|
||||
}
|
||||
|
||||
@@ -314,6 +314,8 @@ internal sealed class ValidateClientCredentialsHandler : IOpenIddictServerHandle
|
||||
var hasPolicyAuthor = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyAuthor) >= 0;
|
||||
var hasPolicyReview = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyReview) >= 0;
|
||||
var hasPolicyOperate = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyOperate) >= 0;
|
||||
var hasPolicyPublish = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyPublish) >= 0;
|
||||
var hasPolicyPromote = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyPromote) >= 0;
|
||||
var hasPolicyAudit = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyAudit) >= 0;
|
||||
var hasPolicyApprove = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyApprove) >= 0;
|
||||
var hasPolicyRun = grantedScopes.Length > 0 && Array.IndexOf(grantedScopes, StellaOpsScopes.PolicyRun) >= 0;
|
||||
@@ -327,6 +329,8 @@ internal sealed class ValidateClientCredentialsHandler : IOpenIddictServerHandle
|
||||
var policyStudioScopesRequested = hasPolicyAuthor
|
||||
|| hasPolicyReview
|
||||
|| hasPolicyOperate
|
||||
|| hasPolicyPublish
|
||||
|| hasPolicyPromote
|
||||
|| hasPolicyAudit
|
||||
|| hasPolicyApprove
|
||||
|| hasPolicyRun
|
||||
@@ -662,6 +666,20 @@ internal sealed class ValidateClientCredentialsHandler : IOpenIddictServerHandle
|
||||
return;
|
||||
}
|
||||
|
||||
if (hasPolicyPublish || hasPolicyPromote)
|
||||
{
|
||||
var restrictedScope = hasPolicyPublish ? StellaOpsScopes.PolicyPublish : StellaOpsScopes.PolicyPromote;
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty] = restrictedScope;
|
||||
activity?.SetTag("authority.policy_attestation_denied", restrictedScope);
|
||||
var message = $"Scope '{restrictedScope}' requires interactive authentication.";
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidScope, message);
|
||||
logger.LogWarning(
|
||||
"Client credentials validation failed for {ClientId}: {Scope} is restricted to interactive authentication.",
|
||||
document.ClientId,
|
||||
restrictedScope);
|
||||
return;
|
||||
}
|
||||
|
||||
if (policyStudioScopesRequested && !EnsureTenantAssigned())
|
||||
{
|
||||
var policyScopeForAudit =
|
||||
|
||||
@@ -202,7 +202,29 @@ internal sealed class ValidatePasswordGrantHandler : IOpenIddictServerHandler<Op
|
||||
=> scopes.Length > 0 && Array.IndexOf(scopes, scope) >= 0;
|
||||
static string? Normalize(string? value) => string.IsNullOrWhiteSpace(value) ? null : value.Trim();
|
||||
|
||||
static bool IsHexString(string value)
|
||||
{
|
||||
if (string.IsNullOrEmpty(value))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
foreach (var ch in value)
|
||||
{
|
||||
if (!System.Uri.IsHexDigit(ch))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
const int IncidentReasonMaxLength = 512;
|
||||
const int PolicyReasonMaxLength = 512;
|
||||
const int PolicyTicketMaxLength = 128;
|
||||
const int PolicyDigestMinLength = 32;
|
||||
const int PolicyDigestMaxLength = 128;
|
||||
|
||||
var hasAdvisoryIngest = ContainsScope(grantedScopesArray, StellaOpsScopes.AdvisoryIngest);
|
||||
var hasAdvisoryRead = ContainsScope(grantedScopesArray, StellaOpsScopes.AdvisoryRead);
|
||||
@@ -221,6 +243,8 @@ internal sealed class ValidatePasswordGrantHandler : IOpenIddictServerHandler<Op
|
||||
var hasPolicyAuthor = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyAuthor);
|
||||
var hasPolicyReview = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyReview);
|
||||
var hasPolicyOperate = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyOperate);
|
||||
var hasPolicyPublish = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyPublish);
|
||||
var hasPolicyPromote = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyPromote);
|
||||
var hasPolicyAudit = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyAudit);
|
||||
var hasPolicyApprove = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyApprove);
|
||||
var hasPolicyRun = ContainsScope(grantedScopesArray, StellaOpsScopes.PolicyRun);
|
||||
@@ -230,6 +254,8 @@ internal sealed class ValidatePasswordGrantHandler : IOpenIddictServerHandler<Op
|
||||
var policyStudioScopesRequested = hasPolicyAuthor
|
||||
|| hasPolicyReview
|
||||
|| hasPolicyOperate
|
||||
|| hasPolicyPublish
|
||||
|| hasPolicyPromote
|
||||
|| hasPolicyAudit
|
||||
|| hasPolicyApprove
|
||||
|| hasPolicyRun
|
||||
@@ -501,6 +527,126 @@ internal sealed class ValidatePasswordGrantHandler : IOpenIddictServerHandler<Op
|
||||
return;
|
||||
}
|
||||
|
||||
if (hasPolicyPublish || hasPolicyPromote)
|
||||
{
|
||||
var restrictedScope = hasPolicyPublish ? StellaOpsScopes.PolicyPublish : StellaOpsScopes.PolicyPromote;
|
||||
var policyOperation = hasPolicyPublish
|
||||
? AuthorityOpenIddictConstants.PolicyOperationPublishValue
|
||||
: AuthorityOpenIddictConstants.PolicyOperationPromoteValue;
|
||||
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.PolicyOperationProperty] = policyOperation;
|
||||
activity?.SetTag("authority.policy_action", policyOperation);
|
||||
|
||||
var digestRaw = Normalize(context.Request.GetParameter(AuthorityOpenIddictConstants.PolicyDigestParameterName)?.Value?.ToString());
|
||||
var reasonRaw = Normalize(context.Request.GetParameter(AuthorityOpenIddictConstants.PolicyReasonParameterName)?.Value?.ToString());
|
||||
var ticketRaw = Normalize(context.Request.GetParameter(AuthorityOpenIddictConstants.PolicyTicketParameterName)?.Value?.ToString());
|
||||
|
||||
var policyAuditProperties = new List<AuthEventProperty>
|
||||
{
|
||||
new()
|
||||
{
|
||||
Name = "policy.action",
|
||||
Value = ClassifiedString.Public(policyOperation)
|
||||
}
|
||||
};
|
||||
|
||||
async ValueTask RejectPolicyAsync(string message)
|
||||
{
|
||||
activity?.SetTag("authority.policy_attestation_denied", message);
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.AuditInvalidScopeProperty] = restrictedScope;
|
||||
var record = PasswordGrantAuditHelper.CreatePasswordGrantRecord(
|
||||
timeProvider,
|
||||
context.Transaction,
|
||||
metadata,
|
||||
AuthEventOutcome.Failure,
|
||||
message,
|
||||
clientId,
|
||||
providerName: null,
|
||||
tenant,
|
||||
user: null,
|
||||
username: context.Request.Username,
|
||||
scopes: grantedScopesArray,
|
||||
retryAfter: null,
|
||||
failureCode: AuthorityCredentialFailureCode.InvalidCredentials,
|
||||
extraProperties: policyAuditProperties);
|
||||
|
||||
await auditSink.WriteAsync(record, context.CancellationToken).ConfigureAwait(false);
|
||||
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidRequest, message);
|
||||
logger.LogWarning(
|
||||
"Password grant validation failed for {Username}: {Message}.",
|
||||
context.Request.Username,
|
||||
message);
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(reasonRaw))
|
||||
{
|
||||
await RejectPolicyAsync("Policy attestation actions require 'policy_reason'.").ConfigureAwait(false);
|
||||
return;
|
||||
}
|
||||
|
||||
if (reasonRaw.Length > PolicyReasonMaxLength)
|
||||
{
|
||||
await RejectPolicyAsync($"policy_reason must not exceed {PolicyReasonMaxLength} characters.").ConfigureAwait(false);
|
||||
return;
|
||||
}
|
||||
|
||||
policyAuditProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.reason",
|
||||
Value = ClassifiedString.Sensitive(reasonRaw)
|
||||
});
|
||||
|
||||
if (string.IsNullOrWhiteSpace(ticketRaw))
|
||||
{
|
||||
await RejectPolicyAsync("Policy attestation actions require 'policy_ticket'.").ConfigureAwait(false);
|
||||
return;
|
||||
}
|
||||
|
||||
if (ticketRaw.Length > PolicyTicketMaxLength)
|
||||
{
|
||||
await RejectPolicyAsync($"policy_ticket must not exceed {PolicyTicketMaxLength} characters.").ConfigureAwait(false);
|
||||
return;
|
||||
}
|
||||
|
||||
policyAuditProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.ticket",
|
||||
Value = ClassifiedString.Sensitive(ticketRaw)
|
||||
});
|
||||
|
||||
if (string.IsNullOrWhiteSpace(digestRaw))
|
||||
{
|
||||
await RejectPolicyAsync("Policy attestation actions require 'policy_digest'.").ConfigureAwait(false);
|
||||
return;
|
||||
}
|
||||
|
||||
var digestNormalized = digestRaw.ToLowerInvariant();
|
||||
if (digestNormalized.Length < PolicyDigestMinLength ||
|
||||
digestNormalized.Length > PolicyDigestMaxLength ||
|
||||
!IsHexString(digestNormalized))
|
||||
{
|
||||
await RejectPolicyAsync(
|
||||
$"policy_digest must be a hexadecimal string between {PolicyDigestMinLength} and {PolicyDigestMaxLength} characters.")
|
||||
.ConfigureAwait(false);
|
||||
return;
|
||||
}
|
||||
|
||||
policyAuditProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.digest",
|
||||
Value = ClassifiedString.Sensitive(digestNormalized)
|
||||
});
|
||||
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.PolicyReasonProperty] = reasonRaw;
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.PolicyTicketProperty] = ticketRaw;
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.PolicyDigestProperty] = digestNormalized;
|
||||
context.Transaction.Properties[AuthorityOpenIddictConstants.PolicyAuditPropertiesProperty] = policyAuditProperties;
|
||||
activity?.SetTag("authority.policy_reason_present", true);
|
||||
activity?.SetTag("authority.policy_ticket_present", true);
|
||||
activity?.SetTag("authority.policy_digest_present", true);
|
||||
}
|
||||
|
||||
var unexpectedParameters = TokenRequestTamperInspector.GetUnexpectedPasswordGrantParameters(context.Request);
|
||||
if (unexpectedParameters.Count > 0)
|
||||
{
|
||||
@@ -926,6 +1072,34 @@ internal sealed class HandlePasswordGrantHandler : IOpenIddictServerHandler<Open
|
||||
activity?.SetTag("authority.incident_reason_present", true);
|
||||
}
|
||||
|
||||
if (context.Transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyOperationProperty, out var policyOperationObj) &&
|
||||
policyOperationObj is string policyOperationValue &&
|
||||
!string.IsNullOrWhiteSpace(policyOperationValue))
|
||||
{
|
||||
identity.SetClaim(StellaOpsClaimTypes.PolicyOperation, policyOperationValue);
|
||||
}
|
||||
|
||||
if (context.Transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyDigestProperty, out var policyDigestObj) &&
|
||||
policyDigestObj is string policyDigestValue &&
|
||||
!string.IsNullOrWhiteSpace(policyDigestValue))
|
||||
{
|
||||
identity.SetClaim(StellaOpsClaimTypes.PolicyDigest, policyDigestValue);
|
||||
}
|
||||
|
||||
if (context.Transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyReasonProperty, out var policyReasonObj) &&
|
||||
policyReasonObj is string policyReasonValue &&
|
||||
!string.IsNullOrWhiteSpace(policyReasonValue))
|
||||
{
|
||||
identity.SetClaim(StellaOpsClaimTypes.PolicyReason, policyReasonValue);
|
||||
}
|
||||
|
||||
if (context.Transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyTicketProperty, out var policyTicketObj) &&
|
||||
policyTicketObj is string policyTicketValue &&
|
||||
!string.IsNullOrWhiteSpace(policyTicketValue))
|
||||
{
|
||||
identity.SetClaim(StellaOpsClaimTypes.PolicyTicket, policyTicketValue);
|
||||
}
|
||||
|
||||
var issuedAt = timeProvider.GetUtcNow();
|
||||
identity.SetClaim(OpenIddictConstants.Claims.AuthenticationTime, issuedAt.ToUnixTimeSeconds().ToString(CultureInfo.InvariantCulture));
|
||||
|
||||
@@ -968,6 +1142,69 @@ internal sealed class HandlePasswordGrantHandler : IOpenIddictServerHandler<Open
|
||||
});
|
||||
}
|
||||
|
||||
if (context.Transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyAuditPropertiesProperty, out var successPolicyAuditObj) &&
|
||||
successPolicyAuditObj is List<AuthEventProperty> policyAuditProps &&
|
||||
policyAuditProps.Count > 0)
|
||||
{
|
||||
successProperties ??= new List<AuthEventProperty>();
|
||||
|
||||
foreach (var property in policyAuditProps)
|
||||
{
|
||||
if (property is null || string.IsNullOrWhiteSpace(property.Name))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
successProperties.Add(property);
|
||||
}
|
||||
|
||||
context.Transaction.Properties.Remove(AuthorityOpenIddictConstants.PolicyAuditPropertiesProperty);
|
||||
}
|
||||
|
||||
var principalPolicyOperation = principal.GetClaim(StellaOpsClaimTypes.PolicyOperation);
|
||||
if (!string.IsNullOrWhiteSpace(principalPolicyOperation))
|
||||
{
|
||||
successProperties ??= new List<AuthEventProperty>();
|
||||
successProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.action",
|
||||
Value = ClassifiedString.Public(principalPolicyOperation)
|
||||
});
|
||||
}
|
||||
|
||||
var principalPolicyDigest = principal.GetClaim(StellaOpsClaimTypes.PolicyDigest);
|
||||
if (!string.IsNullOrWhiteSpace(principalPolicyDigest))
|
||||
{
|
||||
successProperties ??= new List<AuthEventProperty>();
|
||||
successProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.digest",
|
||||
Value = ClassifiedString.Sensitive(principalPolicyDigest)
|
||||
});
|
||||
}
|
||||
|
||||
var principalPolicyReason = principal.GetClaim(StellaOpsClaimTypes.PolicyReason);
|
||||
if (!string.IsNullOrWhiteSpace(principalPolicyReason))
|
||||
{
|
||||
successProperties ??= new List<AuthEventProperty>();
|
||||
successProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.reason",
|
||||
Value = ClassifiedString.Sensitive(principalPolicyReason)
|
||||
});
|
||||
}
|
||||
|
||||
var principalPolicyTicket = principal.GetClaim(StellaOpsClaimTypes.PolicyTicket);
|
||||
if (!string.IsNullOrWhiteSpace(principalPolicyTicket))
|
||||
{
|
||||
successProperties ??= new List<AuthEventProperty>();
|
||||
successProperties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.ticket",
|
||||
Value = ClassifiedString.Sensitive(principalPolicyTicket)
|
||||
});
|
||||
}
|
||||
|
||||
var successRecord = PasswordGrantAuditHelper.CreatePasswordGrantRecord(
|
||||
timeProvider,
|
||||
context.Transaction,
|
||||
@@ -1039,23 +1276,27 @@ internal static class PasswordGrantAuditHelper
|
||||
var subject = BuildSubject(user, username, providerName);
|
||||
var client = BuildClient(clientId, providerName);
|
||||
var network = BuildNetwork(metadata);
|
||||
var properties = BuildProperties(user, retryAfter, failureCode, extraProperties);
|
||||
|
||||
return new AuthEventRecord
|
||||
{
|
||||
EventType = string.IsNullOrWhiteSpace(eventType) ? "authority.password.grant" : eventType,
|
||||
OccurredAt = timeProvider.GetUtcNow(),
|
||||
var properties = BuildProperties(user, retryAfter, failureCode, extraProperties);
|
||||
var mutableProperties = properties.Count == 0
|
||||
? new List<AuthEventProperty>()
|
||||
: new List<AuthEventProperty>(properties);
|
||||
AppendPolicyMetadata(transaction, mutableProperties);
|
||||
|
||||
return new AuthEventRecord
|
||||
{
|
||||
EventType = string.IsNullOrWhiteSpace(eventType) ? "authority.password.grant" : eventType,
|
||||
OccurredAt = timeProvider.GetUtcNow(),
|
||||
CorrelationId = correlationId,
|
||||
Outcome = outcome,
|
||||
Reason = Normalize(reason),
|
||||
Subject = subject,
|
||||
Client = client,
|
||||
Scopes = normalizedScopes,
|
||||
Network = network,
|
||||
Tenant = ClassifiedString.Public(normalizedTenant),
|
||||
Properties = properties
|
||||
};
|
||||
}
|
||||
Subject = subject,
|
||||
Client = client,
|
||||
Scopes = normalizedScopes,
|
||||
Network = network,
|
||||
Tenant = ClassifiedString.Public(normalizedTenant),
|
||||
Properties = mutableProperties.Count == 0 ? Array.Empty<AuthEventProperty>() : mutableProperties
|
||||
};
|
||||
}
|
||||
|
||||
private static AuthEventSubject? BuildSubject(AuthorityUserDescriptor? user, string? username, string? providerName)
|
||||
{
|
||||
@@ -1193,16 +1434,63 @@ internal static class PasswordGrantAuditHelper
|
||||
|
||||
properties.Add(property);
|
||||
}
|
||||
}
|
||||
|
||||
return properties.Count == 0 ? Array.Empty<AuthEventProperty>() : properties;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<string> NormalizeScopes(IEnumerable<string>? scopes)
|
||||
{
|
||||
if (scopes is null)
|
||||
{
|
||||
return Array.Empty<string>();
|
||||
}
|
||||
|
||||
return properties.Count == 0 ? Array.Empty<AuthEventProperty>() : properties;
|
||||
}
|
||||
|
||||
private static void AppendPolicyMetadata(OpenIddictServerTransaction transaction, List<AuthEventProperty> properties)
|
||||
{
|
||||
if (transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyOperationProperty, out var operationObj) &&
|
||||
operationObj is string operationValue &&
|
||||
!string.IsNullOrWhiteSpace(operationValue))
|
||||
{
|
||||
properties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.action",
|
||||
Value = ClassifiedString.Public(operationValue)
|
||||
});
|
||||
}
|
||||
|
||||
if (transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyDigestProperty, out var digestObj) &&
|
||||
digestObj is string digestValue &&
|
||||
!string.IsNullOrWhiteSpace(digestValue))
|
||||
{
|
||||
properties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.digest",
|
||||
Value = ClassifiedString.Sensitive(digestValue)
|
||||
});
|
||||
}
|
||||
|
||||
if (transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyReasonProperty, out var reasonObj) &&
|
||||
reasonObj is string reasonValue &&
|
||||
!string.IsNullOrWhiteSpace(reasonValue))
|
||||
{
|
||||
properties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.reason",
|
||||
Value = ClassifiedString.Sensitive(reasonValue)
|
||||
});
|
||||
}
|
||||
|
||||
if (transaction.Properties.TryGetValue(AuthorityOpenIddictConstants.PolicyTicketProperty, out var ticketObj) &&
|
||||
ticketObj is string ticketValue &&
|
||||
!string.IsNullOrWhiteSpace(ticketValue))
|
||||
{
|
||||
properties.Add(new AuthEventProperty
|
||||
{
|
||||
Name = "policy.ticket",
|
||||
Value = ClassifiedString.Sensitive(ticketValue)
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
private static IReadOnlyList<string> NormalizeScopes(IEnumerable<string>? scopes)
|
||||
{
|
||||
if (scopes is null)
|
||||
{
|
||||
return Array.Empty<string>();
|
||||
}
|
||||
|
||||
var normalized = scopes
|
||||
@@ -1216,11 +1504,11 @@ internal static class PasswordGrantAuditHelper
|
||||
return normalized.Length == 0 ? Array.Empty<string>() : normalized;
|
||||
}
|
||||
|
||||
private static string? Normalize(string? value)
|
||||
=> string.IsNullOrWhiteSpace(value) ? null : value.Trim();
|
||||
|
||||
internal static string? NormalizeTenant(string? value)
|
||||
=> string.IsNullOrWhiteSpace(value) ? null : value.Trim().ToLowerInvariant();
|
||||
private static string? Normalize(string? value)
|
||||
=> string.IsNullOrWhiteSpace(value) ? null : value.Trim();
|
||||
|
||||
internal static string? NormalizeTenant(string? value)
|
||||
=> string.IsNullOrWhiteSpace(value) ? null : value.Trim().ToLowerInvariant();
|
||||
|
||||
internal static AuthEventRecord CreateTamperRecord(
|
||||
TimeProvider timeProvider,
|
||||
|
||||
@@ -33,6 +33,7 @@ internal sealed class ValidateAccessTokenHandler : IOpenIddictServerHandler<Open
|
||||
private readonly ActivitySource activitySource;
|
||||
private readonly ILogger<ValidateAccessTokenHandler> logger;
|
||||
private static readonly TimeSpan IncidentFreshAuthWindow = TimeSpan.FromMinutes(5);
|
||||
private static readonly TimeSpan PolicyAttestationFreshAuthWindow = TimeSpan.FromMinutes(5);
|
||||
|
||||
public ValidateAccessTokenHandler(
|
||||
IAuthorityTokenStore tokenStore,
|
||||
@@ -363,6 +364,69 @@ internal sealed class ValidateAccessTokenHandler : IOpenIddictServerHandler<Open
|
||||
metadataAccessor.SetTag("authority.incident_scope_validated", "true");
|
||||
}
|
||||
|
||||
if (context.Principal.HasScope(StellaOpsScopes.PolicyPublish) ||
|
||||
context.Principal.HasScope(StellaOpsScopes.PolicyPromote))
|
||||
{
|
||||
var policyOperation = identity.GetClaim(StellaOpsClaimTypes.PolicyOperation);
|
||||
if (string.IsNullOrWhiteSpace(policyOperation) ||
|
||||
(!string.Equals(policyOperation, AuthorityOpenIddictConstants.PolicyOperationPublishValue, StringComparison.OrdinalIgnoreCase) &&
|
||||
!string.Equals(policyOperation, AuthorityOpenIddictConstants.PolicyOperationPromoteValue, StringComparison.OrdinalIgnoreCase)))
|
||||
{
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidToken, "policy attestation tokens require a valid policy_operation claim.");
|
||||
logger.LogWarning("Access token validation failed: policy attestation token missing/invalid operation. ClientId={ClientId}", clientId ?? "<unknown>");
|
||||
return;
|
||||
}
|
||||
|
||||
var policyDigest = identity.GetClaim(StellaOpsClaimTypes.PolicyDigest);
|
||||
if (string.IsNullOrWhiteSpace(policyDigest))
|
||||
{
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidToken, "policy attestation tokens require policy_digest claim.");
|
||||
logger.LogWarning("Access token validation failed: policy attestation token missing digest. ClientId={ClientId}", clientId ?? "<unknown>");
|
||||
return;
|
||||
}
|
||||
|
||||
var policyReason = identity.GetClaim(StellaOpsClaimTypes.PolicyReason);
|
||||
if (string.IsNullOrWhiteSpace(policyReason))
|
||||
{
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidToken, "policy attestation tokens require policy_reason claim.");
|
||||
logger.LogWarning("Access token validation failed: policy attestation token missing reason. ClientId={ClientId}", clientId ?? "<unknown>");
|
||||
return;
|
||||
}
|
||||
|
||||
var policyTicket = identity.GetClaim(StellaOpsClaimTypes.PolicyTicket);
|
||||
if (string.IsNullOrWhiteSpace(policyTicket))
|
||||
{
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidToken, "policy attestation tokens require policy_ticket claim.");
|
||||
logger.LogWarning("Access token validation failed: policy attestation token missing ticket. ClientId={ClientId}", clientId ?? "<unknown>");
|
||||
return;
|
||||
}
|
||||
|
||||
var authTimeClaim = context.Principal.GetClaim(OpenIddictConstants.Claims.AuthenticationTime);
|
||||
if (string.IsNullOrWhiteSpace(authTimeClaim) ||
|
||||
!long.TryParse(authTimeClaim, NumberStyles.Integer, CultureInfo.InvariantCulture, out var attestationAuthTimeSeconds))
|
||||
{
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidToken, "policy attestation tokens require authentication_time claim.");
|
||||
logger.LogWarning("Access token validation failed: policy attestation token missing auth_time. ClientId={ClientId}", clientId ?? "<unknown>");
|
||||
return;
|
||||
}
|
||||
|
||||
var attestationAuthTime = DateTimeOffset.FromUnixTimeSeconds(attestationAuthTimeSeconds);
|
||||
var now = clock.GetUtcNow();
|
||||
if (now - attestationAuthTime > PolicyAttestationFreshAuthWindow)
|
||||
{
|
||||
context.Reject(OpenIddictConstants.Errors.InvalidToken, "policy attestation tokens require fresh authentication.");
|
||||
logger.LogWarning(
|
||||
"Access token validation failed: policy attestation token stale. ClientId={ClientId}; AuthTime={AuthTime:o}; Now={Now:o}; Window={Window}",
|
||||
clientId ?? "<unknown>",
|
||||
attestationAuthTime,
|
||||
now,
|
||||
PolicyAttestationFreshAuthWindow);
|
||||
return;
|
||||
}
|
||||
|
||||
metadataAccessor.SetTag("authority.policy_attestation_validated", policyOperation.ToLowerInvariant());
|
||||
}
|
||||
|
||||
var enrichmentContext = new AuthorityClaimsEnrichmentContext(provider.Context, user, client);
|
||||
await provider.ClaimsEnricher.EnrichAsync(identity, enrichmentContext, context.CancellationToken).ConfigureAwait(false);
|
||||
logger.LogInformation("Access token validated for subject {Subject} and client {ClientId}.",
|
||||
|
||||
@@ -36,12 +36,15 @@ internal static class TokenRequestTamperInspector
|
||||
OpenIddictConstants.Parameters.IdToken
|
||||
};
|
||||
|
||||
private static readonly HashSet<string> PasswordGrantParameters = new(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
OpenIddictConstants.Parameters.Username,
|
||||
OpenIddictConstants.Parameters.Password,
|
||||
AuthorityOpenIddictConstants.ProviderParameterName
|
||||
};
|
||||
private static readonly HashSet<string> PasswordGrantParameters = new(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
OpenIddictConstants.Parameters.Username,
|
||||
OpenIddictConstants.Parameters.Password,
|
||||
AuthorityOpenIddictConstants.ProviderParameterName,
|
||||
AuthorityOpenIddictConstants.PolicyReasonParameterName,
|
||||
AuthorityOpenIddictConstants.PolicyTicketParameterName,
|
||||
AuthorityOpenIddictConstants.PolicyDigestParameterName
|
||||
};
|
||||
|
||||
private static readonly HashSet<string> ClientCredentialsParameters = new(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
|
||||
@@ -50,7 +50,8 @@ using System.Text;
|
||||
using StellaOps.Authority.Signing;
|
||||
using StellaOps.Cryptography;
|
||||
using StellaOps.Cryptography.Kms;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
using StellaOps.Authority.Storage.Mongo.Documents;
|
||||
using StellaOps.Authority.Storage.Mongo.Stores;
|
||||
using StellaOps.Authority.Security;
|
||||
using StellaOps.Authority.OpenApi;
|
||||
using StellaOps.Auth.Abstractions;
|
||||
@@ -383,9 +384,29 @@ builder.Services.Configure<OpenIddictServerOptions>(options =>
|
||||
|
||||
var app = builder.Build();
|
||||
|
||||
var mongoInitializer = app.Services.GetRequiredService<AuthorityMongoInitializer>();
|
||||
var mongoDatabase = app.Services.GetRequiredService<IMongoDatabase>();
|
||||
await mongoInitializer.InitialiseAsync(mongoDatabase, CancellationToken.None);
|
||||
var mongoInitializer = app.Services.GetRequiredService<AuthorityMongoInitializer>();
|
||||
var mongoDatabase = app.Services.GetRequiredService<IMongoDatabase>();
|
||||
await mongoInitializer.InitialiseAsync(mongoDatabase, CancellationToken.None);
|
||||
|
||||
var serviceAccountStore = app.Services.GetRequiredService<IAuthorityServiceAccountStore>();
|
||||
if (authorityOptions.Delegation.ServiceAccounts.Count > 0)
|
||||
{
|
||||
foreach (var seed in authorityOptions.Delegation.ServiceAccounts)
|
||||
{
|
||||
var document = new AuthorityServiceAccountDocument
|
||||
{
|
||||
AccountId = seed.AccountId,
|
||||
Tenant = seed.Tenant,
|
||||
DisplayName = string.IsNullOrWhiteSpace(seed.DisplayName) ? seed.AccountId : seed.DisplayName,
|
||||
Description = seed.Description,
|
||||
Enabled = seed.Enabled,
|
||||
AllowedScopes = seed.AllowedScopes.ToList(),
|
||||
AuthorizedClients = seed.AuthorizedClients.ToList()
|
||||
};
|
||||
|
||||
await serviceAccountStore.UpsertAsync(document, CancellationToken.None).ConfigureAwait(false);
|
||||
}
|
||||
}
|
||||
|
||||
var registrationSummary = app.Services.GetRequiredService<AuthorityPluginRegistrationSummary>();
|
||||
if (registrationSummary.RegisteredPlugins.Count > 0)
|
||||
@@ -1252,36 +1273,68 @@ if (authorityOptions.Bootstrap.Enabled)
|
||||
}
|
||||
});
|
||||
|
||||
bootstrapGroup.MapPost("/notifications/ack-tokens/rotate", (
|
||||
SigningRotationRequest? request,
|
||||
AuthorityAckTokenKeyManager ackManager,
|
||||
ILogger<AuthorityAckTokenKeyManager> ackLogger) =>
|
||||
{
|
||||
if (request is null)
|
||||
{
|
||||
ackLogger.LogWarning("Ack token rotation request payload missing.");
|
||||
return Results.BadRequest(new { error = "invalid_request", message = "Request payload is required." });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
request.KeyId = trimmedKeyId;
|
||||
request.Location = trimmedLocation;
|
||||
|
||||
logger.LogDebug(
|
||||
"Attempting ack token rotation with keyId='{KeyId}', location='{Location}', provider='{Provider}', source='{Source}', algorithm='{Algorithm}'",
|
||||
trimmedKeyId,
|
||||
trimmedLocation,
|
||||
request.Provider ?? ackOptions.Provider,
|
||||
request.Source ?? ackOptions.KeySource,
|
||||
request.Algorithm ?? ackOptions.Algorithm);
|
||||
|
||||
var result = ackManager.Rotate(request);
|
||||
ackLogger.LogInformation("Ack token key rotation completed. Active key {KeyId}.", result.ActiveKeyId);
|
||||
|
||||
return Results.Ok(new
|
||||
{
|
||||
activeKeyId = result.ActiveKeyId,
|
||||
bootstrapGroup.MapPost("/notifications/ack-tokens/rotate", (
|
||||
SigningRotationRequest? request,
|
||||
AuthorityAckTokenKeyManager ackManager,
|
||||
IOptions<StellaOpsAuthorityOptions> optionsAccessor,
|
||||
ILogger<AuthorityAckTokenKeyManager> ackLogger) =>
|
||||
{
|
||||
if (request is null)
|
||||
{
|
||||
ackLogger.LogWarning("Ack token rotation request payload missing.");
|
||||
return Results.BadRequest(new { error = "invalid_request", message = "Request payload is required." });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
var notifications = optionsAccessor.Value.Notifications ?? throw new InvalidOperationException("Authority notifications configuration is missing.");
|
||||
var ackOptions = notifications.AckTokens ?? throw new InvalidOperationException("Ack token configuration is missing.");
|
||||
|
||||
var trimmedKeyId = request.KeyId?.Trim();
|
||||
if (string.IsNullOrWhiteSpace(trimmedKeyId))
|
||||
{
|
||||
ackLogger.LogWarning("Ack token rotation rejected: missing keyId.");
|
||||
return Results.BadRequest(new { error = "invalid_request", message = "Ack token key rotation requires a keyId." });
|
||||
}
|
||||
|
||||
var trimmedLocation = request.Location?.Trim();
|
||||
if (string.IsNullOrWhiteSpace(trimmedLocation))
|
||||
{
|
||||
ackLogger.LogWarning("Ack token rotation rejected: missing key path/location.");
|
||||
return Results.BadRequest(new { error = "invalid_request", message = "Ack token key rotation requires a key path/location." });
|
||||
}
|
||||
|
||||
if (!ackOptions.Enabled)
|
||||
{
|
||||
ackLogger.LogWarning("Ack token rotation attempted while ack tokens are disabled.");
|
||||
return Results.BadRequest(new { error = "ack_tokens_disabled", message = "Ack tokens are disabled. Enable notifications.ackTokens before rotating keys." });
|
||||
}
|
||||
|
||||
request.KeyId = trimmedKeyId;
|
||||
request.Location = trimmedLocation;
|
||||
|
||||
var provider = request.Provider ?? ackOptions.Provider;
|
||||
var source = request.Source ?? ackOptions.KeySource;
|
||||
var algorithm = request.Algorithm ?? ackOptions.Algorithm;
|
||||
|
||||
ackLogger.LogDebug(
|
||||
"Attempting ack token rotation with keyId='{KeyId}', location='{Location}', provider='{Provider}', source='{Source}', algorithm='{Algorithm}'",
|
||||
trimmedKeyId,
|
||||
trimmedLocation,
|
||||
provider,
|
||||
source,
|
||||
algorithm);
|
||||
|
||||
request.Provider = provider;
|
||||
request.Source = source;
|
||||
request.Algorithm = algorithm;
|
||||
|
||||
var result = ackManager.Rotate(request);
|
||||
ackLogger.LogInformation("Ack token key rotation completed. Active key {KeyId}.", result.ActiveKeyId);
|
||||
|
||||
return Results.Ok(new
|
||||
{
|
||||
activeKeyId = result.ActiveKeyId,
|
||||
provider = result.ActiveProvider,
|
||||
source = result.ActiveSource,
|
||||
location = result.ActiveLocation,
|
||||
|
||||
@@ -69,9 +69,10 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
> 2025-10-31: Added Policy Studio scope family (`policy:author/review/operate/audit`), updated OpenAPI + discovery headers, enforced tenant requirements in grant handlers, seeded new roles in Authority config/offline kit docs, and refreshed CLI/Console documentation + tests to validate the new catalogue.
|
||||
| AUTH-POLICY-27-002 | TODO | Authority Core & Security Guild | AUTH-POLICY-27-001, REGISTRY-API-27-007 | Provide attestation signing service bindings (OIDC token exchange, cosign integration) and enforce publish/promote scope checks, fresh-auth requirements, and audit logging. | Publish/promote requests require fresh auth + correct scopes; attestations signed with validated identity; audit logs enriched with digest + tenant; integration tests pass. |
|
||||
| AUTH-POLICY-27-002 | DONE (2025-11-02) | Authority Core & Security Guild | AUTH-POLICY-27-001, REGISTRY-API-27-007 | Provide attestation signing service bindings (OIDC token exchange, cosign integration) and enforce publish/promote scope checks, fresh-auth requirements, and audit logging. | Publish/promote requests require fresh auth + correct scopes; attestations signed with validated identity; audit logs enriched with digest + tenant; integration tests pass. |
|
||||
> Docs dependency: `DOCS-POLICY-27-009` awaiting signing guidance from this work.
|
||||
| AUTH-POLICY-27-003 | TODO | Authority Core & Docs Guild | AUTH-POLICY-27-001, AUTH-POLICY-27-002 | Update Authority configuration/docs for Policy Studio roles, signing policies, approval workflows, and CLI integration; include compliance checklist. | Docs merged; samples validated; governance checklist appended; release notes updated. |
|
||||
> 2025-11-02: Added `policy:publish`/`policy:promote` scopes with interactive-only enforcement, metadata parameters (`policy_reason`, `policy_ticket`, `policy_digest`), fresh-auth token validation, audit augmentations, and updated config/docs references.
|
||||
| AUTH-POLICY-27-003 | DOING (2025-11-02) | Authority Core & Docs Guild | AUTH-POLICY-27-001, AUTH-POLICY-27-002 | Update Authority configuration/docs for Policy Studio roles, signing policies, approval workflows, and CLI integration; include compliance checklist. | Docs merged; samples validated; governance checklist appended; release notes updated. |
|
||||
|
||||
## Exceptions v1
|
||||
|
||||
@@ -111,6 +112,9 @@
|
||||
| AUTH-NOTIFY-38-001 | DONE (2025-11-01) | Authority Core & Security Guild | — | Define `Notify.Viewer`, `Notify.Operator`, `Notify.Admin` scopes/roles, update discovery metadata, offline defaults, and issuer templates. | Scopes available; metadata updated; tests ensure enforcement; offline kit defaults refreshed. |
|
||||
| AUTH-NOTIFY-40-001 | DONE (2025-11-02) | Authority Core & Security Guild | AUTH-NOTIFY-38-001, WEB-NOTIFY-40-001 | Implement signed ack token key rotation, webhook allowlists, admin-only escalation settings, and audit logging of ack actions. | Ack tokens signed/rotated; webhook allowlists enforced; admin enforcement validated; audit logs capture ack/resolution. |
|
||||
> 2025-11-02: `/notify/ack-tokens/rotate` exposed (notify.admin), emits `notify.ack.key_rotated|notify.ack.key_rotation_failed`, and DSSE rotation tests cover allowlist + scope enforcement.
|
||||
| AUTH-NOTIFY-42-001 | DONE (2025-11-02) | Authority Core & Security Guild | AUTH-NOTIFY-40-001 | Investigate ack token rotation 500 errors (test Rotate_ReturnsBadRequest_WhenKeyIdMissing_AndAuditsFailure still failing). Capture logs, identify root cause, and patch handler. | Failure mode understood; fix merged; regression test passes. |
|
||||
> 2025-11-02: Aliased `StellaOpsBearer` to the test auth handler, corrected bootstrap `/notifications/ack-tokens/rotate` defaults, and validated `Rotate_ReturnsBadRequest_WhenKeyIdMissing_AndAuditsFailure` via targeted `dotnet test`.
|
||||
|
||||
|
||||
## CLI Parity & Task Packs
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
@@ -125,7 +129,7 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
> 2025-10-28: Tidied advisory raw idempotency migration to avoid LINQ-on-`BsonValue` (explicit array copy) while continuing duplicate guardrail validation; scoped scanner/policy token call sites updated to honor new metadata parameter.
|
||||
| AUTH-TEN-49-001 | TODO | Authority Core & Security Guild | AUTH-TEN-47-001 | Implement service accounts & delegation tokens (`act` chain), per-tenant quotas, audit stream of auth decisions, and revocation APIs. | Service tokens minted with scopes/TTL; delegation logged; quotas configurable; audit stream live; docs updated. |
|
||||
| AUTH-TEN-49-001 | DOING (2025-11-02) | Authority Core & Security Guild | AUTH-TEN-47-001 | Implement service accounts & delegation tokens (`act` chain), per-tenant quotas, audit stream of auth decisions, and revocation APIs. | Service tokens minted with scopes/TTL; delegation logged; quotas configurable; audit stream live; docs updated. |
|
||||
|
||||
## Observability & Forensics (Epic 15)
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ using System.IO;
|
||||
using System.Linq;
|
||||
using System.Text.RegularExpressions;
|
||||
using System.Threading.RateLimiting;
|
||||
using StellaOps.Auth.Abstractions;
|
||||
using StellaOps.Authority.Plugins.Abstractions;
|
||||
using StellaOps.Cryptography;
|
||||
|
||||
@@ -113,6 +114,11 @@ public sealed class StellaOpsAuthorityOptions
|
||||
/// </summary>
|
||||
public AuthoritySigningOptions Signing { get; } = new();
|
||||
|
||||
/// <summary>
|
||||
/// Delegation and service account configuration.
|
||||
/// </summary>
|
||||
public AuthorityDelegationOptions Delegation { get; } = new();
|
||||
|
||||
/// <summary>
|
||||
/// Validates configured values and normalises collections.
|
||||
/// </summary>
|
||||
@@ -154,6 +160,7 @@ public sealed class StellaOpsAuthorityOptions
|
||||
Notifications.Validate();
|
||||
ApiLifecycle.Validate();
|
||||
Signing.Validate();
|
||||
Delegation.NormalizeAndValidate(tenants);
|
||||
Plugins.NormalizeAndValidate();
|
||||
Storage.Validate();
|
||||
Exceptions.Validate();
|
||||
@@ -164,8 +171,8 @@ public sealed class StellaOpsAuthorityOptions
|
||||
var identifiers = new HashSet<string>(StringComparer.Ordinal);
|
||||
foreach (var tenant in tenants)
|
||||
{
|
||||
tenant.Normalize(AdvisoryAi);
|
||||
tenant.Validate(AdvisoryAi);
|
||||
tenant.Normalize(AdvisoryAi, Delegation);
|
||||
tenant.Validate(AdvisoryAi, Delegation);
|
||||
if (!identifiers.Add(tenant.Id))
|
||||
{
|
||||
throw new InvalidOperationException($"Authority configuration contains duplicate tenant identifier '{tenant.Id}'.");
|
||||
@@ -767,7 +774,9 @@ public sealed class AuthorityTenantOptions
|
||||
|
||||
public AuthorityTenantAdvisoryAiOptions AdvisoryAi { get; } = new();
|
||||
|
||||
internal void Normalize(AuthorityAdvisoryAiOptions? advisoryAiOptions)
|
||||
public AuthorityTenantDelegationOptions Delegation { get; } = new();
|
||||
|
||||
internal void Normalize(AuthorityAdvisoryAiOptions? advisoryAiOptions, AuthorityDelegationOptions delegationOptions)
|
||||
{
|
||||
Id = (Id ?? string.Empty).Trim();
|
||||
DisplayName = (DisplayName ?? string.Empty).Trim();
|
||||
@@ -810,9 +819,10 @@ public sealed class AuthorityTenantOptions
|
||||
}
|
||||
|
||||
AdvisoryAi.Normalize(advisoryAiOptions);
|
||||
Delegation.Normalize(delegationOptions);
|
||||
}
|
||||
|
||||
internal void Validate(AuthorityAdvisoryAiOptions? advisoryAiOptions)
|
||||
internal void Validate(AuthorityAdvisoryAiOptions? advisoryAiOptions, AuthorityDelegationOptions delegationOptions)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(Id))
|
||||
{
|
||||
@@ -841,12 +851,186 @@ public sealed class AuthorityTenantOptions
|
||||
}
|
||||
|
||||
AdvisoryAi.Validate(advisoryAiOptions);
|
||||
Delegation.Validate(delegationOptions, Id);
|
||||
}
|
||||
|
||||
private static readonly Regex TenantSlugRegex = new("^[a-z0-9-]+$", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
||||
private static readonly Regex ProjectSlugRegex = new("^[a-z0-9-]+$", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
||||
}
|
||||
|
||||
public sealed class AuthorityDelegationOptions
|
||||
{
|
||||
private readonly IList<AuthorityServiceAccountSeedOptions> serviceAccounts = new List<AuthorityServiceAccountSeedOptions>();
|
||||
|
||||
public AuthorityDelegationQuotaOptions Quotas { get; } = new();
|
||||
|
||||
public IList<AuthorityServiceAccountSeedOptions> ServiceAccounts => (IList<AuthorityServiceAccountSeedOptions>)serviceAccounts;
|
||||
|
||||
internal void NormalizeAndValidate(IList<AuthorityTenantOptions> tenants)
|
||||
{
|
||||
Quotas.Validate(nameof(Quotas));
|
||||
|
||||
var tenantIds = tenants is { Count: > 0 }
|
||||
? tenants
|
||||
.Where(static tenant => !string.IsNullOrWhiteSpace(tenant.Id))
|
||||
.Select(static tenant => tenant.Id.Trim())
|
||||
.ToHashSet(StringComparer.OrdinalIgnoreCase)
|
||||
: new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
var seenAccounts = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var account in serviceAccounts)
|
||||
{
|
||||
account.Normalize();
|
||||
account.Validate(tenantIds);
|
||||
|
||||
if (!seenAccounts.Add(account.AccountId))
|
||||
{
|
||||
throw new InvalidOperationException($"Delegation configuration contains duplicate service account id '{account.AccountId}'.");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public sealed class AuthorityDelegationQuotaOptions
|
||||
{
|
||||
public int MaxActiveTokens { get; set; } = 50;
|
||||
|
||||
internal void Validate(string propertyName)
|
||||
{
|
||||
if (MaxActiveTokens <= 0)
|
||||
{
|
||||
throw new InvalidOperationException($"Authority delegation configuration requires {propertyName}.{nameof(MaxActiveTokens)} to be greater than zero.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public sealed class AuthorityTenantDelegationOptions
|
||||
{
|
||||
public int? MaxActiveTokens { get; set; }
|
||||
|
||||
internal void Normalize(AuthorityDelegationOptions defaults)
|
||||
{
|
||||
_ = defaults ?? throw new ArgumentNullException(nameof(defaults));
|
||||
}
|
||||
|
||||
internal void Validate(AuthorityDelegationOptions defaults, string tenantId)
|
||||
{
|
||||
_ = defaults ?? throw new ArgumentNullException(nameof(defaults));
|
||||
|
||||
if (MaxActiveTokens is { } value && value <= 0)
|
||||
{
|
||||
throw new InvalidOperationException($"Tenant '{tenantId}' delegation maxActiveTokens must be greater than zero when specified.");
|
||||
}
|
||||
}
|
||||
|
||||
public int ResolveMaxActiveTokens(AuthorityDelegationOptions defaults)
|
||||
{
|
||||
_ = defaults ?? throw new ArgumentNullException(nameof(defaults));
|
||||
return MaxActiveTokens ?? defaults.Quotas.MaxActiveTokens;
|
||||
}
|
||||
}
|
||||
|
||||
public sealed class AuthorityServiceAccountSeedOptions
|
||||
{
|
||||
private static readonly Regex AccountIdRegex = new("^[a-z0-9][a-z0-9:_-]{2,63}$", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
||||
|
||||
public string AccountId { get; set; } = string.Empty;
|
||||
|
||||
public string Tenant { get; set; } = string.Empty;
|
||||
|
||||
public string DisplayName { get; set; } = string.Empty;
|
||||
|
||||
public string? Description { get; set; }
|
||||
|
||||
public bool Enabled { get; set; } = true;
|
||||
|
||||
public IList<string> AuthorizedClients { get; } = new List<string>();
|
||||
|
||||
public IList<string> AllowedScopes { get; } = new List<string>();
|
||||
|
||||
internal void Normalize()
|
||||
{
|
||||
AccountId = (AccountId ?? string.Empty).Trim();
|
||||
Tenant = string.IsNullOrWhiteSpace(Tenant) ? string.Empty : Tenant.Trim().ToLowerInvariant();
|
||||
DisplayName = (DisplayName ?? string.Empty).Trim();
|
||||
Description = string.IsNullOrWhiteSpace(Description) ? null : Description.Trim();
|
||||
|
||||
NormalizeList(AuthorizedClients, static client => client.Trim().ToLowerInvariant(), StringComparer.OrdinalIgnoreCase);
|
||||
NormalizeList(AllowedScopes, scope =>
|
||||
{
|
||||
var normalized = StellaOpsScopes.Normalize(scope);
|
||||
return normalized ?? scope.Trim().ToLowerInvariant();
|
||||
}, StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
internal void Validate(ISet<string> tenantIds)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(AccountId))
|
||||
{
|
||||
throw new InvalidOperationException("Delegation service account seeds require an accountId.");
|
||||
}
|
||||
|
||||
if (!AccountIdRegex.IsMatch(AccountId))
|
||||
{
|
||||
throw new InvalidOperationException($"Service account id '{AccountId}' must contain lowercase letters, digits, colon, underscore, or hyphen.");
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(Tenant))
|
||||
{
|
||||
throw new InvalidOperationException($"Service account '{AccountId}' requires a tenant assignment.");
|
||||
}
|
||||
|
||||
if (tenantIds.Count > 0 && !tenantIds.Contains(Tenant))
|
||||
{
|
||||
throw new InvalidOperationException($"Service account '{AccountId}' references unknown tenant '{Tenant}'.");
|
||||
}
|
||||
|
||||
if (AllowedScopes.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException($"Service account '{AccountId}' must specify at least one allowed scope.");
|
||||
}
|
||||
}
|
||||
|
||||
private static void NormalizeList(IList<string> values, Func<string, string> normalize, IEqualityComparer<string> comparer)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(values);
|
||||
ArgumentNullException.ThrowIfNull(normalize);
|
||||
comparer ??= StringComparer.Ordinal;
|
||||
|
||||
if (values.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var seen = new HashSet<string>(comparer);
|
||||
for (var index = values.Count - 1; index >= 0; index--)
|
||||
{
|
||||
var current = values[index];
|
||||
if (string.IsNullOrWhiteSpace(current))
|
||||
{
|
||||
values.RemoveAt(index);
|
||||
continue;
|
||||
}
|
||||
|
||||
var normalized = normalize(current);
|
||||
if (string.IsNullOrWhiteSpace(normalized))
|
||||
{
|
||||
values.RemoveAt(index);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!seen.Add(normalized))
|
||||
{
|
||||
values.RemoveAt(index);
|
||||
continue;
|
||||
}
|
||||
|
||||
values[index] = normalized;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public sealed class AuthorityPluginSettings
|
||||
{
|
||||
|
||||
Reference in New Issue
Block a user