Add reference architecture and testing strategy documentation
- Created a new document for the Stella Ops Reference Architecture outlining the system's topology, trust boundaries, artifact association, and interfaces. - Developed a comprehensive Testing Strategy document detailing the importance of offline readiness, interoperability, determinism, and operational guardrails. - Introduced a README for the Testing Strategy, summarizing processing details and key concepts implemented. - Added guidance for AI agents and developers in the tests directory, including directory structure, test categories, key patterns, and rules for test development.
This commit is contained in:
@@ -1,7 +1,7 @@
|
|||||||
# Automated Test‑Suite Overview
|
# Automated Test-Suite Overview
|
||||||
|
|
||||||
This document enumerates **every automated check** executed by the Stella Ops
|
This document enumerates **every automated check** executed by the Stella Ops
|
||||||
CI pipeline, from unit level to chaos experiments. It is intended for
|
CI pipeline, from unit level to chaos experiments. It is intended for
|
||||||
contributors who need to extend coverage or diagnose failures.
|
contributors who need to extend coverage or diagnose failures.
|
||||||
|
|
||||||
> **Build parameters** – values such as `{{ dotnet }}` (runtime) and
|
> **Build parameters** – values such as `{{ dotnet }}` (runtime) and
|
||||||
@@ -9,40 +9,81 @@ contributors who need to extend coverage or diagnose failures.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Layer map
|
## Test Philosophy
|
||||||
|
|
||||||
| Layer | Tooling | Entry‑point | Frequency |
|
### Core Principles
|
||||||
|-------|---------|-------------|-----------|
|
|
||||||
| **1. Unit** | `xUnit` (<code>dotnet test</code>) | `*.Tests.csproj` | per PR / push |
|
1. **Determinism as Contract**: Scan verdicts must be reproducible. Same inputs → byte-identical outputs.
|
||||||
| **2. Property‑based** | `FsCheck` | `SbomPropertyTests` | per PR |
|
2. **Offline by Default**: Every test (except explicitly tagged "online") runs without network access.
|
||||||
| **3. Integration (API)** | `Testcontainers` suite | `test/Api.Integration` | per PR + nightly |
|
3. **Evidence-First Validation**: Assertions verify the complete evidence chain, not just pass/fail.
|
||||||
| **4. Integration (DB-merge)** | Testcontainers PostgreSQL + Redis | `Concelier.Integration` (vulnerability ingest/merge/export service) | per PR |
|
4. **Interop is Required**: Compatibility with ecosystem tools (Syft, Grype, Trivy, cosign) blocks releases.
|
||||||
| **5. Contract (gRPC)** | `Buf breaking` | `buf.yaml` files | per PR |
|
5. **Coverage by Risk**: Prioritize testing high-risk paths over line coverage metrics.
|
||||||
| **6. Front‑end unit** | `Jest` | `ui/src/**/*.spec.ts` | per PR |
|
|
||||||
| **7. Front‑end E2E** | `Playwright` | `ui/e2e/**` | nightly |
|
### Test Boundaries
|
||||||
| **8. Lighthouse perf / a11y** | `lighthouse-ci` (Chrome headless) | `ui/dist/index.html` | nightly |
|
|
||||||
| **9. Load** | `k6` scripted scenarios | `k6/*.js` | nightly |
|
- **Lattice/policy merge** algorithms run in `scanner.webservice`
|
||||||
| **10. Chaos CPU / OOM** | `pumba` | Docker Compose overlay | weekly |
|
- **Concelier/Excitors** preserve prune source (no conflict resolution)
|
||||||
| **11. Dependency scanning** | `Trivy fs` + `dotnet list package --vuln` | root | per PR |
|
- Tests enforce these boundaries explicitly
|
||||||
| **12. License compliance** | `LicenceFinder` | root | per PR |
|
|
||||||
| **13. SBOM reproducibility** | `in‑toto attestation` diff | GitLab job | release tags |
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Quality gates
|
## Layer Map
|
||||||
|
|
||||||
|
| Layer | Tooling | Entry-point | Frequency |
|
||||||
|
|-------|---------|-------------|-----------|
|
||||||
|
| **1. Unit** | `xUnit` (<code>dotnet test</code>) | `*.Tests.csproj` | per PR / push |
|
||||||
|
| **2. Property-based** | `FsCheck` | `SbomPropertyTests`, `Canonicalization` | per PR |
|
||||||
|
| **3. Integration (API)** | `Testcontainers` suite | `test/Api.Integration` | per PR + nightly |
|
||||||
|
| **4. Integration (DB-merge)** | Testcontainers PostgreSQL + Valkey | `Concelier.Integration` | per PR |
|
||||||
|
| **5. Contract (OpenAPI)** | Schema validation | `docs/api/*.yaml` | per PR |
|
||||||
|
| **6. Front-end unit** | `Jest` | `ui/src/**/*.spec.ts` | per PR |
|
||||||
|
| **7. Front-end E2E** | `Playwright` | `ui/e2e/**` | nightly |
|
||||||
|
| **8. Lighthouse perf / a11y** | `lighthouse-ci` (Chrome headless) | `ui/dist/index.html` | nightly |
|
||||||
|
| **9. Load** | `k6` scripted scenarios | `tests/load/*.js` | nightly |
|
||||||
|
| **10. Chaos** | `pumba`, custom harness | `tests/chaos/` | weekly |
|
||||||
|
| **11. Interop** | Syft/Grype/cosign | `tests/interop/` | nightly |
|
||||||
|
| **12. Offline E2E** | Network-isolated containers | `tests/offline/` | nightly |
|
||||||
|
| **13. Replay Verification** | Golden corpus replay | `bench/golden-corpus/` | per PR |
|
||||||
|
| **14. Dependency scanning** | `Trivy fs` + `dotnet list package --vuln` | root | per PR |
|
||||||
|
| **15. License compliance** | `LicenceFinder` | root | per PR |
|
||||||
|
| **16. SBOM reproducibility** | `in-toto attestation` diff | GitLab job | release tags |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Test Categories (xUnit Traits)
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "Unit")] // Fast, isolated unit tests
|
||||||
|
[Trait("Category", "Integration")] // Tests requiring infrastructure
|
||||||
|
[Trait("Category", "E2E")] // Full end-to-end workflows
|
||||||
|
[Trait("Category", "AirGap")] // Must work without network
|
||||||
|
[Trait("Category", "Interop")] // Third-party tool compatibility
|
||||||
|
[Trait("Category", "Performance")] // Performance benchmarks
|
||||||
|
[Trait("Category", "Chaos")] // Failure injection tests
|
||||||
|
[Trait("Category", "Security")] // Security-focused tests
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quality Gates
|
||||||
|
|
||||||
| Metric | Budget | Gate |
|
| Metric | Budget | Gate |
|
||||||
|--------|--------|------|
|
|--------|--------|------|
|
||||||
| API unit coverage | ≥ 85 % lines | PR merge |
|
| API unit coverage | ≥ 85% lines | PR merge |
|
||||||
| API response P95 | ≤ 120 ms | nightly alert |
|
| API response P95 | ≤ 120 ms | nightly alert |
|
||||||
| Δ‑SBOM warm scan P95 (4 vCPU) | ≤ 5 s | nightly alert |
|
| Δ-SBOM warm scan P95 (4 vCPU) | ≤ 5 s | nightly alert |
|
||||||
| Lighthouse performance score | ≥ 90 | nightly alert |
|
| Lighthouse performance score | ≥ 90 | nightly alert |
|
||||||
| Lighthouse accessibility score | ≥ 95 | nightly alert |
|
| Lighthouse accessibility score | ≥ 95 | nightly alert |
|
||||||
| k6 sustained RPS drop | < 5 % vs baseline | nightly alert |
|
| k6 sustained RPS drop | < 5% vs baseline | nightly alert |
|
||||||
|
| **Replay determinism** | 0 byte diff | **Release** |
|
||||||
|
| **Interop findings parity** | ≥ 95% | **Release** |
|
||||||
|
| **Offline E2E** | All pass with no network | **Release** |
|
||||||
|
| **Unknowns budget (prod)** | ≤ configured limit | **Release** |
|
||||||
|
| **Router Retry-After compliance** | 100% | Nightly |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Local runner
|
## Local Runner
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# minimal run: unit + property + frontend tests
|
# minimal run: unit + property + frontend tests
|
||||||
@@ -50,21 +91,26 @@ contributors who need to extend coverage or diagnose failures.
|
|||||||
|
|
||||||
# full stack incl. Playwright and lighthouse
|
# full stack incl. Playwright and lighthouse
|
||||||
./scripts/dev-test.sh --full
|
./scripts/dev-test.sh --full
|
||||||
````
|
|
||||||
|
|
||||||
The script spins up PostgreSQL/Redis via Testcontainers and requires:
|
# category-specific
|
||||||
|
dotnet test --filter "Category=Unit"
|
||||||
|
dotnet test --filter "Category=AirGap"
|
||||||
|
dotnet test --filter "Category=Interop"
|
||||||
|
```
|
||||||
|
|
||||||
|
The script spins up PostgreSQL/Valkey via Testcontainers and requires:
|
||||||
|
|
||||||
* Docker ≥ 25
|
* Docker ≥ 25
|
||||||
* Node 20 (for Jest/Playwright)
|
* Node 20 (for Jest/Playwright)
|
||||||
|
|
||||||
#### PostgreSQL Testcontainers
|
### PostgreSQL Testcontainers
|
||||||
|
|
||||||
Multiple suites (Concelier connectors, Excititor worker/WebService, Scheduler)
|
Multiple suites (Concelier connectors, Excititor worker/WebService, Scheduler)
|
||||||
use Testcontainers with PostgreSQL for integration tests. If you don't have
|
use Testcontainers with PostgreSQL for integration tests. If you don't have
|
||||||
Docker available, tests can also run against a local PostgreSQL instance
|
Docker available, tests can also run against a local PostgreSQL instance
|
||||||
listening on `127.0.0.1:5432`.
|
listening on `127.0.0.1:5432`.
|
||||||
|
|
||||||
#### Local PostgreSQL helper
|
### Local PostgreSQL Helper
|
||||||
|
|
||||||
Some suites (Concelier WebService/Core, Exporter JSON) need a full
|
Some suites (Concelier WebService/Core, Exporter JSON) need a full
|
||||||
PostgreSQL instance when you want to debug or inspect data with `psql`.
|
PostgreSQL instance when you want to debug or inspect data with `psql`.
|
||||||
@@ -86,7 +132,57 @@ running `dotnet test` if a suite supports overriding its connection string.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Concelier OSV↔GHSA parity fixtures
|
## New Test Infrastructure (Epic 5100)
|
||||||
|
|
||||||
|
### Run Manifest & Replay
|
||||||
|
|
||||||
|
Every scan captures a **Run Manifest** containing all inputs (artifact digests, feed versions, policy versions, PRNG seed). This enables deterministic replay:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Replay a scan from manifest
|
||||||
|
stella replay --manifest run-manifest.json --output verdict.json
|
||||||
|
|
||||||
|
# Verify determinism
|
||||||
|
stella replay verify --manifest run-manifest.json
|
||||||
|
```
|
||||||
|
|
||||||
|
### Evidence Index
|
||||||
|
|
||||||
|
The **Evidence Index** links verdicts to their supporting evidence chain:
|
||||||
|
- Verdict → SBOM digests → Attestation IDs → Tool versions
|
||||||
|
|
||||||
|
### Golden Corpus
|
||||||
|
|
||||||
|
Located at `bench/golden-corpus/`, contains 50+ test cases:
|
||||||
|
- Severity levels (Critical, High, Medium, Low)
|
||||||
|
- VEX scenarios (Not Affected, Affected, Conflicting)
|
||||||
|
- Reachability cases (Reachable, Not Reachable, Inconclusive)
|
||||||
|
- Unknowns scenarios
|
||||||
|
- Scale tests (200 to 50k+ packages)
|
||||||
|
- Multi-distro (Alpine, Debian, RHEL, SUSE, Ubuntu)
|
||||||
|
- Interop fixtures (Syft-generated, Trivy-generated)
|
||||||
|
- Negative cases (malformed inputs)
|
||||||
|
|
||||||
|
### Offline Testing
|
||||||
|
|
||||||
|
Inherit from `NetworkIsolatedTestBase` for air-gap compliance:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "AirGap")]
|
||||||
|
public class OfflineTests : NetworkIsolatedTestBase
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Test_WorksOffline()
|
||||||
|
{
|
||||||
|
// Test implementation
|
||||||
|
AssertNoNetworkCalls(); // Fails if network accessed
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Concelier OSV↔GHSA Parity Fixtures
|
||||||
|
|
||||||
The Concelier connector suite includes a regression test (`OsvGhsaParityRegressionTests`)
|
The Concelier connector suite includes a regression test (`OsvGhsaParityRegressionTests`)
|
||||||
that checks a curated set of GHSA identifiers against OSV responses. The fixture
|
that checks a curated set of GHSA identifiers against OSV responses. The fixture
|
||||||
@@ -104,7 +200,7 @@ fixtures stay stable across machines.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## CI job layout
|
## CI Job Layout
|
||||||
|
|
||||||
```mermaid
|
```mermaid
|
||||||
flowchart LR
|
flowchart LR
|
||||||
@@ -115,21 +211,42 @@ flowchart LR
|
|||||||
I1 --> FE[Jest]
|
I1 --> FE[Jest]
|
||||||
FE --> E2E[Playwright]
|
FE --> E2E[Playwright]
|
||||||
E2E --> Lighthouse
|
E2E --> Lighthouse
|
||||||
|
|
||||||
|
subgraph release-gates
|
||||||
|
REPLAY[Replay Verify]
|
||||||
|
INTEROP[Interop E2E]
|
||||||
|
OFFLINE[Offline E2E]
|
||||||
|
BUDGET[Unknowns Gate]
|
||||||
|
end
|
||||||
|
|
||||||
Lighthouse --> INTEG2[Concelier]
|
Lighthouse --> INTEG2[Concelier]
|
||||||
INTEG2 --> LOAD[k6]
|
INTEG2 --> LOAD[k6]
|
||||||
LOAD --> CHAOS[pumba]
|
LOAD --> CHAOS[Chaos Suite]
|
||||||
CHAOS --> RELEASE[Attestation diff]
|
CHAOS --> RELEASE[Attestation diff]
|
||||||
|
|
||||||
|
RELEASE --> release-gates
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Adding a new test layer
|
## Adding a New Test Layer
|
||||||
|
|
||||||
1. Extend `scripts/dev-test.sh` so local contributors get the layer by default.
|
1. Extend `scripts/dev-test.sh` so local contributors get the layer by default.
|
||||||
2. Add a dedicated GitLab job in `.gitlab-ci.yml` (stage `test` or `nightly`).
|
2. Add a dedicated workflow in `.gitea/workflows/` (or GitLab job in `.gitlab-ci.yml`).
|
||||||
3. Register the job in `docs/19_TEST_SUITE_OVERVIEW.md` *and* list its metric
|
3. Register the job in `docs/19_TEST_SUITE_OVERVIEW.md` *and* list its metric
|
||||||
in `docs/metrics/README.md`.
|
in `docs/metrics/README.md`.
|
||||||
|
4. If the test requires network isolation, inherit from `NetworkIsolatedTestBase`.
|
||||||
|
5. If the test uses golden corpus, add cases to `bench/golden-corpus/`.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
*Last updated {{ "now" | date: "%Y‑%m‑%d" }}*
|
## Related Documentation
|
||||||
|
|
||||||
|
- [Sprint Epic 5100 - Testing Strategy](implplan/SPRINT_5100_SUMMARY.md)
|
||||||
|
- [tests/AGENTS.md](../tests/AGENTS.md)
|
||||||
|
- [Offline Operation Guide](24_OFFLINE_KIT.md)
|
||||||
|
- [Module Architecture Dossiers](modules/)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Last updated 2025-12-21*
|
||||||
|
|||||||
680
docs/db/schemas/binaries_schema_specification.md
Normal file
680
docs/db/schemas/binaries_schema_specification.md
Normal file
@@ -0,0 +1,680 @@
|
|||||||
|
# Binaries Schema Specification
|
||||||
|
|
||||||
|
**Version:** 1.0.0
|
||||||
|
**Status:** DRAFT
|
||||||
|
**Owner:** BinaryIndex Module
|
||||||
|
**Last Updated:** 2025-12-21
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Overview
|
||||||
|
|
||||||
|
The `binaries` schema stores binary identity, vulnerability mappings, fingerprints, and patch-aware fix status for the BinaryIndex module. This enables detection of vulnerable binaries independent of package metadata.
|
||||||
|
|
||||||
|
## 2. Schema Definition
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- ============================================================================
|
||||||
|
-- BINARIES SCHEMA
|
||||||
|
-- ============================================================================
|
||||||
|
-- Purpose: Binary identity, fingerprint, and vulnerability mapping for
|
||||||
|
-- the BinaryIndex module (vulnerable binaries database).
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE SCHEMA IF NOT EXISTS binaries;
|
||||||
|
CREATE SCHEMA IF NOT EXISTS binaries_app;
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- RLS Helper Function
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE OR REPLACE FUNCTION binaries_app.require_current_tenant()
|
||||||
|
RETURNS TEXT
|
||||||
|
LANGUAGE plpgsql STABLE SECURITY DEFINER
|
||||||
|
AS $$
|
||||||
|
DECLARE
|
||||||
|
v_tenant TEXT;
|
||||||
|
BEGIN
|
||||||
|
v_tenant := current_setting('app.tenant_id', true);
|
||||||
|
IF v_tenant IS NULL OR v_tenant = '' THEN
|
||||||
|
RAISE EXCEPTION 'app.tenant_id session variable not set';
|
||||||
|
END IF;
|
||||||
|
RETURN v_tenant;
|
||||||
|
END;
|
||||||
|
$$;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- CORE IDENTITY TABLES
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: binary_identity
|
||||||
|
-- Purpose: Known binary identities extracted from packages
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.binary_identity (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Primary identity (Build-ID preferred for ELF)
|
||||||
|
binary_key TEXT NOT NULL, -- build_id || file_sha256 (normalized)
|
||||||
|
build_id TEXT, -- ELF GNU Build-ID (hex)
|
||||||
|
build_id_type TEXT CHECK (build_id_type IN ('gnu-build-id', 'pe-cv', 'macho-uuid')),
|
||||||
|
|
||||||
|
-- Hashes
|
||||||
|
file_sha256 TEXT NOT NULL, -- sha256 of entire file
|
||||||
|
text_sha256 TEXT, -- sha256 of .text section (ELF)
|
||||||
|
blake3_hash TEXT, -- Optional faster hash
|
||||||
|
|
||||||
|
-- Binary metadata
|
||||||
|
format TEXT NOT NULL CHECK (format IN ('elf', 'pe', 'macho')),
|
||||||
|
architecture TEXT NOT NULL, -- x86-64, aarch64, arm, etc.
|
||||||
|
osabi TEXT, -- linux, windows, darwin
|
||||||
|
binary_type TEXT CHECK (binary_type IN ('executable', 'shared_library', 'static_library', 'object')),
|
||||||
|
is_stripped BOOLEAN DEFAULT FALSE,
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
first_seen_snapshot_id UUID,
|
||||||
|
last_seen_snapshot_id UUID,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT binary_identity_key_unique UNIQUE (tenant_id, binary_key)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: binary_package_map
|
||||||
|
-- Purpose: Maps binaries to source packages (per snapshot)
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.binary_package_map (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Binary reference
|
||||||
|
binary_identity_id UUID NOT NULL REFERENCES binaries.binary_identity(id) ON DELETE CASCADE,
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
|
||||||
|
-- Package info
|
||||||
|
distro TEXT NOT NULL, -- debian, ubuntu, rhel, alpine
|
||||||
|
release TEXT NOT NULL, -- bookworm, jammy, 9, 3.19
|
||||||
|
source_pkg TEXT NOT NULL, -- Source package name (e.g., openssl)
|
||||||
|
binary_pkg TEXT NOT NULL, -- Binary package name (e.g., libssl3)
|
||||||
|
pkg_version TEXT NOT NULL, -- Full distro version (e.g., 1.1.1n-0+deb11u5)
|
||||||
|
pkg_purl TEXT, -- PURL if derivable
|
||||||
|
architecture TEXT NOT NULL,
|
||||||
|
|
||||||
|
-- File location
|
||||||
|
file_path_in_pkg TEXT NOT NULL, -- /usr/lib/x86_64-linux-gnu/libssl.so.3
|
||||||
|
|
||||||
|
-- Snapshot reference
|
||||||
|
snapshot_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT binary_package_map_unique UNIQUE (binary_identity_id, snapshot_id, file_path_in_pkg)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: corpus_snapshots
|
||||||
|
-- Purpose: Tracks corpus ingestion snapshots
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.corpus_snapshots (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Snapshot identification
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
architecture TEXT NOT NULL,
|
||||||
|
snapshot_id TEXT NOT NULL, -- Unique snapshot identifier
|
||||||
|
|
||||||
|
-- Content tracking
|
||||||
|
packages_processed INT NOT NULL DEFAULT 0,
|
||||||
|
binaries_indexed INT NOT NULL DEFAULT 0,
|
||||||
|
repo_metadata_digest TEXT, -- SHA-256 of repo metadata
|
||||||
|
|
||||||
|
-- Signing
|
||||||
|
signing_key_id TEXT,
|
||||||
|
dsse_envelope_ref TEXT, -- RustFS reference to DSSE envelope
|
||||||
|
|
||||||
|
-- Status
|
||||||
|
status TEXT NOT NULL DEFAULT 'pending' CHECK (status IN ('pending', 'processing', 'completed', 'failed')),
|
||||||
|
error TEXT,
|
||||||
|
|
||||||
|
-- Timestamps
|
||||||
|
started_at TIMESTAMPTZ,
|
||||||
|
completed_at TIMESTAMPTZ,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT corpus_snapshots_unique UNIQUE (tenant_id, distro, release, architecture, snapshot_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- VULNERABILITY MAPPING TABLES
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: vulnerable_buildids
|
||||||
|
-- Purpose: Build-IDs known to be associated with vulnerable packages
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.vulnerable_buildids (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Build-ID reference
|
||||||
|
buildid_type TEXT NOT NULL CHECK (buildid_type IN ('gnu-build-id', 'pe-cv', 'macho-uuid')),
|
||||||
|
buildid_value TEXT NOT NULL, -- Hex string
|
||||||
|
|
||||||
|
-- Package info
|
||||||
|
purl TEXT NOT NULL, -- Package URL
|
||||||
|
pkg_version TEXT NOT NULL,
|
||||||
|
distro TEXT,
|
||||||
|
release TEXT,
|
||||||
|
|
||||||
|
-- Confidence
|
||||||
|
confidence TEXT NOT NULL DEFAULT 'exact' CHECK (confidence IN ('exact', 'inferred', 'heuristic')),
|
||||||
|
|
||||||
|
-- Provenance
|
||||||
|
provenance JSONB DEFAULT '{}',
|
||||||
|
snapshot_id UUID REFERENCES binaries.corpus_snapshots(id),
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
indexed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT vulnerable_buildids_unique UNIQUE (tenant_id, buildid_value, buildid_type, purl, pkg_version)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: binary_vuln_assertion
|
||||||
|
-- Purpose: CVE status assertions for specific binaries
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.binary_vuln_assertion (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Binary reference
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
binary_identity_id UUID REFERENCES binaries.binary_identity(id),
|
||||||
|
|
||||||
|
-- CVE reference
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
advisory_id UUID, -- Reference to vuln.advisories
|
||||||
|
|
||||||
|
-- Status
|
||||||
|
status TEXT NOT NULL CHECK (status IN ('affected', 'not_affected', 'fixed', 'unknown')),
|
||||||
|
|
||||||
|
-- Method used to determine status
|
||||||
|
method TEXT NOT NULL CHECK (method IN ('range_match', 'buildid_catalog', 'fingerprint_match', 'fix_index')),
|
||||||
|
confidence NUMERIC(3,2) CHECK (confidence >= 0 AND confidence <= 1),
|
||||||
|
|
||||||
|
-- Evidence
|
||||||
|
evidence_ref TEXT, -- RustFS reference to evidence bundle
|
||||||
|
evidence_digest TEXT, -- SHA-256 of evidence
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
evaluated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT binary_vuln_assertion_unique UNIQUE (tenant_id, binary_key, cve_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- FIX INDEX TABLES (Patch-Aware Backport Handling)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: cve_fix_evidence
|
||||||
|
-- Purpose: Raw evidence of CVE fixes (append-only)
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.cve_fix_evidence (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Key fields
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
source_pkg TEXT NOT NULL,
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
|
||||||
|
-- Fix information
|
||||||
|
state TEXT NOT NULL CHECK (state IN ('fixed', 'vulnerable', 'not_affected', 'wontfix', 'unknown')),
|
||||||
|
fixed_version TEXT, -- Distro version string (nullable for not_affected)
|
||||||
|
|
||||||
|
-- Method and confidence
|
||||||
|
method TEXT NOT NULL CHECK (method IN ('security_feed', 'changelog', 'patch_header', 'upstream_patch_match')),
|
||||||
|
confidence NUMERIC(3,2) NOT NULL CHECK (confidence >= 0 AND confidence <= 1),
|
||||||
|
|
||||||
|
-- Evidence details
|
||||||
|
evidence JSONB NOT NULL, -- Method-specific evidence payload
|
||||||
|
|
||||||
|
-- Snapshot reference
|
||||||
|
snapshot_id UUID REFERENCES binaries.corpus_snapshots(id),
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: cve_fix_index
|
||||||
|
-- Purpose: Merged best-record for CVE fix status per distro/package
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.cve_fix_index (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Key fields
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
source_pkg TEXT NOT NULL,
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
architecture TEXT, -- NULL means all architectures
|
||||||
|
|
||||||
|
-- Fix status
|
||||||
|
state TEXT NOT NULL CHECK (state IN ('fixed', 'vulnerable', 'not_affected', 'wontfix', 'unknown')),
|
||||||
|
fixed_version TEXT,
|
||||||
|
|
||||||
|
-- Merge metadata
|
||||||
|
primary_method TEXT NOT NULL, -- Method of highest-confidence evidence
|
||||||
|
confidence NUMERIC(3,2) NOT NULL,
|
||||||
|
evidence_ids UUID[], -- References to cve_fix_evidence
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
computed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT cve_fix_index_unique UNIQUE (tenant_id, distro, release, source_pkg, cve_id, architecture)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- FINGERPRINT TABLES
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: vulnerable_fingerprints
|
||||||
|
-- Purpose: Function fingerprints for CVE detection
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.vulnerable_fingerprints (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- CVE and component
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
component TEXT NOT NULL, -- e.g., openssl, glibc
|
||||||
|
purl TEXT, -- Package URL if applicable
|
||||||
|
|
||||||
|
-- Fingerprint data
|
||||||
|
algorithm TEXT NOT NULL CHECK (algorithm IN ('basic_block', 'control_flow_graph', 'string_refs', 'combined')),
|
||||||
|
fingerprint_id TEXT NOT NULL, -- Unique ID (e.g., "bb-abc123...")
|
||||||
|
fingerprint_hash BYTEA NOT NULL, -- Raw fingerprint bytes (16-32 bytes)
|
||||||
|
architecture TEXT NOT NULL, -- x86-64, aarch64
|
||||||
|
|
||||||
|
-- Function hints
|
||||||
|
function_name TEXT, -- Original function name if known
|
||||||
|
source_file TEXT, -- Source file path
|
||||||
|
source_line INT,
|
||||||
|
|
||||||
|
-- Confidence and validation
|
||||||
|
similarity_threshold NUMERIC(3,2) DEFAULT 0.95,
|
||||||
|
confidence NUMERIC(3,2) CHECK (confidence >= 0 AND confidence <= 1),
|
||||||
|
validated BOOLEAN DEFAULT FALSE,
|
||||||
|
validation_stats JSONB DEFAULT '{}', -- precision, recall, etc.
|
||||||
|
|
||||||
|
-- Reference builds
|
||||||
|
vuln_build_ref TEXT, -- RustFS ref to vulnerable reference build
|
||||||
|
fixed_build_ref TEXT, -- RustFS ref to fixed reference build
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
notes TEXT,
|
||||||
|
evidence_ref TEXT, -- RustFS ref to evidence bundle
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
indexed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT vulnerable_fingerprints_unique UNIQUE (tenant_id, cve_id, algorithm, fingerprint_id, architecture)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: fingerprint_corpus_metadata
|
||||||
|
-- Purpose: Tracks which packages have been fingerprinted
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.fingerprint_corpus_metadata (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Package identification
|
||||||
|
purl TEXT NOT NULL,
|
||||||
|
version TEXT NOT NULL,
|
||||||
|
|
||||||
|
-- Fingerprinting info
|
||||||
|
algorithm TEXT NOT NULL,
|
||||||
|
binary_digest TEXT, -- sha256 of the binary analyzed
|
||||||
|
|
||||||
|
-- Statistics
|
||||||
|
function_count INT NOT NULL DEFAULT 0,
|
||||||
|
fingerprints_indexed INT NOT NULL DEFAULT 0,
|
||||||
|
|
||||||
|
-- Provenance
|
||||||
|
indexed_by TEXT, -- Service/user that indexed
|
||||||
|
indexed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT fingerprint_corpus_metadata_unique UNIQUE (tenant_id, purl, version, algorithm)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- MATCH RESULTS TABLES
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
-- Table: fingerprint_matches
|
||||||
|
-- Purpose: Records fingerprint matches during scans
|
||||||
|
-- ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
CREATE TABLE binaries.fingerprint_matches (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
|
||||||
|
-- Scan reference
|
||||||
|
scan_id UUID NOT NULL, -- Reference to scanner.scan_manifest
|
||||||
|
|
||||||
|
-- Match details
|
||||||
|
match_type TEXT NOT NULL CHECK (match_type IN ('fingerprint', 'buildid', 'hash_exact')),
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
binary_identity_id UUID REFERENCES binaries.binary_identity(id),
|
||||||
|
|
||||||
|
-- Vulnerable package
|
||||||
|
vulnerable_purl TEXT NOT NULL,
|
||||||
|
vulnerable_version TEXT NOT NULL,
|
||||||
|
|
||||||
|
-- Fingerprint match specifics (nullable for non-fingerprint matches)
|
||||||
|
matched_fingerprint_id UUID REFERENCES binaries.vulnerable_fingerprints(id),
|
||||||
|
matched_function TEXT,
|
||||||
|
similarity NUMERIC(3,2), -- 0.00-1.00
|
||||||
|
|
||||||
|
-- CVE linkage
|
||||||
|
advisory_ids TEXT[], -- Linked CVE/GHSA IDs
|
||||||
|
|
||||||
|
-- Reachability (populated later by Scanner)
|
||||||
|
reachability_status TEXT CHECK (reachability_status IN ('reachable', 'unreachable', 'unknown', 'partial')),
|
||||||
|
|
||||||
|
-- Evidence
|
||||||
|
evidence JSONB DEFAULT '{}',
|
||||||
|
|
||||||
|
-- Tracking
|
||||||
|
matched_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- INDEXES
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- binary_identity indexes
|
||||||
|
CREATE INDEX idx_binary_identity_tenant ON binaries.binary_identity(tenant_id);
|
||||||
|
CREATE INDEX idx_binary_identity_buildid ON binaries.binary_identity(build_id) WHERE build_id IS NOT NULL;
|
||||||
|
CREATE INDEX idx_binary_identity_sha256 ON binaries.binary_identity(file_sha256);
|
||||||
|
CREATE INDEX idx_binary_identity_key ON binaries.binary_identity(binary_key);
|
||||||
|
|
||||||
|
-- binary_package_map indexes
|
||||||
|
CREATE INDEX idx_binary_package_map_tenant ON binaries.binary_package_map(tenant_id);
|
||||||
|
CREATE INDEX idx_binary_package_map_binary ON binaries.binary_package_map(binary_identity_id);
|
||||||
|
CREATE INDEX idx_binary_package_map_distro ON binaries.binary_package_map(distro, release, source_pkg);
|
||||||
|
CREATE INDEX idx_binary_package_map_snapshot ON binaries.binary_package_map(snapshot_id);
|
||||||
|
CREATE INDEX idx_binary_package_map_purl ON binaries.binary_package_map(pkg_purl) WHERE pkg_purl IS NOT NULL;
|
||||||
|
|
||||||
|
-- corpus_snapshots indexes
|
||||||
|
CREATE INDEX idx_corpus_snapshots_tenant ON binaries.corpus_snapshots(tenant_id);
|
||||||
|
CREATE INDEX idx_corpus_snapshots_distro ON binaries.corpus_snapshots(distro, release, architecture);
|
||||||
|
CREATE INDEX idx_corpus_snapshots_status ON binaries.corpus_snapshots(status) WHERE status IN ('pending', 'processing');
|
||||||
|
|
||||||
|
-- vulnerable_buildids indexes
|
||||||
|
CREATE INDEX idx_vulnerable_buildids_tenant ON binaries.vulnerable_buildids(tenant_id);
|
||||||
|
CREATE INDEX idx_vulnerable_buildids_value ON binaries.vulnerable_buildids(buildid_type, buildid_value);
|
||||||
|
CREATE INDEX idx_vulnerable_buildids_purl ON binaries.vulnerable_buildids(purl);
|
||||||
|
|
||||||
|
-- binary_vuln_assertion indexes
|
||||||
|
CREATE INDEX idx_binary_vuln_assertion_tenant ON binaries.binary_vuln_assertion(tenant_id);
|
||||||
|
CREATE INDEX idx_binary_vuln_assertion_binary ON binaries.binary_vuln_assertion(binary_key);
|
||||||
|
CREATE INDEX idx_binary_vuln_assertion_cve ON binaries.binary_vuln_assertion(cve_id);
|
||||||
|
CREATE INDEX idx_binary_vuln_assertion_status ON binaries.binary_vuln_assertion(status) WHERE status = 'affected';
|
||||||
|
|
||||||
|
-- cve_fix_evidence indexes
|
||||||
|
CREATE INDEX idx_cve_fix_evidence_tenant ON binaries.cve_fix_evidence(tenant_id);
|
||||||
|
CREATE INDEX idx_cve_fix_evidence_key ON binaries.cve_fix_evidence(distro, release, source_pkg, cve_id);
|
||||||
|
|
||||||
|
-- cve_fix_index indexes
|
||||||
|
CREATE INDEX idx_cve_fix_index_tenant ON binaries.cve_fix_index(tenant_id);
|
||||||
|
CREATE INDEX idx_cve_fix_index_lookup ON binaries.cve_fix_index(distro, release, source_pkg, cve_id);
|
||||||
|
CREATE INDEX idx_cve_fix_index_state ON binaries.cve_fix_index(state) WHERE state = 'fixed';
|
||||||
|
|
||||||
|
-- vulnerable_fingerprints indexes
|
||||||
|
CREATE INDEX idx_vulnerable_fingerprints_tenant ON binaries.vulnerable_fingerprints(tenant_id);
|
||||||
|
CREATE INDEX idx_vulnerable_fingerprints_cve ON binaries.vulnerable_fingerprints(cve_id);
|
||||||
|
CREATE INDEX idx_vulnerable_fingerprints_component ON binaries.vulnerable_fingerprints(component, architecture);
|
||||||
|
CREATE INDEX idx_vulnerable_fingerprints_hash ON binaries.vulnerable_fingerprints USING hash (fingerprint_hash);
|
||||||
|
CREATE INDEX idx_vulnerable_fingerprints_validated ON binaries.vulnerable_fingerprints(validated) WHERE validated = TRUE;
|
||||||
|
|
||||||
|
-- fingerprint_corpus_metadata indexes
|
||||||
|
CREATE INDEX idx_fingerprint_corpus_tenant ON binaries.fingerprint_corpus_metadata(tenant_id);
|
||||||
|
CREATE INDEX idx_fingerprint_corpus_purl ON binaries.fingerprint_corpus_metadata(purl, version);
|
||||||
|
|
||||||
|
-- fingerprint_matches indexes
|
||||||
|
CREATE INDEX idx_fingerprint_matches_tenant ON binaries.fingerprint_matches(tenant_id);
|
||||||
|
CREATE INDEX idx_fingerprint_matches_scan ON binaries.fingerprint_matches(scan_id);
|
||||||
|
CREATE INDEX idx_fingerprint_matches_type ON binaries.fingerprint_matches(match_type);
|
||||||
|
CREATE INDEX idx_fingerprint_matches_purl ON binaries.fingerprint_matches(vulnerable_purl);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- ROW-LEVEL SECURITY
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Enable RLS on all tenant-scoped tables
|
||||||
|
ALTER TABLE binaries.binary_identity ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.binary_identity FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY binary_identity_tenant_isolation ON binaries.binary_identity
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.binary_package_map ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.binary_package_map FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY binary_package_map_tenant_isolation ON binaries.binary_package_map
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.corpus_snapshots ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.corpus_snapshots FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY corpus_snapshots_tenant_isolation ON binaries.corpus_snapshots
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.vulnerable_buildids ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.vulnerable_buildids FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY vulnerable_buildids_tenant_isolation ON binaries.vulnerable_buildids
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.binary_vuln_assertion ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.binary_vuln_assertion FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY binary_vuln_assertion_tenant_isolation ON binaries.binary_vuln_assertion
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.cve_fix_evidence ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.cve_fix_evidence FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY cve_fix_evidence_tenant_isolation ON binaries.cve_fix_evidence
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.cve_fix_index ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.cve_fix_index FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY cve_fix_index_tenant_isolation ON binaries.cve_fix_index
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.vulnerable_fingerprints ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.vulnerable_fingerprints FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY vulnerable_fingerprints_tenant_isolation ON binaries.vulnerable_fingerprints
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.fingerprint_corpus_metadata ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.fingerprint_corpus_metadata FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY fingerprint_corpus_metadata_tenant_isolation ON binaries.fingerprint_corpus_metadata
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.fingerprint_matches ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.fingerprint_matches FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY fingerprint_matches_tenant_isolation ON binaries.fingerprint_matches
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Table Relationships
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ BINARIES SCHEMA │
|
||||||
|
│ │
|
||||||
|
│ ┌────────────────────┐ ┌────────────────────┐ │
|
||||||
|
│ │ corpus_snapshots │<────────│ binary_package_map │ │
|
||||||
|
│ │ (ingestion state) │ │ (binary→pkg) │ │
|
||||||
|
│ └─────────┬──────────┘ └────────┬───────────┘ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ ▼ │
|
||||||
|
│ │ ┌────────────────────┐ │
|
||||||
|
│ └───────────────────>│ binary_identity │<─────────────────┐ │
|
||||||
|
│ │ (Build-ID, hashes) │ │ │
|
||||||
|
│ └────────┬───────────┘ │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ ┌─────────────────────────────┼───────────────────────────────┤ │
|
||||||
|
│ │ │ │ │
|
||||||
|
│ ▼ ▼ │ │
|
||||||
|
│ ┌────────────────────┐ ┌─────────────────────┐ ┌──────────┴───┐
|
||||||
|
│ │ vulnerable_buildids│ │ binary_vuln_ │ │fingerprint_ │
|
||||||
|
│ │ (known vuln builds)│ │ assertion │ │matches │
|
||||||
|
│ └────────────────────┘ │ (CVE status) │ │(scan results)│
|
||||||
|
│ └─────────────────────┘ └──────────────┘
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────────┐│
|
||||||
|
│ │ FIX INDEX (Patch-Aware) ││
|
||||||
|
│ │ ┌────────────────────┐ ┌────────────────────┐ ││
|
||||||
|
│ │ │ cve_fix_evidence │────────>│ cve_fix_index │ ││
|
||||||
|
│ │ │ (raw evidence) │ merge │ (merged best) │ ││
|
||||||
|
│ │ └────────────────────┘ └────────────────────┘ ││
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────────┘│
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────────┐│
|
||||||
|
│ │ FINGERPRINTS ││
|
||||||
|
│ │ ┌────────────────────┐ ┌──────────────────────┐ ││
|
||||||
|
│ │ │vulnerable_ │ │fingerprint_corpus_ │ ││
|
||||||
|
│ │ │fingerprints │ │metadata │ ││
|
||||||
|
│ │ │(CVE fingerprints) │ │(what's indexed) │ ││
|
||||||
|
│ │ └────────────────────┘ └──────────────────────┘ ││
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────────┘│
|
||||||
|
└─────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Query Patterns
|
||||||
|
|
||||||
|
### 4.1 Lookup by Build-ID
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- Find vulnerabilities for a specific Build-ID
|
||||||
|
SELECT ba.cve_id, ba.status, ba.confidence, ba.method
|
||||||
|
FROM binaries.binary_vuln_assertion ba
|
||||||
|
JOIN binaries.binary_identity bi ON bi.binary_key = ba.binary_key
|
||||||
|
WHERE bi.build_id = :build_id
|
||||||
|
AND bi.build_id_type = 'gnu-build-id'
|
||||||
|
AND ba.status = 'affected';
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.2 Check Fix Status (Patch-Aware)
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- Check if a CVE is fixed for a specific distro/package
|
||||||
|
SELECT cfi.state, cfi.fixed_version, cfi.confidence, cfi.primary_method
|
||||||
|
FROM binaries.cve_fix_index cfi
|
||||||
|
WHERE cfi.distro = :distro
|
||||||
|
AND cfi.release = :release
|
||||||
|
AND cfi.source_pkg = :source_pkg
|
||||||
|
AND cfi.cve_id = :cve_id;
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.3 Fingerprint Similarity Search
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- Find fingerprints with similar hash (requires application-level similarity)
|
||||||
|
SELECT vf.cve_id, vf.component, vf.function_name, vf.confidence
|
||||||
|
FROM binaries.vulnerable_fingerprints vf
|
||||||
|
WHERE vf.algorithm = :algorithm
|
||||||
|
AND vf.architecture = :architecture
|
||||||
|
AND vf.validated = TRUE
|
||||||
|
-- Application performs similarity comparison on fingerprint_hash
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Migration Strategy
|
||||||
|
|
||||||
|
### 5.1 Initial Migration
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- V001__create_binaries_schema.sql
|
||||||
|
-- Creates all tables, indexes, and RLS policies
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5.2 Seed Data
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- S001__seed_reference_fingerprints.sql
|
||||||
|
-- Seeds fingerprints for high-impact CVEs from golden corpus
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Performance Considerations
|
||||||
|
|
||||||
|
### 6.1 Table Sizing Estimates
|
||||||
|
|
||||||
|
| Table | Expected Rows | Growth Rate |
|
||||||
|
|-------|---------------|-------------|
|
||||||
|
| binary_identity | 10M | 1M/month |
|
||||||
|
| binary_package_map | 50M | 5M/month |
|
||||||
|
| vulnerable_buildids | 1M | 100K/month |
|
||||||
|
| cve_fix_index | 500K | 50K/month |
|
||||||
|
| vulnerable_fingerprints | 100K | 10K/month |
|
||||||
|
| fingerprint_matches | 10M | 1M/month |
|
||||||
|
|
||||||
|
### 6.2 Partitioning Candidates
|
||||||
|
|
||||||
|
- `fingerprint_matches` - Partition by `matched_at` (monthly)
|
||||||
|
- `cve_fix_evidence` - Partition by `created_at` (monthly)
|
||||||
|
|
||||||
|
### 6.3 Index Maintenance
|
||||||
|
|
||||||
|
- Hash index on `fingerprint_hash` for exact matches
|
||||||
|
- Consider bloom filter for fingerprint similarity pre-filtering
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Document Version: 1.0.0*
|
||||||
|
*Last Updated: 2025-12-21*
|
||||||
378
docs/implplan/SPRINT_3600_0001_0001_gateway_webservice.md
Normal file
378
docs/implplan/SPRINT_3600_0001_0001_gateway_webservice.md
Normal file
@@ -0,0 +1,378 @@
|
|||||||
|
# Sprint 3600.0001.0001 · Gateway WebService — HTTP Ingress Implementation
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Implement the missing `StellaOps.Gateway.WebService` HTTP ingress service.
|
||||||
|
- This is the single entry point for all external HTTP traffic, routing to microservices via the Router binary protocol.
|
||||||
|
- Connects the existing `StellaOps.Router.Gateway` library to a production-ready ASP.NET Core host.
|
||||||
|
- **Working directory:** `src/Gateway/StellaOps.Gateway.WebService/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- **Upstream**: `StellaOps.Router.Gateway`, `StellaOps.Router.Transport.*`, `StellaOps.Auth.ServerIntegration`
|
||||||
|
- **Downstream**: All external API consumers, CLI, UI
|
||||||
|
- **Safe to parallelize with**: Sprints 3600.0002.*, 4200.*, 5200.*
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/modules/router/architecture.md` (canonical Router specification)
|
||||||
|
- `docs/modules/gateway/openapi.md` (OpenAPI aggregation)
|
||||||
|
- `docs/product-advisories/archived/2025-12-21-reference-architecture/20-Dec-2025 - Stella Ops Reference Architecture.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md` Section 7 (APIs)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Project Scaffolding
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the Gateway.WebService project with proper structure and dependencies.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Gateway/StellaOps.Gateway.WebService/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `StellaOps.Gateway.WebService.csproj` targeting `net10.0`
|
||||||
|
- [ ] References: `StellaOps.Router.Gateway`, `StellaOps.Auth.ServerIntegration`, `StellaOps.Router.Transport.Tcp`, `StellaOps.Router.Transport.Tls`
|
||||||
|
- [ ] `Program.cs` with minimal viable bootstrap
|
||||||
|
- [ ] `appsettings.json` and `appsettings.Development.json`
|
||||||
|
- [ ] Dockerfile for containerized deployment
|
||||||
|
- [ ] Added to `StellaOps.sln`
|
||||||
|
|
||||||
|
**Project Structure**:
|
||||||
|
```
|
||||||
|
src/Gateway/
|
||||||
|
├── StellaOps.Gateway.WebService/
|
||||||
|
│ ├── StellaOps.Gateway.WebService.csproj
|
||||||
|
│ ├── Program.cs
|
||||||
|
│ ├── Dockerfile
|
||||||
|
│ ├── appsettings.json
|
||||||
|
│ ├── appsettings.Development.json
|
||||||
|
│ ├── Configuration/
|
||||||
|
│ │ └── GatewayOptions.cs
|
||||||
|
│ ├── Middleware/
|
||||||
|
│ │ ├── TenantMiddleware.cs
|
||||||
|
│ │ ├── RequestRoutingMiddleware.cs
|
||||||
|
│ │ └── HealthCheckMiddleware.cs
|
||||||
|
│ └── Services/
|
||||||
|
│ ├── GatewayHostedService.cs
|
||||||
|
│ └── OpenApiAggregationService.cs
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Gateway Host Service
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the hosted service that manages Router transport connections and microservice registration.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `GatewayHostedService` : `IHostedService`
|
||||||
|
- [ ] Starts TCP/TLS transport servers on configured ports
|
||||||
|
- [ ] Handles HELLO frames from microservices
|
||||||
|
- [ ] Maintains connection health via heartbeats
|
||||||
|
- [ ] Graceful shutdown with DRAINING state propagation
|
||||||
|
- [ ] Metrics: active_connections, registered_endpoints
|
||||||
|
|
||||||
|
**Code Spec**:
|
||||||
|
```csharp
|
||||||
|
public sealed class GatewayHostedService : IHostedService, IDisposable
|
||||||
|
{
|
||||||
|
private readonly ITransportServer _tcpServer;
|
||||||
|
private readonly ITransportServer _tlsServer;
|
||||||
|
private readonly IRoutingStateManager _routingState;
|
||||||
|
private readonly ILogger<GatewayHostedService> _logger;
|
||||||
|
|
||||||
|
public async Task StartAsync(CancellationToken ct)
|
||||||
|
{
|
||||||
|
_tcpServer.OnHelloReceived += HandleHelloAsync;
|
||||||
|
_tcpServer.OnHeartbeatReceived += HandleHeartbeatAsync;
|
||||||
|
_tcpServer.OnConnectionClosed += HandleDisconnectAsync;
|
||||||
|
|
||||||
|
await _tcpServer.StartAsync(ct);
|
||||||
|
await _tlsServer.StartAsync(ct);
|
||||||
|
|
||||||
|
_logger.LogInformation("Gateway started on TCP:{TcpPort} TLS:{TlsPort}",
|
||||||
|
_options.TcpPort, _options.TlsPort);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task StopAsync(CancellationToken ct)
|
||||||
|
{
|
||||||
|
await _routingState.DrainAllConnectionsAsync(ct);
|
||||||
|
await _tcpServer.StopAsync(ct);
|
||||||
|
await _tlsServer.StopAsync(ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Request Routing Middleware
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the core HTTP-to-binary routing middleware.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `RequestRoutingMiddleware` intercepts all non-system routes
|
||||||
|
- [ ] Extracts `(Method, Path)` from HTTP request
|
||||||
|
- [ ] Looks up endpoint in routing state
|
||||||
|
- [ ] Serializes HTTP request to binary frame
|
||||||
|
- [ ] Sends to selected microservice instance
|
||||||
|
- [ ] Deserializes binary response to HTTP response
|
||||||
|
- [ ] Supports streaming responses (chunked transfer)
|
||||||
|
- [ ] Propagates cancellation on client disconnect
|
||||||
|
- [ ] Request correlation ID in X-Correlation-Id header
|
||||||
|
|
||||||
|
**Routing Flow**:
|
||||||
|
```
|
||||||
|
HTTP Request → Middleware → RoutingState.SelectInstance()
|
||||||
|
↓
|
||||||
|
TransportClient.SendRequestAsync()
|
||||||
|
↓
|
||||||
|
Microservice processes
|
||||||
|
↓
|
||||||
|
TransportClient.ReceiveResponseAsync()
|
||||||
|
↓
|
||||||
|
HTTP Response ← Middleware ← Response Frame
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Authentication & Authorization Integration
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate Authority DPoP/mTLS validation and claims-based authorization.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] DPoP token validation via `StellaOps.Auth.ServerIntegration`
|
||||||
|
- [ ] mTLS certificate binding validation
|
||||||
|
- [ ] Claims extraction and propagation to microservices
|
||||||
|
- [ ] Endpoint-level authorization based on `RequiringClaims`
|
||||||
|
- [ ] Tenant context extraction from `tid` claim
|
||||||
|
- [ ] Rate limiting per tenant/identity
|
||||||
|
- [ ] Audit logging of auth failures
|
||||||
|
|
||||||
|
**Claims Propagation**:
|
||||||
|
```csharp
|
||||||
|
// Claims are serialized into request frame headers
|
||||||
|
var claims = new Dictionary<string, string>
|
||||||
|
{
|
||||||
|
["sub"] = principal.FindFirst("sub")?.Value ?? "",
|
||||||
|
["tid"] = principal.FindFirst("tid")?.Value ?? "",
|
||||||
|
["scope"] = string.Join(" ", principal.FindAll("scope").Select(c => c.Value)),
|
||||||
|
["cnf.jkt"] = principal.FindFirst("cnf.jkt")?.Value ?? ""
|
||||||
|
};
|
||||||
|
requestFrame.Headers = claims;
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: OpenAPI Aggregation Endpoint
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement aggregated OpenAPI 3.1.0 spec generation from registered endpoints.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `GET /openapi.json` returns aggregated spec
|
||||||
|
- [ ] `GET /openapi.yaml` returns YAML format
|
||||||
|
- [ ] TTL-based caching (5 min default)
|
||||||
|
- [ ] ETag generation for conditional requests
|
||||||
|
- [ ] Schema validation before aggregation
|
||||||
|
- [ ] Includes all registered endpoints with their schemas
|
||||||
|
- [ ] Info section populated from gateway config
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Health & Readiness Endpoints
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement health check endpoints for orchestration platforms.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `GET /health/live` - Liveness probe (process alive)
|
||||||
|
- [ ] `GET /health/ready` - Readiness probe (accepting traffic)
|
||||||
|
- [ ] `GET /health/startup` - Startup probe (initialization complete)
|
||||||
|
- [ ] Downstream health aggregation from connected microservices
|
||||||
|
- [ ] Metrics endpoint at `/metrics` (Prometheus format)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Configuration & Options
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define comprehensive gateway configuration model.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `GatewayOptions` with all configurable settings
|
||||||
|
- [ ] YAML configuration support
|
||||||
|
- [ ] Environment variable overrides
|
||||||
|
- [ ] Configuration validation on startup
|
||||||
|
- [ ] Hot-reload for non-transport settings
|
||||||
|
|
||||||
|
**Configuration Spec**:
|
||||||
|
```yaml
|
||||||
|
gateway:
|
||||||
|
node:
|
||||||
|
region: "eu1"
|
||||||
|
nodeId: "gw-eu1-01"
|
||||||
|
environment: "prod"
|
||||||
|
|
||||||
|
transports:
|
||||||
|
tcp:
|
||||||
|
enabled: true
|
||||||
|
port: 9100
|
||||||
|
maxConnections: 1000
|
||||||
|
tls:
|
||||||
|
enabled: true
|
||||||
|
port: 9443
|
||||||
|
certificatePath: "/certs/gateway.pfx"
|
||||||
|
clientCertificateMode: "RequireCertificate"
|
||||||
|
|
||||||
|
routing:
|
||||||
|
defaultTimeout: "30s"
|
||||||
|
maxRequestBodySize: "100MB"
|
||||||
|
streamingEnabled: true
|
||||||
|
neighborRegions: ["eu2", "us1"]
|
||||||
|
|
||||||
|
auth:
|
||||||
|
dpopEnabled: true
|
||||||
|
mtlsEnabled: true
|
||||||
|
rateLimiting:
|
||||||
|
enabled: true
|
||||||
|
requestsPerMinute: 1000
|
||||||
|
burstSize: 100
|
||||||
|
|
||||||
|
openapi:
|
||||||
|
enabled: true
|
||||||
|
cacheTtlSeconds: 300
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive unit tests for gateway components.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Routing middleware tests (happy path, errors, timeouts)
|
||||||
|
- [ ] Instance selection algorithm tests
|
||||||
|
- [ ] Claims extraction tests
|
||||||
|
- [ ] Configuration validation tests
|
||||||
|
- [ ] OpenAPI aggregation tests
|
||||||
|
- [ ] 90%+ code coverage
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
End-to-end integration tests with in-memory transport.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Request routing through gateway to mock microservice
|
||||||
|
- [ ] Streaming response handling
|
||||||
|
- [ ] Cancellation propagation
|
||||||
|
- [ ] Auth flow integration
|
||||||
|
- [ ] Multi-instance load balancing
|
||||||
|
- [ ] Health check aggregation
|
||||||
|
- [ ] Uses `StellaOps.Router.Transport.InMemory` for testing
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: Documentation
|
||||||
|
|
||||||
|
**Assignee**: Platform Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create gateway architecture documentation.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `docs/modules/gateway/architecture.md` - Full architecture card
|
||||||
|
- [ ] Update `docs/07_HIGH_LEVEL_ARCHITECTURE.md` with gateway details
|
||||||
|
- [ ] Operator runbook for deployment and troubleshooting
|
||||||
|
- [ ] Configuration reference
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Platform Team | Project Scaffolding |
|
||||||
|
| 2 | T2 | TODO | T1 | Platform Team | Gateway Host Service |
|
||||||
|
| 3 | T3 | TODO | T2 | Platform Team | Request Routing Middleware |
|
||||||
|
| 4 | T4 | TODO | T1 | Platform Team | Auth & Authorization Integration |
|
||||||
|
| 5 | T5 | TODO | T2 | Platform Team | OpenAPI Aggregation Endpoint |
|
||||||
|
| 6 | T6 | TODO | T1 | Platform Team | Health & Readiness Endpoints |
|
||||||
|
| 7 | T7 | TODO | T1 | Platform Team | Configuration & Options |
|
||||||
|
| 8 | T8 | TODO | T1-T7 | Platform Team | Unit Tests |
|
||||||
|
| 9 | T9 | TODO | T8 | Platform Team | Integration Tests |
|
||||||
|
| 10 | T10 | TODO | T1-T9 | Platform Team | Documentation |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Reference Architecture advisory gap analysis. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Single ingress point | Decision | Platform Team | All HTTP traffic goes through Gateway.WebService |
|
||||||
|
| Binary protocol only for internal | Decision | Platform Team | No HTTP between Gateway and microservices |
|
||||||
|
| TLS required for production | Decision | Platform Team | TCP transport only for development/testing |
|
||||||
|
| DPoP + mTLS dual support | Decision | Platform Team | Both auth mechanisms supported concurrently |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] Gateway accepts HTTP requests and routes to microservices via binary protocol
|
||||||
|
- [ ] All existing Router.Gateway tests pass
|
||||||
|
- [ ] `tests/StellaOps.Gateway.WebService.Tests/` project references work (no longer orphaned)
|
||||||
|
- [ ] OpenAPI spec aggregation functional
|
||||||
|
- [ ] Auth integration with Authority validated
|
||||||
|
- [ ] Performance: <5ms routing overhead at P99
|
||||||
|
|
||||||
|
**Sprint Status**: TODO (0/10 tasks complete)
|
||||||
309
docs/implplan/SPRINT_3600_0002_0001_cyclonedx_1_7_upgrade.md
Normal file
309
docs/implplan/SPRINT_3600_0002_0001_cyclonedx_1_7_upgrade.md
Normal file
@@ -0,0 +1,309 @@
|
|||||||
|
# Sprint 3600.0002.0001 · CycloneDX 1.7 Upgrade — SBOM Format Migration
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Upgrade all CycloneDX SBOM generation from version 1.6 to version 1.7.
|
||||||
|
- Update serialization, parsing, and validation to CycloneDX 1.7 specification.
|
||||||
|
- Maintain backward compatibility for reading CycloneDX 1.6 documents.
|
||||||
|
- **Working directory:** `src/Scanner/__Libraries/StellaOps.Scanner.Emit/`, `src/SbomService/`, `src/Excititor/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- **Upstream**: CycloneDX Core NuGet package update
|
||||||
|
- **Downstream**: All SBOM consumers (Policy, Excititor, ExportCenter)
|
||||||
|
- **Safe to parallelize with**: Sprints 3600.0003.*, 4200.*, 5200.*
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- CycloneDX 1.7 Specification: https://cyclonedx.org/docs/1.7/
|
||||||
|
- `docs/modules/scanner/architecture.md`
|
||||||
|
- `docs/modules/sbomservice/architecture.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: CycloneDX NuGet Package Update
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update CycloneDX.Core and related packages to versions supporting 1.7.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Update `CycloneDX.Core` to latest version with 1.7 support
|
||||||
|
- [ ] Update `CycloneDX.Json` if separate
|
||||||
|
- [ ] Update `CycloneDX.Protobuf` if separate
|
||||||
|
- [ ] Verify all dependent projects build
|
||||||
|
- [ ] No breaking API changes (or document migration path)
|
||||||
|
|
||||||
|
**Package Updates**:
|
||||||
|
```xml
|
||||||
|
<!-- Before -->
|
||||||
|
<PackageReference Include="CycloneDX.Core" Version="10.0.2" />
|
||||||
|
|
||||||
|
<!-- After -->
|
||||||
|
<PackageReference Include="CycloneDX.Core" Version="11.0.0" /> <!-- or appropriate 1.7-supporting version -->
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: CycloneDxComposer Update
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update the SBOM composer to emit CycloneDX 1.7 format.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/__Libraries/StellaOps.Scanner.Emit/Composition/CycloneDxComposer.cs`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Spec version set to "1.7"
|
||||||
|
- [ ] Media type updated to `application/vnd.cyclonedx+json; version=1.7`
|
||||||
|
- [ ] New 1.7 fields populated where applicable:
|
||||||
|
- [ ] `declarations` for attestations
|
||||||
|
- [ ] `definitions` for standards/requirements
|
||||||
|
- [ ] Enhanced `formulation` for build environment
|
||||||
|
- [ ] `modelCard` for ML components (if applicable)
|
||||||
|
- [ ] `cryptography` properties (if applicable)
|
||||||
|
- [ ] Existing fields remain populated correctly
|
||||||
|
- [ ] Deterministic output maintained
|
||||||
|
|
||||||
|
**Key 1.7 Additions**:
|
||||||
|
```csharp
|
||||||
|
// CycloneDX 1.7 new features
|
||||||
|
public sealed record CycloneDx17Enhancements
|
||||||
|
{
|
||||||
|
// Attestations - link to in-toto/DSSE
|
||||||
|
public ImmutableArray<Declaration> Declarations { get; init; }
|
||||||
|
|
||||||
|
// Standards compliance (e.g., NIST, ISO)
|
||||||
|
public ImmutableArray<Definition> Definitions { get; init; }
|
||||||
|
|
||||||
|
// Enhanced formulation for reproducibility
|
||||||
|
public Formulation? Formulation { get; init; }
|
||||||
|
|
||||||
|
// Cryptography bill of materials
|
||||||
|
public CryptographyProperties? Cryptography { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: SBOM Serialization Updates
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update JSON and Protobuf serialization for 1.7 schema.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] JSON serialization outputs valid CycloneDX 1.7
|
||||||
|
- [ ] Protobuf serialization updated for 1.7 schema
|
||||||
|
- [ ] Schema validation against official 1.7 JSON schema
|
||||||
|
- [ ] Canonical JSON ordering preserved (determinism)
|
||||||
|
- [ ] Empty collections omitted (spec compliance)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: SBOM Parsing Backward Compatibility
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Ensure parsers can read both 1.6 and 1.7 CycloneDX documents.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.CycloneDX/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Parser auto-detects spec version from document
|
||||||
|
- [ ] 1.6 documents parsed without errors
|
||||||
|
- [ ] 1.7 documents parsed with new fields
|
||||||
|
- [ ] Unknown fields in future versions ignored gracefully
|
||||||
|
- [ ] Version-specific validation applied
|
||||||
|
|
||||||
|
**Parsing Logic**:
|
||||||
|
```csharp
|
||||||
|
public CycloneDxBom Parse(string json)
|
||||||
|
{
|
||||||
|
var specVersion = ExtractSpecVersion(json);
|
||||||
|
return specVersion switch
|
||||||
|
{
|
||||||
|
"1.6" => ParseV16(json),
|
||||||
|
"1.7" => ParseV17(json),
|
||||||
|
_ when specVersion.StartsWith("1.") => ParseV17(json), // forward compat
|
||||||
|
_ => throw new UnsupportedSpecVersionException(specVersion)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: VEX Format Updates
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update VEX document generation to leverage CycloneDX 1.7 improvements.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] VEX documents reference 1.7 spec
|
||||||
|
- [ ] Enhanced `vulnerability.ratings` with CVSS 4.0 vectors
|
||||||
|
- [ ] `vulnerability.affects[].versions` range expressions
|
||||||
|
- [ ] `vulnerability.source` with PURL references
|
||||||
|
- [ ] Backward-compatible with 1.6 VEX consumers
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Media Type Updates
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update all media type references throughout the codebase.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Constants updated: `application/vnd.cyclonedx+json; version=1.7`
|
||||||
|
- [ ] OCI artifact type updated for SBOM referrers
|
||||||
|
- [ ] Content-Type headers in API responses updated
|
||||||
|
- [ ] Accept header handling supports both 1.6 and 1.7
|
||||||
|
|
||||||
|
**Media Type Constants**:
|
||||||
|
```csharp
|
||||||
|
public static class CycloneDxMediaTypes
|
||||||
|
{
|
||||||
|
public const string JsonV17 = "application/vnd.cyclonedx+json; version=1.7";
|
||||||
|
public const string JsonV16 = "application/vnd.cyclonedx+json; version=1.6";
|
||||||
|
public const string Json = JsonV17; // Default to latest
|
||||||
|
|
||||||
|
public const string ProtobufV17 = "application/vnd.cyclonedx+protobuf; version=1.7";
|
||||||
|
public const string XmlV17 = "application/vnd.cyclonedx+xml; version=1.7";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Golden Corpus Update
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update golden test corpus with CycloneDX 1.7 expected outputs.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Regenerate all golden SBOM files in 1.7 format
|
||||||
|
- [ ] Verify determinism: same inputs produce identical outputs
|
||||||
|
- [ ] Add 1.7-specific test cases (declarations, formulation)
|
||||||
|
- [ ] Retain 1.6 golden files for backward compat testing
|
||||||
|
- [ ] CI/CD determinism tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update and expand unit tests for 1.7 support.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Composer tests for 1.7 output
|
||||||
|
- [ ] Parser tests for 1.6 and 1.7 input
|
||||||
|
- [ ] Serialization round-trip tests
|
||||||
|
- [ ] Schema validation tests
|
||||||
|
- [ ] Media type handling tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
End-to-end integration tests with 1.7 SBOMs.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Full scan → SBOM → Policy evaluation flow
|
||||||
|
- [ ] SBOM export to OCI registry as referrer
|
||||||
|
- [ ] Cross-module SBOM consumption (Excititor, Policy)
|
||||||
|
- [ ] Air-gap bundle with 1.7 SBOMs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: Documentation Updates
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update documentation to reflect 1.7 upgrade.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Update `docs/modules/scanner/architecture.md` with 1.7 references
|
||||||
|
- [ ] Update `docs/modules/sbomservice/architecture.md`
|
||||||
|
- [ ] Update API documentation with new media types
|
||||||
|
- [ ] Migration guide for 1.6 → 1.7
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Scanner Team | NuGet Package Update |
|
||||||
|
| 2 | T2 | TODO | T1 | Scanner Team | CycloneDxComposer Update |
|
||||||
|
| 3 | T3 | TODO | T1 | Scanner Team | Serialization Updates |
|
||||||
|
| 4 | T4 | TODO | T1 | Scanner Team | Parsing Backward Compatibility |
|
||||||
|
| 5 | T5 | TODO | T2 | Scanner Team | VEX Format Updates |
|
||||||
|
| 6 | T6 | TODO | T2 | Scanner Team | Media Type Updates |
|
||||||
|
| 7 | T7 | TODO | T2-T6 | Scanner Team | Golden Corpus Update |
|
||||||
|
| 8 | T8 | TODO | T2-T6 | Scanner Team | Unit Tests |
|
||||||
|
| 9 | T9 | TODO | T8 | Scanner Team | Integration Tests |
|
||||||
|
| 10 | T10 | TODO | T1-T9 | Scanner Team | Documentation Updates |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Reference Architecture advisory - upgrading from 1.6 to 1.7. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Default to 1.7 | Decision | Scanner Team | New SBOMs default to 1.7; 1.6 available via config |
|
||||||
|
| Backward compat | Decision | Scanner Team | Parsers support 1.5, 1.6, 1.7 for ingestion |
|
||||||
|
| Protobuf sync | Risk | Scanner Team | Protobuf schema may lag JSON; prioritize JSON |
|
||||||
|
| NuGet availability | Risk | Scanner Team | CycloneDX.Core 1.7 support timing unclear |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All SBOM generation outputs valid CycloneDX 1.7
|
||||||
|
- [ ] All parsers read 1.6 and 1.7 without errors
|
||||||
|
- [ ] Determinism tests pass with 1.7 output
|
||||||
|
- [ ] No regression in scan-to-policy flow
|
||||||
|
- [ ] Media types correctly reflect 1.7
|
||||||
|
|
||||||
|
**Sprint Status**: TODO (0/10 tasks complete)
|
||||||
387
docs/implplan/SPRINT_3600_0003_0001_spdx_3_0_1_generation.md
Normal file
387
docs/implplan/SPRINT_3600_0003_0001_spdx_3_0_1_generation.md
Normal file
@@ -0,0 +1,387 @@
|
|||||||
|
# Sprint 3600.0003.0001 · SPDX 3.0.1 Native Generation — Full SBOM Format Support
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Implement native SPDX 3.0.1 SBOM generation capability.
|
||||||
|
- Currently only license normalization and import parsing exists; this sprint adds full generation.
|
||||||
|
- Provide SPDX 3.0.1 as an alternative output format alongside CycloneDX 1.7.
|
||||||
|
- **Working directory:** `src/Scanner/__Libraries/StellaOps.Scanner.Emit/`, `src/SbomService/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- **Upstream**: Sprint 3600.0002.0001 (CycloneDX 1.7 - establishes patterns)
|
||||||
|
- **Downstream**: ExportCenter, air-gap bundles, Policy (optional SPDX support)
|
||||||
|
- **Safe to parallelize with**: Sprints 4200.*, 5200.*
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- SPDX 3.0.1 Specification: https://spdx.github.io/spdx-spec/v3.0.1/
|
||||||
|
- `docs/modules/scanner/architecture.md`
|
||||||
|
- Existing: `src/AirGap/StellaOps.AirGap.Importer/Reconciliation/Parsers/SpdxParser.cs`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: SPDX 3.0.1 Domain Model
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create comprehensive C# domain model for SPDX 3.0.1 elements.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/__Libraries/StellaOps.Scanner.Emit/Spdx/Models/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Core classes: `SpdxDocument`, `SpdxElement`, `SpdxRelationship`
|
||||||
|
- [ ] Package model: `SpdxPackage` with all 3.0.1 fields
|
||||||
|
- [ ] File model: `SpdxFile` with checksums and annotations
|
||||||
|
- [ ] Snippet model: `SpdxSnippet` for partial file references
|
||||||
|
- [ ] Licensing: `SpdxLicense`, `SpdxLicenseExpression`, `SpdxExtractedLicense`
|
||||||
|
- [ ] Security: `SpdxVulnerability`, `SpdxVulnAssessment`
|
||||||
|
- [ ] Annotations and relationships per spec
|
||||||
|
- [ ] Immutable records with init-only properties
|
||||||
|
|
||||||
|
**Core Model**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Scanner.Emit.Spdx.Models;
|
||||||
|
|
||||||
|
public sealed record SpdxDocument
|
||||||
|
{
|
||||||
|
public required string SpdxVersion { get; init; } // "SPDX-3.0.1"
|
||||||
|
public required string DocumentNamespace { get; init; }
|
||||||
|
public required string Name { get; init; }
|
||||||
|
public required SpdxCreationInfo CreationInfo { get; init; }
|
||||||
|
public ImmutableArray<SpdxElement> Elements { get; init; }
|
||||||
|
public ImmutableArray<SpdxRelationship> Relationships { get; init; }
|
||||||
|
public ImmutableArray<SpdxAnnotation> Annotations { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public abstract record SpdxElement
|
||||||
|
{
|
||||||
|
public required string SpdxId { get; init; }
|
||||||
|
public string? Name { get; init; }
|
||||||
|
public string? Comment { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SpdxPackage : SpdxElement
|
||||||
|
{
|
||||||
|
public string? Version { get; init; }
|
||||||
|
public string? PackageUrl { get; init; } // PURL
|
||||||
|
public string? DownloadLocation { get; init; }
|
||||||
|
public SpdxLicenseExpression? DeclaredLicense { get; init; }
|
||||||
|
public SpdxLicenseExpression? ConcludedLicense { get; init; }
|
||||||
|
public string? CopyrightText { get; init; }
|
||||||
|
public ImmutableArray<SpdxChecksum> Checksums { get; init; }
|
||||||
|
public ImmutableArray<SpdxExternalRef> ExternalRefs { get; init; }
|
||||||
|
public SpdxPackageVerificationCode? VerificationCode { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SpdxRelationship
|
||||||
|
{
|
||||||
|
public required string FromElement { get; init; }
|
||||||
|
public required SpdxRelationshipType Type { get; init; }
|
||||||
|
public required string ToElement { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: SPDX 3.0.1 Composer
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement SBOM composer that generates SPDX 3.0.1 documents from scan results.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/__Libraries/StellaOps.Scanner.Emit/Composition/SpdxComposer.cs`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `ISpdxComposer` interface with `Compose()` method
|
||||||
|
- [ ] `SpdxComposer` implementation
|
||||||
|
- [ ] Maps internal package model to SPDX packages
|
||||||
|
- [ ] Generates DESCRIBES relationships for root packages
|
||||||
|
- [ ] Generates DEPENDENCY_OF relationships for dependencies
|
||||||
|
- [ ] Populates license expressions from detected licenses
|
||||||
|
- [ ] Deterministic SPDX ID generation (content-addressed)
|
||||||
|
- [ ] Document namespace follows URI pattern
|
||||||
|
|
||||||
|
**Composer Interface**:
|
||||||
|
```csharp
|
||||||
|
public interface ISpdxComposer
|
||||||
|
{
|
||||||
|
SpdxDocument Compose(
|
||||||
|
ScanResult scanResult,
|
||||||
|
SpdxCompositionOptions options,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
|
||||||
|
ValueTask<SpdxDocument> ComposeAsync(
|
||||||
|
ScanResult scanResult,
|
||||||
|
SpdxCompositionOptions options,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SpdxCompositionOptions
|
||||||
|
{
|
||||||
|
public string CreatorTool { get; init; } = "StellaOps-Scanner";
|
||||||
|
public string? CreatorOrganization { get; init; }
|
||||||
|
public string NamespaceBase { get; init; } = "https://stellaops.io/spdx";
|
||||||
|
public bool IncludeFiles { get; init; } = false;
|
||||||
|
public bool IncludeSnippets { get; init; } = false;
|
||||||
|
public SpdxLicenseListVersion LicenseListVersion { get; init; } = SpdxLicenseListVersion.V3_21;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: SPDX JSON-LD Serialization
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement JSON-LD serialization per SPDX 3.0.1 specification.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] JSON-LD output with proper @context
|
||||||
|
- [ ] @type annotations for all elements
|
||||||
|
- [ ] @id for element references
|
||||||
|
- [ ] Canonical JSON ordering (deterministic)
|
||||||
|
- [ ] Schema validation against official SPDX 3.0.1 JSON schema
|
||||||
|
- [ ] Compact JSON-LD form (not expanded)
|
||||||
|
|
||||||
|
**JSON-LD Output Example**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"@context": "https://spdx.org/rdf/3.0.1/spdx-context.jsonld",
|
||||||
|
"@type": "SpdxDocument",
|
||||||
|
"spdxVersion": "SPDX-3.0.1",
|
||||||
|
"name": "SBOM for container:sha256:abc123",
|
||||||
|
"documentNamespace": "https://stellaops.io/spdx/container/sha256:abc123",
|
||||||
|
"creationInfo": {
|
||||||
|
"@type": "CreationInfo",
|
||||||
|
"created": "2025-12-21T10:00:00Z",
|
||||||
|
"createdBy": ["Tool: StellaOps-Scanner-1.0.0"]
|
||||||
|
},
|
||||||
|
"rootElement": ["SPDXRef-Package-root"],
|
||||||
|
"element": [
|
||||||
|
{
|
||||||
|
"@type": "Package",
|
||||||
|
"@id": "SPDXRef-Package-root",
|
||||||
|
"name": "myapp",
|
||||||
|
"packageVersion": "1.0.0",
|
||||||
|
"packageUrl": "pkg:oci/myapp@sha256:abc123"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: SPDX Tag-Value Serialization (Optional)
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement legacy tag-value format for backward compatibility.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Tag-value output matching SPDX 2.3 format
|
||||||
|
- [ ] Deterministic field ordering
|
||||||
|
- [ ] Proper escaping of multi-line text
|
||||||
|
- [ ] Relationship serialization
|
||||||
|
- [ ] Can be disabled via configuration
|
||||||
|
|
||||||
|
**Tag-Value Example**:
|
||||||
|
```
|
||||||
|
SPDXVersion: SPDX-2.3
|
||||||
|
DataLicense: CC0-1.0
|
||||||
|
SPDXID: SPDXRef-DOCUMENT
|
||||||
|
DocumentName: SBOM for container:sha256:abc123
|
||||||
|
DocumentNamespace: https://stellaops.io/spdx/container/sha256:abc123
|
||||||
|
|
||||||
|
PackageName: myapp
|
||||||
|
SPDXID: SPDXRef-Package-root
|
||||||
|
PackageVersion: 1.0.0
|
||||||
|
PackageDownloadLocation: NOASSERTION
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: License Expression Handling
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement SPDX license expression parsing and generation.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Parse SPDX license expressions (AND, OR, WITH)
|
||||||
|
- [ ] Generate valid license expressions
|
||||||
|
- [ ] Handle LicenseRef- for custom licenses
|
||||||
|
- [ ] Validate against SPDX license list
|
||||||
|
- [ ] Support SPDX 3.21 license list
|
||||||
|
|
||||||
|
**License Expression Model**:
|
||||||
|
```csharp
|
||||||
|
public abstract record SpdxLicenseExpression;
|
||||||
|
|
||||||
|
public sealed record SpdxSimpleLicense(string LicenseId) : SpdxLicenseExpression;
|
||||||
|
|
||||||
|
public sealed record SpdxConjunctiveLicense(
|
||||||
|
SpdxLicenseExpression Left,
|
||||||
|
SpdxLicenseExpression Right) : SpdxLicenseExpression; // AND
|
||||||
|
|
||||||
|
public sealed record SpdxDisjunctiveLicense(
|
||||||
|
SpdxLicenseExpression Left,
|
||||||
|
SpdxLicenseExpression Right) : SpdxLicenseExpression; // OR
|
||||||
|
|
||||||
|
public sealed record SpdxWithException(
|
||||||
|
SpdxLicenseExpression License,
|
||||||
|
string Exception) : SpdxLicenseExpression;
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: SPDX-CycloneDX Conversion
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement bidirectional conversion between SPDX and CycloneDX.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] CycloneDX → SPDX conversion
|
||||||
|
- [ ] SPDX → CycloneDX conversion
|
||||||
|
- [ ] Preserve all common fields
|
||||||
|
- [ ] Handle format-specific fields gracefully
|
||||||
|
- [ ] Conversion loss documented
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: SBOM Service Integration
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate SPDX generation into SBOM service endpoints.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/SbomService/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `Accept: application/spdx+json` returns SPDX 3.0.1
|
||||||
|
- [ ] `Accept: text/spdx` returns tag-value format
|
||||||
|
- [ ] Query parameter `?format=spdx` as alternative
|
||||||
|
- [ ] Default remains CycloneDX 1.7
|
||||||
|
- [ ] Caching works for both formats
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: OCI Artifact Type Registration
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Register SPDX SBOMs as OCI referrers with proper artifact type.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Artifact type: `application/spdx+json`
|
||||||
|
- [ ] Push to registry alongside CycloneDX
|
||||||
|
- [ ] Configurable: push one or both formats
|
||||||
|
- [ ] Referrer index lists both when available
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive unit tests for SPDX generation.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Model construction tests
|
||||||
|
- [ ] Composer tests for various scan results
|
||||||
|
- [ ] JSON-LD serialization tests
|
||||||
|
- [ ] Tag-value serialization tests
|
||||||
|
- [ ] License expression tests
|
||||||
|
- [ ] Conversion tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: Integration Tests & Golden Corpus
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
End-to-end tests and golden file corpus for SPDX.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Full scan → SPDX flow
|
||||||
|
- [ ] Golden SPDX files for determinism testing
|
||||||
|
- [ ] SPDX validation against official tooling
|
||||||
|
- [ ] Air-gap bundle with SPDX SBOMs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Scanner Team | SPDX 3.0.1 Domain Model |
|
||||||
|
| 2 | T2 | TODO | T1 | Scanner Team | SPDX 3.0.1 Composer |
|
||||||
|
| 3 | T3 | TODO | T1 | Scanner Team | JSON-LD Serialization |
|
||||||
|
| 4 | T4 | TODO | T1 | Scanner Team | Tag-Value Serialization |
|
||||||
|
| 5 | T5 | TODO | — | Scanner Team | License Expression Handling |
|
||||||
|
| 6 | T6 | TODO | T1, T3 | Scanner Team | SPDX-CycloneDX Conversion |
|
||||||
|
| 7 | T7 | TODO | T2, T3 | Scanner Team | SBOM Service Integration |
|
||||||
|
| 8 | T8 | TODO | T7 | Scanner Team | OCI Artifact Type Registration |
|
||||||
|
| 9 | T9 | TODO | T1-T6 | Scanner Team | Unit Tests |
|
||||||
|
| 10 | T10 | TODO | T7-T8 | Scanner Team | Integration Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Reference Architecture advisory - adding SPDX 3.0.1 generation. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| JSON-LD primary | Decision | Scanner Team | JSON-LD is primary format; tag-value for legacy |
|
||||||
|
| CycloneDX default | Decision | Scanner Team | CycloneDX remains default; SPDX opt-in |
|
||||||
|
| SPDX 3.0.1 only | Decision | Scanner Team | No support for SPDX 2.x generation (only parsing) |
|
||||||
|
| License list sync | Risk | Scanner Team | SPDX license list updates may require periodic sync |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] Valid SPDX 3.0.1 JSON-LD output from scans
|
||||||
|
- [ ] Passes official SPDX validation tools
|
||||||
|
- [ ] Deterministic output (same input = same output)
|
||||||
|
- [ ] Can export both CycloneDX and SPDX for same scan
|
||||||
|
- [ ] Documentation complete
|
||||||
|
|
||||||
|
**Sprint Status**: TODO (0/10 tasks complete)
|
||||||
87
docs/implplan/SPRINT_3600_SUMMARY.md
Normal file
87
docs/implplan/SPRINT_3600_SUMMARY.md
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
# Sprint Series 3600 · Reference Architecture Gap Closure
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
This sprint series addresses gaps identified from the **20-Dec-2025 Reference Architecture Advisory** analysis. These sprints complete the implementation of the Stella Ops reference architecture vision.
|
||||||
|
|
||||||
|
## Sprint Index
|
||||||
|
|
||||||
|
| Sprint | Title | Priority | Status | Dependencies |
|
||||||
|
|--------|-------|----------|--------|--------------|
|
||||||
|
| 3600.0001.0001 | Gateway WebService | HIGH | TODO | Router infrastructure (complete) |
|
||||||
|
| 3600.0002.0001 | CycloneDX 1.7 Upgrade | HIGH | TODO | None |
|
||||||
|
| 3600.0003.0001 | SPDX 3.0.1 Generation | MEDIUM | TODO | 3600.0002.0001 |
|
||||||
|
|
||||||
|
## Related Sprints (Other Series)
|
||||||
|
|
||||||
|
| Sprint | Title | Priority | Status | Series |
|
||||||
|
|--------|-------|----------|--------|--------|
|
||||||
|
| 4200.0001.0001 | Proof Chain Verification UI | HIGH | TODO | 4200 (UI) |
|
||||||
|
| 5200.0001.0001 | Starter Policy Template | HIGH | TODO | 5200 (Docs) |
|
||||||
|
|
||||||
|
## Gap Analysis Source
|
||||||
|
|
||||||
|
**Advisory**: `docs/product-advisories/archived/2025-12-21-reference-architecture/20-Dec-2025 - Stella Ops Reference Architecture.md`
|
||||||
|
|
||||||
|
### Gaps Addressed
|
||||||
|
|
||||||
|
| Gap | Sprint | Description |
|
||||||
|
|-----|--------|-------------|
|
||||||
|
| Gateway WebService Missing | 3600.0001.0001 | HTTP ingress service not implemented |
|
||||||
|
| CycloneDX 1.6 → 1.7 | 3600.0002.0001 | Upgrade to latest CycloneDX spec |
|
||||||
|
| SPDX 3.0.1 Generation | 3600.0003.0001 | Native SPDX SBOM generation |
|
||||||
|
| Proof Chain UI | 4200.0001.0001 | Evidence transparency dashboard |
|
||||||
|
| Starter Policy | 5200.0001.0001 | Day-1 policy pack for onboarding |
|
||||||
|
|
||||||
|
### Already Implemented (No Action Required)
|
||||||
|
|
||||||
|
| Component | Status | Notes |
|
||||||
|
|-----------|--------|-------|
|
||||||
|
| Scheduler | Complete | Full implementation with PostgreSQL, Redis |
|
||||||
|
| Policy Engine | Complete | Signed verdicts, deterministic IR, exceptions |
|
||||||
|
| Authority | Complete | DPoP/mTLS, OpToks, JWKS rotation |
|
||||||
|
| Attestor | Complete | DSSE/in-toto, Rekor v2, proof chains |
|
||||||
|
| Timeline/Notify | Complete | TimelineIndexer + Notify with 4 channels |
|
||||||
|
| Excititor | Complete | VEX ingestion, CycloneDX, OpenVEX |
|
||||||
|
| Concelier | Complete | 31+ connectors, Link-Not-Merge |
|
||||||
|
| Reachability/Signals | Complete | 5-factor scoring, lattice logic |
|
||||||
|
| OCI Referrers | Complete | ExportCenter + Excititor |
|
||||||
|
| Tenant Isolation | Complete | RLS, per-tenant keys, namespaces |
|
||||||
|
|
||||||
|
## Execution Order
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[3600.0002.0001<br/>CycloneDX 1.7] --> B[3600.0003.0001<br/>SPDX 3.0.1]
|
||||||
|
C[3600.0001.0001<br/>Gateway WebService] --> D[Production Ready]
|
||||||
|
B --> D
|
||||||
|
E[4200.0001.0001<br/>Proof Chain UI] --> D
|
||||||
|
F[5200.0001.0001<br/>Starter Policy] --> D
|
||||||
|
```
|
||||||
|
|
||||||
|
## Success Criteria for Series
|
||||||
|
|
||||||
|
- [ ] Gateway WebService accepts HTTP and routes to microservices
|
||||||
|
- [ ] All SBOMs generated in CycloneDX 1.7 format
|
||||||
|
- [ ] SPDX 3.0.1 available as alternative SBOM format
|
||||||
|
- [ ] Auditors can view complete evidence chains in UI
|
||||||
|
- [ ] New customers can deploy starter policy in <5 minutes
|
||||||
|
|
||||||
|
## Created
|
||||||
|
|
||||||
|
- **Date**: 2025-12-21
|
||||||
|
- **Source**: Reference Architecture Advisory Gap Analysis
|
||||||
|
- **Author**: Agent
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Sprint Status Summary
|
||||||
|
|
||||||
|
| Sprint | Tasks | Completed | Status |
|
||||||
|
|--------|-------|-----------|--------|
|
||||||
|
| 3600.0001.0001 | 10 | 0 | TODO |
|
||||||
|
| 3600.0002.0001 | 10 | 0 | TODO |
|
||||||
|
| 3600.0003.0001 | 10 | 0 | TODO |
|
||||||
|
| 4200.0001.0001 | 11 | 0 | TODO |
|
||||||
|
| 5200.0001.0001 | 10 | 0 | TODO |
|
||||||
|
| **Total** | **51** | **0** | **TODO** |
|
||||||
384
docs/implplan/SPRINT_4000_0001_0001_unknowns_decay_algorithm.md
Normal file
384
docs/implplan/SPRINT_4000_0001_0001_unknowns_decay_algorithm.md
Normal file
@@ -0,0 +1,384 @@
|
|||||||
|
# Sprint 4000.0001.0001 · Unknowns Decay Algorithm
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Add time-based decay factor to the UnknownRanker scoring algorithm
|
||||||
|
- Implements bucket-based freshness decay following existing `FreshnessModels` pattern
|
||||||
|
- Ensures older unknowns gradually reduce in priority unless re-evaluated
|
||||||
|
|
||||||
|
**Working directory:** `src/Policy/__Libraries/StellaOps.Policy.Unknowns/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (first sprint in batch)
|
||||||
|
- **Downstream**: Sprint 4000.0001.0002 (BlastRadius/Containment)
|
||||||
|
- **Safe to parallelize with**: Sprint 4000.0002.0001 (EPSS Connector)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy.Unknowns/AGENTS.md`
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy/Scoring/FreshnessModels.cs` (pattern reference)
|
||||||
|
- `docs/product-advisories/14-Dec-2025 - Triage and Unknowns Technical Reference.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Extend UnknownRankInput with Timestamps
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add timestamp fields to `UnknownRankInput` record to support decay calculation.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs` (lines 16-23)
|
||||||
|
|
||||||
|
**Changes**:
|
||||||
|
```csharp
|
||||||
|
public sealed record UnknownRankInput(
|
||||||
|
bool HasVexStatement,
|
||||||
|
bool HasReachabilityData,
|
||||||
|
bool HasConflictingSources,
|
||||||
|
bool IsStaleAdvisory,
|
||||||
|
bool IsInKev,
|
||||||
|
decimal EpssScore,
|
||||||
|
decimal CvssScore,
|
||||||
|
// NEW: Time-based decay inputs
|
||||||
|
DateTimeOffset? FirstSeenAt,
|
||||||
|
DateTimeOffset? LastEvaluatedAt,
|
||||||
|
DateTimeOffset AsOfDateTime);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `FirstSeenAt` nullable timestamp added (when unknown first detected)
|
||||||
|
- [ ] `LastEvaluatedAt` nullable timestamp added (last ranking recalculation)
|
||||||
|
- [ ] `AsOfDateTime` required timestamp added (reference time for decay)
|
||||||
|
- [ ] Backward compatible: existing callers can pass null for new optional fields
|
||||||
|
- [ ] All existing tests still pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Implement DecayCalculator
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement bucket-based decay calculation following the `FreshnessModels` pattern in `StellaOps.Policy.Scoring`.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs`
|
||||||
|
|
||||||
|
**Decay Buckets** (from FreshnessModels pattern):
|
||||||
|
```csharp
|
||||||
|
/// <summary>
|
||||||
|
/// Computes decay factor based on days since last evaluation.
|
||||||
|
/// Returns 1.0 for fresh, decreasing to 0.2 for very old.
|
||||||
|
/// </summary>
|
||||||
|
private static decimal ComputeDecayFactor(UnknownRankInput input)
|
||||||
|
{
|
||||||
|
if (input.LastEvaluatedAt is null)
|
||||||
|
return 1.0m; // No history = no decay
|
||||||
|
|
||||||
|
var ageDays = (int)(input.AsOfDateTime - input.LastEvaluatedAt.Value).TotalDays;
|
||||||
|
|
||||||
|
return ageDays switch
|
||||||
|
{
|
||||||
|
<= 7 => 1.00m, // Fresh (7d): 100%
|
||||||
|
<= 30 => 0.90m, // 30d: 90%
|
||||||
|
<= 90 => 0.75m, // 90d: 75%
|
||||||
|
<= 180 => 0.60m, // 180d: 60%
|
||||||
|
<= 365 => 0.40m, // 365d: 40%
|
||||||
|
_ => 0.20m // >365d: 20%
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `ComputeDecayFactor` method implemented with bucket logic
|
||||||
|
- [ ] Returns `1.0m` when `LastEvaluatedAt` is null (no decay)
|
||||||
|
- [ ] All arithmetic uses `decimal` for determinism
|
||||||
|
- [ ] Buckets match FreshnessModels pattern (7/30/90/180/365 days)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Extend UnknownRankerOptions
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add decay configuration options to allow customization of decay behavior.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs` (lines 162-172)
|
||||||
|
|
||||||
|
**Changes**:
|
||||||
|
```csharp
|
||||||
|
public sealed class UnknownRankerOptions
|
||||||
|
{
|
||||||
|
// Existing band thresholds
|
||||||
|
public decimal HotThreshold { get; set; } = 75m;
|
||||||
|
public decimal WarmThreshold { get; set; } = 50m;
|
||||||
|
public decimal ColdThreshold { get; set; } = 25m;
|
||||||
|
|
||||||
|
// NEW: Decay configuration
|
||||||
|
public bool EnableDecay { get; set; } = true;
|
||||||
|
public IReadOnlyList<DecayBucket> DecayBuckets { get; set; } = DefaultDecayBuckets;
|
||||||
|
|
||||||
|
public static IReadOnlyList<DecayBucket> DefaultDecayBuckets { get; } =
|
||||||
|
[
|
||||||
|
new DecayBucket(7, 10000), // 7d: 100%
|
||||||
|
new DecayBucket(30, 9000), // 30d: 90%
|
||||||
|
new DecayBucket(90, 7500), // 90d: 75%
|
||||||
|
new DecayBucket(180, 6000), // 180d: 60%
|
||||||
|
new DecayBucket(365, 4000), // 365d: 40%
|
||||||
|
new DecayBucket(int.MaxValue, 2000) // >365d: 20%
|
||||||
|
];
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record DecayBucket(int MaxAgeDays, int MultiplierBps);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `EnableDecay` toggle added (default: true)
|
||||||
|
- [ ] `DecayBuckets` configurable list added
|
||||||
|
- [ ] Uses basis points (10000 = 100%) for integer math
|
||||||
|
- [ ] Default buckets match T2 implementation
|
||||||
|
- [ ] DI configuration via `services.Configure<UnknownRankerOptions>()` works
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Integrate Decay into Rank()
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Apply decay factor to the final score calculation in the `Rank()` method.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs` (lines 87-95)
|
||||||
|
|
||||||
|
**Updated Rank Method**:
|
||||||
|
```csharp
|
||||||
|
public UnknownRankResult Rank(UnknownRankInput input)
|
||||||
|
{
|
||||||
|
var uncertainty = ComputeUncertainty(input);
|
||||||
|
var pressure = ComputeExploitPressure(input);
|
||||||
|
var rawScore = Math.Round((uncertainty * 50m) + (pressure * 50m), 2);
|
||||||
|
|
||||||
|
// Apply decay factor if enabled
|
||||||
|
decimal decayFactor = 1.0m;
|
||||||
|
if (_options.EnableDecay)
|
||||||
|
{
|
||||||
|
decayFactor = ComputeDecayFactor(input);
|
||||||
|
}
|
||||||
|
|
||||||
|
var score = Math.Round(rawScore * decayFactor, 2);
|
||||||
|
var band = AssignBand(score);
|
||||||
|
|
||||||
|
return new UnknownRankResult(score, uncertainty, pressure, band, decayFactor);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Updated Result Record**:
|
||||||
|
```csharp
|
||||||
|
public sealed record UnknownRankResult(
|
||||||
|
decimal Score,
|
||||||
|
decimal UncertaintyFactor,
|
||||||
|
decimal ExploitPressure,
|
||||||
|
UnknownBand Band,
|
||||||
|
decimal DecayFactor = 1.0m); // NEW field
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Decay factor applied as multiplier to raw score
|
||||||
|
- [ ] `DecayFactor` added to `UnknownRankResult`
|
||||||
|
- [ ] Score still rounded to 2 decimal places
|
||||||
|
- [ ] Band assignment uses decayed score
|
||||||
|
- [ ] When `EnableDecay = false`, decay factor is 1.0
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Add Decay Tests
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add comprehensive tests for decay calculation covering all buckets and edge cases.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Tests/StellaOps.Policy.Unknowns.Tests/Services/UnknownRankerTests.cs`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
#region Decay Factor Tests
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ComputeDecay_NullLastEvaluated_Returns100Percent()
|
||||||
|
{
|
||||||
|
var input = CreateInputWithAge(lastEvaluatedAt: null);
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
result.DecayFactor.Should().Be(1.00m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[InlineData(0, 1.00)] // Today
|
||||||
|
[InlineData(7, 1.00)] // 7 days
|
||||||
|
[InlineData(8, 0.90)] // 8 days (next bucket)
|
||||||
|
[InlineData(30, 0.90)] // 30 days
|
||||||
|
[InlineData(31, 0.75)] // 31 days
|
||||||
|
[InlineData(90, 0.75)] // 90 days
|
||||||
|
[InlineData(91, 0.60)] // 91 days
|
||||||
|
[InlineData(180, 0.60)] // 180 days
|
||||||
|
[InlineData(181, 0.40)] // 181 days
|
||||||
|
[InlineData(365, 0.40)] // 365 days
|
||||||
|
[InlineData(366, 0.20)] // 366 days
|
||||||
|
[InlineData(1000, 0.20)] // Very old
|
||||||
|
public void ComputeDecay_AgeBuckets_ReturnsCorrectMultiplier(int ageDays, decimal expected)
|
||||||
|
{
|
||||||
|
var asOf = DateTimeOffset.UtcNow;
|
||||||
|
var input = CreateInputWithAge(
|
||||||
|
lastEvaluatedAt: asOf.AddDays(-ageDays),
|
||||||
|
asOfDateTime: asOf);
|
||||||
|
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
result.DecayFactor.Should().Be(expected);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Rank_WithDecay_AppliesMultiplierToScore()
|
||||||
|
{
|
||||||
|
// Arrange: Create input that would score 50 without decay
|
||||||
|
var input = CreateHighScoreInput(ageDays: 100); // 75% decay
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
|
||||||
|
// Assert: Score should be 50 * 0.75 = 37.50
|
||||||
|
result.Score.Should().Be(37.50m);
|
||||||
|
result.DecayFactor.Should().Be(0.75m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Rank_DecayDisabled_ReturnsFullScore()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var options = new UnknownRankerOptions { EnableDecay = false };
|
||||||
|
var ranker = new UnknownRanker(Options.Create(options));
|
||||||
|
var input = CreateHighScoreInput(ageDays: 100);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = ranker.Rank(input);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.DecayFactor.Should().Be(1.0m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Rank_Determinism_SameInputSameOutput()
|
||||||
|
{
|
||||||
|
var input = CreateInputWithAge(ageDays: 45);
|
||||||
|
|
||||||
|
var results = Enumerable.Range(0, 100)
|
||||||
|
.Select(_ => _ranker.Rank(input))
|
||||||
|
.ToList();
|
||||||
|
|
||||||
|
results.Should().AllBeEquivalentTo(results[0]);
|
||||||
|
}
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for null `LastEvaluatedAt` returns 1.0
|
||||||
|
- [ ] Theory test covers all bucket boundaries (0, 7, 8, 30, 31, 90, 91, 180, 181, 365, 366)
|
||||||
|
- [ ] Test verifies decay multiplier applied to score
|
||||||
|
- [ ] Test verifies `EnableDecay = false` bypasses decay
|
||||||
|
- [ ] Determinism test confirms reproducibility
|
||||||
|
- [ ] All 6+ new tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Update UnknownsRepository
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Ensure repository queries populate `first_seen_at` and `last_evaluated_at` columns.
|
||||||
|
|
||||||
|
**Implementation Path**: `Repositories/UnknownsRepository.cs`
|
||||||
|
|
||||||
|
**SQL Updates**:
|
||||||
|
```sql
|
||||||
|
-- Verify columns exist in policy.unknowns table
|
||||||
|
-- first_seen_at should already exist per schema
|
||||||
|
-- last_evaluated_at needs to be updated on each ranking
|
||||||
|
|
||||||
|
UPDATE policy.unknowns
|
||||||
|
SET last_evaluated_at = @now,
|
||||||
|
score = @score,
|
||||||
|
band = @band,
|
||||||
|
uncertainty_factor = @uncertainty,
|
||||||
|
exploit_pressure = @pressure
|
||||||
|
WHERE id = @id AND tenant_id = @tenantId;
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `first_seen_at` column is set on INSERT (if not already)
|
||||||
|
- [ ] `last_evaluated_at` column updated on every re-ranking
|
||||||
|
- [ ] Repository methods return timestamps for decay calculation
|
||||||
|
- [ ] RLS (tenant isolation) still enforced
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Policy Team | Extend UnknownRankInput with timestamps |
|
||||||
|
| 2 | T2 | TODO | T1 | Policy Team | Implement DecayCalculator |
|
||||||
|
| 3 | T3 | TODO | T2 | Policy Team | Extend UnknownRankerOptions |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | Policy Team | Integrate decay into Rank() |
|
||||||
|
| 5 | T5 | TODO | T4 | Policy Team | Add decay tests |
|
||||||
|
| 6 | T6 | TODO | T1 | Policy Team | Update UnknownsRepository |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT gap analysis. Decay logic identified as gap in Triage & Unknowns advisory. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Decay as multiplier vs deduction | Decision | Policy Team | Using multiplier (score × decay) preserves relative ordering |
|
||||||
|
| Bucket boundaries | Decision | Policy Team | Following FreshnessModels pattern (7/30/90/180/365 days) |
|
||||||
|
| Nullable timestamps | Decision | Policy Team | Allow null for backward compatibility; null = no decay |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] 6+ decay-related tests passing
|
||||||
|
- [ ] Existing 29 tests still passing
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds for `StellaOps.Policy.Unknowns.Tests`
|
||||||
@@ -0,0 +1,500 @@
|
|||||||
|
# Sprint 4000.0001.0002 · Unknowns BlastRadius & Containment Signals
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Add BlastRadius scoring (dependency graph impact) to UnknownRanker
|
||||||
|
- Add ContainmentSignals scoring (runtime isolation posture) to UnknownRanker
|
||||||
|
- Extends the ranking formula with a containment reduction factor
|
||||||
|
|
||||||
|
**Working directory:** `src/Policy/__Libraries/StellaOps.Policy.Unknowns/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4000.0001.0001 (Decay Algorithm) — MUST BE DONE
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: Sprint 4000.0002.0001 (EPSS Connector)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- Sprint 4000.0001.0001 completion
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy.Unknowns/AGENTS.md`
|
||||||
|
- `docs/product-advisories/14-Dec-2025 - Triage and Unknowns Technical Reference.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define BlastRadius Model
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create a new model for blast radius data representing dependency graph impact.
|
||||||
|
|
||||||
|
**Implementation Path**: `Models/BlastRadius.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Represents the dependency graph impact of an unknown package.
|
||||||
|
/// Data sourced from Scanner/Signals module call graph analysis.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BlastRadius
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Number of packages that directly or transitively depend on this package.
|
||||||
|
/// 0 = isolated, higher = more impact if exploited.
|
||||||
|
/// </summary>
|
||||||
|
public int Dependents { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether this package is reachable from network-facing entrypoints.
|
||||||
|
/// True = higher risk, False = reduced risk.
|
||||||
|
/// </summary>
|
||||||
|
public bool NetFacing { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Privilege level under which this package typically runs.
|
||||||
|
/// "root" = highest risk, "user" = normal, "none" = lowest.
|
||||||
|
/// </summary>
|
||||||
|
public string? Privilege { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `BlastRadius.cs` file created in `Models/` directory
|
||||||
|
- [ ] Record is immutable with init-only properties
|
||||||
|
- [ ] XML documentation describes each property
|
||||||
|
- [ ] Namespace is `StellaOps.Policy.Unknowns.Models`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Define ContainmentSignals Model
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create a new model for runtime containment posture signals.
|
||||||
|
|
||||||
|
**Implementation Path**: `Models/ContainmentSignals.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Represents runtime isolation and containment posture signals.
|
||||||
|
/// Data sourced from runtime probes (Seccomp, eBPF, container config).
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ContainmentSignals
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Seccomp profile status: "enforced", "permissive", "disabled", null if unknown.
|
||||||
|
/// "enforced" = reduced risk (limits syscalls).
|
||||||
|
/// </summary>
|
||||||
|
public string? Seccomp { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Filesystem mount mode: "ro" (read-only), "rw" (read-write), null if unknown.
|
||||||
|
/// "ro" = reduced risk (limits persistence).
|
||||||
|
/// </summary>
|
||||||
|
public string? FileSystem { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Network policy status: "isolated", "restricted", "open", null if unknown.
|
||||||
|
/// "isolated" = reduced risk (no egress).
|
||||||
|
/// </summary>
|
||||||
|
public string? NetworkPolicy { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `ContainmentSignals.cs` file created in `Models/` directory
|
||||||
|
- [ ] Record is immutable with init-only properties
|
||||||
|
- [ ] All properties nullable (unknown state allowed)
|
||||||
|
- [ ] XML documentation describes each property
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Extend UnknownRankInput
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add blast radius and containment fields to `UnknownRankInput`.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs`
|
||||||
|
|
||||||
|
**Updated Record**:
|
||||||
|
```csharp
|
||||||
|
public sealed record UnknownRankInput(
|
||||||
|
// Existing fields
|
||||||
|
bool HasVexStatement,
|
||||||
|
bool HasReachabilityData,
|
||||||
|
bool HasConflictingSources,
|
||||||
|
bool IsStaleAdvisory,
|
||||||
|
bool IsInKev,
|
||||||
|
decimal EpssScore,
|
||||||
|
decimal CvssScore,
|
||||||
|
// From Sprint 4000.0001.0001 (Decay)
|
||||||
|
DateTimeOffset? FirstSeenAt,
|
||||||
|
DateTimeOffset? LastEvaluatedAt,
|
||||||
|
DateTimeOffset AsOfDateTime,
|
||||||
|
// NEW: BlastRadius & Containment
|
||||||
|
BlastRadius? BlastRadius,
|
||||||
|
ContainmentSignals? Containment);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `BlastRadius` nullable field added
|
||||||
|
- [ ] `Containment` nullable field added
|
||||||
|
- [ ] Both fields default to null (backward compatible)
|
||||||
|
- [ ] Existing tests still pass with null values
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement ComputeContainmentReduction
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement containment-based score reduction logic.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs`
|
||||||
|
|
||||||
|
**Reduction Formula**:
|
||||||
|
```csharp
|
||||||
|
/// <summary>
|
||||||
|
/// Computes a reduction factor based on containment posture.
|
||||||
|
/// Better containment = lower effective risk = score reduction.
|
||||||
|
/// Maximum reduction capped at 40%.
|
||||||
|
/// </summary>
|
||||||
|
private decimal ComputeContainmentReduction(UnknownRankInput input)
|
||||||
|
{
|
||||||
|
decimal reduction = 0m;
|
||||||
|
|
||||||
|
// BlastRadius reductions
|
||||||
|
if (input.BlastRadius is { } blast)
|
||||||
|
{
|
||||||
|
// Isolated package (no dependents) reduces risk
|
||||||
|
if (blast.Dependents == 0)
|
||||||
|
reduction += _options.IsolatedReduction; // default: 0.15
|
||||||
|
|
||||||
|
// Not network-facing reduces risk
|
||||||
|
if (!blast.NetFacing)
|
||||||
|
reduction += _options.NotNetFacingReduction; // default: 0.05
|
||||||
|
|
||||||
|
// Non-root privilege reduces risk
|
||||||
|
if (blast.Privilege is "user" or "none")
|
||||||
|
reduction += _options.NonRootReduction; // default: 0.05
|
||||||
|
}
|
||||||
|
|
||||||
|
// ContainmentSignals reductions
|
||||||
|
if (input.Containment is { } contain)
|
||||||
|
{
|
||||||
|
// Enforced Seccomp reduces risk
|
||||||
|
if (contain.Seccomp == "enforced")
|
||||||
|
reduction += _options.SeccompEnforcedReduction; // default: 0.10
|
||||||
|
|
||||||
|
// Read-only filesystem reduces risk
|
||||||
|
if (contain.FileSystem == "ro")
|
||||||
|
reduction += _options.FsReadOnlyReduction; // default: 0.10
|
||||||
|
|
||||||
|
// Network isolation reduces risk
|
||||||
|
if (contain.NetworkPolicy == "isolated")
|
||||||
|
reduction += _options.NetworkIsolatedReduction; // default: 0.05
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cap at maximum reduction
|
||||||
|
return Math.Min(reduction, _options.MaxContainmentReduction); // default: 0.40
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Score Application**:
|
||||||
|
```csharp
|
||||||
|
// In Rank() method, after decay:
|
||||||
|
var containmentReduction = ComputeContainmentReduction(input);
|
||||||
|
var finalScore = Math.Max(0m, decayedScore * (1m - containmentReduction));
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Method computes reduction from BlastRadius and ContainmentSignals
|
||||||
|
- [ ] Null inputs contribute 0 reduction
|
||||||
|
- [ ] Reduction capped at configurable maximum (default 40%)
|
||||||
|
- [ ] All arithmetic uses `decimal`
|
||||||
|
- [ ] Reduction applied as multiplier: `score * (1 - reduction)`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Extend UnknownRankerOptions
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add containment reduction weight configuration.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs`
|
||||||
|
|
||||||
|
**Updated Options**:
|
||||||
|
```csharp
|
||||||
|
public sealed class UnknownRankerOptions
|
||||||
|
{
|
||||||
|
// Existing band thresholds
|
||||||
|
public decimal HotThreshold { get; set; } = 75m;
|
||||||
|
public decimal WarmThreshold { get; set; } = 50m;
|
||||||
|
public decimal ColdThreshold { get; set; } = 25m;
|
||||||
|
|
||||||
|
// Decay (from Sprint 4000.0001.0001)
|
||||||
|
public bool EnableDecay { get; set; } = true;
|
||||||
|
public IReadOnlyList<DecayBucket> DecayBuckets { get; set; } = DefaultDecayBuckets;
|
||||||
|
|
||||||
|
// NEW: Containment reduction weights
|
||||||
|
public bool EnableContainmentReduction { get; set; } = true;
|
||||||
|
public decimal IsolatedReduction { get; set; } = 0.15m;
|
||||||
|
public decimal NotNetFacingReduction { get; set; } = 0.05m;
|
||||||
|
public decimal NonRootReduction { get; set; } = 0.05m;
|
||||||
|
public decimal SeccompEnforcedReduction { get; set; } = 0.10m;
|
||||||
|
public decimal FsReadOnlyReduction { get; set; } = 0.10m;
|
||||||
|
public decimal NetworkIsolatedReduction { get; set; } = 0.05m;
|
||||||
|
public decimal MaxContainmentReduction { get; set; } = 0.40m;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `EnableContainmentReduction` toggle added
|
||||||
|
- [ ] Individual reduction weights configurable
|
||||||
|
- [ ] `MaxContainmentReduction` cap configurable
|
||||||
|
- [ ] Defaults match T4 implementation
|
||||||
|
- [ ] DI configuration works
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Add DB Migration
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add columns to `policy.unknowns` table for blast radius and containment data.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Libraries/StellaOps.Policy.Storage.Postgres/migrations/`
|
||||||
|
|
||||||
|
**Migration SQL**:
|
||||||
|
```sql
|
||||||
|
-- Migration: Add blast radius and containment columns to policy.unknowns
|
||||||
|
|
||||||
|
ALTER TABLE policy.unknowns
|
||||||
|
ADD COLUMN IF NOT EXISTS blast_radius_dependents INT,
|
||||||
|
ADD COLUMN IF NOT EXISTS blast_radius_net_facing BOOLEAN,
|
||||||
|
ADD COLUMN IF NOT EXISTS blast_radius_privilege TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS containment_seccomp TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS containment_fs_mode TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS containment_network_policy TEXT;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN policy.unknowns.blast_radius_dependents IS 'Number of packages depending on this package';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.blast_radius_net_facing IS 'Whether reachable from network entrypoints';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.blast_radius_privilege IS 'Privilege level: root, user, none';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.containment_seccomp IS 'Seccomp status: enforced, permissive, disabled';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.containment_fs_mode IS 'Filesystem mode: ro, rw';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.containment_network_policy IS 'Network policy: isolated, restricted, open';
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migration file created with sequential number
|
||||||
|
- [ ] All 6 columns added with appropriate types
|
||||||
|
- [ ] Column comments added for documentation
|
||||||
|
- [ ] Migration is idempotent (IF NOT EXISTS)
|
||||||
|
- [ ] RLS policies still apply
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Add Containment Tests
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4, T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add comprehensive tests for containment reduction logic.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Tests/StellaOps.Policy.Unknowns.Tests/Services/UnknownRankerTests.cs`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
#region Containment Reduction Tests
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ComputeContainmentReduction_NullInputs_ReturnsZero()
|
||||||
|
{
|
||||||
|
var input = CreateInputWithContainment(blastRadius: null, containment: null);
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
result.ContainmentReduction.Should().Be(0m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ComputeContainmentReduction_IsolatedPackage_Returns15Percent()
|
||||||
|
{
|
||||||
|
var blast = new BlastRadius { Dependents = 0, NetFacing = true };
|
||||||
|
var input = CreateInputWithContainment(blastRadius: blast);
|
||||||
|
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
result.ContainmentReduction.Should().Be(0.15m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ComputeContainmentReduction_AllContainmentFactors_CapsAt40Percent()
|
||||||
|
{
|
||||||
|
var blast = new BlastRadius { Dependents = 0, NetFacing = false, Privilege = "none" };
|
||||||
|
var contain = new ContainmentSignals { Seccomp = "enforced", FileSystem = "ro", NetworkPolicy = "isolated" };
|
||||||
|
var input = CreateInputWithContainment(blastRadius: blast, containment: contain);
|
||||||
|
|
||||||
|
// Total would be: 0.15 + 0.05 + 0.05 + 0.10 + 0.10 + 0.05 = 0.50
|
||||||
|
// But capped at 0.40
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
result.ContainmentReduction.Should().Be(0.40m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Rank_WithContainment_AppliesReductionToScore()
|
||||||
|
{
|
||||||
|
// Arrange: Create input that would score 60 before containment
|
||||||
|
var blast = new BlastRadius { Dependents = 0 }; // 15% reduction
|
||||||
|
var input = CreateHighScoreInputWithContainment(blast);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = _ranker.Rank(input);
|
||||||
|
|
||||||
|
// Assert: Score reduced by 15%: 60 * 0.85 = 51
|
||||||
|
result.Score.Should().Be(51.00m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Rank_ContainmentDisabled_NoReduction()
|
||||||
|
{
|
||||||
|
var options = new UnknownRankerOptions { EnableContainmentReduction = false };
|
||||||
|
var ranker = new UnknownRanker(Options.Create(options));
|
||||||
|
var blast = new BlastRadius { Dependents = 0 };
|
||||||
|
var input = CreateHighScoreInputWithContainment(blast);
|
||||||
|
|
||||||
|
var result = ranker.Rank(input);
|
||||||
|
result.ContainmentReduction.Should().Be(0m);
|
||||||
|
}
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for null BlastRadius/Containment returns 0 reduction
|
||||||
|
- [ ] Test for isolated package (Dependents=0)
|
||||||
|
- [ ] Test for cap at 40% maximum
|
||||||
|
- [ ] Test verifies reduction applied to final score
|
||||||
|
- [ ] Test for `EnableContainmentReduction = false`
|
||||||
|
- [ ] All 5+ new tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Document Signal Sources
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update AGENTS.md with signal provenance for blast radius and containment.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Libraries/StellaOps.Policy.Unknowns/AGENTS.md`
|
||||||
|
|
||||||
|
**Documentation to Add**:
|
||||||
|
```markdown
|
||||||
|
## Signal Sources
|
||||||
|
|
||||||
|
### BlastRadius
|
||||||
|
- **Source**: Scanner/Signals module call graph analysis
|
||||||
|
- **Dependents**: Count of packages in dependency tree
|
||||||
|
- **NetFacing**: Reachability from network entrypoints (ASP.NET controllers, gRPC, etc.)
|
||||||
|
- **Privilege**: Extracted from container config or runtime probes
|
||||||
|
|
||||||
|
### ContainmentSignals
|
||||||
|
- **Source**: Runtime probes (eBPF, Seccomp profiles, container inspection)
|
||||||
|
- **Seccomp**: Seccomp profile enforcement status
|
||||||
|
- **FileSystem**: Mount mode from container spec or /proc/mounts
|
||||||
|
- **NetworkPolicy**: Kubernetes NetworkPolicy or firewall rules
|
||||||
|
|
||||||
|
### Data Flow
|
||||||
|
1. Scanner generates BlastRadius during SBOM analysis
|
||||||
|
2. Runtime probes collect ContainmentSignals
|
||||||
|
3. Signals stored in `policy.unknowns` table
|
||||||
|
4. UnknownRanker reads signals for scoring
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] AGENTS.md updated with Signal Sources section
|
||||||
|
- [ ] BlastRadius provenance documented
|
||||||
|
- [ ] ContainmentSignals provenance documented
|
||||||
|
- [ ] Data flow explained
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Policy Team | Define BlastRadius model |
|
||||||
|
| 2 | T2 | TODO | — | Policy Team | Define ContainmentSignals model |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | Policy Team | Extend UnknownRankInput |
|
||||||
|
| 4 | T4 | TODO | T3 | Policy Team | Implement ComputeContainmentReduction |
|
||||||
|
| 5 | T5 | TODO | T4 | Policy Team | Extend UnknownRankerOptions |
|
||||||
|
| 6 | T6 | TODO | T1, T2 | Policy Team | Add DB migration |
|
||||||
|
| 7 | T7 | TODO | T4, T5 | Policy Team | Add containment tests |
|
||||||
|
| 8 | T8 | TODO | T1, T2 | Policy Team | Document signal sources |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT gap analysis. BlastRadius/ContainmentSignals identified as gap in Triage & Unknowns advisory. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Reduction vs multiplier | Decision | Policy Team | Using reduction (score × (1-reduction)) allows additive containment factors |
|
||||||
|
| Maximum cap at 40% | Decision | Policy Team | Prevents well-contained packages from dropping to 0; preserves signal |
|
||||||
|
| Nullable signals | Decision | Policy Team | Allow null for unknown containment state; null = no reduction |
|
||||||
|
| JSONB vs columns | Decision | Policy Team | Using columns for queryability and indexing |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 8 tasks marked DONE
|
||||||
|
- [ ] 5+ containment-related tests passing
|
||||||
|
- [ ] Existing tests still passing (including decay tests from Sprint 1)
|
||||||
|
- [ ] Migration applies cleanly
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
866
docs/implplan/SPRINT_4000_0002_0001_epss_feed_connector.md
Normal file
866
docs/implplan/SPRINT_4000_0002_0001_epss_feed_connector.md
Normal file
@@ -0,0 +1,866 @@
|
|||||||
|
# Sprint 4000.0002.0001 · EPSS Feed Connector
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Create Concelier connector for EPSS (Exploit Prediction Scoring System) feed ingestion
|
||||||
|
- Follows three-stage connector pattern: Fetch → Parse → Map
|
||||||
|
- Leverages existing `EpssCsvStreamParser` from Scanner module for CSV parsing
|
||||||
|
- Integrates with orchestrator for scheduled, rate-limited, airgap-capable ingestion
|
||||||
|
|
||||||
|
**Working directory:** `src/Concelier/__Libraries/StellaOps.Concelier.Connector.Epss/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (first sprint in batch 0002)
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: Sprint 4000.0001.0001 (Decay), Sprint 4000.0001.0002 (Containment)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Concelier/__Libraries/StellaOps.Concelier.Core/Orchestration/ConnectorMetadata.cs`
|
||||||
|
- `src/Scanner/__Libraries/StellaOps.Scanner.Storage/Epss/EpssCsvStreamParser.cs` (reuse pattern)
|
||||||
|
- Existing connector examples: `StellaOps.Concelier.Connector.CertFr`, `StellaOps.Concelier.Connector.Osv`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create Project Structure
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create new connector project following established Concelier patterns.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Concelier/__Libraries/StellaOps.Concelier.Connector.Epss/`
|
||||||
|
|
||||||
|
**Project Structure**:
|
||||||
|
```
|
||||||
|
StellaOps.Concelier.Connector.Epss/
|
||||||
|
├── StellaOps.Concelier.Connector.Epss.csproj
|
||||||
|
├── EpssConnectorPlugin.cs
|
||||||
|
├── EpssDependencyInjectionRoutine.cs
|
||||||
|
├── EpssServiceCollectionExtensions.cs
|
||||||
|
├── Jobs.cs
|
||||||
|
├── Configuration/
|
||||||
|
│ └── EpssOptions.cs
|
||||||
|
└── Internal/
|
||||||
|
├── EpssConnector.cs
|
||||||
|
├── EpssCursor.cs
|
||||||
|
├── EpssMapper.cs
|
||||||
|
└── EpssDiagnostics.cs
|
||||||
|
```
|
||||||
|
|
||||||
|
**csproj Definition**:
|
||||||
|
```xml
|
||||||
|
<Project Sdk="Microsoft.NET.Sdk">
|
||||||
|
<PropertyGroup>
|
||||||
|
<TargetFramework>net10.0</TargetFramework>
|
||||||
|
<RootNamespace>StellaOps.Concelier.Connector.Epss</RootNamespace>
|
||||||
|
<AssemblyName>StellaOps.Concelier.Connector.Epss</AssemblyName>
|
||||||
|
<ImplicitUsings>enable</ImplicitUsings>
|
||||||
|
<Nullable>enable</Nullable>
|
||||||
|
<LangVersion>preview</LangVersion>
|
||||||
|
</PropertyGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<ProjectReference Include="..\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
|
||||||
|
<ProjectReference Include="..\..\..\Scanner\__Libraries\StellaOps.Scanner.Storage\StellaOps.Scanner.Storage.csproj" />
|
||||||
|
</ItemGroup>
|
||||||
|
</Project>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Project created with correct structure
|
||||||
|
- [ ] References to Concelier.Core and Scanner.Storage added
|
||||||
|
- [ ] Compiles successfully
|
||||||
|
- [ ] Follows naming conventions
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Implement EpssConnectorPlugin
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the plugin entry point for connector registration.
|
||||||
|
|
||||||
|
**Implementation Path**: `EpssConnectorPlugin.cs`
|
||||||
|
|
||||||
|
**Plugin Definition**:
|
||||||
|
```csharp
|
||||||
|
using Microsoft.Extensions.DependencyInjection;
|
||||||
|
using StellaOps.Concelier.Connector.Epss.Internal;
|
||||||
|
using StellaOps.Plugin;
|
||||||
|
|
||||||
|
namespace StellaOps.Concelier.Connector.Epss;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Plugin entry point for EPSS feed connector.
|
||||||
|
/// Provides EPSS probability scores for CVE exploitation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class EpssConnectorPlugin : IConnectorPlugin
|
||||||
|
{
|
||||||
|
public const string SourceName = "epss";
|
||||||
|
|
||||||
|
public string Name => SourceName;
|
||||||
|
|
||||||
|
public bool IsAvailable(IServiceProvider services)
|
||||||
|
=> services.GetService<EpssConnector>() is not null;
|
||||||
|
|
||||||
|
public IFeedConnector Create(IServiceProvider services)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(services);
|
||||||
|
return services.GetRequiredService<EpssConnector>();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Implements `IConnectorPlugin`
|
||||||
|
- [ ] Source name is `"epss"`
|
||||||
|
- [ ] Factory method resolves connector from DI
|
||||||
|
- [ ] Availability check works correctly
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement EpssOptions
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create configuration options for EPSS connector.
|
||||||
|
|
||||||
|
**Implementation Path**: `Configuration/EpssOptions.cs`
|
||||||
|
|
||||||
|
**Options Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Concelier.Connector.Epss.Configuration;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration options for EPSS feed connector.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class EpssOptions
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration section name.
|
||||||
|
/// </summary>
|
||||||
|
public const string SectionName = "Concelier:Epss";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Base URL for EPSS API/feed.
|
||||||
|
/// Default: https://epss.empiricalsecurity.com/
|
||||||
|
/// </summary>
|
||||||
|
public string BaseUrl { get; set; } = "https://epss.empiricalsecurity.com/";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether to fetch the current day's snapshot or historical.
|
||||||
|
/// Default: true (fetch current).
|
||||||
|
/// </summary>
|
||||||
|
public bool FetchCurrent { get; set; } = true;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Number of days to look back for initial catch-up.
|
||||||
|
/// Default: 7 days.
|
||||||
|
/// </summary>
|
||||||
|
public int CatchUpDays { get; set; } = 7;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Request timeout in seconds.
|
||||||
|
/// Default: 120 (2 minutes for large CSV files).
|
||||||
|
/// </summary>
|
||||||
|
public int TimeoutSeconds { get; set; } = 120;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maximum retries on transient failure.
|
||||||
|
/// Default: 3.
|
||||||
|
/// </summary>
|
||||||
|
public int MaxRetries { get; set; } = 3;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether to enable offline/airgap mode using bundled data.
|
||||||
|
/// Default: false.
|
||||||
|
/// </summary>
|
||||||
|
public bool AirgapMode { get; set; } = false;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Path to offline bundle directory (when AirgapMode=true).
|
||||||
|
/// </summary>
|
||||||
|
public string? BundlePath { get; set; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] All configuration options documented
|
||||||
|
- [ ] Sensible defaults provided
|
||||||
|
- [ ] Airgap mode flag present
|
||||||
|
- [ ] Timeout and retry settings included
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement EpssCursor
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create cursor model for resumable state tracking.
|
||||||
|
|
||||||
|
**Implementation Path**: `Internal/EpssCursor.cs`
|
||||||
|
|
||||||
|
**Cursor Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Concelier.Connector.Epss.Internal;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Resumable cursor state for EPSS connector.
|
||||||
|
/// Tracks model version and last processed date for incremental sync.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EpssCursor
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// EPSS model version tag (e.g., "v2024.12.21").
|
||||||
|
/// </summary>
|
||||||
|
public string? ModelVersion { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Date of the last successfully processed snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public DateOnly? LastProcessedDate { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// HTTP ETag of last fetched resource (for conditional requests).
|
||||||
|
/// </summary>
|
||||||
|
public string? ETag { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 hash of the last processed CSV content.
|
||||||
|
/// </summary>
|
||||||
|
public string? ContentHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Number of CVE scores in the last snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public int? LastRowCount { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Timestamp when cursor was last updated.
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset UpdatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates initial empty cursor.
|
||||||
|
/// </summary>
|
||||||
|
public static EpssCursor Empty => new() { UpdatedAt = DateTimeOffset.MinValue };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Record is immutable
|
||||||
|
- [ ] Tracks model version for EPSS updates
|
||||||
|
- [ ] Tracks content hash for change detection
|
||||||
|
- [ ] Includes ETag for conditional HTTP requests
|
||||||
|
- [ ] Has static `Empty` factory
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Implement EpssConnector.FetchAsync
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement HTTP fetch stage with ETag/gzip support.
|
||||||
|
|
||||||
|
**Implementation Path**: `Internal/EpssConnector.cs`
|
||||||
|
|
||||||
|
**Fetch Implementation**:
|
||||||
|
```csharp
|
||||||
|
using System.Net.Http;
|
||||||
|
using Microsoft.Extensions.Logging;
|
||||||
|
using Microsoft.Extensions.Options;
|
||||||
|
using StellaOps.Concelier.Connector.Epss.Configuration;
|
||||||
|
using StellaOps.Concelier.Core.Feeds;
|
||||||
|
|
||||||
|
namespace StellaOps.Concelier.Connector.Epss.Internal;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// EPSS feed connector implementing three-stage Fetch/Parse/Map pattern.
|
||||||
|
/// </summary>
|
||||||
|
public sealed partial class EpssConnector : IFeedConnector
|
||||||
|
{
|
||||||
|
private readonly HttpClient _httpClient;
|
||||||
|
private readonly EpssOptions _options;
|
||||||
|
private readonly ILogger<EpssConnector> _logger;
|
||||||
|
|
||||||
|
public EpssConnector(
|
||||||
|
HttpClient httpClient,
|
||||||
|
IOptions<EpssOptions> options,
|
||||||
|
ILogger<EpssConnector> logger)
|
||||||
|
{
|
||||||
|
_httpClient = httpClient ?? throw new ArgumentNullException(nameof(httpClient));
|
||||||
|
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
|
||||||
|
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Fetches EPSS CSV snapshot from remote or bundle source.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<FetchResult> FetchAsync(
|
||||||
|
EpssCursor cursor,
|
||||||
|
CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
var targetDate = DateOnly.FromDateTime(DateTime.UtcNow);
|
||||||
|
var fileName = $"epss_scores-{targetDate:yyyy-MM-dd}.csv.gz";
|
||||||
|
|
||||||
|
if (_options.AirgapMode && !string.IsNullOrEmpty(_options.BundlePath))
|
||||||
|
{
|
||||||
|
return FetchFromBundle(fileName);
|
||||||
|
}
|
||||||
|
|
||||||
|
var uri = new Uri(new Uri(_options.BaseUrl), fileName);
|
||||||
|
|
||||||
|
using var request = new HttpRequestMessage(HttpMethod.Get, uri);
|
||||||
|
|
||||||
|
// Conditional fetch if we have ETag
|
||||||
|
if (!string.IsNullOrEmpty(cursor.ETag))
|
||||||
|
{
|
||||||
|
request.Headers.IfNoneMatch.ParseAdd(cursor.ETag);
|
||||||
|
}
|
||||||
|
|
||||||
|
using var response = await _httpClient.SendAsync(
|
||||||
|
request,
|
||||||
|
HttpCompletionOption.ResponseHeadersRead,
|
||||||
|
cancellationToken).ConfigureAwait(false);
|
||||||
|
|
||||||
|
if (response.StatusCode == System.Net.HttpStatusCode.NotModified)
|
||||||
|
{
|
||||||
|
_logger.LogInformation("EPSS snapshot unchanged (304 Not Modified)");
|
||||||
|
return FetchResult.NotModified(cursor);
|
||||||
|
}
|
||||||
|
|
||||||
|
response.EnsureSuccessStatusCode();
|
||||||
|
|
||||||
|
var stream = await response.Content.ReadAsStreamAsync(cancellationToken).ConfigureAwait(false);
|
||||||
|
var etag = response.Headers.ETag?.Tag;
|
||||||
|
|
||||||
|
return FetchResult.Success(stream, targetDate, etag);
|
||||||
|
}
|
||||||
|
|
||||||
|
private FetchResult FetchFromBundle(string fileName)
|
||||||
|
{
|
||||||
|
var bundlePath = Path.Combine(_options.BundlePath!, fileName);
|
||||||
|
if (!File.Exists(bundlePath))
|
||||||
|
{
|
||||||
|
_logger.LogWarning("EPSS bundle file not found: {Path}", bundlePath);
|
||||||
|
return FetchResult.NotFound(bundlePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
var stream = File.OpenRead(bundlePath);
|
||||||
|
return FetchResult.Success(stream, DateOnly.FromDateTime(DateTime.UtcNow), etag: null);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] HTTP GET with gzip streaming
|
||||||
|
- [ ] Conditional requests using ETag (If-None-Match)
|
||||||
|
- [ ] Handles 304 Not Modified response
|
||||||
|
- [ ] Airgap mode falls back to bundle
|
||||||
|
- [ ] Proper error handling and logging
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Implement EpssConnector.ParseAsync
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement CSV parsing stage reusing Scanner's `EpssCsvStreamParser`.
|
||||||
|
|
||||||
|
**Implementation Path**: `Internal/EpssConnector.cs` (continued)
|
||||||
|
|
||||||
|
**Parse Implementation**:
|
||||||
|
```csharp
|
||||||
|
using StellaOps.Scanner.Storage.Epss;
|
||||||
|
|
||||||
|
public sealed partial class EpssConnector
|
||||||
|
{
|
||||||
|
private readonly EpssCsvStreamParser _parser = new();
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Parses gzip CSV stream into EPSS score rows.
|
||||||
|
/// Reuses Scanner's EpssCsvStreamParser for deterministic parsing.
|
||||||
|
/// </summary>
|
||||||
|
public async IAsyncEnumerable<EpssScoreRow> ParseAsync(
|
||||||
|
Stream gzipStream,
|
||||||
|
[EnumeratorCancellation] CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(gzipStream);
|
||||||
|
|
||||||
|
await using var session = _parser.ParseGzip(gzipStream);
|
||||||
|
|
||||||
|
await foreach (var row in session.WithCancellation(cancellationToken))
|
||||||
|
{
|
||||||
|
yield return row;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log session metadata
|
||||||
|
_logger.LogInformation(
|
||||||
|
"Parsed EPSS snapshot: ModelVersion={ModelVersion}, Date={Date}, Rows={Rows}, Hash={Hash}",
|
||||||
|
session.ModelVersionTag,
|
||||||
|
session.PublishedDate,
|
||||||
|
session.RowCount,
|
||||||
|
session.DecompressedSha256);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets parse session metadata after enumeration.
|
||||||
|
/// </summary>
|
||||||
|
public EpssCursor CreateCursorFromSession(
|
||||||
|
EpssCsvStreamParser.EpssCsvParseSession session,
|
||||||
|
string? etag)
|
||||||
|
{
|
||||||
|
return new EpssCursor
|
||||||
|
{
|
||||||
|
ModelVersion = session.ModelVersionTag,
|
||||||
|
LastProcessedDate = session.PublishedDate,
|
||||||
|
ETag = etag,
|
||||||
|
ContentHash = session.DecompressedSha256,
|
||||||
|
LastRowCount = session.RowCount,
|
||||||
|
UpdatedAt = DateTimeOffset.UtcNow
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Reuses `EpssCsvStreamParser` from Scanner module
|
||||||
|
- [ ] Async enumerable streaming (no full materialization)
|
||||||
|
- [ ] Captures session metadata (model version, date, hash)
|
||||||
|
- [ ] Creates cursor from parse session
|
||||||
|
- [ ] Proper cancellation support
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Implement EpssConnector.MapAsync
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T6
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Map parsed EPSS rows to canonical observation records.
|
||||||
|
|
||||||
|
**Implementation Path**: `Internal/EpssMapper.cs`
|
||||||
|
|
||||||
|
**Mapper Definition**:
|
||||||
|
```csharp
|
||||||
|
using StellaOps.Concelier.Core.Observations;
|
||||||
|
using StellaOps.Scanner.Storage.Epss;
|
||||||
|
|
||||||
|
namespace StellaOps.Concelier.Connector.Epss.Internal;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maps EPSS score rows to canonical observation records.
|
||||||
|
/// </summary>
|
||||||
|
public static class EpssMapper
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Maps a single EPSS score row to an observation.
|
||||||
|
/// </summary>
|
||||||
|
public static EpssObservation ToObservation(
|
||||||
|
EpssScoreRow row,
|
||||||
|
string modelVersion,
|
||||||
|
DateOnly publishedDate)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(row);
|
||||||
|
|
||||||
|
return new EpssObservation
|
||||||
|
{
|
||||||
|
CveId = row.CveId,
|
||||||
|
Score = (decimal)row.EpssScore,
|
||||||
|
Percentile = (decimal)row.Percentile,
|
||||||
|
ModelVersion = modelVersion,
|
||||||
|
PublishedDate = publishedDate,
|
||||||
|
Band = DetermineBand((decimal)row.EpssScore)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Determines priority band based on EPSS score.
|
||||||
|
/// </summary>
|
||||||
|
private static EpssBand DetermineBand(decimal score) => score switch
|
||||||
|
{
|
||||||
|
>= 0.70m => EpssBand.Critical, // Top 30%: Critical priority
|
||||||
|
>= 0.40m => EpssBand.High, // 40-70%: High priority
|
||||||
|
>= 0.10m => EpssBand.Medium, // 10-40%: Medium priority
|
||||||
|
_ => EpssBand.Low // <10%: Low priority
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// EPSS observation record.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EpssObservation
|
||||||
|
{
|
||||||
|
public required string CveId { get; init; }
|
||||||
|
public required decimal Score { get; init; }
|
||||||
|
public required decimal Percentile { get; init; }
|
||||||
|
public required string ModelVersion { get; init; }
|
||||||
|
public required DateOnly PublishedDate { get; init; }
|
||||||
|
public required EpssBand Band { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// EPSS priority bands.
|
||||||
|
/// </summary>
|
||||||
|
public enum EpssBand
|
||||||
|
{
|
||||||
|
Low = 0,
|
||||||
|
Medium = 1,
|
||||||
|
High = 2,
|
||||||
|
Critical = 3
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Maps `EpssScoreRow` to `EpssObservation`
|
||||||
|
- [ ] Score values converted to `decimal` for consistency
|
||||||
|
- [ ] Priority bands assigned based on score thresholds
|
||||||
|
- [ ] Model version and date preserved
|
||||||
|
- [ ] Immutable record output
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Register with WellKnownConnectors
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add EPSS to the well-known connectors registry.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Concelier/__Libraries/StellaOps.Concelier.Core/Orchestration/ConnectorRegistrationService.cs`
|
||||||
|
|
||||||
|
**Updated WellKnownConnectors**:
|
||||||
|
```csharp
|
||||||
|
/// <summary>
|
||||||
|
/// EPSS (Exploit Prediction Scoring System) connector metadata.
|
||||||
|
/// </summary>
|
||||||
|
public static ConnectorMetadata Epss => new()
|
||||||
|
{
|
||||||
|
ConnectorId = "epss",
|
||||||
|
Source = "epss",
|
||||||
|
DisplayName = "EPSS",
|
||||||
|
Description = "FIRST.org Exploit Prediction Scoring System",
|
||||||
|
Capabilities = ["observations"],
|
||||||
|
ArtifactKinds = ["raw-scores", "normalized"],
|
||||||
|
DefaultCron = "0 10 * * *", // Daily at 10:00 UTC (after EPSS publishes ~08:00 UTC)
|
||||||
|
DefaultRpm = 100, // No rate limiting on EPSS feed
|
||||||
|
MaxLagMinutes = 1440, // 24 hours (daily feed)
|
||||||
|
EgressAllowlist = ["epss.empiricalsecurity.com"]
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets metadata for all well-known connectors.
|
||||||
|
/// </summary>
|
||||||
|
public static IReadOnlyList<ConnectorMetadata> All => [Nvd, Ghsa, Osv, Kev, IcsCisa, Epss];
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `Epss` static property added to `WellKnownConnectors`
|
||||||
|
- [ ] ConnectorId is `"epss"`
|
||||||
|
- [ ] Default cron set to daily 10:00 UTC
|
||||||
|
- [ ] Egress allowlist includes `epss.empiricalsecurity.com`
|
||||||
|
- [ ] Added to `All` collection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Add Connector Tests
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5, T6, T7
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add integration tests with mock HTTP for EPSS connector.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Concelier/__Tests/StellaOps.Concelier.Connector.Epss.Tests/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
using System.Net;
|
||||||
|
using Microsoft.Extensions.Options;
|
||||||
|
using StellaOps.Concelier.Connector.Epss.Configuration;
|
||||||
|
using StellaOps.Concelier.Connector.Epss.Internal;
|
||||||
|
|
||||||
|
namespace StellaOps.Concelier.Connector.Epss.Tests;
|
||||||
|
|
||||||
|
public class EpssConnectorTests
|
||||||
|
{
|
||||||
|
private static readonly string SampleCsvGz = GetEmbeddedResource("sample_epss.csv.gz");
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task FetchAsync_ReturnsStream_OnSuccess()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var handler = new MockHttpMessageHandler(SampleCsvGz, HttpStatusCode.OK);
|
||||||
|
var httpClient = new HttpClient(handler);
|
||||||
|
var connector = CreateConnector(httpClient);
|
||||||
|
var cursor = EpssCursor.Empty;
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = await connector.FetchAsync(cursor, CancellationToken.None);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.IsSuccess.Should().BeTrue();
|
||||||
|
result.Stream.Should().NotBeNull();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task FetchAsync_ReturnsNotModified_OnETagMatch()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var handler = new MockHttpMessageHandler(status: HttpStatusCode.NotModified);
|
||||||
|
var httpClient = new HttpClient(handler);
|
||||||
|
var connector = CreateConnector(httpClient);
|
||||||
|
var cursor = new EpssCursor { ETag = "\"abc123\"" };
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = await connector.FetchAsync(cursor, CancellationToken.None);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.IsNotModified.Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task ParseAsync_YieldsAllRows()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
await using var stream = GetSampleGzipStream();
|
||||||
|
var connector = CreateConnector();
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var rows = await connector.ParseAsync(stream, CancellationToken.None).ToListAsync();
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
rows.Should().HaveCountGreaterThan(0);
|
||||||
|
rows.Should().AllSatisfy(r =>
|
||||||
|
{
|
||||||
|
r.CveId.Should().StartWith("CVE-");
|
||||||
|
r.EpssScore.Should().BeInRange(0.0, 1.0);
|
||||||
|
r.Percentile.Should().BeInRange(0.0, 1.0);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[InlineData(0.75, EpssBand.Critical)]
|
||||||
|
[InlineData(0.50, EpssBand.High)]
|
||||||
|
[InlineData(0.20, EpssBand.Medium)]
|
||||||
|
[InlineData(0.05, EpssBand.Low)]
|
||||||
|
public void ToObservation_AssignsCorrectBand(double score, EpssBand expectedBand)
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var row = new EpssScoreRow("CVE-2024-12345", score, 0.5);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var observation = EpssMapper.ToObservation(row, "v2024.12.21", DateOnly.FromDateTime(DateTime.UtcNow));
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
observation.Band.Should().Be(expectedBand);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void EpssCursor_Empty_HasMinValue()
|
||||||
|
{
|
||||||
|
// Act
|
||||||
|
var cursor = EpssCursor.Empty;
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
cursor.UpdatedAt.Should().Be(DateTimeOffset.MinValue);
|
||||||
|
cursor.ModelVersion.Should().BeNull();
|
||||||
|
cursor.ContentHash.Should().BeNull();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for successful fetch with mock HTTP
|
||||||
|
- [ ] Test for 304 Not Modified handling
|
||||||
|
- [ ] Test for parse yielding all rows
|
||||||
|
- [ ] Test for band assignment logic
|
||||||
|
- [ ] Test for cursor creation
|
||||||
|
- [ ] All 5+ tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: Add Airgap Bundle Support
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement offline bundle fallback for airgap deployments.
|
||||||
|
|
||||||
|
**Implementation Path**: `Internal/EpssConnector.cs` (update FetchAsync)
|
||||||
|
|
||||||
|
**Bundle Convention**:
|
||||||
|
```
|
||||||
|
/var/stellaops/bundles/epss/
|
||||||
|
├── epss_scores-2024-12-21.csv.gz
|
||||||
|
├── epss_scores-2024-12-20.csv.gz
|
||||||
|
└── manifest.json
|
||||||
|
```
|
||||||
|
|
||||||
|
**Manifest Schema**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"source": "epss",
|
||||||
|
"created": "2024-12-21T10:00:00Z",
|
||||||
|
"files": [
|
||||||
|
{
|
||||||
|
"name": "epss_scores-2024-12-21.csv.gz",
|
||||||
|
"modelVersion": "v2024.12.21",
|
||||||
|
"sha256": "sha256:abc123...",
|
||||||
|
"rowCount": 245000
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Bundle path configurable via `EpssOptions.BundlePath`
|
||||||
|
- [ ] Falls back to bundle when `AirgapMode = true`
|
||||||
|
- [ ] Reads files from bundle directory
|
||||||
|
- [ ] Logs warning if bundle file missing
|
||||||
|
- [ ] Manifest.json validation optional but recommended
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T11: Update Documentation
|
||||||
|
|
||||||
|
**Assignee**: Concelier Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T8
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add EPSS connector to documentation and create AGENTS.md.
|
||||||
|
|
||||||
|
**Implementation Path**:
|
||||||
|
- `src/Concelier/__Libraries/StellaOps.Concelier.Connector.Epss/AGENTS.md` (new)
|
||||||
|
- `docs/modules/concelier/connectors.md` (update)
|
||||||
|
|
||||||
|
**AGENTS.md Content**:
|
||||||
|
```markdown
|
||||||
|
# AGENTS.md - EPSS Connector
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
Ingests EPSS (Exploit Prediction Scoring System) scores from FIRST.org.
|
||||||
|
Provides exploitation probability estimates for CVE prioritization.
|
||||||
|
|
||||||
|
## Data Source
|
||||||
|
- **URL**: https://epss.empiricalsecurity.com/
|
||||||
|
- **Format**: CSV.gz (gzip-compressed CSV)
|
||||||
|
- **Update Frequency**: Daily (~08:00 UTC)
|
||||||
|
- **Coverage**: All CVEs with exploitation telemetry
|
||||||
|
|
||||||
|
## Data Flow
|
||||||
|
1. Connector fetches daily snapshot (epss_scores-YYYY-MM-DD.csv.gz)
|
||||||
|
2. Parses using EpssCsvStreamParser (reused from Scanner)
|
||||||
|
3. Maps to EpssObservation records with band classification
|
||||||
|
4. Stores in concelier.epss_observations table
|
||||||
|
5. Publishes EpssUpdatedEvent for downstream consumers
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
```yaml
|
||||||
|
Concelier:
|
||||||
|
Epss:
|
||||||
|
BaseUrl: "https://epss.empiricalsecurity.com/"
|
||||||
|
AirgapMode: false
|
||||||
|
BundlePath: "/var/stellaops/bundles/epss"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Orchestrator Registration
|
||||||
|
- ConnectorId: `epss`
|
||||||
|
- Default Schedule: Daily 10:00 UTC
|
||||||
|
- Egress Allowlist: `epss.empiricalsecurity.com`
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] AGENTS.md created in connector directory
|
||||||
|
- [ ] Connector added to docs/modules/concelier/connectors.md
|
||||||
|
- [ ] Data flow documented
|
||||||
|
- [ ] Configuration examples provided
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Concelier Team | Create project structure |
|
||||||
|
| 2 | T2 | TODO | T1 | Concelier Team | Implement EpssConnectorPlugin |
|
||||||
|
| 3 | T3 | TODO | T1 | Concelier Team | Implement EpssOptions |
|
||||||
|
| 4 | T4 | TODO | T1 | Concelier Team | Implement EpssCursor |
|
||||||
|
| 5 | T5 | TODO | T3, T4 | Concelier Team | Implement FetchAsync |
|
||||||
|
| 6 | T6 | TODO | T5 | Concelier Team | Implement ParseAsync |
|
||||||
|
| 7 | T7 | TODO | T6 | Concelier Team | Implement MapAsync |
|
||||||
|
| 8 | T8 | TODO | T2 | Concelier Team | Register with WellKnownConnectors |
|
||||||
|
| 9 | T9 | TODO | T5, T6, T7 | Concelier Team | Add connector tests |
|
||||||
|
| 10 | T10 | TODO | T5 | Concelier Team | Add airgap bundle support |
|
||||||
|
| 11 | T11 | TODO | T8 | Concelier Team | Update documentation |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT gap analysis. EPSS connector identified as gap in orchestrated feed ingestion. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Reuse EpssCsvStreamParser | Decision | Concelier Team | Avoids duplication; Scanner parser already tested and optimized |
|
||||||
|
| Separate project vs Scanner extension | Decision | Concelier Team | New Concelier connector aligns with orchestrator pattern |
|
||||||
|
| Daily vs hourly schedule | Decision | Concelier Team | EPSS publishes daily; no benefit to more frequent polling |
|
||||||
|
| Band thresholds | Decision | Concelier Team | 0.70/0.40/0.10 aligned with EPSS community recommendations |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 11 tasks marked DONE
|
||||||
|
- [ ] 5+ connector tests passing
|
||||||
|
- [ ] `dotnet build` succeeds for connector project
|
||||||
|
- [ ] Connector registered in WellKnownConnectors
|
||||||
|
- [ ] Airgap bundle fallback works
|
||||||
|
- [ ] AGENTS.md created
|
||||||
|
|
||||||
489
docs/implplan/SPRINT_4100_0001_0001_reason_coded_unknowns.md
Normal file
489
docs/implplan/SPRINT_4100_0001_0001_reason_coded_unknowns.md
Normal file
@@ -0,0 +1,489 @@
|
|||||||
|
# Sprint 4100.0001.0001 · Reason-Coded Unknowns
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Define structured reason codes for why a component is marked "unknown"
|
||||||
|
- Add remediation hints that map to each reason code
|
||||||
|
- Enable actionable triage by categorizing uncertainty sources
|
||||||
|
|
||||||
|
**Working directory:** `src/Policy/__Libraries/StellaOps.Policy.Unknowns/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (first sprint in batch)
|
||||||
|
- **Downstream**: Sprint 4100.0001.0002 (Unknown Budgets), Sprint 4100.0001.0003 (Unknowns in Attestations)
|
||||||
|
- **Safe to parallelize with**: Sprint 4100.0002.0001, Sprint 4100.0003.0001, Sprint 4100.0004.0002
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy.Unknowns/AGENTS.md`
|
||||||
|
- `docs/product-advisories/19-Dec-2025 - Moat #5.md` (Unknowns as First-Class Risk)
|
||||||
|
- `docs/product-advisories/archived/2025-12-21-moat-gap-closure/14-Dec-2025 - Triage and Unknowns Technical Reference.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define UnknownReasonCode Enum
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create an enumeration defining the canonical reason codes for unknowns.
|
||||||
|
|
||||||
|
**Implementation Path**: `Models/UnknownReasonCode.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Canonical reason codes explaining why a component is marked as "unknown".
|
||||||
|
/// Each code maps to a specific remediation action.
|
||||||
|
/// </summary>
|
||||||
|
public enum UnknownReasonCode
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// U-RCH: Call path analysis is indeterminate.
|
||||||
|
/// The reachability analyzer cannot confirm or deny exploitability.
|
||||||
|
/// </summary>
|
||||||
|
Reachability,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// U-ID: Ambiguous package identity or missing digest.
|
||||||
|
/// Cannot uniquely identify the component (e.g., missing PURL, no checksum).
|
||||||
|
/// </summary>
|
||||||
|
Identity,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// U-PROV: Cannot map binary artifact to source repository.
|
||||||
|
/// Provenance chain is broken or unavailable.
|
||||||
|
/// </summary>
|
||||||
|
Provenance,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// U-VEX: VEX statements conflict or missing applicability data.
|
||||||
|
/// Multiple VEX sources disagree or no VEX coverage exists.
|
||||||
|
/// </summary>
|
||||||
|
VexConflict,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// U-FEED: Required knowledge source is missing or stale.
|
||||||
|
/// Advisory feed gap (e.g., no NVD/OSV data for this package).
|
||||||
|
/// </summary>
|
||||||
|
FeedGap,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// U-CONFIG: Feature flag or configuration not observable.
|
||||||
|
/// Cannot determine if vulnerable code path is enabled at runtime.
|
||||||
|
/// </summary>
|
||||||
|
ConfigUnknown,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// U-ANALYZER: Language or framework not supported by analyzer.
|
||||||
|
/// Static analysis tools do not cover this ecosystem.
|
||||||
|
/// </summary>
|
||||||
|
AnalyzerLimit
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownReasonCode.cs` file created in `Models/` directory
|
||||||
|
- [ ] 7 reason codes defined with XML documentation
|
||||||
|
- [ ] Each code has a short prefix (U-RCH, U-ID, etc.) documented
|
||||||
|
- [ ] Namespace is `StellaOps.Policy.Unknowns.Models`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Extend Unknown Model
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add reason code, remediation hint, evidence references, and assumptions to the Unknown model.
|
||||||
|
|
||||||
|
**Implementation Path**: `Models/Unknown.cs`
|
||||||
|
|
||||||
|
**Updated Model**:
|
||||||
|
```csharp
|
||||||
|
public sealed record Unknown
|
||||||
|
{
|
||||||
|
// Existing fields
|
||||||
|
public Guid Id { get; init; }
|
||||||
|
public string PackageUrl { get; init; }
|
||||||
|
public string? CveId { get; init; }
|
||||||
|
public decimal Score { get; init; }
|
||||||
|
public UnknownBand Band { get; init; }
|
||||||
|
|
||||||
|
// NEW: Reason code explaining why this is unknown
|
||||||
|
public UnknownReasonCode ReasonCode { get; init; }
|
||||||
|
|
||||||
|
// NEW: Human-readable remediation guidance
|
||||||
|
public string? RemediationHint { get; init; }
|
||||||
|
|
||||||
|
// NEW: References to evidence that led to unknown classification
|
||||||
|
public IReadOnlyList<EvidenceRef> EvidenceRefs { get; init; } = [];
|
||||||
|
|
||||||
|
// NEW: Assumptions made during analysis (for audit trail)
|
||||||
|
public IReadOnlyList<string> Assumptions { get; init; } = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to evidence supporting unknown classification.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EvidenceRef(
|
||||||
|
string Type, // "reachability", "vex", "sbom", "feed"
|
||||||
|
string Uri, // Location of evidence
|
||||||
|
string? Digest); // Content hash if applicable
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `ReasonCode` field added to `Unknown` record
|
||||||
|
- [ ] `RemediationHint` nullable string field added
|
||||||
|
- [ ] `EvidenceRefs` collection added with `EvidenceRef` record
|
||||||
|
- [ ] `Assumptions` string collection added
|
||||||
|
- [ ] All new fields have XML documentation
|
||||||
|
- [ ] Existing tests still pass with default values
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Create RemediationHintsRegistry
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create a registry that maps reason codes to actionable remediation hints.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/RemediationHintsRegistry.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Services;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Registry of remediation hints for each unknown reason code.
|
||||||
|
/// Provides actionable guidance for resolving unknowns.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class RemediationHintsRegistry : IRemediationHintsRegistry
|
||||||
|
{
|
||||||
|
private static readonly IReadOnlyDictionary<UnknownReasonCode, RemediationHint> _hints =
|
||||||
|
new Dictionary<UnknownReasonCode, RemediationHint>
|
||||||
|
{
|
||||||
|
[UnknownReasonCode.Reachability] = new(
|
||||||
|
ShortHint: "Run reachability analysis",
|
||||||
|
DetailedHint: "Execute call-graph analysis to determine if vulnerable code paths are reachable from application entrypoints.",
|
||||||
|
AutomationRef: "stella analyze --reachability"),
|
||||||
|
|
||||||
|
[UnknownReasonCode.Identity] = new(
|
||||||
|
ShortHint: "Add package digest",
|
||||||
|
DetailedHint: "Ensure SBOM includes package checksums (SHA-256) and valid PURL coordinates.",
|
||||||
|
AutomationRef: "stella sbom --include-digests"),
|
||||||
|
|
||||||
|
[UnknownReasonCode.Provenance] = new(
|
||||||
|
ShortHint: "Add provenance attestation",
|
||||||
|
DetailedHint: "Generate SLSA provenance linking binary artifact to source repository and build.",
|
||||||
|
AutomationRef: "stella attest --provenance"),
|
||||||
|
|
||||||
|
[UnknownReasonCode.VexConflict] = new(
|
||||||
|
ShortHint: "Publish authoritative VEX",
|
||||||
|
DetailedHint: "Create or update VEX document with applicability assessment for your deployment context.",
|
||||||
|
AutomationRef: "stella vex create"),
|
||||||
|
|
||||||
|
[UnknownReasonCode.FeedGap] = new(
|
||||||
|
ShortHint: "Add advisory source",
|
||||||
|
DetailedHint: "Configure additional advisory feeds (OSV, vendor-specific) or request coverage from upstream.",
|
||||||
|
AutomationRef: "stella feed add"),
|
||||||
|
|
||||||
|
[UnknownReasonCode.ConfigUnknown] = new(
|
||||||
|
ShortHint: "Document feature flags",
|
||||||
|
DetailedHint: "Export runtime configuration showing which features are enabled/disabled in this deployment.",
|
||||||
|
AutomationRef: "stella config export"),
|
||||||
|
|
||||||
|
[UnknownReasonCode.AnalyzerLimit] = new(
|
||||||
|
ShortHint: "Request analyzer support",
|
||||||
|
DetailedHint: "This language/framework is not yet supported. File an issue or use manual assessment.",
|
||||||
|
AutomationRef: null)
|
||||||
|
};
|
||||||
|
|
||||||
|
public RemediationHint GetHint(UnknownReasonCode code) =>
|
||||||
|
_hints.TryGetValue(code, out var hint) ? hint : RemediationHint.Empty;
|
||||||
|
|
||||||
|
public IEnumerable<(UnknownReasonCode Code, RemediationHint Hint)> GetAllHints() =>
|
||||||
|
_hints.Select(kv => (kv.Key, kv.Value));
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record RemediationHint(
|
||||||
|
string ShortHint,
|
||||||
|
string DetailedHint,
|
||||||
|
string? AutomationRef)
|
||||||
|
{
|
||||||
|
public static RemediationHint Empty { get; } = new("No remediation available", "", null);
|
||||||
|
}
|
||||||
|
|
||||||
|
public interface IRemediationHintsRegistry
|
||||||
|
{
|
||||||
|
RemediationHint GetHint(UnknownReasonCode code);
|
||||||
|
IEnumerable<(UnknownReasonCode Code, RemediationHint Hint)> GetAllHints();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `RemediationHintsRegistry.cs` created in `Services/`
|
||||||
|
- [ ] All 7 reason codes have mapped hints
|
||||||
|
- [ ] Each hint includes short hint, detailed hint, and optional automation reference
|
||||||
|
- [ ] Interface `IRemediationHintsRegistry` defined for DI
|
||||||
|
- [ ] Registry is thread-safe (immutable dictionary)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Update UnknownRanker
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update the UnknownRanker to emit reason codes and remediation hints on ranking.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownRanker.cs`
|
||||||
|
|
||||||
|
**Updated Input**:
|
||||||
|
```csharp
|
||||||
|
public sealed record UnknownRankInput(
|
||||||
|
// Existing fields
|
||||||
|
bool HasVexStatement,
|
||||||
|
bool HasReachabilityData,
|
||||||
|
bool HasConflictingSources,
|
||||||
|
bool IsStaleAdvisory,
|
||||||
|
bool IsInKev,
|
||||||
|
decimal EpssScore,
|
||||||
|
decimal CvssScore,
|
||||||
|
DateTimeOffset? FirstSeenAt,
|
||||||
|
DateTimeOffset? LastEvaluatedAt,
|
||||||
|
DateTimeOffset AsOfDateTime,
|
||||||
|
BlastRadius? BlastRadius,
|
||||||
|
ContainmentSignals? Containment,
|
||||||
|
// NEW: Reason classification inputs
|
||||||
|
bool HasPackageDigest,
|
||||||
|
bool HasProvenanceAttestation,
|
||||||
|
bool HasVexConflicts,
|
||||||
|
bool HasFeedCoverage,
|
||||||
|
bool HasConfigVisibility,
|
||||||
|
bool IsAnalyzerSupported);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Reason Code Assignment Logic**:
|
||||||
|
```csharp
|
||||||
|
/// <summary>
|
||||||
|
/// Determines the primary reason code for unknown classification.
|
||||||
|
/// Returns the most actionable/resolvable reason.
|
||||||
|
/// </summary>
|
||||||
|
private UnknownReasonCode DetermineReasonCode(UnknownRankInput input)
|
||||||
|
{
|
||||||
|
// Priority order: most actionable first
|
||||||
|
if (!input.IsAnalyzerSupported)
|
||||||
|
return UnknownReasonCode.AnalyzerLimit;
|
||||||
|
|
||||||
|
if (!input.HasReachabilityData)
|
||||||
|
return UnknownReasonCode.Reachability;
|
||||||
|
|
||||||
|
if (!input.HasPackageDigest)
|
||||||
|
return UnknownReasonCode.Identity;
|
||||||
|
|
||||||
|
if (!input.HasProvenanceAttestation)
|
||||||
|
return UnknownReasonCode.Provenance;
|
||||||
|
|
||||||
|
if (input.HasVexConflicts || !input.HasVexStatement)
|
||||||
|
return UnknownReasonCode.VexConflict;
|
||||||
|
|
||||||
|
if (!input.HasFeedCoverage)
|
||||||
|
return UnknownReasonCode.FeedGap;
|
||||||
|
|
||||||
|
if (!input.HasConfigVisibility)
|
||||||
|
return UnknownReasonCode.ConfigUnknown;
|
||||||
|
|
||||||
|
// Default to reachability if no specific reason
|
||||||
|
return UnknownReasonCode.Reachability;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Updated Result**:
|
||||||
|
```csharp
|
||||||
|
public sealed record UnknownRankResult(
|
||||||
|
decimal Score,
|
||||||
|
decimal UncertaintyFactor,
|
||||||
|
decimal ExploitPressure,
|
||||||
|
UnknownBand Band,
|
||||||
|
decimal DecayFactor = 1.0m,
|
||||||
|
decimal ContainmentReduction = 0m,
|
||||||
|
// NEW: Reason code and hint
|
||||||
|
UnknownReasonCode ReasonCode = UnknownReasonCode.Reachability,
|
||||||
|
string? RemediationHint = null);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownRankInput` extended with reason classification inputs
|
||||||
|
- [ ] `DetermineReasonCode` method implemented with priority logic
|
||||||
|
- [ ] `UnknownRankResult` extended with `ReasonCode` and `RemediationHint`
|
||||||
|
- [ ] Ranker uses `IRemediationHintsRegistry` to populate hints
|
||||||
|
- [ ] Existing tests updated for new input/output fields
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Add DB Migration
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add columns to `policy.unknowns` table for reason code and remediation hint.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Libraries/StellaOps.Policy.Storage.Postgres/migrations/`
|
||||||
|
|
||||||
|
**Migration SQL**:
|
||||||
|
```sql
|
||||||
|
-- Migration: Add reason code and remediation columns to policy.unknowns
|
||||||
|
|
||||||
|
ALTER TABLE policy.unknowns
|
||||||
|
ADD COLUMN IF NOT EXISTS reason_code TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS remediation_hint TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS evidence_refs JSONB DEFAULT '[]',
|
||||||
|
ADD COLUMN IF NOT EXISTS assumptions JSONB DEFAULT '[]';
|
||||||
|
|
||||||
|
-- Create index for querying by reason code
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_unknowns_reason_code
|
||||||
|
ON policy.unknowns(reason_code)
|
||||||
|
WHERE reason_code IS NOT NULL;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN policy.unknowns.reason_code IS 'Canonical reason code: Reachability, Identity, Provenance, VexConflict, FeedGap, ConfigUnknown, AnalyzerLimit';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.remediation_hint IS 'Actionable guidance for resolving this unknown';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.evidence_refs IS 'JSON array of evidence references supporting classification';
|
||||||
|
COMMENT ON COLUMN policy.unknowns.assumptions IS 'JSON array of assumptions made during analysis';
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migration file created with sequential number
|
||||||
|
- [ ] `reason_code` TEXT column added
|
||||||
|
- [ ] `remediation_hint` TEXT column added
|
||||||
|
- [ ] `evidence_refs` JSONB column added with default
|
||||||
|
- [ ] `assumptions` JSONB column added with default
|
||||||
|
- [ ] Index created for reason_code queries
|
||||||
|
- [ ] Column comments added for documentation
|
||||||
|
- [ ] Migration is idempotent (IF NOT EXISTS)
|
||||||
|
- [ ] RLS policies still apply
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Update API DTOs
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Include reason codes and remediation hints in API response DTOs.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/StellaOps.Policy.WebService/Controllers/UnknownsController.cs`
|
||||||
|
|
||||||
|
**Updated DTO**:
|
||||||
|
```csharp
|
||||||
|
public sealed record UnknownDto
|
||||||
|
{
|
||||||
|
public Guid Id { get; init; }
|
||||||
|
public string PackageUrl { get; init; }
|
||||||
|
public string? CveId { get; init; }
|
||||||
|
public decimal Score { get; init; }
|
||||||
|
public string Band { get; init; }
|
||||||
|
// NEW fields
|
||||||
|
public string ReasonCode { get; init; }
|
||||||
|
public string ReasonCodeShort { get; init; } // e.g., "U-RCH"
|
||||||
|
public string? RemediationHint { get; init; }
|
||||||
|
public string? DetailedHint { get; init; }
|
||||||
|
public string? AutomationCommand { get; init; }
|
||||||
|
public IReadOnlyList<EvidenceRefDto> EvidenceRefs { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record EvidenceRefDto(
|
||||||
|
string Type,
|
||||||
|
string Uri,
|
||||||
|
string? Digest);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Short Code Mapping**:
|
||||||
|
```csharp
|
||||||
|
private static readonly IReadOnlyDictionary<UnknownReasonCode, string> ShortCodes = new Dictionary<UnknownReasonCode, string>
|
||||||
|
{
|
||||||
|
[UnknownReasonCode.Reachability] = "U-RCH",
|
||||||
|
[UnknownReasonCode.Identity] = "U-ID",
|
||||||
|
[UnknownReasonCode.Provenance] = "U-PROV",
|
||||||
|
[UnknownReasonCode.VexConflict] = "U-VEX",
|
||||||
|
[UnknownReasonCode.FeedGap] = "U-FEED",
|
||||||
|
[UnknownReasonCode.ConfigUnknown] = "U-CONFIG",
|
||||||
|
[UnknownReasonCode.AnalyzerLimit] = "U-ANALYZER"
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownDto` extended with reason code fields
|
||||||
|
- [ ] Short code (U-RCH, U-ID, etc.) included in response
|
||||||
|
- [ ] Remediation hint fields included
|
||||||
|
- [ ] Evidence references included as array
|
||||||
|
- [ ] OpenAPI spec updated
|
||||||
|
- [ ] Response schema validated
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Policy Team | Define UnknownReasonCode enum |
|
||||||
|
| 2 | T2 | TODO | T1 | Policy Team | Extend Unknown model |
|
||||||
|
| 3 | T3 | TODO | T1 | Policy Team | Create RemediationHintsRegistry |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | Policy Team | Update UnknownRanker |
|
||||||
|
| 5 | T5 | TODO | T1, T2 | Policy Team | Add DB migration |
|
||||||
|
| 6 | T6 | TODO | T4 | Policy Team | Update API DTOs |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT Phase 2 gap analysis. Reason-coded unknowns identified as requirement from Moat #5 advisory. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| 7 reason codes | Decision | Policy Team | Covers all identified uncertainty sources; extensible if needed |
|
||||||
|
| Priority ordering | Decision | Policy Team | Most actionable/resolvable reasons assigned first |
|
||||||
|
| Short codes (U-*) | Decision | Policy Team | Human-readable prefixes for triage dashboards |
|
||||||
|
| JSONB for arrays | Decision | Policy Team | Flexible schema for evidence refs and assumptions |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] 7 reason codes defined and documented
|
||||||
|
- [ ] Remediation hints mapped for all codes
|
||||||
|
- [ ] API returns reason codes in responses
|
||||||
|
- [ ] Migration applies cleanly
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds for `StellaOps.Policy.Unknowns.Tests`
|
||||||
659
docs/implplan/SPRINT_4100_0001_0002_unknown_budgets.md
Normal file
659
docs/implplan/SPRINT_4100_0001_0002_unknown_budgets.md
Normal file
@@ -0,0 +1,659 @@
|
|||||||
|
# Sprint 4100.0001.0002 · Unknown Budgets & Environment Thresholds
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Define environment-aware unknown budgets (prod: strict, stage: moderate, dev: permissive)
|
||||||
|
- Implement budget enforcement with block/warn actions
|
||||||
|
- Enable policy-driven control over acceptable unknown counts
|
||||||
|
|
||||||
|
**Working directory:** `src/Policy/__Libraries/StellaOps.Policy.Unknowns/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4100.0001.0001 (Reason-Coded Unknowns) — MUST BE DONE
|
||||||
|
- **Downstream**: Sprint 4100.0001.0003 (Unknowns in Attestations)
|
||||||
|
- **Safe to parallelize with**: Sprint 4100.0002.0002, Sprint 4100.0003.0002
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- Sprint 4100.0001.0001 completion
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy.Unknowns/AGENTS.md`
|
||||||
|
- `docs/product-advisories/19-Dec-2025 - Moat #5.md` (Unknowns as First-Class Risk)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define UnknownBudget Model
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create a model representing unknown budgets with environment-specific thresholds.
|
||||||
|
|
||||||
|
**Implementation Path**: `Models/UnknownBudget.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Represents an unknown budget for a specific environment.
|
||||||
|
/// Budgets define maximum acceptable unknown counts by reason code.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record UnknownBudget
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Environment name: "prod", "stage", "dev", or custom.
|
||||||
|
/// </summary>
|
||||||
|
public required string Environment { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maximum total unknowns allowed across all reason codes.
|
||||||
|
/// </summary>
|
||||||
|
public int? TotalLimit { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Per-reason-code limits. Missing codes inherit from TotalLimit.
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyDictionary<UnknownReasonCode, int> ReasonLimits { get; init; }
|
||||||
|
= new Dictionary<UnknownReasonCode, int>();
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Action when budget is exceeded.
|
||||||
|
/// </summary>
|
||||||
|
public BudgetAction Action { get; init; } = BudgetAction.Warn;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Custom message to display when budget is exceeded.
|
||||||
|
/// </summary>
|
||||||
|
public string? ExceededMessage { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Action to take when unknown budget is exceeded.
|
||||||
|
/// </summary>
|
||||||
|
public enum BudgetAction
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Log warning only, do not block.
|
||||||
|
/// </summary>
|
||||||
|
Warn,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Block the operation (fail policy evaluation).
|
||||||
|
/// </summary>
|
||||||
|
Block,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Warn but allow if exception is applied.
|
||||||
|
/// </summary>
|
||||||
|
WarnUnlessException
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Result of checking unknowns against a budget.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BudgetCheckResult
|
||||||
|
{
|
||||||
|
public required bool IsWithinBudget { get; init; }
|
||||||
|
public required BudgetAction RecommendedAction { get; init; }
|
||||||
|
public required int TotalUnknowns { get; init; }
|
||||||
|
public int? TotalLimit { get; init; }
|
||||||
|
public IReadOnlyDictionary<UnknownReasonCode, BudgetViolation> Violations { get; init; }
|
||||||
|
= new Dictionary<UnknownReasonCode, BudgetViolation>();
|
||||||
|
public string? Message { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Details of a specific budget violation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BudgetViolation(
|
||||||
|
UnknownReasonCode ReasonCode,
|
||||||
|
int Count,
|
||||||
|
int Limit);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownBudget.cs` file created in `Models/` directory
|
||||||
|
- [ ] Budget supports total and per-reason limits
|
||||||
|
- [ ] `BudgetAction` enum with Warn, Block, WarnUnlessException
|
||||||
|
- [ ] `BudgetCheckResult` captures violation details
|
||||||
|
- [ ] XML documentation on all types
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Create UnknownBudgetService
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement service for retrieving budgets and checking compliance.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownBudgetService.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Services;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Service for managing and checking unknown budgets.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class UnknownBudgetService : IUnknownBudgetService
|
||||||
|
{
|
||||||
|
private readonly IOptionsMonitor<UnknownBudgetOptions> _options;
|
||||||
|
private readonly ILogger<UnknownBudgetService> _logger;
|
||||||
|
|
||||||
|
public UnknownBudgetService(
|
||||||
|
IOptionsMonitor<UnknownBudgetOptions> options,
|
||||||
|
ILogger<UnknownBudgetService> logger)
|
||||||
|
{
|
||||||
|
_options = options;
|
||||||
|
_logger = logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the budget configuration for a specific environment.
|
||||||
|
/// Falls back to default if environment not found.
|
||||||
|
/// </summary>
|
||||||
|
public UnknownBudget GetBudgetForEnvironment(string environment)
|
||||||
|
{
|
||||||
|
var budgets = _options.CurrentValue.Budgets;
|
||||||
|
|
||||||
|
if (budgets.TryGetValue(environment, out var budget))
|
||||||
|
return budget;
|
||||||
|
|
||||||
|
if (budgets.TryGetValue("default", out var defaultBudget))
|
||||||
|
return defaultBudget with { Environment = environment };
|
||||||
|
|
||||||
|
// Permissive fallback if no configuration
|
||||||
|
return new UnknownBudget
|
||||||
|
{
|
||||||
|
Environment = environment,
|
||||||
|
TotalLimit = null,
|
||||||
|
Action = BudgetAction.Warn
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Checks a collection of unknowns against the budget for an environment.
|
||||||
|
/// </summary>
|
||||||
|
public BudgetCheckResult CheckBudget(
|
||||||
|
string environment,
|
||||||
|
IReadOnlyList<Unknown> unknowns)
|
||||||
|
{
|
||||||
|
var budget = GetBudgetForEnvironment(environment);
|
||||||
|
var violations = new Dictionary<UnknownReasonCode, BudgetViolation>();
|
||||||
|
var total = unknowns.Count;
|
||||||
|
|
||||||
|
// Check per-reason-code limits
|
||||||
|
var byReason = unknowns
|
||||||
|
.GroupBy(u => u.ReasonCode)
|
||||||
|
.ToDictionary(g => g.Key, g => g.Count());
|
||||||
|
|
||||||
|
foreach (var (code, limit) in budget.ReasonLimits)
|
||||||
|
{
|
||||||
|
if (byReason.TryGetValue(code, out var count) && count > limit)
|
||||||
|
{
|
||||||
|
violations[code] = new BudgetViolation(code, count, limit);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check total limit
|
||||||
|
var isWithinBudget = violations.Count == 0 &&
|
||||||
|
(!budget.TotalLimit.HasValue || total <= budget.TotalLimit.Value);
|
||||||
|
|
||||||
|
var message = isWithinBudget
|
||||||
|
? null
|
||||||
|
: budget.ExceededMessage ?? $"Unknown budget exceeded: {total} unknowns in {environment}";
|
||||||
|
|
||||||
|
return new BudgetCheckResult
|
||||||
|
{
|
||||||
|
IsWithinBudget = isWithinBudget,
|
||||||
|
RecommendedAction = isWithinBudget ? BudgetAction.Warn : budget.Action,
|
||||||
|
TotalUnknowns = total,
|
||||||
|
TotalLimit = budget.TotalLimit,
|
||||||
|
Violations = violations,
|
||||||
|
Message = message
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Checks if an operation should be blocked based on budget result.
|
||||||
|
/// </summary>
|
||||||
|
public bool ShouldBlock(BudgetCheckResult result) =>
|
||||||
|
!result.IsWithinBudget && result.RecommendedAction == BudgetAction.Block;
|
||||||
|
}
|
||||||
|
|
||||||
|
public interface IUnknownBudgetService
|
||||||
|
{
|
||||||
|
UnknownBudget GetBudgetForEnvironment(string environment);
|
||||||
|
BudgetCheckResult CheckBudget(string environment, IReadOnlyList<Unknown> unknowns);
|
||||||
|
bool ShouldBlock(BudgetCheckResult result);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownBudgetService.cs` created in `Services/`
|
||||||
|
- [ ] `GetBudgetForEnvironment` with fallback logic
|
||||||
|
- [ ] `CheckBudget` aggregates violations by reason code
|
||||||
|
- [ ] `ShouldBlock` helper method
|
||||||
|
- [ ] Interface defined for DI
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement Budget Checking Logic
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the detailed budget checking with block/warn decision logic.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownBudgetService.cs`
|
||||||
|
|
||||||
|
**Extended Logic**:
|
||||||
|
```csharp
|
||||||
|
/// <summary>
|
||||||
|
/// Performs comprehensive budget check with environment escalation.
|
||||||
|
/// </summary>
|
||||||
|
public BudgetCheckResult CheckBudgetWithEscalation(
|
||||||
|
string environment,
|
||||||
|
IReadOnlyList<Unknown> unknowns,
|
||||||
|
IReadOnlyList<ExceptionObject>? exceptions = null)
|
||||||
|
{
|
||||||
|
var baseResult = CheckBudget(environment, unknowns);
|
||||||
|
|
||||||
|
if (baseResult.IsWithinBudget)
|
||||||
|
return baseResult;
|
||||||
|
|
||||||
|
// Check if exceptions cover the violations
|
||||||
|
if (exceptions?.Count > 0)
|
||||||
|
{
|
||||||
|
var coveredReasons = exceptions
|
||||||
|
.Where(e => e.Status == ExceptionStatus.Approved)
|
||||||
|
.SelectMany(e => e.CoveredReasonCodes)
|
||||||
|
.ToHashSet();
|
||||||
|
|
||||||
|
var uncoveredViolations = baseResult.Violations
|
||||||
|
.Where(v => !coveredReasons.Contains(v.Key))
|
||||||
|
.ToDictionary(v => v.Key, v => v.Value);
|
||||||
|
|
||||||
|
if (uncoveredViolations.Count == 0)
|
||||||
|
{
|
||||||
|
return baseResult with
|
||||||
|
{
|
||||||
|
IsWithinBudget = true,
|
||||||
|
RecommendedAction = BudgetAction.Warn,
|
||||||
|
Message = "Budget exceeded but covered by approved exceptions"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log the violation for observability
|
||||||
|
_logger.LogWarning(
|
||||||
|
"Unknown budget exceeded for environment {Environment}: {Total}/{Limit}",
|
||||||
|
environment, baseResult.TotalUnknowns, baseResult.TotalLimit);
|
||||||
|
|
||||||
|
return baseResult;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets a summary of budget status for reporting.
|
||||||
|
/// </summary>
|
||||||
|
public BudgetStatusSummary GetBudgetStatus(
|
||||||
|
string environment,
|
||||||
|
IReadOnlyList<Unknown> unknowns)
|
||||||
|
{
|
||||||
|
var budget = GetBudgetForEnvironment(environment);
|
||||||
|
var result = CheckBudget(environment, unknowns);
|
||||||
|
|
||||||
|
return new BudgetStatusSummary
|
||||||
|
{
|
||||||
|
Environment = environment,
|
||||||
|
TotalUnknowns = unknowns.Count,
|
||||||
|
TotalLimit = budget.TotalLimit,
|
||||||
|
PercentageUsed = budget.TotalLimit.HasValue
|
||||||
|
? (decimal)unknowns.Count / budget.TotalLimit.Value * 100
|
||||||
|
: 0m,
|
||||||
|
IsExceeded = !result.IsWithinBudget,
|
||||||
|
ViolationCount = result.Violations.Count,
|
||||||
|
ByReasonCode = unknowns
|
||||||
|
.GroupBy(u => u.ReasonCode)
|
||||||
|
.ToDictionary(g => g.Key, g => g.Count())
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BudgetStatusSummary
|
||||||
|
{
|
||||||
|
public required string Environment { get; init; }
|
||||||
|
public required int TotalUnknowns { get; init; }
|
||||||
|
public int? TotalLimit { get; init; }
|
||||||
|
public decimal PercentageUsed { get; init; }
|
||||||
|
public bool IsExceeded { get; init; }
|
||||||
|
public int ViolationCount { get; init; }
|
||||||
|
public IReadOnlyDictionary<UnknownReasonCode, int> ByReasonCode { get; init; }
|
||||||
|
= new Dictionary<UnknownReasonCode, int>();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `CheckBudgetWithEscalation` supports exception coverage
|
||||||
|
- [ ] Approved exceptions can cover specific reason codes
|
||||||
|
- [ ] Violations logged for observability
|
||||||
|
- [ ] `GetBudgetStatus` returns summary for dashboards
|
||||||
|
- [ ] Percentage calculation for budget utilization
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Add Policy Configuration
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define YAML configuration schema for unknown budgets.
|
||||||
|
|
||||||
|
**Implementation Path**: `Configuration/UnknownBudgetOptions.cs` (new file)
|
||||||
|
|
||||||
|
**Options Class**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Unknowns.Configuration;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration options for unknown budgets.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class UnknownBudgetOptions
|
||||||
|
{
|
||||||
|
public const string SectionName = "UnknownBudgets";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Budget configurations keyed by environment name.
|
||||||
|
/// </summary>
|
||||||
|
public Dictionary<string, UnknownBudget> Budgets { get; set; } = new();
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether to enforce budgets (false = warn only).
|
||||||
|
/// </summary>
|
||||||
|
public bool EnforceBudgets { get; set; } = true;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Sample YAML Configuration**:
|
||||||
|
```yaml
|
||||||
|
# etc/policy.unknowns.yaml
|
||||||
|
unknownBudgets:
|
||||||
|
enforceBudgets: true
|
||||||
|
budgets:
|
||||||
|
prod:
|
||||||
|
environment: prod
|
||||||
|
totalLimit: 3
|
||||||
|
reasonLimits:
|
||||||
|
Reachability: 0
|
||||||
|
Provenance: 0
|
||||||
|
VexConflict: 1
|
||||||
|
action: Block
|
||||||
|
exceededMessage: "Production requires zero reachability unknowns"
|
||||||
|
|
||||||
|
stage:
|
||||||
|
environment: stage
|
||||||
|
totalLimit: 10
|
||||||
|
reasonLimits:
|
||||||
|
Reachability: 1
|
||||||
|
action: WarnUnlessException
|
||||||
|
|
||||||
|
dev:
|
||||||
|
environment: dev
|
||||||
|
totalLimit: null # No limit
|
||||||
|
action: Warn
|
||||||
|
|
||||||
|
default:
|
||||||
|
environment: default
|
||||||
|
totalLimit: 5
|
||||||
|
action: Warn
|
||||||
|
```
|
||||||
|
|
||||||
|
**DI Registration**:
|
||||||
|
```csharp
|
||||||
|
// In startup/DI configuration
|
||||||
|
services.Configure<UnknownBudgetOptions>(
|
||||||
|
configuration.GetSection(UnknownBudgetOptions.SectionName));
|
||||||
|
services.AddSingleton<IUnknownBudgetService, UnknownBudgetService>();
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownBudgetOptions.cs` created in `Configuration/`
|
||||||
|
- [ ] Options bind from YAML configuration
|
||||||
|
- [ ] Sample configuration documented
|
||||||
|
- [ ] `EnforceBudgets` toggle for global enable/disable
|
||||||
|
- [ ] Default budget fallback defined
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Integrate with PolicyEvaluator
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate unknown budget checking into the policy evaluation pipeline.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/StellaOps.Policy.Engine/Services/PolicyEvaluator.cs`
|
||||||
|
|
||||||
|
**Integration Points**:
|
||||||
|
```csharp
|
||||||
|
public sealed class PolicyEvaluator
|
||||||
|
{
|
||||||
|
private readonly IUnknownBudgetService _budgetService;
|
||||||
|
|
||||||
|
public async Task<PolicyEvaluationResult> EvaluateAsync(
|
||||||
|
PolicyEvaluationRequest request,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// ... existing evaluation logic ...
|
||||||
|
|
||||||
|
// Check unknown budgets
|
||||||
|
var budgetResult = _budgetService.CheckBudgetWithEscalation(
|
||||||
|
request.Environment,
|
||||||
|
unknowns,
|
||||||
|
request.AppliedExceptions);
|
||||||
|
|
||||||
|
if (_budgetService.ShouldBlock(budgetResult))
|
||||||
|
{
|
||||||
|
return PolicyEvaluationResult.Fail(
|
||||||
|
PolicyFailureReason.UnknownBudgetExceeded,
|
||||||
|
budgetResult.Message,
|
||||||
|
new UnknownBudgetViolation(budgetResult));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Include budget status in result
|
||||||
|
return result with
|
||||||
|
{
|
||||||
|
UnknownBudgetStatus = new BudgetStatusSummary
|
||||||
|
{
|
||||||
|
IsExceeded = !budgetResult.IsWithinBudget,
|
||||||
|
TotalUnknowns = budgetResult.TotalUnknowns,
|
||||||
|
TotalLimit = budgetResult.TotalLimit,
|
||||||
|
Violations = budgetResult.Violations
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Failure reason for policy evaluation.
|
||||||
|
/// </summary>
|
||||||
|
public enum PolicyFailureReason
|
||||||
|
{
|
||||||
|
// Existing reasons...
|
||||||
|
CveExceedsThreshold,
|
||||||
|
LicenseViolation,
|
||||||
|
// NEW
|
||||||
|
UnknownBudgetExceeded
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `PolicyEvaluator` checks unknown budgets
|
||||||
|
- [ ] Blocking configured budgets fail evaluation
|
||||||
|
- [ ] `UnknownBudgetExceeded` failure reason added
|
||||||
|
- [ ] Budget status included in evaluation result
|
||||||
|
- [ ] Exception coverage respected
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Add Tests
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add comprehensive tests for budget enforcement.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Tests/StellaOps.Policy.Unknowns.Tests/Services/UnknownBudgetServiceTests.cs`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class UnknownBudgetServiceTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void GetBudgetForEnvironment_KnownEnv_ReturnsBudget()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var options = CreateOptions(prod: new UnknownBudget
|
||||||
|
{
|
||||||
|
Environment = "prod",
|
||||||
|
TotalLimit = 3
|
||||||
|
});
|
||||||
|
var service = new UnknownBudgetService(options, NullLogger.Instance);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var budget = service.GetBudgetForEnvironment("prod");
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
budget.TotalLimit.Should().Be(3);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void CheckBudget_WithinLimit_ReturnsSuccess()
|
||||||
|
{
|
||||||
|
var unknowns = CreateUnknowns(count: 2);
|
||||||
|
var result = _service.CheckBudget("prod", unknowns);
|
||||||
|
|
||||||
|
result.IsWithinBudget.Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void CheckBudget_ExceedsTotal_ReturnsViolation()
|
||||||
|
{
|
||||||
|
var unknowns = CreateUnknowns(count: 5); // limit is 3
|
||||||
|
var result = _service.CheckBudget("prod", unknowns);
|
||||||
|
|
||||||
|
result.IsWithinBudget.Should().BeFalse();
|
||||||
|
result.RecommendedAction.Should().Be(BudgetAction.Block);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void CheckBudget_ExceedsReasonLimit_ReturnsSpecificViolation()
|
||||||
|
{
|
||||||
|
var unknowns = CreateUnknowns(
|
||||||
|
reachability: 2, // limit is 0
|
||||||
|
identity: 1);
|
||||||
|
var result = _service.CheckBudget("prod", unknowns);
|
||||||
|
|
||||||
|
result.Violations.Should().ContainKey(UnknownReasonCode.Reachability);
|
||||||
|
result.Violations[UnknownReasonCode.Reachability].Count.Should().Be(2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void CheckBudgetWithEscalation_ExceptionCovers_AllowsOperation()
|
||||||
|
{
|
||||||
|
var unknowns = CreateUnknowns(reachability: 1);
|
||||||
|
var exceptions = new[] { CreateException(UnknownReasonCode.Reachability) };
|
||||||
|
|
||||||
|
var result = _service.CheckBudgetWithEscalation("prod", unknowns, exceptions);
|
||||||
|
|
||||||
|
result.IsWithinBudget.Should().BeTrue();
|
||||||
|
result.Message.Should().Contain("covered by approved exceptions");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ShouldBlock_BlockAction_ReturnsTrue()
|
||||||
|
{
|
||||||
|
var result = new BudgetCheckResult
|
||||||
|
{
|
||||||
|
IsWithinBudget = false,
|
||||||
|
RecommendedAction = BudgetAction.Block
|
||||||
|
};
|
||||||
|
|
||||||
|
_service.ShouldBlock(result).Should().BeTrue();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for budget retrieval with fallback
|
||||||
|
- [ ] Test for within-budget success
|
||||||
|
- [ ] Test for total limit violation
|
||||||
|
- [ ] Test for per-reason limit violation
|
||||||
|
- [ ] Test for exception coverage
|
||||||
|
- [ ] Test for block action decision
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Policy Team | Define UnknownBudget model |
|
||||||
|
| 2 | T2 | TODO | T1 | Policy Team | Create UnknownBudgetService |
|
||||||
|
| 3 | T3 | TODO | T2 | Policy Team | Implement budget checking logic |
|
||||||
|
| 4 | T4 | TODO | T1 | Policy Team | Add policy configuration |
|
||||||
|
| 5 | T5 | TODO | T2, T3 | Policy Team | Integrate with PolicyEvaluator |
|
||||||
|
| 6 | T6 | TODO | T5 | Policy Team | Add tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT Phase 2 gap analysis. Unknown budgets identified as requirement from Moat #5 advisory. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Environment-keyed budgets | Decision | Policy Team | Allows prod/stage/dev differentiation |
|
||||||
|
| BudgetAction enum | Decision | Policy Team | Block, Warn, WarnUnlessException provides flexibility |
|
||||||
|
| Exception coverage | Decision | Policy Team | Approved exceptions can override budget violations |
|
||||||
|
| Null totalLimit | Decision | Policy Team | Null means unlimited (no budget enforcement) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] Budget configuration loads from YAML
|
||||||
|
- [ ] Policy evaluator respects budget limits
|
||||||
|
- [ ] Exceptions can cover violations
|
||||||
|
- [ ] 6+ budget-related tests passing
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
675
docs/implplan/SPRINT_4100_0001_0003_unknowns_attestations.md
Normal file
675
docs/implplan/SPRINT_4100_0001_0003_unknowns_attestations.md
Normal file
@@ -0,0 +1,675 @@
|
|||||||
|
# Sprint 4100.0001.0003 · Unknowns in Attestations
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Include unknown summaries in signed attestations
|
||||||
|
- Aggregate unknowns by reason code for policy predicates
|
||||||
|
- Enable attestation consumers to verify unknown handling
|
||||||
|
|
||||||
|
**Working directory:** `src/Attestor/__Libraries/StellaOps.Attestor.ProofChain/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4100.0001.0001 (Reason-Coded Unknowns), Sprint 4100.0001.0002 (Unknown Budgets) — MUST BE DONE
|
||||||
|
- **Downstream**: Sprint 4100.0003.0001 (Risk Verdict Attestation)
|
||||||
|
- **Safe to parallelize with**: Sprint 4100.0002.0003, Sprint 4100.0004.0001
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- Sprint 4100.0001.0001 completion (UnknownReasonCode enum)
|
||||||
|
- Sprint 4100.0001.0002 completion (UnknownBudget model)
|
||||||
|
- `src/Attestor/__Libraries/StellaOps.Attestor.ProofChain/AGENTS.md`
|
||||||
|
- `docs/product-advisories/19-Dec-2025 - Moat #5.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define UnknownsSummary Model
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create a model for aggregated unknowns data to include in attestations.
|
||||||
|
|
||||||
|
**Implementation Path**: `Models/UnknownsSummary.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Attestor.ProofChain.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Aggregated summary of unknowns for inclusion in attestations.
|
||||||
|
/// Provides verifiable data about unknown risk handled during evaluation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record UnknownsSummary
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Total count of unknowns encountered.
|
||||||
|
/// </summary>
|
||||||
|
public int Total { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Count of unknowns by reason code.
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyDictionary<string, int> ByReasonCode { get; init; }
|
||||||
|
= new Dictionary<string, int>();
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Count of unknowns that would block if not excepted.
|
||||||
|
/// </summary>
|
||||||
|
public int BlockingCount { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Count of unknowns that are covered by approved exceptions.
|
||||||
|
/// </summary>
|
||||||
|
public int ExceptedCount { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Policy thresholds that were evaluated.
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyList<string> PolicyThresholdsApplied { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Exception IDs that were applied to cover unknowns.
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyList<string> ExceptionsApplied { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Hash of the unknowns list for integrity verification.
|
||||||
|
/// </summary>
|
||||||
|
public string? UnknownsDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates an empty summary for cases with no unknowns.
|
||||||
|
/// </summary>
|
||||||
|
public static UnknownsSummary Empty { get; } = new()
|
||||||
|
{
|
||||||
|
Total = 0,
|
||||||
|
ByReasonCode = new Dictionary<string, int>(),
|
||||||
|
BlockingCount = 0,
|
||||||
|
ExceptedCount = 0
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownsSummary.cs` file created in `Models/` directory
|
||||||
|
- [ ] Total and per-reason-code counts included
|
||||||
|
- [ ] Blocking and excepted counts tracked
|
||||||
|
- [ ] Policy thresholds and exception IDs recorded
|
||||||
|
- [ ] Digest field for integrity verification
|
||||||
|
- [ ] Static `Empty` instance for convenience
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Extend VerdictReceiptPayload
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add unknowns summary field to the verdict receipt statement payload.
|
||||||
|
|
||||||
|
**Implementation Path**: `Statements/VerdictReceiptStatement.cs`
|
||||||
|
|
||||||
|
**Updated Payload**:
|
||||||
|
```csharp
|
||||||
|
/// <summary>
|
||||||
|
/// Payload for verdict receipt attestation statement.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VerdictReceiptPayload
|
||||||
|
{
|
||||||
|
// Existing fields
|
||||||
|
public required string VerdictId { get; init; }
|
||||||
|
public required string ArtifactDigest { get; init; }
|
||||||
|
public required string PolicyRef { get; init; }
|
||||||
|
public required VerdictStatus Status { get; init; }
|
||||||
|
public required DateTimeOffset EvaluatedAt { get; init; }
|
||||||
|
public IReadOnlyList<Finding> Findings { get; init; } = [];
|
||||||
|
public IReadOnlyList<string> AppliedExceptions { get; init; } = [];
|
||||||
|
|
||||||
|
// NEW: Unknowns summary
|
||||||
|
/// <summary>
|
||||||
|
/// Summary of unknowns encountered during evaluation.
|
||||||
|
/// Included for transparency about uncertainty in the verdict.
|
||||||
|
/// </summary>
|
||||||
|
public UnknownsSummary? Unknowns { get; init; }
|
||||||
|
|
||||||
|
// NEW: Knowledge snapshot reference
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the knowledge snapshot used for evaluation.
|
||||||
|
/// Enables replay and verification of inputs.
|
||||||
|
/// </summary>
|
||||||
|
public string? KnowledgeSnapshotId { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**JSON Schema Update**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"verdictId": { "type": "string" },
|
||||||
|
"artifactDigest": { "type": "string" },
|
||||||
|
"unknowns": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"total": { "type": "integer" },
|
||||||
|
"byReasonCode": {
|
||||||
|
"type": "object",
|
||||||
|
"additionalProperties": { "type": "integer" }
|
||||||
|
},
|
||||||
|
"blockingCount": { "type": "integer" },
|
||||||
|
"exceptedCount": { "type": "integer" },
|
||||||
|
"policyThresholdsApplied": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" }
|
||||||
|
},
|
||||||
|
"exceptionsApplied": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" }
|
||||||
|
},
|
||||||
|
"unknownsDigest": { "type": "string" }
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"knowledgeSnapshotId": { "type": "string" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `Unknowns` field added to `VerdictReceiptPayload`
|
||||||
|
- [ ] `KnowledgeSnapshotId` field added for replay support
|
||||||
|
- [ ] JSON schema updated with unknowns structure
|
||||||
|
- [ ] Field is nullable for backward compatibility
|
||||||
|
- [ ] Existing attestation tests still pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Create UnknownsAggregator
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement service to aggregate unknowns into summary format for attestations.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/UnknownsAggregator.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Attestor.ProofChain.Services;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Aggregates unknowns data into summary format for attestations.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class UnknownsAggregator : IUnknownsAggregator
|
||||||
|
{
|
||||||
|
private readonly IHasher _hasher;
|
||||||
|
|
||||||
|
public UnknownsAggregator(IHasher hasher)
|
||||||
|
{
|
||||||
|
_hasher = hasher;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates an unknowns summary from evaluation results.
|
||||||
|
/// </summary>
|
||||||
|
public UnknownsSummary Aggregate(
|
||||||
|
IReadOnlyList<UnknownItem> unknowns,
|
||||||
|
BudgetCheckResult? budgetResult = null,
|
||||||
|
IReadOnlyList<ExceptionRef>? exceptions = null)
|
||||||
|
{
|
||||||
|
if (unknowns.Count == 0)
|
||||||
|
return UnknownsSummary.Empty;
|
||||||
|
|
||||||
|
// Count by reason code
|
||||||
|
var byReasonCode = unknowns
|
||||||
|
.GroupBy(u => u.ReasonCode.ToString())
|
||||||
|
.ToDictionary(g => g.Key, g => g.Count());
|
||||||
|
|
||||||
|
// Calculate blocking count (would block without exceptions)
|
||||||
|
var blockingCount = budgetResult?.Violations.Values.Sum(v => v.Count) ?? 0;
|
||||||
|
|
||||||
|
// Calculate excepted count
|
||||||
|
var exceptedCount = exceptions?.Count ?? 0;
|
||||||
|
|
||||||
|
// Compute digest of unknowns list for integrity
|
||||||
|
var unknownsDigest = ComputeUnknownsDigest(unknowns);
|
||||||
|
|
||||||
|
// Extract policy thresholds that were checked
|
||||||
|
var thresholds = budgetResult?.Violations.Keys
|
||||||
|
.Select(k => $"{k}:{budgetResult.Violations[k].Limit}")
|
||||||
|
.ToList() ?? [];
|
||||||
|
|
||||||
|
// Extract applied exception IDs
|
||||||
|
var exceptionIds = exceptions?
|
||||||
|
.Select(e => e.ExceptionId)
|
||||||
|
.ToList() ?? [];
|
||||||
|
|
||||||
|
return new UnknownsSummary
|
||||||
|
{
|
||||||
|
Total = unknowns.Count,
|
||||||
|
ByReasonCode = byReasonCode,
|
||||||
|
BlockingCount = blockingCount,
|
||||||
|
ExceptedCount = exceptedCount,
|
||||||
|
PolicyThresholdsApplied = thresholds,
|
||||||
|
ExceptionsApplied = exceptionIds,
|
||||||
|
UnknownsDigest = unknownsDigest
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Computes a deterministic digest of the unknowns list.
|
||||||
|
/// </summary>
|
||||||
|
private string ComputeUnknownsDigest(IReadOnlyList<UnknownItem> unknowns)
|
||||||
|
{
|
||||||
|
// Sort for determinism
|
||||||
|
var sorted = unknowns
|
||||||
|
.OrderBy(u => u.PackageUrl)
|
||||||
|
.ThenBy(u => u.CveId)
|
||||||
|
.ThenBy(u => u.ReasonCode.ToString())
|
||||||
|
.ToList();
|
||||||
|
|
||||||
|
// Serialize to canonical JSON
|
||||||
|
var json = JsonSerializer.Serialize(sorted, new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
WriteIndented = false
|
||||||
|
});
|
||||||
|
|
||||||
|
// Hash the serialized data
|
||||||
|
return _hasher.ComputeSha256(json);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Input item for unknowns aggregation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record UnknownItem(
|
||||||
|
string PackageUrl,
|
||||||
|
string? CveId,
|
||||||
|
string ReasonCode,
|
||||||
|
string? RemediationHint);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to an applied exception.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ExceptionRef(
|
||||||
|
string ExceptionId,
|
||||||
|
string Status,
|
||||||
|
IReadOnlyList<string> CoveredReasonCodes);
|
||||||
|
|
||||||
|
public interface IUnknownsAggregator
|
||||||
|
{
|
||||||
|
UnknownsSummary Aggregate(
|
||||||
|
IReadOnlyList<UnknownItem> unknowns,
|
||||||
|
BudgetCheckResult? budgetResult = null,
|
||||||
|
IReadOnlyList<ExceptionRef>? exceptions = null);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `UnknownsAggregator.cs` created in `Services/`
|
||||||
|
- [ ] Aggregates unknowns by reason code
|
||||||
|
- [ ] Computes blocking and excepted counts
|
||||||
|
- [ ] Generates deterministic digest of unknowns
|
||||||
|
- [ ] Records policy thresholds and exception IDs
|
||||||
|
- [ ] Interface defined for DI
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Update PolicyDecisionPredicate
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Include unknowns data in the policy decision predicate for attestation verification.
|
||||||
|
|
||||||
|
**Implementation Path**: `Predicates/PolicyDecisionPredicate.cs`
|
||||||
|
|
||||||
|
**Updated Predicate**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Attestor.ProofChain.Predicates;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Predicate type for policy decision attestations.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record PolicyDecisionPredicate
|
||||||
|
{
|
||||||
|
public const string PredicateType = "https://stella.ops/predicates/policy-decision@v2";
|
||||||
|
|
||||||
|
// Existing fields
|
||||||
|
public required string PolicyRef { get; init; }
|
||||||
|
public required PolicyDecision Decision { get; init; }
|
||||||
|
public required DateTimeOffset EvaluatedAt { get; init; }
|
||||||
|
public IReadOnlyList<FindingSummary> Findings { get; init; } = [];
|
||||||
|
|
||||||
|
// NEW: Unknowns handling
|
||||||
|
/// <summary>
|
||||||
|
/// Summary of unknowns and how they were handled.
|
||||||
|
/// </summary>
|
||||||
|
public UnknownsSummary? Unknowns { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether unknowns were a factor in the decision.
|
||||||
|
/// </summary>
|
||||||
|
public bool UnknownsAffectedDecision { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reason codes that caused blocking (if any).
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyList<string> BlockingReasonCodes { get; init; } = [];
|
||||||
|
|
||||||
|
// NEW: Knowledge snapshot reference
|
||||||
|
/// <summary>
|
||||||
|
/// Content-addressed ID of the knowledge snapshot used.
|
||||||
|
/// </summary>
|
||||||
|
public string? KnowledgeSnapshotId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Policy decision outcome.
|
||||||
|
/// </summary>
|
||||||
|
public enum PolicyDecision
|
||||||
|
{
|
||||||
|
Pass,
|
||||||
|
Fail,
|
||||||
|
PassWithExceptions,
|
||||||
|
Indeterminate
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Predicate Builder Update**:
|
||||||
|
```csharp
|
||||||
|
public PolicyDecisionPredicate Build(PolicyEvaluationResult result)
|
||||||
|
{
|
||||||
|
var unknownsAffected = result.UnknownBudgetStatus?.IsExceeded == true ||
|
||||||
|
result.FailureReason == PolicyFailureReason.UnknownBudgetExceeded;
|
||||||
|
|
||||||
|
var blockingCodes = result.UnknownBudgetStatus?.Violations.Keys
|
||||||
|
.Select(k => k.ToString())
|
||||||
|
.ToList() ?? [];
|
||||||
|
|
||||||
|
return new PolicyDecisionPredicate
|
||||||
|
{
|
||||||
|
PolicyRef = result.PolicyRef,
|
||||||
|
Decision = MapDecision(result),
|
||||||
|
EvaluatedAt = result.EvaluatedAt,
|
||||||
|
Findings = result.Findings.Select(MapFinding).ToList(),
|
||||||
|
Unknowns = _aggregator.Aggregate(result.Unknowns, result.UnknownBudgetStatus),
|
||||||
|
UnknownsAffectedDecision = unknownsAffected,
|
||||||
|
BlockingReasonCodes = blockingCodes,
|
||||||
|
KnowledgeSnapshotId = result.KnowledgeSnapshotId
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Predicate version bumped to v2
|
||||||
|
- [ ] `Unknowns` field added with summary
|
||||||
|
- [ ] `UnknownsAffectedDecision` boolean flag
|
||||||
|
- [ ] `BlockingReasonCodes` list for failed verdicts
|
||||||
|
- [ ] `KnowledgeSnapshotId` for replay support
|
||||||
|
- [ ] Predicate builder uses aggregator
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Add Attestation Tests
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add tests verifying unknowns are correctly included in signed attestations.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Attestor/__Tests/StellaOps.Attestor.ProofChain.Tests/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class UnknownsSummaryTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void Empty_ReturnsZeroCounts()
|
||||||
|
{
|
||||||
|
var summary = UnknownsSummary.Empty;
|
||||||
|
|
||||||
|
summary.Total.Should().Be(0);
|
||||||
|
summary.ByReasonCode.Should().BeEmpty();
|
||||||
|
summary.BlockingCount.Should().Be(0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class UnknownsAggregatorTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void Aggregate_GroupsByReasonCode()
|
||||||
|
{
|
||||||
|
var unknowns = new[]
|
||||||
|
{
|
||||||
|
new UnknownItem("pkg:npm/foo@1.0", null, "Reachability", null),
|
||||||
|
new UnknownItem("pkg:npm/bar@1.0", null, "Reachability", null),
|
||||||
|
new UnknownItem("pkg:npm/baz@1.0", null, "Identity", null)
|
||||||
|
};
|
||||||
|
|
||||||
|
var summary = _aggregator.Aggregate(unknowns);
|
||||||
|
|
||||||
|
summary.Total.Should().Be(3);
|
||||||
|
summary.ByReasonCode["Reachability"].Should().Be(2);
|
||||||
|
summary.ByReasonCode["Identity"].Should().Be(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Aggregate_ComputesDeterministicDigest()
|
||||||
|
{
|
||||||
|
var unknowns = CreateUnknowns();
|
||||||
|
|
||||||
|
var summary1 = _aggregator.Aggregate(unknowns);
|
||||||
|
var summary2 = _aggregator.Aggregate(unknowns.Reverse().ToList());
|
||||||
|
|
||||||
|
summary1.UnknownsDigest.Should().Be(summary2.UnknownsDigest);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Aggregate_IncludesExceptionIds()
|
||||||
|
{
|
||||||
|
var unknowns = CreateUnknowns();
|
||||||
|
var exceptions = new[]
|
||||||
|
{
|
||||||
|
new ExceptionRef("EXC-001", "Approved", new[] { "Reachability" })
|
||||||
|
};
|
||||||
|
|
||||||
|
var summary = _aggregator.Aggregate(unknowns, null, exceptions);
|
||||||
|
|
||||||
|
summary.ExceptionsApplied.Should().Contain("EXC-001");
|
||||||
|
summary.ExceptedCount.Should().Be(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class VerdictReceiptStatementTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void CreateStatement_IncludesUnknownsSummary()
|
||||||
|
{
|
||||||
|
var result = CreateEvaluationResult(unknownsCount: 5);
|
||||||
|
|
||||||
|
var statement = _builder.Build(result);
|
||||||
|
|
||||||
|
statement.Predicate.Unknowns.Should().NotBeNull();
|
||||||
|
statement.Predicate.Unknowns.Total.Should().Be(5);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void CreateStatement_SignatureCoversUnknowns()
|
||||||
|
{
|
||||||
|
var result = CreateEvaluationResult(unknownsCount: 5);
|
||||||
|
|
||||||
|
var envelope = _signer.SignStatement(result);
|
||||||
|
|
||||||
|
// Modify unknowns and verify signature fails
|
||||||
|
var tampered = envelope with
|
||||||
|
{
|
||||||
|
Payload = ModifyUnknownsCount(envelope.Payload, 0)
|
||||||
|
};
|
||||||
|
|
||||||
|
_verifier.Verify(tampered).Should().BeFalse();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for empty summary creation
|
||||||
|
- [ ] Test for reason code grouping
|
||||||
|
- [ ] Test for deterministic digest computation
|
||||||
|
- [ ] Test for exception ID inclusion
|
||||||
|
- [ ] Test for unknowns in statement payload
|
||||||
|
- [ ] Test that signature covers unknowns data
|
||||||
|
- [ ] All 6+ tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Update Predicate Schema
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update the JSON schema documentation for the policy decision predicate.
|
||||||
|
|
||||||
|
**Implementation Path**: `docs/api/predicates/policy-decision-v2.schema.json`
|
||||||
|
|
||||||
|
**Schema Documentation**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||||
|
"$id": "https://stella.ops/predicates/policy-decision@v2",
|
||||||
|
"title": "Policy Decision Predicate v2",
|
||||||
|
"description": "Attestation predicate for policy evaluation decisions, including unknowns handling.",
|
||||||
|
"type": "object",
|
||||||
|
"required": ["policyRef", "decision", "evaluatedAt"],
|
||||||
|
"properties": {
|
||||||
|
"policyRef": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Reference to the policy that was evaluated"
|
||||||
|
},
|
||||||
|
"decision": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": ["Pass", "Fail", "PassWithExceptions", "Indeterminate"],
|
||||||
|
"description": "Final policy decision"
|
||||||
|
},
|
||||||
|
"evaluatedAt": {
|
||||||
|
"type": "string",
|
||||||
|
"format": "date-time",
|
||||||
|
"description": "ISO-8601 timestamp of evaluation"
|
||||||
|
},
|
||||||
|
"unknowns": {
|
||||||
|
"type": "object",
|
||||||
|
"description": "Summary of unknowns encountered during evaluation",
|
||||||
|
"properties": {
|
||||||
|
"total": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 0,
|
||||||
|
"description": "Total count of unknowns"
|
||||||
|
},
|
||||||
|
"byReasonCode": {
|
||||||
|
"type": "object",
|
||||||
|
"additionalProperties": { "type": "integer" },
|
||||||
|
"description": "Count per reason code (Reachability, Identity, etc.)"
|
||||||
|
},
|
||||||
|
"blockingCount": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 0,
|
||||||
|
"description": "Count that would block without exceptions"
|
||||||
|
},
|
||||||
|
"exceptedCount": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 0,
|
||||||
|
"description": "Count covered by approved exceptions"
|
||||||
|
},
|
||||||
|
"unknownsDigest": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "SHA-256 digest of unknowns list"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"unknownsAffectedDecision": {
|
||||||
|
"type": "boolean",
|
||||||
|
"description": "Whether unknowns influenced the decision"
|
||||||
|
},
|
||||||
|
"blockingReasonCodes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" },
|
||||||
|
"description": "Reason codes that caused blocking"
|
||||||
|
},
|
||||||
|
"knowledgeSnapshotId": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Content-addressed ID of knowledge snapshot"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Schema file created at `docs/api/predicates/`
|
||||||
|
- [ ] All new fields documented
|
||||||
|
- [ ] Schema validates against sample payloads
|
||||||
|
- [ ] Version bump to v2 documented
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Attestor Team | Define UnknownsSummary model |
|
||||||
|
| 2 | T2 | TODO | T1 | Attestor Team | Extend VerdictReceiptPayload |
|
||||||
|
| 3 | T3 | TODO | T1 | Attestor Team | Create UnknownsAggregator |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | Attestor Team | Update PolicyDecisionPredicate |
|
||||||
|
| 5 | T5 | TODO | T4 | Attestor Team | Add attestation tests |
|
||||||
|
| 6 | T6 | TODO | T4 | Attestor Team | Update predicate schema |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT Phase 2 gap analysis. Unknowns in attestations identified as requirement from Moat #5 advisory. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Predicate version bump | Decision | Attestor Team | v1 → v2 for backward compatibility tracking |
|
||||||
|
| Deterministic digest | Decision | Attestor Team | Enables tamper detection of unknowns list |
|
||||||
|
| String reason codes | Decision | Attestor Team | Using strings instead of enums for JSON flexibility |
|
||||||
|
| Nullable unknowns | Decision | Attestor Team | Allows backward compatibility with v1 payloads |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] Unknowns summary included in attestations
|
||||||
|
- [ ] Predicate schema v2 documented
|
||||||
|
- [ ] Aggregator computes deterministic digests
|
||||||
|
- [ ] 6+ attestation tests passing
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
@@ -0,0 +1,949 @@
|
|||||||
|
# Sprint 4100.0002.0001 · Knowledge Snapshot Manifest
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Define unified content-addressed manifest for knowledge snapshots
|
||||||
|
- Enable deterministic capture of all evaluation inputs
|
||||||
|
- Support time-travel replay by freezing knowledge state
|
||||||
|
|
||||||
|
**Working directory:** `src/Policy/__Libraries/StellaOps.Policy/Snapshots/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (first sprint in batch)
|
||||||
|
- **Downstream**: Sprint 4100.0002.0002 (Replay Engine), Sprint 4100.0002.0003 (Snapshot Export/Import), Sprint 4100.0004.0001 (Security State Delta)
|
||||||
|
- **Safe to parallelize with**: Sprint 4100.0001.0001, Sprint 4100.0003.0001, Sprint 4100.0004.0002
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy/AGENTS.md`
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Moat Explanation - Knowledge Snapshots and Time‑Travel Replay.md`
|
||||||
|
- `docs/product-advisories/19-Dec-2025 - Moat #2.md` (Risk Verdict Attestation)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define KnowledgeSnapshotManifest
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the unified manifest structure for knowledge snapshots.
|
||||||
|
|
||||||
|
**Implementation Path**: `Snapshots/KnowledgeSnapshotManifest.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Snapshots;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Unified manifest for a knowledge snapshot.
|
||||||
|
/// Content-addressed bundle capturing all inputs to a policy evaluation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record KnowledgeSnapshotManifest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Content-addressed snapshot ID: ksm:sha256:{hash}
|
||||||
|
/// </summary>
|
||||||
|
public required string SnapshotId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When this snapshot was created (UTC).
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Engine version that created this snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public required EngineInfo Engine { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Plugins/analyzers active during snapshot creation.
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyList<PluginInfo> Plugins { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the policy bundle used.
|
||||||
|
/// </summary>
|
||||||
|
public required PolicyBundleRef Policy { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the scoring rules used.
|
||||||
|
/// </summary>
|
||||||
|
public required ScoringRulesRef Scoring { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the trust bundle (root certificates, VEX publishers).
|
||||||
|
/// </summary>
|
||||||
|
public TrustBundleRef? Trust { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Knowledge sources included in this snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public required IReadOnlyList<KnowledgeSourceDescriptor> Sources { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Determinism profile for environment reproducibility.
|
||||||
|
/// </summary>
|
||||||
|
public DeterminismProfile? Environment { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional DSSE signature over the manifest.
|
||||||
|
/// </summary>
|
||||||
|
public string? Signature { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Manifest format version.
|
||||||
|
/// </summary>
|
||||||
|
public string ManifestVersion { get; init; } = "1.0";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Engine version information.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EngineInfo(
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
string Commit);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Plugin/analyzer information.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record PluginInfo(
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
string Type);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to a policy bundle.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record PolicyBundleRef(
|
||||||
|
string PolicyId,
|
||||||
|
string Digest,
|
||||||
|
string? Uri);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to scoring rules.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScoringRulesRef(
|
||||||
|
string RulesId,
|
||||||
|
string Digest,
|
||||||
|
string? Uri);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to trust bundle.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record TrustBundleRef(
|
||||||
|
string BundleId,
|
||||||
|
string Digest,
|
||||||
|
string? Uri);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Determinism profile for environment capture.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record DeterminismProfile(
|
||||||
|
string TimezoneOffset,
|
||||||
|
string Locale,
|
||||||
|
string Platform,
|
||||||
|
IReadOnlyDictionary<string, string> EnvironmentVars);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `KnowledgeSnapshotManifest.cs` file created in `Snapshots/` directory
|
||||||
|
- [ ] All component records defined (EngineInfo, PluginInfo, etc.)
|
||||||
|
- [ ] SnapshotId uses content-addressed format `ksm:sha256:{hash}`
|
||||||
|
- [ ] Manifest is immutable (all init-only properties)
|
||||||
|
- [ ] XML documentation on all types
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Define KnowledgeSourceDescriptor
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create a model describing each knowledge source in the snapshot.
|
||||||
|
|
||||||
|
**Implementation Path**: `Snapshots/KnowledgeSourceDescriptor.cs` (new file)
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Snapshots;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Descriptor for a knowledge source included in a snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record KnowledgeSourceDescriptor
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique name of the source (e.g., "nvd", "osv", "vendor-vex").
|
||||||
|
/// </summary>
|
||||||
|
public required string Name { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Type of source: "advisory-feed", "vex", "sbom", "reachability", "policy".
|
||||||
|
/// </summary>
|
||||||
|
public required string Type { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Epoch or version of the source data.
|
||||||
|
/// </summary>
|
||||||
|
public required string Epoch { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Content digest of the source data.
|
||||||
|
/// </summary>
|
||||||
|
public required string Digest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Origin URI where this source was fetched from.
|
||||||
|
/// </summary>
|
||||||
|
public string? Origin { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When this source was last updated.
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset? LastUpdatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Record count or entry count in this source.
|
||||||
|
/// </summary>
|
||||||
|
public int? RecordCount { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether this source is bundled (embedded) or referenced.
|
||||||
|
/// </summary>
|
||||||
|
public SourceInclusionMode InclusionMode { get; init; } = SourceInclusionMode.Referenced;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Relative path within the snapshot bundle (if bundled).
|
||||||
|
/// </summary>
|
||||||
|
public string? BundlePath { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// How a source is included in the snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public enum SourceInclusionMode
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Source is referenced by digest only (requires external fetch for replay).
|
||||||
|
/// </summary>
|
||||||
|
Referenced,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Source content is embedded in the snapshot bundle.
|
||||||
|
/// </summary>
|
||||||
|
Bundled,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Source is bundled and compressed.
|
||||||
|
/// </summary>
|
||||||
|
BundledCompressed
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `KnowledgeSourceDescriptor.cs` file created
|
||||||
|
- [ ] Source types defined: advisory-feed, vex, sbom, reachability, policy
|
||||||
|
- [ ] Inclusion modes defined: Referenced, Bundled, BundledCompressed
|
||||||
|
- [ ] Digest and epoch for content addressing
|
||||||
|
- [ ] Optional bundle path for embedded sources
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Create SnapshotBuilder
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement a fluent API for constructing snapshot manifests.
|
||||||
|
|
||||||
|
**Implementation Path**: `Snapshots/SnapshotBuilder.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Snapshots;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Fluent builder for constructing knowledge snapshot manifests.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class SnapshotBuilder
|
||||||
|
{
|
||||||
|
private readonly List<KnowledgeSourceDescriptor> _sources = [];
|
||||||
|
private readonly List<PluginInfo> _plugins = [];
|
||||||
|
private EngineInfo? _engine;
|
||||||
|
private PolicyBundleRef? _policy;
|
||||||
|
private ScoringRulesRef? _scoring;
|
||||||
|
private TrustBundleRef? _trust;
|
||||||
|
private DeterminismProfile? _environment;
|
||||||
|
private readonly IHasher _hasher;
|
||||||
|
|
||||||
|
public SnapshotBuilder(IHasher hasher)
|
||||||
|
{
|
||||||
|
_hasher = hasher;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithEngine(string name, string version, string commit)
|
||||||
|
{
|
||||||
|
_engine = new EngineInfo(name, version, commit);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithPlugin(string name, string version, string type)
|
||||||
|
{
|
||||||
|
_plugins.Add(new PluginInfo(name, version, type));
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithPolicy(string policyId, string digest, string? uri = null)
|
||||||
|
{
|
||||||
|
_policy = new PolicyBundleRef(policyId, digest, uri);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithScoring(string rulesId, string digest, string? uri = null)
|
||||||
|
{
|
||||||
|
_scoring = new ScoringRulesRef(rulesId, digest, uri);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithTrust(string bundleId, string digest, string? uri = null)
|
||||||
|
{
|
||||||
|
_trust = new TrustBundleRef(bundleId, digest, uri);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithSource(KnowledgeSourceDescriptor source)
|
||||||
|
{
|
||||||
|
_sources.Add(source);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithAdvisoryFeed(
|
||||||
|
string name, string epoch, string digest, string? origin = null)
|
||||||
|
{
|
||||||
|
_sources.Add(new KnowledgeSourceDescriptor
|
||||||
|
{
|
||||||
|
Name = name,
|
||||||
|
Type = "advisory-feed",
|
||||||
|
Epoch = epoch,
|
||||||
|
Digest = digest,
|
||||||
|
Origin = origin
|
||||||
|
});
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithVex(string name, string digest, string? origin = null)
|
||||||
|
{
|
||||||
|
_sources.Add(new KnowledgeSourceDescriptor
|
||||||
|
{
|
||||||
|
Name = name,
|
||||||
|
Type = "vex",
|
||||||
|
Epoch = DateTimeOffset.UtcNow.ToString("o"),
|
||||||
|
Digest = digest,
|
||||||
|
Origin = origin
|
||||||
|
});
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder WithEnvironment(DeterminismProfile environment)
|
||||||
|
{
|
||||||
|
_environment = environment;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
public SnapshotBuilder CaptureCurrentEnvironment()
|
||||||
|
{
|
||||||
|
_environment = new DeterminismProfile(
|
||||||
|
TimezoneOffset: TimeZoneInfo.Local.BaseUtcOffset.ToString(),
|
||||||
|
Locale: CultureInfo.CurrentCulture.Name,
|
||||||
|
Platform: Environment.OSVersion.ToString(),
|
||||||
|
EnvironmentVars: new Dictionary<string, string>());
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds the manifest and computes the content-addressed ID.
|
||||||
|
/// </summary>
|
||||||
|
public KnowledgeSnapshotManifest Build()
|
||||||
|
{
|
||||||
|
if (_engine is null)
|
||||||
|
throw new InvalidOperationException("Engine info is required");
|
||||||
|
if (_policy is null)
|
||||||
|
throw new InvalidOperationException("Policy reference is required");
|
||||||
|
if (_scoring is null)
|
||||||
|
throw new InvalidOperationException("Scoring reference is required");
|
||||||
|
if (_sources.Count == 0)
|
||||||
|
throw new InvalidOperationException("At least one source is required");
|
||||||
|
|
||||||
|
// Create manifest without ID first
|
||||||
|
var manifest = new KnowledgeSnapshotManifest
|
||||||
|
{
|
||||||
|
SnapshotId = "", // Placeholder
|
||||||
|
CreatedAt = DateTimeOffset.UtcNow,
|
||||||
|
Engine = _engine,
|
||||||
|
Plugins = _plugins.ToList(),
|
||||||
|
Policy = _policy,
|
||||||
|
Scoring = _scoring,
|
||||||
|
Trust = _trust,
|
||||||
|
Sources = _sources.OrderBy(s => s.Name).ToList(),
|
||||||
|
Environment = _environment
|
||||||
|
};
|
||||||
|
|
||||||
|
// Compute content-addressed ID
|
||||||
|
var snapshotId = ComputeSnapshotId(manifest);
|
||||||
|
|
||||||
|
return manifest with { SnapshotId = snapshotId };
|
||||||
|
}
|
||||||
|
|
||||||
|
private string ComputeSnapshotId(KnowledgeSnapshotManifest manifest)
|
||||||
|
{
|
||||||
|
// Serialize to canonical JSON (sorted keys, no whitespace)
|
||||||
|
var json = JsonSerializer.Serialize(manifest with { SnapshotId = "" },
|
||||||
|
new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
WriteIndented = false,
|
||||||
|
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
|
||||||
|
});
|
||||||
|
|
||||||
|
var hash = _hasher.ComputeSha256(json);
|
||||||
|
return $"ksm:sha256:{hash}";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `SnapshotBuilder.cs` created in `Snapshots/`
|
||||||
|
- [ ] Fluent API for all manifest components
|
||||||
|
- [ ] Validation on Build() for required fields
|
||||||
|
- [ ] Content-addressed ID computed from manifest hash
|
||||||
|
- [ ] Sources sorted for determinism
|
||||||
|
- [ ] Environment capture helper method
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement Content-Addressed ID
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Ensure snapshot ID is deterministically computed from manifest content.
|
||||||
|
|
||||||
|
**Implementation Path**: `Snapshots/SnapshotIdGenerator.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Snapshots;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Generates and validates content-addressed snapshot IDs.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class SnapshotIdGenerator : ISnapshotIdGenerator
|
||||||
|
{
|
||||||
|
private const string Prefix = "ksm:sha256:";
|
||||||
|
private readonly IHasher _hasher;
|
||||||
|
|
||||||
|
public SnapshotIdGenerator(IHasher hasher)
|
||||||
|
{
|
||||||
|
_hasher = hasher;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Generates a content-addressed ID for a manifest.
|
||||||
|
/// </summary>
|
||||||
|
public string GenerateId(KnowledgeSnapshotManifest manifest)
|
||||||
|
{
|
||||||
|
var canonicalJson = ToCanonicalJson(manifest with { SnapshotId = "", Signature = null });
|
||||||
|
var hash = _hasher.ComputeSha256(canonicalJson);
|
||||||
|
return $"{Prefix}{hash}";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Validates that a manifest's ID matches its content.
|
||||||
|
/// </summary>
|
||||||
|
public bool ValidateId(KnowledgeSnapshotManifest manifest)
|
||||||
|
{
|
||||||
|
var expectedId = GenerateId(manifest);
|
||||||
|
return manifest.SnapshotId == expectedId;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Parses a snapshot ID into its components.
|
||||||
|
/// </summary>
|
||||||
|
public SnapshotIdComponents? ParseId(string snapshotId)
|
||||||
|
{
|
||||||
|
if (!snapshotId.StartsWith(Prefix))
|
||||||
|
return null;
|
||||||
|
|
||||||
|
var hash = snapshotId[Prefix.Length..];
|
||||||
|
if (hash.Length != 64) // SHA-256 hex length
|
||||||
|
return null;
|
||||||
|
|
||||||
|
return new SnapshotIdComponents("sha256", hash);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ToCanonicalJson(KnowledgeSnapshotManifest manifest)
|
||||||
|
{
|
||||||
|
return JsonSerializer.Serialize(manifest, new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
WriteIndented = false,
|
||||||
|
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||||
|
Encoder = System.Text.Encodings.Web.JavaScriptEncoder.UnsafeRelaxedJsonEscaping
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SnapshotIdComponents(string Algorithm, string Hash);
|
||||||
|
|
||||||
|
public interface ISnapshotIdGenerator
|
||||||
|
{
|
||||||
|
string GenerateId(KnowledgeSnapshotManifest manifest);
|
||||||
|
bool ValidateId(KnowledgeSnapshotManifest manifest);
|
||||||
|
SnapshotIdComponents? ParseId(string snapshotId);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `SnapshotIdGenerator.cs` created
|
||||||
|
- [ ] ID format: `ksm:sha256:{64-char-hex}`
|
||||||
|
- [ ] ID excludes signature field from hash
|
||||||
|
- [ ] Validation method confirms ID matches content
|
||||||
|
- [ ] Parse method extracts algorithm and hash
|
||||||
|
- [ ] Interface defined for DI
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Create SnapshotService
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement service for creating, sealing, and verifying snapshots.
|
||||||
|
|
||||||
|
**Implementation Path**: `Snapshots/SnapshotService.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Policy.Snapshots;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Service for managing knowledge snapshots.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class SnapshotService : ISnapshotService
|
||||||
|
{
|
||||||
|
private readonly ISnapshotIdGenerator _idGenerator;
|
||||||
|
private readonly ISigner _signer;
|
||||||
|
private readonly ISnapshotStore _store;
|
||||||
|
private readonly ILogger<SnapshotService> _logger;
|
||||||
|
|
||||||
|
public SnapshotService(
|
||||||
|
ISnapshotIdGenerator idGenerator,
|
||||||
|
ISigner signer,
|
||||||
|
ISnapshotStore store,
|
||||||
|
ILogger<SnapshotService> logger)
|
||||||
|
{
|
||||||
|
_idGenerator = idGenerator;
|
||||||
|
_signer = signer;
|
||||||
|
_store = store;
|
||||||
|
_logger = logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates and persists a new snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<KnowledgeSnapshotManifest> CreateSnapshotAsync(
|
||||||
|
SnapshotBuilder builder,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var manifest = builder.Build();
|
||||||
|
|
||||||
|
// Validate ID before storing
|
||||||
|
if (!_idGenerator.ValidateId(manifest))
|
||||||
|
throw new InvalidOperationException("Snapshot ID validation failed");
|
||||||
|
|
||||||
|
await _store.SaveAsync(manifest, ct);
|
||||||
|
|
||||||
|
_logger.LogInformation("Created snapshot {SnapshotId}", manifest.SnapshotId);
|
||||||
|
|
||||||
|
return manifest;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Seals a snapshot with a DSSE signature.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<KnowledgeSnapshotManifest> SealSnapshotAsync(
|
||||||
|
KnowledgeSnapshotManifest manifest,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var payload = JsonSerializer.SerializeToUtf8Bytes(manifest with { Signature = null });
|
||||||
|
var signature = await _signer.SignAsync(payload, ct);
|
||||||
|
|
||||||
|
var sealed = manifest with { Signature = signature };
|
||||||
|
|
||||||
|
await _store.SaveAsync(sealed, ct);
|
||||||
|
|
||||||
|
_logger.LogInformation("Sealed snapshot {SnapshotId}", manifest.SnapshotId);
|
||||||
|
|
||||||
|
return sealed;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Verifies a snapshot's integrity and signature.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<SnapshotVerificationResult> VerifySnapshotAsync(
|
||||||
|
KnowledgeSnapshotManifest manifest,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// Verify content-addressed ID
|
||||||
|
if (!_idGenerator.ValidateId(manifest))
|
||||||
|
{
|
||||||
|
return SnapshotVerificationResult.Fail("Snapshot ID does not match content");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify signature if present
|
||||||
|
if (manifest.Signature is not null)
|
||||||
|
{
|
||||||
|
var payload = JsonSerializer.SerializeToUtf8Bytes(manifest with { Signature = null });
|
||||||
|
var sigValid = await _signer.VerifyAsync(payload, manifest.Signature, ct);
|
||||||
|
|
||||||
|
if (!sigValid)
|
||||||
|
{
|
||||||
|
return SnapshotVerificationResult.Fail("Signature verification failed");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return SnapshotVerificationResult.Success();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Retrieves a snapshot by ID.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<KnowledgeSnapshotManifest?> GetSnapshotAsync(
|
||||||
|
string snapshotId,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
return await _store.GetAsync(snapshotId, ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SnapshotVerificationResult(bool IsValid, string? Error)
|
||||||
|
{
|
||||||
|
public static SnapshotVerificationResult Success() => new(true, null);
|
||||||
|
public static SnapshotVerificationResult Fail(string error) => new(false, error);
|
||||||
|
}
|
||||||
|
|
||||||
|
public interface ISnapshotService
|
||||||
|
{
|
||||||
|
Task<KnowledgeSnapshotManifest> CreateSnapshotAsync(SnapshotBuilder builder, CancellationToken ct = default);
|
||||||
|
Task<KnowledgeSnapshotManifest> SealSnapshotAsync(KnowledgeSnapshotManifest manifest, CancellationToken ct = default);
|
||||||
|
Task<SnapshotVerificationResult> VerifySnapshotAsync(KnowledgeSnapshotManifest manifest, CancellationToken ct = default);
|
||||||
|
Task<KnowledgeSnapshotManifest?> GetSnapshotAsync(string snapshotId, CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
public interface ISnapshotStore
|
||||||
|
{
|
||||||
|
Task SaveAsync(KnowledgeSnapshotManifest manifest, CancellationToken ct = default);
|
||||||
|
Task<KnowledgeSnapshotManifest?> GetAsync(string snapshotId, CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `SnapshotService.cs` created in `Snapshots/`
|
||||||
|
- [ ] Create, seal, verify, and get operations
|
||||||
|
- [ ] Sealing adds DSSE signature
|
||||||
|
- [ ] Verification checks ID and signature
|
||||||
|
- [ ] Store interface for persistence abstraction
|
||||||
|
- [ ] Logging for observability
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Integrate with PolicyEvaluator
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Bind policy evaluation to a knowledge snapshot for reproducibility.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/StellaOps.Policy.Engine/Services/PolicyEvaluator.cs`
|
||||||
|
|
||||||
|
**Integration**:
|
||||||
|
```csharp
|
||||||
|
public sealed class PolicyEvaluator
|
||||||
|
{
|
||||||
|
private readonly ISnapshotService _snapshotService;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Evaluates policy with an explicit knowledge snapshot.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<PolicyEvaluationResult> EvaluateWithSnapshotAsync(
|
||||||
|
PolicyEvaluationRequest request,
|
||||||
|
KnowledgeSnapshotManifest snapshot,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// Verify snapshot before use
|
||||||
|
var verification = await _snapshotService.VerifySnapshotAsync(snapshot, ct);
|
||||||
|
if (!verification.IsValid)
|
||||||
|
{
|
||||||
|
return PolicyEvaluationResult.Fail(
|
||||||
|
PolicyFailureReason.InvalidSnapshot,
|
||||||
|
verification.Error);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Bind evaluation to snapshot sources
|
||||||
|
var context = await CreateEvaluationContext(request, snapshot, ct);
|
||||||
|
|
||||||
|
// Perform evaluation with frozen inputs
|
||||||
|
var result = await EvaluateInternalAsync(context, ct);
|
||||||
|
|
||||||
|
// Include snapshot reference in result
|
||||||
|
return result with
|
||||||
|
{
|
||||||
|
KnowledgeSnapshotId = snapshot.SnapshotId,
|
||||||
|
SnapshotCreatedAt = snapshot.CreatedAt
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a snapshot capturing current knowledge state.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<KnowledgeSnapshotManifest> CaptureCurrentSnapshotAsync(
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var builder = new SnapshotBuilder(_hasher)
|
||||||
|
.WithEngine("StellaOps.Policy", _version, _commit)
|
||||||
|
.WithPolicy(_policyRef.Id, _policyRef.Digest)
|
||||||
|
.WithScoring(_scoringRef.Id, _scoringRef.Digest);
|
||||||
|
|
||||||
|
// Add all active knowledge sources
|
||||||
|
foreach (var source in await _knowledgeSourceProvider.GetActiveSourcesAsync(ct))
|
||||||
|
{
|
||||||
|
builder.WithSource(source);
|
||||||
|
}
|
||||||
|
|
||||||
|
builder.CaptureCurrentEnvironment();
|
||||||
|
|
||||||
|
return await _snapshotService.CreateSnapshotAsync(builder, ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extended result
|
||||||
|
public sealed record PolicyEvaluationResult
|
||||||
|
{
|
||||||
|
// Existing fields...
|
||||||
|
public string? KnowledgeSnapshotId { get; init; }
|
||||||
|
public DateTimeOffset? SnapshotCreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `EvaluateWithSnapshotAsync` method added
|
||||||
|
- [ ] Snapshot verification before evaluation
|
||||||
|
- [ ] Evaluation bound to snapshot sources
|
||||||
|
- [ ] `CaptureCurrentSnapshotAsync` for snapshot creation
|
||||||
|
- [ ] Result includes snapshot reference
|
||||||
|
- [ ] `InvalidSnapshot` failure reason added
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Add Tests
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T6
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add comprehensive tests for snapshot determinism and integrity.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Policy/__Tests/StellaOps.Policy.Tests/Snapshots/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class SnapshotBuilderTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void Build_ValidInputs_CreatesManifest()
|
||||||
|
{
|
||||||
|
var builder = new SnapshotBuilder(_hasher)
|
||||||
|
.WithEngine("test", "1.0", "abc123")
|
||||||
|
.WithPolicy("policy-1", "sha256:xxx")
|
||||||
|
.WithScoring("scoring-1", "sha256:yyy")
|
||||||
|
.WithAdvisoryFeed("nvd", "2025-12-21", "sha256:zzz");
|
||||||
|
|
||||||
|
var manifest = builder.Build();
|
||||||
|
|
||||||
|
manifest.SnapshotId.Should().StartWith("ksm:sha256:");
|
||||||
|
manifest.Sources.Should().HaveCount(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Build_MissingEngine_Throws()
|
||||||
|
{
|
||||||
|
var builder = new SnapshotBuilder(_hasher)
|
||||||
|
.WithPolicy("policy-1", "sha256:xxx")
|
||||||
|
.WithScoring("scoring-1", "sha256:yyy");
|
||||||
|
|
||||||
|
var act = () => builder.Build();
|
||||||
|
|
||||||
|
act.Should().Throw<InvalidOperationException>();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class SnapshotIdGeneratorTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void GenerateId_DeterministicForSameContent()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
|
||||||
|
var id1 = _generator.GenerateId(manifest);
|
||||||
|
var id2 = _generator.GenerateId(manifest);
|
||||||
|
|
||||||
|
id1.Should().Be(id2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void GenerateId_DifferentForDifferentContent()
|
||||||
|
{
|
||||||
|
var manifest1 = CreateTestManifest() with { CreatedAt = DateTimeOffset.UtcNow };
|
||||||
|
var manifest2 = CreateTestManifest() with { CreatedAt = DateTimeOffset.UtcNow.AddSeconds(1) };
|
||||||
|
|
||||||
|
var id1 = _generator.GenerateId(manifest1);
|
||||||
|
var id2 = _generator.GenerateId(manifest2);
|
||||||
|
|
||||||
|
id1.Should().NotBe(id2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ValidateId_ValidManifest_ReturnsTrue()
|
||||||
|
{
|
||||||
|
var manifest = new SnapshotBuilder(_hasher)
|
||||||
|
.WithEngine("test", "1.0", "abc")
|
||||||
|
.WithPolicy("p", "sha256:x")
|
||||||
|
.WithScoring("s", "sha256:y")
|
||||||
|
.WithAdvisoryFeed("nvd", "2025", "sha256:z")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
_generator.ValidateId(manifest).Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ValidateId_TamperedManifest_ReturnsFalse()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
var tampered = manifest with { Policy = manifest.Policy with { Digest = "sha256:tampered" } };
|
||||||
|
|
||||||
|
_generator.ValidateId(tampered).Should().BeFalse();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class SnapshotServiceTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task CreateSnapshot_PersistsManifest()
|
||||||
|
{
|
||||||
|
var builder = CreateBuilder();
|
||||||
|
|
||||||
|
var manifest = await _service.CreateSnapshotAsync(builder);
|
||||||
|
|
||||||
|
var retrieved = await _service.GetSnapshotAsync(manifest.SnapshotId);
|
||||||
|
retrieved.Should().NotBeNull();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task SealSnapshot_AddsSignature()
|
||||||
|
{
|
||||||
|
var manifest = await _service.CreateSnapshotAsync(CreateBuilder());
|
||||||
|
|
||||||
|
var sealed = await _service.SealSnapshotAsync(manifest);
|
||||||
|
|
||||||
|
sealed.Signature.Should().NotBeNullOrEmpty();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task VerifySnapshot_ValidSealed_ReturnsSuccess()
|
||||||
|
{
|
||||||
|
var manifest = await _service.CreateSnapshotAsync(CreateBuilder());
|
||||||
|
var sealed = await _service.SealSnapshotAsync(manifest);
|
||||||
|
|
||||||
|
var result = await _service.VerifySnapshotAsync(sealed);
|
||||||
|
|
||||||
|
result.IsValid.Should().BeTrue();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Builder tests for valid/invalid inputs
|
||||||
|
- [ ] ID generator determinism tests
|
||||||
|
- [ ] ID validation tests (valid and tampered)
|
||||||
|
- [ ] Service create/seal/verify tests
|
||||||
|
- [ ] All 8+ tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Policy Team | Define KnowledgeSnapshotManifest |
|
||||||
|
| 2 | T2 | TODO | — | Policy Team | Define KnowledgeSourceDescriptor |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | Policy Team | Create SnapshotBuilder |
|
||||||
|
| 4 | T4 | TODO | T3 | Policy Team | Implement content-addressed ID |
|
||||||
|
| 5 | T5 | TODO | T3, T4 | Policy Team | Create SnapshotService |
|
||||||
|
| 6 | T6 | TODO | T5 | Policy Team | Integrate with PolicyEvaluator |
|
||||||
|
| 7 | T7 | TODO | T6 | Policy Team | Add tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from MOAT Phase 2 gap analysis. Knowledge snapshots identified as requirement from Knowledge Snapshots advisory. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Content-addressed ID | Decision | Policy Team | ksm:sha256:{hash} format ensures immutability |
|
||||||
|
| Canonical JSON | Decision | Policy Team | Sorted keys, no whitespace for determinism |
|
||||||
|
| Signature exclusion | Decision | Policy Team | ID computed without signature field |
|
||||||
|
| Source ordering | Decision | Policy Team | Sources sorted by name for determinism |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Snapshot IDs are content-addressed
|
||||||
|
- [ ] Manifests are deterministically serializable
|
||||||
|
- [ ] Sealing adds verifiable signatures
|
||||||
|
- [ ] Policy evaluator integrates snapshots
|
||||||
|
- [ ] 8+ snapshot tests passing
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
1589
docs/implplan/SPRINT_4100_0002_0002_replay_engine.md
Normal file
1589
docs/implplan/SPRINT_4100_0002_0002_replay_engine.md
Normal file
File diff suppressed because it is too large
Load Diff
1180
docs/implplan/SPRINT_4100_0002_0003_snapshot_export_import.md
Normal file
1180
docs/implplan/SPRINT_4100_0002_0003_snapshot_export_import.md
Normal file
File diff suppressed because it is too large
Load Diff
1325
docs/implplan/SPRINT_4100_0003_0001_risk_verdict_attestation.md
Normal file
1325
docs/implplan/SPRINT_4100_0003_0001_risk_verdict_attestation.md
Normal file
File diff suppressed because it is too large
Load Diff
1344
docs/implplan/SPRINT_4100_0003_0002_oci_referrer_push.md
Normal file
1344
docs/implplan/SPRINT_4100_0003_0002_oci_referrer_push.md
Normal file
File diff suppressed because it is too large
Load Diff
1434
docs/implplan/SPRINT_4100_0004_0001_security_state_delta.md
Normal file
1434
docs/implplan/SPRINT_4100_0004_0001_security_state_delta.md
Normal file
File diff suppressed because it is too large
Load Diff
1460
docs/implplan/SPRINT_4100_0004_0002_risk_budgets_gates.md
Normal file
1460
docs/implplan/SPRINT_4100_0004_0002_risk_budgets_gates.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,377 @@
|
|||||||
|
# Sprint 4200.0001.0001 · Proof Chain Verification UI — Evidence Transparency Dashboard
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Implement a "Show Me The Proof" UI component that visualizes the evidence chain from finding to verdict.
|
||||||
|
- Enable auditors to point at an image digest and see all linked SBOMs, VEX claims, attestations, and verdicts.
|
||||||
|
- Connect existing Attestor verification APIs to Angular UI components.
|
||||||
|
- **Working directory:** `src/Web/StellaOps.Web/`, `src/Attestor/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- **Upstream**: Attestor ProofChain APIs (implemented), TimelineIndexer (implemented)
|
||||||
|
- **Downstream**: Audit workflows, compliance reporting
|
||||||
|
- **Safe to parallelize with**: Sprints 5200.*, 3600.*
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/modules/attestor/architecture.md`
|
||||||
|
- `docs/modules/ui/architecture.md`
|
||||||
|
- `docs/product-advisories/archived/2025-12-21-reference-architecture/20-Dec-2025 - Stella Ops Reference Architecture.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Proof Chain API Endpoints
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Expose REST endpoints for proof chain visualization data.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Attestor/StellaOps.Attestor.WebService/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `GET /api/v1/proofs/{subjectDigest}` - Get all proofs for an artifact
|
||||||
|
- [ ] `GET /api/v1/proofs/{subjectDigest}/chain` - Get linked evidence chain
|
||||||
|
- [ ] `GET /api/v1/proofs/{proofId}` - Get specific proof details
|
||||||
|
- [ ] `GET /api/v1/proofs/{proofId}/verify` - Verify proof integrity
|
||||||
|
- [ ] Response includes: SBOM refs, VEX refs, verdict refs, attestation refs
|
||||||
|
- [ ] Pagination for large proof sets
|
||||||
|
- [ ] Tenant isolation enforced
|
||||||
|
|
||||||
|
**API Response Model**:
|
||||||
|
```csharp
|
||||||
|
public sealed record ProofChainResponse
|
||||||
|
{
|
||||||
|
public required string SubjectDigest { get; init; }
|
||||||
|
public required string SubjectType { get; init; } // "oci-image", "file", etc.
|
||||||
|
public required DateTimeOffset QueryTime { get; init; }
|
||||||
|
|
||||||
|
public ImmutableArray<ProofNode> Nodes { get; init; }
|
||||||
|
public ImmutableArray<ProofEdge> Edges { get; init; }
|
||||||
|
|
||||||
|
public ProofSummary Summary { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ProofNode
|
||||||
|
{
|
||||||
|
public required string NodeId { get; init; }
|
||||||
|
public required ProofNodeType Type { get; init; } // Sbom, Vex, Verdict, Attestation
|
||||||
|
public required string Digest { get; init; }
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
public string? RekorLogIndex { get; init; }
|
||||||
|
public ImmutableDictionary<string, string> Metadata { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ProofEdge
|
||||||
|
{
|
||||||
|
public required string FromNode { get; init; }
|
||||||
|
public required string ToNode { get; init; }
|
||||||
|
public required string Relationship { get; init; } // "attests", "references", "supersedes"
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ProofSummary
|
||||||
|
{
|
||||||
|
public int TotalProofs { get; init; }
|
||||||
|
public int VerifiedCount { get; init; }
|
||||||
|
public int UnverifiedCount { get; init; }
|
||||||
|
public DateTimeOffset? OldestProof { get; init; }
|
||||||
|
public DateTimeOffset? NewestProof { get; init; }
|
||||||
|
public bool HasRekorAnchoring { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Proof Verification Service
|
||||||
|
|
||||||
|
**Assignee**: Attestor Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement on-demand proof verification with detailed results.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] DSSE signature verification
|
||||||
|
- [ ] Payload hash verification
|
||||||
|
- [ ] Rekor inclusion proof verification
|
||||||
|
- [ ] Key ID validation against Authority
|
||||||
|
- [ ] Expiration checking
|
||||||
|
- [ ] Returns detailed verification result with failure reasons
|
||||||
|
|
||||||
|
**Verification Result**:
|
||||||
|
```csharp
|
||||||
|
public sealed record ProofVerificationResult
|
||||||
|
{
|
||||||
|
public required string ProofId { get; init; }
|
||||||
|
public required bool IsValid { get; init; }
|
||||||
|
public required ProofVerificationStatus Status { get; init; }
|
||||||
|
|
||||||
|
public SignatureVerification? Signature { get; init; }
|
||||||
|
public RekorVerification? Rekor { get; init; }
|
||||||
|
public PayloadVerification? Payload { get; init; }
|
||||||
|
|
||||||
|
public ImmutableArray<string> Warnings { get; init; }
|
||||||
|
public ImmutableArray<string> Errors { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum ProofVerificationStatus
|
||||||
|
{
|
||||||
|
Valid,
|
||||||
|
SignatureInvalid,
|
||||||
|
PayloadTampered,
|
||||||
|
KeyNotTrusted,
|
||||||
|
Expired,
|
||||||
|
RekorNotAnchored,
|
||||||
|
RekorInclusionFailed
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Angular Proof Chain Component
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the main proof chain visualization component.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Web/StellaOps.Web/src/app/components/proof-chain/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `<stella-proof-chain>` component
|
||||||
|
- [ ] Input: subject digest or artifact reference
|
||||||
|
- [ ] Fetches proof chain from API
|
||||||
|
- [ ] Renders interactive graph visualization
|
||||||
|
- [ ] Node click shows detail panel
|
||||||
|
- [ ] Color coding by proof type
|
||||||
|
- [ ] Verification status indicators
|
||||||
|
- [ ] Loading and error states
|
||||||
|
|
||||||
|
**Component Structure**:
|
||||||
|
```typescript
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-proof-chain',
|
||||||
|
templateUrl: './proof-chain.component.html',
|
||||||
|
styleUrls: ['./proof-chain.component.scss'],
|
||||||
|
changeDetection: ChangeDetectionStrategy.OnPush
|
||||||
|
})
|
||||||
|
export class ProofChainComponent implements OnInit {
|
||||||
|
@Input() subjectDigest: string;
|
||||||
|
@Input() showVerification = true;
|
||||||
|
@Input() expandedView = false;
|
||||||
|
|
||||||
|
@Output() nodeSelected = new EventEmitter<ProofNode>();
|
||||||
|
@Output() verificationRequested = new EventEmitter<string>();
|
||||||
|
|
||||||
|
proofChain$: Observable<ProofChainResponse>;
|
||||||
|
selectedNode$: BehaviorSubject<ProofNode | null>;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Graph Visualization Library Integration
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate a graph visualization library for proof chain rendering.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Choose library: D3.js, Cytoscape.js, or vis.js
|
||||||
|
- [ ] Directed graph rendering
|
||||||
|
- [ ] Node icons by type (SBOM, VEX, Verdict, Attestation)
|
||||||
|
- [ ] Edge labels for relationships
|
||||||
|
- [ ] Zoom and pan controls
|
||||||
|
- [ ] Responsive layout
|
||||||
|
- [ ] Accessibility support (keyboard navigation, screen reader)
|
||||||
|
|
||||||
|
**Layout Options**:
|
||||||
|
```typescript
|
||||||
|
interface ProofChainLayout {
|
||||||
|
type: 'hierarchical' | 'force-directed' | 'dagre';
|
||||||
|
direction: 'TB' | 'LR' | 'BT' | 'RL';
|
||||||
|
nodeSpacing: number;
|
||||||
|
rankSpacing: number;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Proof Detail Panel
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create detail panel showing full proof information.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Slide-out panel on node selection
|
||||||
|
- [ ] Shows proof metadata
|
||||||
|
- [ ] Shows DSSE envelope summary
|
||||||
|
- [ ] Shows Rekor log entry if available
|
||||||
|
- [ ] "Verify Now" button triggers verification
|
||||||
|
- [ ] Download raw proof option
|
||||||
|
- [ ] Copy digest to clipboard
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Verification Status Badge
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create reusable verification status indicator.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `<stella-verification-badge>` component
|
||||||
|
- [ ] States: verified, unverified, failed, pending
|
||||||
|
- [ ] Tooltip with verification details
|
||||||
|
- [ ] Consistent styling with design system
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Timeline Integration
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate proof chain with timeline/audit log view.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] "View Proofs" action from timeline events
|
||||||
|
- [ ] Deep link to specific proof from timeline
|
||||||
|
- [ ] Timeline entry shows proof count badge
|
||||||
|
- [ ] Filter timeline by proof-related events
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Image/Artifact Page Integration
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add proof chain tab to image/artifact detail pages.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] New "Evidence Chain" tab on artifact details
|
||||||
|
- [ ] Summary card showing proof count and status
|
||||||
|
- [ ] "Audit This Artifact" button opens full chain
|
||||||
|
- [ ] Export proof bundle (for offline verification)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive unit tests for proof chain components.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Component rendering tests
|
||||||
|
- [ ] API service tests with mocks
|
||||||
|
- [ ] Graph layout tests
|
||||||
|
- [ ] Verification flow tests
|
||||||
|
- [ ] Accessibility tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: E2E Tests
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
End-to-end tests for proof chain workflow.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Navigate to artifact → view proof chain
|
||||||
|
- [ ] Click node → view details
|
||||||
|
- [ ] Verify proof → see result
|
||||||
|
- [ ] Export proof bundle
|
||||||
|
- [ ] Timeline → proof chain navigation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T11: Documentation
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
User and developer documentation for proof chain UI.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] User guide: "How to Audit an Artifact"
|
||||||
|
- [ ] Developer guide: component API
|
||||||
|
- [ ] Accessibility documentation
|
||||||
|
- [ ] Screenshots for documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Attestor Team | Proof Chain API Endpoints |
|
||||||
|
| 2 | T2 | TODO | T1 | Attestor Team | Proof Verification Service |
|
||||||
|
| 3 | T3 | TODO | T1 | UI Team | Angular Proof Chain Component |
|
||||||
|
| 4 | T4 | TODO | — | UI Team | Graph Visualization Integration |
|
||||||
|
| 5 | T5 | TODO | T3, T4 | UI Team | Proof Detail Panel |
|
||||||
|
| 6 | T6 | TODO | — | UI Team | Verification Status Badge |
|
||||||
|
| 7 | T7 | TODO | T3 | UI Team | Timeline Integration |
|
||||||
|
| 8 | T8 | TODO | T3 | UI Team | Artifact Page Integration |
|
||||||
|
| 9 | T9 | TODO | T3-T8 | UI Team | Unit Tests |
|
||||||
|
| 10 | T10 | TODO | T9 | UI Team | E2E Tests |
|
||||||
|
| 11 | T11 | TODO | T3-T8 | UI Team | Documentation |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Reference Architecture advisory - proof chain UI gap. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Graph library | Decision | UI Team | Evaluate D3.js vs Cytoscape.js for complexity vs features |
|
||||||
|
| Verification on-demand | Decision | Attestor Team | Verify on user request, not pre-computed |
|
||||||
|
| Proof export format | Decision | Attestor Team | JSON bundle with all DSSE envelopes |
|
||||||
|
| Large graph handling | Risk | UI Team | May need virtualization for 1000+ nodes |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] Auditors can view complete evidence chain for any artifact
|
||||||
|
- [ ] One-click verification of any proof in the chain
|
||||||
|
- [ ] Rekor anchoring visible when available
|
||||||
|
- [ ] Export proof bundle for offline verification
|
||||||
|
- [ ] Performance: <2s load time for typical proof chains (<100 nodes)
|
||||||
|
|
||||||
|
**Sprint Status**: TODO (0/11 tasks complete)
|
||||||
1032
docs/implplan/SPRINT_4200_0001_0001_triage_rest_api.md
Normal file
1032
docs/implplan/SPRINT_4200_0001_0001_triage_rest_api.md
Normal file
File diff suppressed because it is too large
Load Diff
994
docs/implplan/SPRINT_4200_0001_0002_excititor_policy_lattice.md
Normal file
994
docs/implplan/SPRINT_4200_0001_0002_excititor_policy_lattice.md
Normal file
@@ -0,0 +1,994 @@
|
|||||||
|
# Sprint 4200.0001.0002 · Wire Excititor to Policy K4 Lattice
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Replace hardcoded VEX precedence in Excititor with Policy's K4 TrustLatticeEngine
|
||||||
|
- Enable trust weight propagation in VEX merge decisions
|
||||||
|
- Add structured merge trace for explainability
|
||||||
|
|
||||||
|
**Working directory:** `src/Excititor/__Libraries/StellaOps.Excititor.Core/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None
|
||||||
|
- **Downstream**: Sprint 4500.0002.0001 (VEX Conflict Studio)
|
||||||
|
- **Safe to parallelize with**: Sprint 4200.0001.0001 (Triage REST API), Sprint 4100.0002.0001 (Knowledge Snapshots)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Excititor/__Libraries/StellaOps.Excititor.Core/AGENTS.md`
|
||||||
|
- `src/Policy/__Libraries/StellaOps.Policy/Lattice/AGENTS.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - How Top Scanners Shape Evidence‑First UX.md` (VEX Conflict Studio)
|
||||||
|
- Existing files: `OpenVexStatementMerger.cs`, `TrustLatticeEngine.cs`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Problem Analysis
|
||||||
|
|
||||||
|
The Policy module has a sophisticated `TrustLatticeEngine` implementing K4 logic (True, False, Both, Neither), but Excititor's `OpenVexStatementMerger` uses hardcoded precedence:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
// CURRENT (hardcoded in OpenVexStatementMerger.cs):
|
||||||
|
private static int GetStatusPrecedence(VexStatus status) => status switch
|
||||||
|
{
|
||||||
|
VexStatus.Affected => 3,
|
||||||
|
VexStatus.UnderInvestigation => 2,
|
||||||
|
VexStatus.Fixed => 1,
|
||||||
|
VexStatus.NotAffected => 0,
|
||||||
|
_ => -1
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
This disconnect means VEX merge outcomes are inconsistent with policy intent and cannot consider trust weights.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create IVexLatticeProvider Interface
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define abstraction over Policy's TrustLatticeEngine for Excititor consumption.
|
||||||
|
|
||||||
|
**Implementation Path**: `Lattice/IVexLatticeProvider.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Excititor.Core.Lattice;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Abstraction for VEX status lattice operations.
|
||||||
|
/// </summary>
|
||||||
|
public interface IVexLatticeProvider
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Computes the lattice join (least upper bound) of two VEX statuses.
|
||||||
|
/// </summary>
|
||||||
|
VexLatticeResult Join(VexStatement left, VexStatement right);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Computes the lattice meet (greatest lower bound) of two VEX statuses.
|
||||||
|
/// </summary>
|
||||||
|
VexLatticeResult Meet(VexStatement left, VexStatement right);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Determines if one status is higher in the lattice than another.
|
||||||
|
/// </summary>
|
||||||
|
bool IsHigher(VexStatus a, VexStatus b);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the trust weight for a VEX statement based on its source.
|
||||||
|
/// </summary>
|
||||||
|
decimal GetTrustWeight(VexStatement statement);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Resolves a conflict using trust weights and lattice logic.
|
||||||
|
/// </summary>
|
||||||
|
VexConflictResolution ResolveConflict(VexStatement left, VexStatement right);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Result of a lattice operation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexLatticeResult(
|
||||||
|
VexStatus ResultStatus,
|
||||||
|
VexStatement? WinningStatement,
|
||||||
|
string Reason,
|
||||||
|
decimal? TrustDelta);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Detailed conflict resolution result.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexConflictResolution(
|
||||||
|
VexStatement Winner,
|
||||||
|
VexStatement Loser,
|
||||||
|
ConflictResolutionReason Reason,
|
||||||
|
MergeTrace Trace);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Why one statement won over another.
|
||||||
|
/// </summary>
|
||||||
|
public enum ConflictResolutionReason
|
||||||
|
{
|
||||||
|
/// <summary>Higher trust weight from source.</summary>
|
||||||
|
TrustWeight,
|
||||||
|
|
||||||
|
/// <summary>More recent timestamp.</summary>
|
||||||
|
Freshness,
|
||||||
|
|
||||||
|
/// <summary>Lattice position (e.g., Affected > NotAffected).</summary>
|
||||||
|
LatticePosition,
|
||||||
|
|
||||||
|
/// <summary>Both equal, first used.</summary>
|
||||||
|
Tie
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Structured trace of merge decision.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record MergeTrace
|
||||||
|
{
|
||||||
|
public required string LeftSource { get; init; }
|
||||||
|
public required string RightSource { get; init; }
|
||||||
|
public required VexStatus LeftStatus { get; init; }
|
||||||
|
public required VexStatus RightStatus { get; init; }
|
||||||
|
public required decimal LeftTrust { get; init; }
|
||||||
|
public required decimal RightTrust { get; init; }
|
||||||
|
public required VexStatus ResultStatus { get; init; }
|
||||||
|
public required string Explanation { get; init; }
|
||||||
|
public DateTimeOffset EvaluatedAt { get; init; } = DateTimeOffset.UtcNow;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `IVexLatticeProvider.cs` file created in `Lattice/`
|
||||||
|
- [ ] Join and Meet operations defined
|
||||||
|
- [ ] Trust weight method defined
|
||||||
|
- [ ] Conflict resolution with trace
|
||||||
|
- [ ] MergeTrace for explainability
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Implement PolicyLatticeAdapter
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Adapt Policy module's TrustLatticeEngine for Excititor consumption.
|
||||||
|
|
||||||
|
**Implementation Path**: `Lattice/PolicyLatticeAdapter.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Excititor.Core.Lattice;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adapts Policy's TrustLatticeEngine for VEX operations.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class PolicyLatticeAdapter : IVexLatticeProvider
|
||||||
|
{
|
||||||
|
private readonly ITrustLatticeEngine _lattice;
|
||||||
|
private readonly ITrustWeightRegistry _trustRegistry;
|
||||||
|
private readonly ILogger<PolicyLatticeAdapter> _logger;
|
||||||
|
|
||||||
|
// K4 lattice: Affected(Both) > UnderInvestigation(Unknown) > {Fixed, NotAffected}
|
||||||
|
private static readonly Dictionary<VexStatus, TrustLabel> StatusToLabel = new()
|
||||||
|
{
|
||||||
|
[VexStatus.Affected] = TrustLabel.Both, // Conflict - must address
|
||||||
|
[VexStatus.UnderInvestigation] = TrustLabel.Neither, // Unknown
|
||||||
|
[VexStatus.Fixed] = TrustLabel.True, // Known good
|
||||||
|
[VexStatus.NotAffected] = TrustLabel.False // Known not affected
|
||||||
|
};
|
||||||
|
|
||||||
|
public PolicyLatticeAdapter(
|
||||||
|
ITrustLatticeEngine lattice,
|
||||||
|
ITrustWeightRegistry trustRegistry,
|
||||||
|
ILogger<PolicyLatticeAdapter> logger)
|
||||||
|
{
|
||||||
|
_lattice = lattice;
|
||||||
|
_trustRegistry = trustRegistry;
|
||||||
|
_logger = logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
public VexLatticeResult Join(VexStatement left, VexStatement right)
|
||||||
|
{
|
||||||
|
var leftLabel = StatusToLabel.GetValueOrDefault(left.Status, TrustLabel.Neither);
|
||||||
|
var rightLabel = StatusToLabel.GetValueOrDefault(right.Status, TrustLabel.Neither);
|
||||||
|
|
||||||
|
var joinResult = _lattice.Join(leftLabel, rightLabel);
|
||||||
|
var resultStatus = LabelToStatus(joinResult);
|
||||||
|
|
||||||
|
var winner = DetermineWinner(left, right, resultStatus);
|
||||||
|
|
||||||
|
return new VexLatticeResult(
|
||||||
|
ResultStatus: resultStatus,
|
||||||
|
WinningStatement: winner,
|
||||||
|
Reason: $"K4 join: {leftLabel} ∨ {rightLabel} = {joinResult}",
|
||||||
|
TrustDelta: Math.Abs(GetTrustWeight(left) - GetTrustWeight(right)));
|
||||||
|
}
|
||||||
|
|
||||||
|
public VexLatticeResult Meet(VexStatement left, VexStatement right)
|
||||||
|
{
|
||||||
|
var leftLabel = StatusToLabel.GetValueOrDefault(left.Status, TrustLabel.Neither);
|
||||||
|
var rightLabel = StatusToLabel.GetValueOrDefault(right.Status, TrustLabel.Neither);
|
||||||
|
|
||||||
|
var meetResult = _lattice.Meet(leftLabel, rightLabel);
|
||||||
|
var resultStatus = LabelToStatus(meetResult);
|
||||||
|
|
||||||
|
var winner = DetermineWinner(left, right, resultStatus);
|
||||||
|
|
||||||
|
return new VexLatticeResult(
|
||||||
|
ResultStatus: resultStatus,
|
||||||
|
WinningStatement: winner,
|
||||||
|
Reason: $"K4 meet: {leftLabel} ∧ {rightLabel} = {meetResult}",
|
||||||
|
TrustDelta: Math.Abs(GetTrustWeight(left) - GetTrustWeight(right)));
|
||||||
|
}
|
||||||
|
|
||||||
|
public bool IsHigher(VexStatus a, VexStatus b)
|
||||||
|
{
|
||||||
|
var labelA = StatusToLabel.GetValueOrDefault(a, TrustLabel.Neither);
|
||||||
|
var labelB = StatusToLabel.GetValueOrDefault(b, TrustLabel.Neither);
|
||||||
|
return _lattice.IsAbove(labelA, labelB);
|
||||||
|
}
|
||||||
|
|
||||||
|
public decimal GetTrustWeight(VexStatement statement)
|
||||||
|
{
|
||||||
|
// Get trust weight from registry based on source
|
||||||
|
var sourceKey = ExtractSourceKey(statement);
|
||||||
|
return _trustRegistry.GetWeight(sourceKey);
|
||||||
|
}
|
||||||
|
|
||||||
|
public VexConflictResolution ResolveConflict(VexStatement left, VexStatement right)
|
||||||
|
{
|
||||||
|
var leftWeight = GetTrustWeight(left);
|
||||||
|
var rightWeight = GetTrustWeight(right);
|
||||||
|
|
||||||
|
VexStatement winner;
|
||||||
|
VexStatement loser;
|
||||||
|
ConflictResolutionReason reason;
|
||||||
|
|
||||||
|
// 1. Trust weight takes precedence
|
||||||
|
if (Math.Abs(leftWeight - rightWeight) > 0.01m)
|
||||||
|
{
|
||||||
|
if (leftWeight > rightWeight)
|
||||||
|
{
|
||||||
|
winner = left;
|
||||||
|
loser = right;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
winner = right;
|
||||||
|
loser = left;
|
||||||
|
}
|
||||||
|
reason = ConflictResolutionReason.TrustWeight;
|
||||||
|
}
|
||||||
|
// 2. Lattice position as tiebreaker
|
||||||
|
else if (IsHigher(left.Status, right.Status))
|
||||||
|
{
|
||||||
|
winner = left;
|
||||||
|
loser = right;
|
||||||
|
reason = ConflictResolutionReason.LatticePosition;
|
||||||
|
}
|
||||||
|
else if (IsHigher(right.Status, left.Status))
|
||||||
|
{
|
||||||
|
winner = right;
|
||||||
|
loser = left;
|
||||||
|
reason = ConflictResolutionReason.LatticePosition;
|
||||||
|
}
|
||||||
|
// 3. Freshness as final tiebreaker
|
||||||
|
else if (left.Timestamp > right.Timestamp)
|
||||||
|
{
|
||||||
|
winner = left;
|
||||||
|
loser = right;
|
||||||
|
reason = ConflictResolutionReason.Freshness;
|
||||||
|
}
|
||||||
|
else if (right.Timestamp > left.Timestamp)
|
||||||
|
{
|
||||||
|
winner = right;
|
||||||
|
loser = left;
|
||||||
|
reason = ConflictResolutionReason.Freshness;
|
||||||
|
}
|
||||||
|
// 4. True tie - use first
|
||||||
|
else
|
||||||
|
{
|
||||||
|
winner = left;
|
||||||
|
loser = right;
|
||||||
|
reason = ConflictResolutionReason.Tie;
|
||||||
|
}
|
||||||
|
|
||||||
|
var trace = new MergeTrace
|
||||||
|
{
|
||||||
|
LeftSource = left.Source ?? "unknown",
|
||||||
|
RightSource = right.Source ?? "unknown",
|
||||||
|
LeftStatus = left.Status,
|
||||||
|
RightStatus = right.Status,
|
||||||
|
LeftTrust = leftWeight,
|
||||||
|
RightTrust = rightWeight,
|
||||||
|
ResultStatus = winner.Status,
|
||||||
|
Explanation = BuildExplanation(winner, loser, reason)
|
||||||
|
};
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"VEX conflict resolved: {Winner} ({WinnerStatus}) won over {Loser} ({LoserStatus}) by {Reason}",
|
||||||
|
winner.Source, winner.Status, loser.Source, loser.Status, reason);
|
||||||
|
|
||||||
|
return new VexConflictResolution(winner, loser, reason, trace);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static VexStatus LabelToStatus(TrustLabel label) => label switch
|
||||||
|
{
|
||||||
|
TrustLabel.Both => VexStatus.Affected,
|
||||||
|
TrustLabel.Neither => VexStatus.UnderInvestigation,
|
||||||
|
TrustLabel.True => VexStatus.Fixed,
|
||||||
|
TrustLabel.False => VexStatus.NotAffected,
|
||||||
|
_ => VexStatus.UnderInvestigation
|
||||||
|
};
|
||||||
|
|
||||||
|
private VexStatement? DetermineWinner(VexStatement left, VexStatement right, VexStatus resultStatus)
|
||||||
|
{
|
||||||
|
if (left.Status == resultStatus) return left;
|
||||||
|
if (right.Status == resultStatus) return right;
|
||||||
|
|
||||||
|
// Result is computed from lattice, neither matches exactly
|
||||||
|
// Return the one with higher trust
|
||||||
|
return GetTrustWeight(left) >= GetTrustWeight(right) ? left : right;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ExtractSourceKey(VexStatement statement)
|
||||||
|
{
|
||||||
|
// Extract publisher/issuer from statement for trust lookup
|
||||||
|
return statement.Source?.ToLowerInvariant() ?? "unknown";
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string BuildExplanation(
|
||||||
|
VexStatement winner, VexStatement loser, ConflictResolutionReason reason)
|
||||||
|
{
|
||||||
|
return reason switch
|
||||||
|
{
|
||||||
|
ConflictResolutionReason.TrustWeight =>
|
||||||
|
$"'{winner.Source}' has higher trust weight than '{loser.Source}'",
|
||||||
|
ConflictResolutionReason.Freshness =>
|
||||||
|
$"'{winner.Source}' is more recent ({winner.Timestamp:O}) than '{loser.Source}' ({loser.Timestamp:O})",
|
||||||
|
ConflictResolutionReason.LatticePosition =>
|
||||||
|
$"'{winner.Status}' is higher in K4 lattice than '{loser.Status}'",
|
||||||
|
ConflictResolutionReason.Tie =>
|
||||||
|
$"Tie between '{winner.Source}' and '{loser.Source}', using first",
|
||||||
|
_ => "Unknown resolution"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `PolicyLatticeAdapter.cs` file created
|
||||||
|
- [ ] K4 status mapping defined
|
||||||
|
- [ ] Join/Meet use TrustLatticeEngine
|
||||||
|
- [ ] Trust weights from registry
|
||||||
|
- [ ] Conflict resolution with precedence: trust > lattice > freshness > tie
|
||||||
|
- [ ] Structured explanation in MergeTrace
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Refactor OpenVexStatementMerger
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Replace hardcoded precedence with lattice-based merge logic.
|
||||||
|
|
||||||
|
**Implementation Path**: `Formats/OpenVEX/OpenVexStatementMerger.cs` (modify existing)
|
||||||
|
|
||||||
|
**Changes**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Excititor.Formats.OpenVEX;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Merges OpenVEX statements using K4 lattice logic.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class OpenVexStatementMerger : IVexStatementMerger
|
||||||
|
{
|
||||||
|
private readonly IVexLatticeProvider _lattice;
|
||||||
|
private readonly ILogger<OpenVexStatementMerger> _logger;
|
||||||
|
|
||||||
|
public OpenVexStatementMerger(
|
||||||
|
IVexLatticeProvider lattice,
|
||||||
|
ILogger<OpenVexStatementMerger> logger)
|
||||||
|
{
|
||||||
|
_lattice = lattice;
|
||||||
|
_logger = logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Merges multiple VEX statements for the same product/vulnerability pair.
|
||||||
|
/// </summary>
|
||||||
|
public VexMergeResult Merge(IEnumerable<VexStatement> statements)
|
||||||
|
{
|
||||||
|
var statementList = statements.ToList();
|
||||||
|
|
||||||
|
if (statementList.Count == 0)
|
||||||
|
return VexMergeResult.Empty();
|
||||||
|
|
||||||
|
if (statementList.Count == 1)
|
||||||
|
return VexMergeResult.Single(statementList[0]);
|
||||||
|
|
||||||
|
// Sort by trust weight descending for stable merge order
|
||||||
|
var sorted = statementList
|
||||||
|
.OrderByDescending(s => _lattice.GetTrustWeight(s))
|
||||||
|
.ThenByDescending(s => s.Timestamp)
|
||||||
|
.ToList();
|
||||||
|
|
||||||
|
var traces = new List<MergeTrace>();
|
||||||
|
var current = sorted[0];
|
||||||
|
|
||||||
|
for (int i = 1; i < sorted.Count; i++)
|
||||||
|
{
|
||||||
|
var next = sorted[i];
|
||||||
|
|
||||||
|
// Check for conflict
|
||||||
|
if (current.Status != next.Status)
|
||||||
|
{
|
||||||
|
var resolution = _lattice.ResolveConflict(current, next);
|
||||||
|
traces.Add(resolution.Trace);
|
||||||
|
current = resolution.Winner;
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"Merged VEX statement: {Status} from {Source} (reason: {Reason})",
|
||||||
|
current.Status, current.Source, resolution.Reason);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
// Same status - prefer higher trust
|
||||||
|
if (_lattice.GetTrustWeight(next) > _lattice.GetTrustWeight(current))
|
||||||
|
{
|
||||||
|
current = next;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new VexMergeResult(
|
||||||
|
ResultStatement: current,
|
||||||
|
InputCount: statementList.Count,
|
||||||
|
HadConflicts: traces.Count > 0,
|
||||||
|
Traces: traces);
|
||||||
|
}
|
||||||
|
|
||||||
|
// REMOVED: Hardcoded precedence method
|
||||||
|
// private static int GetStatusPrecedence(VexStatus status) => ...
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Result of VEX statement merge.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexMergeResult(
|
||||||
|
VexStatement ResultStatement,
|
||||||
|
int InputCount,
|
||||||
|
bool HadConflicts,
|
||||||
|
IReadOnlyList<MergeTrace> Traces)
|
||||||
|
{
|
||||||
|
public static VexMergeResult Empty() =>
|
||||||
|
new(default!, 0, false, []);
|
||||||
|
|
||||||
|
public static VexMergeResult Single(VexStatement statement) =>
|
||||||
|
new(statement, 1, false, []);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Hardcoded `GetStatusPrecedence` removed
|
||||||
|
- [ ] Constructor takes `IVexLatticeProvider`
|
||||||
|
- [ ] Merge uses lattice conflict resolution
|
||||||
|
- [ ] MergeTraces collected for all conflicts
|
||||||
|
- [ ] Result includes conflict information
|
||||||
|
- [ ] Logging for observability
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Add Trust Weight Propagation
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement trust weight registry for VEX sources.
|
||||||
|
|
||||||
|
**Implementation Path**: `Lattice/TrustWeightRegistry.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Excititor.Core.Lattice;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Registry for VEX source trust weights.
|
||||||
|
/// </summary>
|
||||||
|
public interface ITrustWeightRegistry
|
||||||
|
{
|
||||||
|
decimal GetWeight(string sourceKey);
|
||||||
|
void RegisterWeight(string sourceKey, decimal weight);
|
||||||
|
IReadOnlyDictionary<string, decimal> GetAllWeights();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Default implementation with configurable weights.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class TrustWeightRegistry : ITrustWeightRegistry
|
||||||
|
{
|
||||||
|
private readonly Dictionary<string, decimal> _weights = new(StringComparer.OrdinalIgnoreCase);
|
||||||
|
private readonly TrustWeightOptions _options;
|
||||||
|
private readonly ILogger<TrustWeightRegistry> _logger;
|
||||||
|
|
||||||
|
// Default trust hierarchy
|
||||||
|
private static readonly Dictionary<string, decimal> DefaultWeights = new()
|
||||||
|
{
|
||||||
|
["vendor"] = 1.0m, // Vendor statements highest trust
|
||||||
|
["distro"] = 0.9m, // Distribution maintainers
|
||||||
|
["nvd"] = 0.8m, // NVD/NIST
|
||||||
|
["ghsa"] = 0.75m, // GitHub Security Advisories
|
||||||
|
["osv"] = 0.7m, // Open Source Vulnerabilities
|
||||||
|
["cisa"] = 0.85m, // CISA advisories
|
||||||
|
["first-party"] = 0.95m, // First-party (internal) statements
|
||||||
|
["community"] = 0.5m, // Community reports
|
||||||
|
["unknown"] = 0.3m // Unknown sources
|
||||||
|
};
|
||||||
|
|
||||||
|
public TrustWeightRegistry(
|
||||||
|
IOptions<TrustWeightOptions> options,
|
||||||
|
ILogger<TrustWeightRegistry> logger)
|
||||||
|
{
|
||||||
|
_options = options.Value;
|
||||||
|
_logger = logger;
|
||||||
|
|
||||||
|
// Initialize with defaults
|
||||||
|
foreach (var (key, weight) in DefaultWeights)
|
||||||
|
{
|
||||||
|
_weights[key] = weight;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Override with configured weights
|
||||||
|
foreach (var (key, weight) in _options.SourceWeights)
|
||||||
|
{
|
||||||
|
_weights[key] = weight;
|
||||||
|
_logger.LogDebug("Configured trust weight: {Source} = {Weight}", key, weight);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public decimal GetWeight(string sourceKey)
|
||||||
|
{
|
||||||
|
// Try exact match
|
||||||
|
if (_weights.TryGetValue(sourceKey, out var weight))
|
||||||
|
return weight;
|
||||||
|
|
||||||
|
// Try category match (e.g., "red-hat-vendor" -> "vendor")
|
||||||
|
foreach (var category in DefaultWeights.Keys)
|
||||||
|
{
|
||||||
|
if (sourceKey.Contains(category, StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
return _weights[category];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return _weights["unknown"];
|
||||||
|
}
|
||||||
|
|
||||||
|
public void RegisterWeight(string sourceKey, decimal weight)
|
||||||
|
{
|
||||||
|
_weights[sourceKey] = Math.Clamp(weight, 0m, 1m);
|
||||||
|
_logger.LogInformation("Registered trust weight: {Source} = {Weight}", sourceKey, weight);
|
||||||
|
}
|
||||||
|
|
||||||
|
public IReadOnlyDictionary<string, decimal> GetAllWeights() =>
|
||||||
|
new Dictionary<string, decimal>(_weights);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration options for trust weights.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class TrustWeightOptions
|
||||||
|
{
|
||||||
|
public Dictionary<string, decimal> SourceWeights { get; set; } = [];
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `ITrustWeightRegistry` interface defined
|
||||||
|
- [ ] `TrustWeightRegistry` implementation created
|
||||||
|
- [ ] Default weights for common sources
|
||||||
|
- [ ] Configuration override support
|
||||||
|
- [ ] Category fallback matching
|
||||||
|
- [ ] Weight clamping to [0, 1]
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Add Merge Trace Output
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add structured trace output for merge decisions.
|
||||||
|
|
||||||
|
**Implementation Path**: `Formats/OpenVEX/MergeTraceWriter.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Excititor.Formats.OpenVEX;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Writes merge traces in various formats.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class MergeTraceWriter
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Formats trace as human-readable explanation.
|
||||||
|
/// </summary>
|
||||||
|
public static string ToExplanation(VexMergeResult result)
|
||||||
|
{
|
||||||
|
if (!result.HadConflicts)
|
||||||
|
{
|
||||||
|
return result.InputCount switch
|
||||||
|
{
|
||||||
|
0 => "No VEX statements to merge.",
|
||||||
|
1 => $"Single statement from '{result.ResultStatement.Source}': {result.ResultStatement.Status}",
|
||||||
|
_ => $"All {result.InputCount} statements agreed: {result.ResultStatement.Status}"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
var sb = new StringBuilder();
|
||||||
|
sb.AppendLine($"Merged {result.InputCount} statements with {result.Traces.Count} conflicts:");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
foreach (var trace in result.Traces)
|
||||||
|
{
|
||||||
|
sb.AppendLine($" Conflict: {trace.LeftSource} ({trace.LeftStatus}) vs {trace.RightSource} ({trace.RightStatus})");
|
||||||
|
sb.AppendLine($" Trust: {trace.LeftTrust:P0} vs {trace.RightTrust:P0}");
|
||||||
|
sb.AppendLine($" Resolution: {trace.Explanation}");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
sb.AppendLine($"Final result: {result.ResultStatement.Status} from '{result.ResultStatement.Source}'");
|
||||||
|
return sb.ToString();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Formats trace as structured JSON.
|
||||||
|
/// </summary>
|
||||||
|
public static string ToJson(VexMergeResult result)
|
||||||
|
{
|
||||||
|
var trace = new
|
||||||
|
{
|
||||||
|
inputCount = result.InputCount,
|
||||||
|
hadConflicts = result.HadConflicts,
|
||||||
|
result = new
|
||||||
|
{
|
||||||
|
status = result.ResultStatement.Status.ToString(),
|
||||||
|
source = result.ResultStatement.Source,
|
||||||
|
timestamp = result.ResultStatement.Timestamp
|
||||||
|
},
|
||||||
|
conflicts = result.Traces.Select(t => new
|
||||||
|
{
|
||||||
|
left = new { source = t.LeftSource, status = t.LeftStatus.ToString(), trust = t.LeftTrust },
|
||||||
|
right = new { source = t.RightSource, status = t.RightStatus.ToString(), trust = t.RightTrust },
|
||||||
|
outcome = t.ResultStatus.ToString(),
|
||||||
|
explanation = t.Explanation,
|
||||||
|
evaluatedAt = t.EvaluatedAt
|
||||||
|
})
|
||||||
|
};
|
||||||
|
|
||||||
|
return JsonSerializer.Serialize(trace, new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
WriteIndented = true,
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a VEX annotation with merge provenance.
|
||||||
|
/// </summary>
|
||||||
|
public static VexAnnotation ToAnnotation(VexMergeResult result)
|
||||||
|
{
|
||||||
|
return new VexAnnotation
|
||||||
|
{
|
||||||
|
Type = "merge-provenance",
|
||||||
|
Text = result.HadConflicts
|
||||||
|
? $"Merged from {result.InputCount} sources with {result.Traces.Count} conflicts"
|
||||||
|
: $"Merged from {result.InputCount} sources (no conflicts)",
|
||||||
|
Details = new Dictionary<string, object>
|
||||||
|
{
|
||||||
|
["inputCount"] = result.InputCount,
|
||||||
|
["hadConflicts"] = result.HadConflicts,
|
||||||
|
["conflictCount"] = result.Traces.Count,
|
||||||
|
["traces"] = result.Traces
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `MergeTraceWriter.cs` file created
|
||||||
|
- [ ] Human-readable explanation format
|
||||||
|
- [ ] Structured JSON format
|
||||||
|
- [ ] VEX annotation for provenance
|
||||||
|
- [ ] Conflict count and details included
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Deprecate VexConsensusResolver
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Remove deprecated VexConsensusResolver per AOC-19.
|
||||||
|
|
||||||
|
**Implementation Path**: `Resolvers/VexConsensusResolver.cs` (delete or mark obsolete)
|
||||||
|
|
||||||
|
**Changes**:
|
||||||
|
```csharp
|
||||||
|
// Option 1: Mark obsolete with error
|
||||||
|
[Obsolete("Use OpenVexStatementMerger with IVexLatticeProvider instead. Will be removed in v2.0.", error: true)]
|
||||||
|
public sealed class VexConsensusResolver
|
||||||
|
{
|
||||||
|
// Existing implementation...
|
||||||
|
}
|
||||||
|
|
||||||
|
// Option 2: Delete file entirely if no external consumers
|
||||||
|
// Delete: src/Excititor/__Libraries/StellaOps.Excititor.Core/Resolvers/VexConsensusResolver.cs
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `VexConsensusResolver` marked obsolete with error OR deleted
|
||||||
|
- [ ] All internal references updated to use `OpenVexStatementMerger`
|
||||||
|
- [ ] No compile errors
|
||||||
|
- [ ] AOC-19 compliance noted
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Tests for Lattice Merge
|
||||||
|
|
||||||
|
**Assignee**: Excititor Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2, T3, T4, T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for lattice-based VEX merge.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Excititor/__Tests/StellaOps.Excititor.Core.Tests/Lattice/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class PolicyLatticeAdapterTests
|
||||||
|
{
|
||||||
|
[Theory]
|
||||||
|
[InlineData(VexStatus.Affected, VexStatus.NotAffected, VexStatus.Affected)]
|
||||||
|
[InlineData(VexStatus.Fixed, VexStatus.NotAffected, VexStatus.Fixed)]
|
||||||
|
[InlineData(VexStatus.UnderInvestigation, VexStatus.Fixed, VexStatus.UnderInvestigation)]
|
||||||
|
public void Join_ReturnsExpectedK4Result(VexStatus left, VexStatus right, VexStatus expected)
|
||||||
|
{
|
||||||
|
var leftStmt = CreateStatement(left, "source1");
|
||||||
|
var rightStmt = CreateStatement(right, "source2");
|
||||||
|
|
||||||
|
var result = _adapter.Join(leftStmt, rightStmt);
|
||||||
|
|
||||||
|
result.ResultStatus.Should().Be(expected);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ResolveConflict_TrustWeightWins()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var vendor = CreateStatement(VexStatus.NotAffected, "vendor");
|
||||||
|
var community = CreateStatement(VexStatus.Affected, "community");
|
||||||
|
// vendor has weight 1.0, community has 0.5
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = _adapter.ResolveConflict(vendor, community);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Winner.Should().Be(vendor);
|
||||||
|
result.Reason.Should().Be(ConflictResolutionReason.TrustWeight);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ResolveConflict_EqualTrust_UsesLatticePosition()
|
||||||
|
{
|
||||||
|
// Arrange - both from vendor (same trust)
|
||||||
|
var affected = CreateStatement(VexStatus.Affected, "vendor-a");
|
||||||
|
var notAffected = CreateStatement(VexStatus.NotAffected, "vendor-b");
|
||||||
|
_registry.RegisterWeight("vendor-a", 0.9m);
|
||||||
|
_registry.RegisterWeight("vendor-b", 0.9m);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = _adapter.ResolveConflict(affected, notAffected);
|
||||||
|
|
||||||
|
// Assert - Affected is higher in K4
|
||||||
|
result.Winner.Status.Should().Be(VexStatus.Affected);
|
||||||
|
result.Reason.Should().Be(ConflictResolutionReason.LatticePosition);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ResolveConflict_EqualTrustAndStatus_UsesFreshness()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var older = CreateStatement(VexStatus.Affected, "vendor", DateTimeOffset.UtcNow.AddDays(-1));
|
||||||
|
var newer = CreateStatement(VexStatus.Affected, "vendor", DateTimeOffset.UtcNow);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = _adapter.ResolveConflict(older, newer);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Winner.Should().Be(newer);
|
||||||
|
result.Reason.Should().Be(ConflictResolutionReason.Freshness);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ResolveConflict_GeneratesTrace()
|
||||||
|
{
|
||||||
|
var left = CreateStatement(VexStatus.Affected, "vendor");
|
||||||
|
var right = CreateStatement(VexStatus.NotAffected, "distro");
|
||||||
|
|
||||||
|
var result = _adapter.ResolveConflict(left, right);
|
||||||
|
|
||||||
|
result.Trace.Should().NotBeNull();
|
||||||
|
result.Trace.LeftSource.Should().Be("vendor");
|
||||||
|
result.Trace.RightSource.Should().Be("distro");
|
||||||
|
result.Trace.Explanation.Should().NotBeNullOrEmpty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class OpenVexStatementMergerTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void Merge_NoStatements_ReturnsEmpty()
|
||||||
|
{
|
||||||
|
var result = _merger.Merge([]);
|
||||||
|
|
||||||
|
result.InputCount.Should().Be(0);
|
||||||
|
result.HadConflicts.Should().BeFalse();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Merge_SingleStatement_ReturnsSingle()
|
||||||
|
{
|
||||||
|
var statement = CreateStatement(VexStatus.NotAffected, "vendor");
|
||||||
|
|
||||||
|
var result = _merger.Merge([statement]);
|
||||||
|
|
||||||
|
result.InputCount.Should().Be(1);
|
||||||
|
result.ResultStatement.Should().Be(statement);
|
||||||
|
result.HadConflicts.Should().BeFalse();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Merge_ConflictingStatements_UsesLattice()
|
||||||
|
{
|
||||||
|
var vendor = CreateStatement(VexStatus.NotAffected, "vendor");
|
||||||
|
var nvd = CreateStatement(VexStatus.Affected, "nvd");
|
||||||
|
|
||||||
|
var result = _merger.Merge([vendor, nvd]);
|
||||||
|
|
||||||
|
result.InputCount.Should().Be(2);
|
||||||
|
result.HadConflicts.Should().BeTrue();
|
||||||
|
result.Traces.Should().HaveCount(1);
|
||||||
|
// Vendor has higher trust, wins
|
||||||
|
result.ResultStatement.Status.Should().Be(VexStatus.NotAffected);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Merge_MultipleStatements_CollectsAllTraces()
|
||||||
|
{
|
||||||
|
var statements = new[]
|
||||||
|
{
|
||||||
|
CreateStatement(VexStatus.Affected, "source1"),
|
||||||
|
CreateStatement(VexStatus.NotAffected, "source2"),
|
||||||
|
CreateStatement(VexStatus.Fixed, "source3")
|
||||||
|
};
|
||||||
|
|
||||||
|
var result = _merger.Merge(statements);
|
||||||
|
|
||||||
|
result.InputCount.Should().Be(3);
|
||||||
|
result.Traces.Should().HaveCountGreaterThan(0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class TrustWeightRegistryTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void GetWeight_KnownSource_ReturnsConfiguredWeight()
|
||||||
|
{
|
||||||
|
var weight = _registry.GetWeight("vendor");
|
||||||
|
|
||||||
|
weight.Should().Be(1.0m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void GetWeight_UnknownSource_ReturnsFallback()
|
||||||
|
{
|
||||||
|
var weight = _registry.GetWeight("random-source");
|
||||||
|
|
||||||
|
weight.Should().Be(0.3m); // "unknown" default
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void GetWeight_CategoryMatch_ReturnsCategory()
|
||||||
|
{
|
||||||
|
var weight = _registry.GetWeight("red-hat-vendor-advisory");
|
||||||
|
|
||||||
|
weight.Should().Be(1.0m); // Contains "vendor"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] K4 join/meet tests
|
||||||
|
- [ ] Trust weight precedence tests
|
||||||
|
- [ ] Lattice position tiebreaker tests
|
||||||
|
- [ ] Freshness tiebreaker tests
|
||||||
|
- [ ] Merge trace generation tests
|
||||||
|
- [ ] Empty/single/multiple merge tests
|
||||||
|
- [ ] Trust registry tests
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Excititor Team | Create IVexLatticeProvider interface |
|
||||||
|
| 2 | T2 | TODO | T1 | Excititor Team | Implement PolicyLatticeAdapter |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | Excititor Team | Refactor OpenVexStatementMerger |
|
||||||
|
| 4 | T4 | TODO | T2 | Excititor Team | Add trust weight propagation |
|
||||||
|
| 5 | T5 | TODO | T3 | Excititor Team | Add merge trace output |
|
||||||
|
| 6 | T6 | TODO | T3 | Excititor Team | Deprecate VexConsensusResolver |
|
||||||
|
| 7 | T7 | TODO | T1-T5 | Excititor Team | Tests for lattice merge |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. K4 lattice disconnect identified between Policy and Excititor modules. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| K4 mapping | Decision | Excititor Team | Affected=Both, UnderInvestigation=Neither, Fixed=True, NotAffected=False |
|
||||||
|
| Trust precedence | Decision | Excititor Team | Trust > Lattice > Freshness > Tie |
|
||||||
|
| Default weights | Decision | Excititor Team | vendor=1.0, distro=0.9, nvd=0.8, etc. |
|
||||||
|
| AOC-19 compliance | Risk | Excititor Team | Must remove VexConsensusResolver |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] No hardcoded VEX precedence values
|
||||||
|
- [ ] Merge uses K4 lattice logic
|
||||||
|
- [ ] Trust weights influence outcomes
|
||||||
|
- [ ] MergeTrace explains decisions
|
||||||
|
- [ ] VexConsensusResolver deprecated
|
||||||
|
- [ ] All tests pass
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
839
docs/implplan/SPRINT_4200_0002_0001_can_i_ship_header.md
Normal file
839
docs/implplan/SPRINT_4200_0002_0001_can_i_ship_header.md
Normal file
@@ -0,0 +1,839 @@
|
|||||||
|
# Sprint 4200.0002.0001 · "Can I Ship?" Case Header
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Create above-the-fold verdict display for triage cases
|
||||||
|
- Show primary verdict (SHIP/BLOCK/EXCEPTION) prominently
|
||||||
|
- Display risk delta from baseline and actionable counts
|
||||||
|
- Link to signed attestation and knowledge snapshot
|
||||||
|
|
||||||
|
**Working directory:** `src/Web/StellaOps.Web/src/app/features/triage/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4200.0001.0001 (Triage REST API)
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: Sprint 4200.0002.0002 (Verdict Ladder), Sprint 4200.0002.0003 (Delta/Compare View)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Web/StellaOps.Web/AGENTS.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - How Top Scanners Shape Evidence‑First UX.md`
|
||||||
|
- `docs/product-advisories/16-Dec-2025 - Reimagining Proof‑Linked UX in Security Workflows.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create case-header.component.ts
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the primary "Can I Ship?" verdict header component.
|
||||||
|
|
||||||
|
**Implementation Path**: `components/case-header/case-header.component.ts` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
import { Component, Input, Output, EventEmitter, ChangeDetectionStrategy } from '@angular/core';
|
||||||
|
import { CommonModule } from '@angular/common';
|
||||||
|
import { MatChipsModule } from '@angular/material/chips';
|
||||||
|
import { MatIconModule } from '@angular/material/icon';
|
||||||
|
import { MatTooltipModule } from '@angular/material/tooltip';
|
||||||
|
import { MatButtonModule } from '@angular/material/button';
|
||||||
|
|
||||||
|
export type Verdict = 'ship' | 'block' | 'exception';
|
||||||
|
|
||||||
|
export interface CaseHeaderData {
|
||||||
|
verdict: Verdict;
|
||||||
|
findingCount: number;
|
||||||
|
criticalCount: number;
|
||||||
|
highCount: number;
|
||||||
|
actionableCount: number;
|
||||||
|
deltaFromBaseline?: DeltaInfo;
|
||||||
|
attestationId?: string;
|
||||||
|
snapshotId?: string;
|
||||||
|
evaluatedAt: Date;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface DeltaInfo {
|
||||||
|
newBlockers: number;
|
||||||
|
resolvedBlockers: number;
|
||||||
|
newFindings: number;
|
||||||
|
resolvedFindings: number;
|
||||||
|
baselineName: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-case-header',
|
||||||
|
standalone: true,
|
||||||
|
imports: [
|
||||||
|
CommonModule,
|
||||||
|
MatChipsModule,
|
||||||
|
MatIconModule,
|
||||||
|
MatTooltipModule,
|
||||||
|
MatButtonModule
|
||||||
|
],
|
||||||
|
templateUrl: './case-header.component.html',
|
||||||
|
styleUrls: ['./case-header.component.scss'],
|
||||||
|
changeDetection: ChangeDetectionStrategy.OnPush
|
||||||
|
})
|
||||||
|
export class CaseHeaderComponent {
|
||||||
|
@Input({ required: true }) data!: CaseHeaderData;
|
||||||
|
@Output() verdictClick = new EventEmitter<void>();
|
||||||
|
@Output() attestationClick = new EventEmitter<string>();
|
||||||
|
@Output() snapshotClick = new EventEmitter<string>();
|
||||||
|
|
||||||
|
get verdictLabel(): string {
|
||||||
|
switch (this.data.verdict) {
|
||||||
|
case 'ship': return 'CAN SHIP';
|
||||||
|
case 'block': return 'BLOCKED';
|
||||||
|
case 'exception': return 'EXCEPTION';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
get verdictIcon(): string {
|
||||||
|
switch (this.data.verdict) {
|
||||||
|
case 'ship': return 'check_circle';
|
||||||
|
case 'block': return 'block';
|
||||||
|
case 'exception': return 'warning';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
get verdictClass(): string {
|
||||||
|
return `verdict-chip verdict-${this.data.verdict}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
get hasNewBlockers(): boolean {
|
||||||
|
return (this.data.deltaFromBaseline?.newBlockers ?? 0) > 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
get deltaText(): string {
|
||||||
|
if (!this.data.deltaFromBaseline) return '';
|
||||||
|
const d = this.data.deltaFromBaseline;
|
||||||
|
|
||||||
|
const parts: string[] = [];
|
||||||
|
if (d.newBlockers > 0) parts.push(`+${d.newBlockers} blockers`);
|
||||||
|
if (d.resolvedBlockers > 0) parts.push(`-${d.resolvedBlockers} resolved`);
|
||||||
|
if (d.newFindings > 0) parts.push(`+${d.newFindings} new`);
|
||||||
|
|
||||||
|
return parts.join(', ') + ` since ${d.baselineName}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
get shortSnapshotId(): string {
|
||||||
|
if (!this.data.snapshotId) return '';
|
||||||
|
// ksm:sha256:abc123... -> ksm:abc123
|
||||||
|
const parts = this.data.snapshotId.split(':');
|
||||||
|
if (parts.length >= 3) {
|
||||||
|
return `ksm:${parts[2].substring(0, 8)}`;
|
||||||
|
}
|
||||||
|
return this.data.snapshotId.substring(0, 16);
|
||||||
|
}
|
||||||
|
|
||||||
|
onVerdictClick(): void {
|
||||||
|
this.verdictClick.emit();
|
||||||
|
}
|
||||||
|
|
||||||
|
onAttestationClick(): void {
|
||||||
|
if (this.data.attestationId) {
|
||||||
|
this.attestationClick.emit(this.data.attestationId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
onSnapshotClick(): void {
|
||||||
|
if (this.data.snapshotId) {
|
||||||
|
this.snapshotClick.emit(this.data.snapshotId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Template** (`case-header.component.html`):
|
||||||
|
```html
|
||||||
|
<div class="case-header">
|
||||||
|
<!-- Primary Verdict Chip -->
|
||||||
|
<div class="verdict-section">
|
||||||
|
<button
|
||||||
|
mat-flat-button
|
||||||
|
[class]="verdictClass"
|
||||||
|
(click)="onVerdictClick()"
|
||||||
|
matTooltip="Click to view verdict details"
|
||||||
|
>
|
||||||
|
<mat-icon>{{ verdictIcon }}</mat-icon>
|
||||||
|
<span class="verdict-label">{{ verdictLabel }}</span>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<!-- Signed Badge -->
|
||||||
|
<button
|
||||||
|
*ngIf="data.attestationId"
|
||||||
|
mat-icon-button
|
||||||
|
class="signed-badge"
|
||||||
|
(click)="onAttestationClick()"
|
||||||
|
matTooltip="View DSSE attestation"
|
||||||
|
>
|
||||||
|
<mat-icon>verified</mat-icon>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Risk Delta -->
|
||||||
|
<div class="delta-section" *ngIf="data.deltaFromBaseline">
|
||||||
|
<span [class.has-blockers]="hasNewBlockers">
|
||||||
|
{{ deltaText }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Actionables Count -->
|
||||||
|
<div class="actionables-section">
|
||||||
|
<mat-chip-set>
|
||||||
|
<mat-chip *ngIf="data.criticalCount > 0" class="chip-critical">
|
||||||
|
{{ data.criticalCount }} Critical
|
||||||
|
</mat-chip>
|
||||||
|
<mat-chip *ngIf="data.highCount > 0" class="chip-high">
|
||||||
|
{{ data.highCount }} High
|
||||||
|
</mat-chip>
|
||||||
|
<mat-chip *ngIf="data.actionableCount > 0" class="chip-actionable">
|
||||||
|
{{ data.actionableCount }} need attention
|
||||||
|
</mat-chip>
|
||||||
|
</mat-chip-set>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Knowledge Snapshot Badge -->
|
||||||
|
<div class="snapshot-section" *ngIf="data.snapshotId">
|
||||||
|
<button
|
||||||
|
mat-stroked-button
|
||||||
|
class="snapshot-badge"
|
||||||
|
(click)="onSnapshotClick()"
|
||||||
|
matTooltip="View knowledge snapshot: {{ data.snapshotId }}"
|
||||||
|
>
|
||||||
|
<mat-icon>history</mat-icon>
|
||||||
|
<span>{{ shortSnapshotId }}</span>
|
||||||
|
</button>
|
||||||
|
<span class="evaluated-at">
|
||||||
|
Evaluated {{ data.evaluatedAt | date:'short' }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Styles** (`case-header.component.scss`):
|
||||||
|
```scss
|
||||||
|
.case-header {
|
||||||
|
display: flex;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
align-items: center;
|
||||||
|
gap: 16px;
|
||||||
|
padding: 16px 24px;
|
||||||
|
background: var(--surface-container);
|
||||||
|
border-radius: 8px;
|
||||||
|
margin-bottom: 16px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.verdict-section {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.verdict-chip {
|
||||||
|
font-size: 1.25rem;
|
||||||
|
font-weight: 600;
|
||||||
|
padding: 12px 24px;
|
||||||
|
border-radius: 24px;
|
||||||
|
|
||||||
|
mat-icon {
|
||||||
|
margin-right: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
&.verdict-ship {
|
||||||
|
background-color: var(--success-container);
|
||||||
|
color: var(--on-success-container);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.verdict-block {
|
||||||
|
background-color: var(--error-container);
|
||||||
|
color: var(--on-error-container);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.verdict-exception {
|
||||||
|
background-color: var(--warning-container);
|
||||||
|
color: var(--on-warning-container);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.signed-badge {
|
||||||
|
color: var(--primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.delta-section {
|
||||||
|
flex: 1;
|
||||||
|
min-width: 200px;
|
||||||
|
|
||||||
|
.has-blockers {
|
||||||
|
color: var(--error);
|
||||||
|
font-weight: 500;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.actionables-section {
|
||||||
|
.chip-critical {
|
||||||
|
background-color: var(--error);
|
||||||
|
color: var(--on-error);
|
||||||
|
}
|
||||||
|
|
||||||
|
.chip-high {
|
||||||
|
background-color: var(--warning);
|
||||||
|
color: var(--on-warning);
|
||||||
|
}
|
||||||
|
|
||||||
|
.chip-actionable {
|
||||||
|
background-color: var(--tertiary-container);
|
||||||
|
color: var(--on-tertiary-container);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.snapshot-section {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
|
||||||
|
.snapshot-badge {
|
||||||
|
font-family: monospace;
|
||||||
|
font-size: 0.875rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.evaluated-at {
|
||||||
|
font-size: 0.75rem;
|
||||||
|
color: var(--on-surface-variant);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Responsive
|
||||||
|
@media (max-width: 768px) {
|
||||||
|
.case-header {
|
||||||
|
flex-direction: column;
|
||||||
|
align-items: flex-start;
|
||||||
|
}
|
||||||
|
|
||||||
|
.verdict-section {
|
||||||
|
width: 100%;
|
||||||
|
justify-content: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.delta-section,
|
||||||
|
.actionables-section,
|
||||||
|
.snapshot-section {
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `case-header.component.ts` file created
|
||||||
|
- [ ] Primary verdict chip (SHIP/BLOCK/EXCEPTION) with icon
|
||||||
|
- [ ] Color coding for each verdict state
|
||||||
|
- [ ] Signed attestation badge with click handler
|
||||||
|
- [ ] Standalone component with modern Angular features
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Add Risk Delta Display
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add delta display showing changes since baseline.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 template with `DeltaInfo` interface.
|
||||||
|
|
||||||
|
**Additional Styles** (add to `case-header.component.scss`):
|
||||||
|
```scss
|
||||||
|
.delta-breakdown {
|
||||||
|
display: flex;
|
||||||
|
gap: 16px;
|
||||||
|
margin-top: 8px;
|
||||||
|
|
||||||
|
.delta-item {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 4px;
|
||||||
|
font-size: 0.875rem;
|
||||||
|
|
||||||
|
&.positive {
|
||||||
|
color: var(--error);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.negative {
|
||||||
|
color: var(--success);
|
||||||
|
}
|
||||||
|
|
||||||
|
mat-icon {
|
||||||
|
font-size: 16px;
|
||||||
|
width: 16px;
|
||||||
|
height: 16px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Delta from baseline shown: "+3 new blockers since baseline"
|
||||||
|
- [ ] Red highlighting for new blockers
|
||||||
|
- [ ] Green highlighting for resolved issues
|
||||||
|
- [ ] Baseline name displayed
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Add Actionables Count
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Display count of items needing attention.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 with mat-chip-set.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Critical count chip with red color
|
||||||
|
- [ ] High count chip with orange color
|
||||||
|
- [ ] "X items need attention" chip
|
||||||
|
- [ ] Chips clickable to filter list
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Add Signed Gate Link
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Link verdict to DSSE attestation viewer.
|
||||||
|
|
||||||
|
**Implementation Path**: Add attestation dialog/drawer
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// attestation-viewer.component.ts
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-attestation-viewer',
|
||||||
|
standalone: true,
|
||||||
|
imports: [CommonModule, MatDialogModule, MatButtonModule],
|
||||||
|
template: `
|
||||||
|
<h2 mat-dialog-title>DSSE Attestation</h2>
|
||||||
|
<mat-dialog-content>
|
||||||
|
<div class="attestation-content">
|
||||||
|
<div class="field">
|
||||||
|
<label>Attestation ID</label>
|
||||||
|
<code>{{ data.attestationId }}</code>
|
||||||
|
</div>
|
||||||
|
<div class="field">
|
||||||
|
<label>Subject</label>
|
||||||
|
<code>{{ data.subject }}</code>
|
||||||
|
</div>
|
||||||
|
<div class="field">
|
||||||
|
<label>Predicate Type</label>
|
||||||
|
<code>{{ data.predicateType }}</code>
|
||||||
|
</div>
|
||||||
|
<div class="field">
|
||||||
|
<label>Signed By</label>
|
||||||
|
<span>{{ data.signedBy }}</span>
|
||||||
|
</div>
|
||||||
|
<div class="field">
|
||||||
|
<label>Timestamp</label>
|
||||||
|
<span>{{ data.timestamp | date:'medium' }}</span>
|
||||||
|
</div>
|
||||||
|
<div class="field" *ngIf="data.rekorEntry">
|
||||||
|
<label>Transparency Log</label>
|
||||||
|
<a [href]="data.rekorEntry" target="_blank">View in Rekor</a>
|
||||||
|
</div>
|
||||||
|
<div class="signature-section">
|
||||||
|
<label>DSSE Envelope</label>
|
||||||
|
<pre>{{ data.envelope | json }}</pre>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</mat-dialog-content>
|
||||||
|
<mat-dialog-actions align="end">
|
||||||
|
<button mat-button (click)="copyEnvelope()">Copy Envelope</button>
|
||||||
|
<button mat-button mat-dialog-close>Close</button>
|
||||||
|
</mat-dialog-actions>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class AttestationViewerComponent {
|
||||||
|
constructor(
|
||||||
|
@Inject(MAT_DIALOG_DATA) public data: AttestationData,
|
||||||
|
private clipboard: Clipboard
|
||||||
|
) {}
|
||||||
|
|
||||||
|
copyEnvelope(): void {
|
||||||
|
this.clipboard.copy(JSON.stringify(this.data.envelope, null, 2));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] "Verified" badge next to verdict
|
||||||
|
- [ ] Click opens attestation viewer dialog
|
||||||
|
- [ ] Shows DSSE envelope details
|
||||||
|
- [ ] Link to Rekor if available
|
||||||
|
- [ ] Copy envelope button
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Add Knowledge Snapshot Badge
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Display knowledge snapshot ID with link to snapshot details.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 with snapshot-section.
|
||||||
|
|
||||||
|
**Additional Component** - Snapshot Viewer:
|
||||||
|
```typescript
|
||||||
|
// snapshot-viewer.component.ts
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-snapshot-viewer',
|
||||||
|
standalone: true,
|
||||||
|
template: `
|
||||||
|
<div class="snapshot-viewer">
|
||||||
|
<h3>Knowledge Snapshot</h3>
|
||||||
|
<div class="snapshot-id">
|
||||||
|
<code>{{ snapshot.snapshotId }}</code>
|
||||||
|
<button mat-icon-button (click)="copyId()">
|
||||||
|
<mat-icon>content_copy</mat-icon>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<h4>Sources</h4>
|
||||||
|
<mat-list>
|
||||||
|
<mat-list-item *ngFor="let source of snapshot.sources">
|
||||||
|
<mat-icon matListItemIcon>{{ getSourceIcon(source.type) }}</mat-icon>
|
||||||
|
<span matListItemTitle>{{ source.name }}</span>
|
||||||
|
<span matListItemLine>{{ source.epoch }} • {{ source.digest | slice:0:16 }}</span>
|
||||||
|
</mat-list-item>
|
||||||
|
</mat-list>
|
||||||
|
|
||||||
|
<h4>Environment</h4>
|
||||||
|
<div class="environment" *ngIf="snapshot.environment">
|
||||||
|
<span>Platform: {{ snapshot.environment.platform }}</span>
|
||||||
|
<span>Engine: {{ snapshot.engine.version }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="actions">
|
||||||
|
<button mat-stroked-button (click)="exportSnapshot()">
|
||||||
|
<mat-icon>download</mat-icon> Export Bundle
|
||||||
|
</button>
|
||||||
|
<button mat-stroked-button (click)="replayWithSnapshot()">
|
||||||
|
<mat-icon>replay</mat-icon> Replay
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class SnapshotViewerComponent {
|
||||||
|
@Input({ required: true }) snapshot!: KnowledgeSnapshot;
|
||||||
|
@Output() export = new EventEmitter<string>();
|
||||||
|
@Output() replay = new EventEmitter<string>();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Snapshot ID badge: "ksm:abc123..."
|
||||||
|
- [ ] Truncated display with full ID on hover
|
||||||
|
- [ ] Click opens snapshot details panel
|
||||||
|
- [ ] Shows sources included in snapshot
|
||||||
|
- [ ] Export and replay buttons
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Responsive Design
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2, T3, T4, T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Ensure header works on mobile and tablet.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 SCSS with media queries.
|
||||||
|
|
||||||
|
**Additional Breakpoints**:
|
||||||
|
```scss
|
||||||
|
// Tablet
|
||||||
|
@media (min-width: 769px) and (max-width: 1024px) {
|
||||||
|
.case-header {
|
||||||
|
.verdict-section {
|
||||||
|
flex: 0 0 auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
.delta-section {
|
||||||
|
flex: 1;
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.actionables-section {
|
||||||
|
flex: 0 0 auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
.snapshot-section {
|
||||||
|
width: 100%;
|
||||||
|
justify-content: flex-end;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mobile
|
||||||
|
@media (max-width: 480px) {
|
||||||
|
.case-header {
|
||||||
|
padding: 12px 16px;
|
||||||
|
gap: 12px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.verdict-chip {
|
||||||
|
width: 100%;
|
||||||
|
justify-content: center;
|
||||||
|
font-size: 1.1rem;
|
||||||
|
padding: 10px 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.actionables-section mat-chip-set {
|
||||||
|
flex-wrap: wrap;
|
||||||
|
justify-content: center;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Stacks vertically on mobile (<768px)
|
||||||
|
- [ ] Verdict centered on mobile
|
||||||
|
- [ ] Chips wrap appropriately
|
||||||
|
- [ ] Touch-friendly tap targets (min 44px)
|
||||||
|
- [ ] No horizontal scroll
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Tests
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T6
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Component tests with mocks.
|
||||||
|
|
||||||
|
**Implementation Path**: `components/case-header/case-header.component.spec.ts`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```typescript
|
||||||
|
describe('CaseHeaderComponent', () => {
|
||||||
|
let component: CaseHeaderComponent;
|
||||||
|
let fixture: ComponentFixture<CaseHeaderComponent>;
|
||||||
|
|
||||||
|
beforeEach(async () => {
|
||||||
|
await TestBed.configureTestingModule({
|
||||||
|
imports: [CaseHeaderComponent, NoopAnimationsModule]
|
||||||
|
}).compileComponents();
|
||||||
|
|
||||||
|
fixture = TestBed.createComponent(CaseHeaderComponent);
|
||||||
|
component = fixture.componentInstance;
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should create', () => {
|
||||||
|
component.data = createMockData('ship');
|
||||||
|
fixture.detectChanges();
|
||||||
|
expect(component).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Verdict Display', () => {
|
||||||
|
it('should show CAN SHIP for ship verdict', () => {
|
||||||
|
component.data = createMockData('ship');
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const label = fixture.nativeElement.querySelector('.verdict-label');
|
||||||
|
expect(label.textContent).toContain('CAN SHIP');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show BLOCKED for block verdict', () => {
|
||||||
|
component.data = createMockData('block');
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const label = fixture.nativeElement.querySelector('.verdict-label');
|
||||||
|
expect(label.textContent).toContain('BLOCKED');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show EXCEPTION for exception verdict', () => {
|
||||||
|
component.data = createMockData('exception');
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const label = fixture.nativeElement.querySelector('.verdict-label');
|
||||||
|
expect(label.textContent).toContain('EXCEPTION');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should apply correct CSS class for verdict', () => {
|
||||||
|
component.data = createMockData('block');
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const chip = fixture.nativeElement.querySelector('.verdict-chip');
|
||||||
|
expect(chip.classList).toContain('verdict-block');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Delta Display', () => {
|
||||||
|
it('should show delta when present', () => {
|
||||||
|
component.data = {
|
||||||
|
...createMockData('block'),
|
||||||
|
deltaFromBaseline: {
|
||||||
|
newBlockers: 3,
|
||||||
|
resolvedBlockers: 1,
|
||||||
|
newFindings: 5,
|
||||||
|
resolvedFindings: 2,
|
||||||
|
baselineName: 'v1.2.0'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const delta = fixture.nativeElement.querySelector('.delta-section');
|
||||||
|
expect(delta.textContent).toContain('+3 blockers');
|
||||||
|
expect(delta.textContent).toContain('v1.2.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should highlight new blockers', () => {
|
||||||
|
component.data = {
|
||||||
|
...createMockData('block'),
|
||||||
|
deltaFromBaseline: {
|
||||||
|
newBlockers: 3,
|
||||||
|
resolvedBlockers: 0,
|
||||||
|
newFindings: 0,
|
||||||
|
resolvedFindings: 0,
|
||||||
|
baselineName: 'main'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const delta = fixture.nativeElement.querySelector('.delta-section span');
|
||||||
|
expect(delta.classList).toContain('has-blockers');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Attestation Badge', () => {
|
||||||
|
it('should show signed badge when attestation present', () => {
|
||||||
|
component.data = {
|
||||||
|
...createMockData('ship'),
|
||||||
|
attestationId: 'att-123'
|
||||||
|
};
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const badge = fixture.nativeElement.querySelector('.signed-badge');
|
||||||
|
expect(badge).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should emit attestationClick on badge click', () => {
|
||||||
|
component.data = {
|
||||||
|
...createMockData('ship'),
|
||||||
|
attestationId: 'att-123'
|
||||||
|
};
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
spyOn(component.attestationClick, 'emit');
|
||||||
|
const badge = fixture.nativeElement.querySelector('.signed-badge');
|
||||||
|
badge.click();
|
||||||
|
|
||||||
|
expect(component.attestationClick.emit).toHaveBeenCalledWith('att-123');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Snapshot Badge', () => {
|
||||||
|
it('should show truncated snapshot ID', () => {
|
||||||
|
component.data = {
|
||||||
|
...createMockData('ship'),
|
||||||
|
snapshotId: 'ksm:sha256:abcdef1234567890'
|
||||||
|
};
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const badge = fixture.nativeElement.querySelector('.snapshot-badge');
|
||||||
|
expect(badge.textContent).toContain('ksm:abcdef12');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
function createMockData(verdict: Verdict): CaseHeaderData {
|
||||||
|
return {
|
||||||
|
verdict,
|
||||||
|
findingCount: 10,
|
||||||
|
criticalCount: 2,
|
||||||
|
highCount: 5,
|
||||||
|
actionableCount: 7,
|
||||||
|
evaluatedAt: new Date()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for each verdict state
|
||||||
|
- [ ] Test for delta display
|
||||||
|
- [ ] Test for attestation badge
|
||||||
|
- [ ] Test for snapshot badge
|
||||||
|
- [ ] Test event emissions
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | UI Team | Create case-header.component.ts |
|
||||||
|
| 2 | T2 | TODO | T1 | UI Team | Add risk delta display |
|
||||||
|
| 3 | T3 | TODO | T1 | UI Team | Add actionables count |
|
||||||
|
| 4 | T4 | TODO | T1 | UI Team | Add signed gate link |
|
||||||
|
| 5 | T5 | TODO | T1 | UI Team | Add knowledge snapshot badge |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | UI Team | Responsive design |
|
||||||
|
| 7 | T7 | TODO | T1-T6 | UI Team | Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. "Can I Ship?" header identified as core UX pattern. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Standalone component | Decision | UI Team | Use Angular 17 standalone components |
|
||||||
|
| Material Design | Decision | UI Team | Use Angular Material for consistency |
|
||||||
|
| Verdict colors | Decision | UI Team | Ship=success, Block=error, Exception=warning |
|
||||||
|
| Snapshot truncation | Decision | UI Team | Show first 8 chars of hash |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Verdict visible without scrolling
|
||||||
|
- [ ] Delta from baseline shown
|
||||||
|
- [ ] Clicking verdict chip shows attestation
|
||||||
|
- [ ] Snapshot ID visible with link
|
||||||
|
- [ ] Responsive on mobile/tablet
|
||||||
|
- [ ] All component tests pass
|
||||||
|
- [ ] `ng build` succeeds
|
||||||
|
- [ ] `ng test` succeeds
|
||||||
979
docs/implplan/SPRINT_4200_0002_0002_verdict_ladder.md
Normal file
979
docs/implplan/SPRINT_4200_0002_0002_verdict_ladder.md
Normal file
@@ -0,0 +1,979 @@
|
|||||||
|
# Sprint 4200.0002.0002 · Verdict Ladder UI
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Create vertical timeline visualization showing 8 steps from detection to verdict
|
||||||
|
- Enable click-to-expand evidence at each step
|
||||||
|
- Show the complete audit trail for how a finding became a verdict
|
||||||
|
|
||||||
|
**Working directory:** `src/Web/StellaOps.Web/src/app/features/triage/components/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4200.0001.0001 (Triage REST API)
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: Sprint 4200.0002.0001 ("Can I Ship?" Header), Sprint 4200.0002.0003 (Delta/Compare View)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Web/StellaOps.Web/AGENTS.md`
|
||||||
|
- `docs/product-advisories/16-Dec-2025 - Reimagining Proof‑Linked UX in Security Workflows.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Designing Explainable Triage Workflows.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## The 8-Step Verdict Ladder
|
||||||
|
|
||||||
|
```
|
||||||
|
Step 1: Detection → CVE source, SBOM match
|
||||||
|
Step 2: Component ID → PURL, version, location
|
||||||
|
Step 3: Applicability → OVAL/version range match
|
||||||
|
Step 4: Reachability → Static analysis path
|
||||||
|
Step 5: Runtime → Process trace, signal
|
||||||
|
Step 6: VEX Merge → Lattice outcome with trust weights
|
||||||
|
Step 7: Policy Trace → Rule → verdict mapping
|
||||||
|
Step 8: Attestation → Signature, transparency log
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create verdict-ladder.component.ts
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the main vertical timeline component.
|
||||||
|
|
||||||
|
**Implementation Path**: `verdict-ladder/verdict-ladder.component.ts` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
import { Component, Input, ChangeDetectionStrategy } from '@angular/core';
|
||||||
|
import { CommonModule } from '@angular/common';
|
||||||
|
import { MatExpansionModule } from '@angular/material/expansion';
|
||||||
|
import { MatIconModule } from '@angular/material/icon';
|
||||||
|
import { MatChipsModule } from '@angular/material/chips';
|
||||||
|
import { MatButtonModule } from '@angular/material/button';
|
||||||
|
import { MatTooltipModule } from '@angular/material/tooltip';
|
||||||
|
|
||||||
|
export interface VerdictLadderStep {
|
||||||
|
step: number;
|
||||||
|
name: string;
|
||||||
|
status: 'complete' | 'partial' | 'missing' | 'na';
|
||||||
|
summary: string;
|
||||||
|
evidence?: EvidenceItem[];
|
||||||
|
expandable: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface EvidenceItem {
|
||||||
|
type: string;
|
||||||
|
title: string;
|
||||||
|
source?: string;
|
||||||
|
hash?: string;
|
||||||
|
signed?: boolean;
|
||||||
|
signedBy?: string;
|
||||||
|
uri?: string;
|
||||||
|
preview?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface VerdictLadderData {
|
||||||
|
findingId: string;
|
||||||
|
steps: VerdictLadderStep[];
|
||||||
|
finalVerdict: 'ship' | 'block' | 'exception';
|
||||||
|
}
|
||||||
|
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-verdict-ladder',
|
||||||
|
standalone: true,
|
||||||
|
imports: [
|
||||||
|
CommonModule,
|
||||||
|
MatExpansionModule,
|
||||||
|
MatIconModule,
|
||||||
|
MatChipsModule,
|
||||||
|
MatButtonModule,
|
||||||
|
MatTooltipModule
|
||||||
|
],
|
||||||
|
templateUrl: './verdict-ladder.component.html',
|
||||||
|
styleUrls: ['./verdict-ladder.component.scss'],
|
||||||
|
changeDetection: ChangeDetectionStrategy.OnPush
|
||||||
|
})
|
||||||
|
export class VerdictLadderComponent {
|
||||||
|
@Input({ required: true }) data!: VerdictLadderData;
|
||||||
|
|
||||||
|
getStepIcon(step: VerdictLadderStep): string {
|
||||||
|
switch (step.status) {
|
||||||
|
case 'complete': return 'check_circle';
|
||||||
|
case 'partial': return 'radio_button_checked';
|
||||||
|
case 'missing': return 'error';
|
||||||
|
case 'na': return 'remove_circle_outline';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
getStepClass(step: VerdictLadderStep): string {
|
||||||
|
return `step-${step.status}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
getStepLabel(stepNumber: number): string {
|
||||||
|
switch (stepNumber) {
|
||||||
|
case 1: return 'Detection';
|
||||||
|
case 2: return 'Component';
|
||||||
|
case 3: return 'Applicability';
|
||||||
|
case 4: return 'Reachability';
|
||||||
|
case 5: return 'Runtime';
|
||||||
|
case 6: return 'VEX Merge';
|
||||||
|
case 7: return 'Policy';
|
||||||
|
case 8: return 'Attestation';
|
||||||
|
default: return `Step ${stepNumber}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
getEvidenceIcon(type: string): string {
|
||||||
|
switch (type) {
|
||||||
|
case 'sbom_slice': return 'inventory_2';
|
||||||
|
case 'vex_doc': return 'description';
|
||||||
|
case 'provenance': return 'verified';
|
||||||
|
case 'callstack_slice': return 'account_tree';
|
||||||
|
case 'reachability_proof': return 'route';
|
||||||
|
case 'replay_manifest': return 'replay';
|
||||||
|
case 'policy': return 'policy';
|
||||||
|
case 'scan_log': return 'article';
|
||||||
|
default: return 'attachment';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
trackByStep(index: number, step: VerdictLadderStep): number {
|
||||||
|
return step.step;
|
||||||
|
}
|
||||||
|
|
||||||
|
trackByEvidence(index: number, evidence: EvidenceItem): string {
|
||||||
|
return evidence.hash ?? evidence.title;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Template** (`verdict-ladder.component.html`):
|
||||||
|
```html
|
||||||
|
<div class="verdict-ladder">
|
||||||
|
<div class="ladder-header">
|
||||||
|
<h3>Verdict Trail</h3>
|
||||||
|
<mat-chip [class]="'verdict-' + data.finalVerdict">
|
||||||
|
{{ data.finalVerdict | uppercase }}
|
||||||
|
</mat-chip>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="ladder-timeline">
|
||||||
|
<mat-accordion multi>
|
||||||
|
<mat-expansion-panel
|
||||||
|
*ngFor="let step of data.steps; trackBy: trackByStep"
|
||||||
|
[expanded]="false"
|
||||||
|
[disabled]="!step.expandable"
|
||||||
|
[class]="getStepClass(step)"
|
||||||
|
>
|
||||||
|
<mat-expansion-panel-header>
|
||||||
|
<mat-panel-title>
|
||||||
|
<div class="step-header">
|
||||||
|
<div class="step-number">{{ step.step }}</div>
|
||||||
|
<mat-icon [class]="'status-icon ' + getStepClass(step)">
|
||||||
|
{{ getStepIcon(step) }}
|
||||||
|
</mat-icon>
|
||||||
|
<span class="step-name">{{ step.name }}</span>
|
||||||
|
</div>
|
||||||
|
</mat-panel-title>
|
||||||
|
<mat-panel-description>
|
||||||
|
{{ step.summary }}
|
||||||
|
</mat-panel-description>
|
||||||
|
</mat-expansion-panel-header>
|
||||||
|
|
||||||
|
<!-- Expanded Content -->
|
||||||
|
<div class="step-content" *ngIf="step.evidence?.length">
|
||||||
|
<div class="evidence-list">
|
||||||
|
<div
|
||||||
|
class="evidence-item"
|
||||||
|
*ngFor="let ev of step.evidence; trackBy: trackByEvidence"
|
||||||
|
>
|
||||||
|
<div class="evidence-header">
|
||||||
|
<mat-icon>{{ getEvidenceIcon(ev.type) }}</mat-icon>
|
||||||
|
<span class="evidence-title">{{ ev.title }}</span>
|
||||||
|
<mat-icon *ngIf="ev.signed" class="signed-icon" matTooltip="Signed by {{ ev.signedBy }}">
|
||||||
|
verified
|
||||||
|
</mat-icon>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="evidence-details">
|
||||||
|
<span class="evidence-source" *ngIf="ev.source">
|
||||||
|
Source: {{ ev.source }}
|
||||||
|
</span>
|
||||||
|
<code class="evidence-hash" *ngIf="ev.hash">
|
||||||
|
{{ ev.hash | slice:0:16 }}...
|
||||||
|
</code>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="evidence-preview" *ngIf="ev.preview">
|
||||||
|
<pre>{{ ev.preview }}</pre>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="evidence-actions">
|
||||||
|
<button mat-stroked-button size="small" *ngIf="ev.uri">
|
||||||
|
<mat-icon>download</mat-icon>
|
||||||
|
Download
|
||||||
|
</button>
|
||||||
|
<button mat-stroked-button size="small">
|
||||||
|
<mat-icon>visibility</mat-icon>
|
||||||
|
View Full
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="step-content no-evidence" *ngIf="!step.evidence?.length && step.status !== 'na'">
|
||||||
|
<p>No evidence artifacts attached to this step.</p>
|
||||||
|
</div>
|
||||||
|
</mat-expansion-panel>
|
||||||
|
</mat-accordion>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Timeline connector line -->
|
||||||
|
<div class="timeline-connector"></div>
|
||||||
|
</div>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Styles** (`verdict-ladder.component.scss`):
|
||||||
|
```scss
|
||||||
|
.verdict-ladder {
|
||||||
|
position: relative;
|
||||||
|
padding: 16px;
|
||||||
|
background: var(--surface);
|
||||||
|
border-radius: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.ladder-header {
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
align-items: center;
|
||||||
|
margin-bottom: 24px;
|
||||||
|
|
||||||
|
h3 {
|
||||||
|
margin: 0;
|
||||||
|
font-size: 1.125rem;
|
||||||
|
font-weight: 500;
|
||||||
|
}
|
||||||
|
|
||||||
|
.verdict-ship { background-color: var(--success); color: white; }
|
||||||
|
.verdict-block { background-color: var(--error); color: white; }
|
||||||
|
.verdict-exception { background-color: var(--warning); color: black; }
|
||||||
|
}
|
||||||
|
|
||||||
|
.ladder-timeline {
|
||||||
|
position: relative;
|
||||||
|
z-index: 1;
|
||||||
|
|
||||||
|
mat-expansion-panel {
|
||||||
|
margin-bottom: 8px;
|
||||||
|
border-left: 3px solid var(--outline);
|
||||||
|
|
||||||
|
&.step-complete {
|
||||||
|
border-left-color: var(--success);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.step-partial {
|
||||||
|
border-left-color: var(--warning);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.step-missing {
|
||||||
|
border-left-color: var(--error);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.step-na {
|
||||||
|
border-left-color: var(--outline-variant);
|
||||||
|
opacity: 0.7;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.step-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 12px;
|
||||||
|
|
||||||
|
.step-number {
|
||||||
|
width: 24px;
|
||||||
|
height: 24px;
|
||||||
|
border-radius: 50%;
|
||||||
|
background: var(--primary-container);
|
||||||
|
color: var(--on-primary-container);
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
font-size: 0.75rem;
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-icon {
|
||||||
|
font-size: 20px;
|
||||||
|
width: 20px;
|
||||||
|
height: 20px;
|
||||||
|
|
||||||
|
&.step-complete { color: var(--success); }
|
||||||
|
&.step-partial { color: var(--warning); }
|
||||||
|
&.step-missing { color: var(--error); }
|
||||||
|
&.step-na { color: var(--outline); }
|
||||||
|
}
|
||||||
|
|
||||||
|
.step-name {
|
||||||
|
font-weight: 500;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.step-content {
|
||||||
|
padding: 16px 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-list {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 16px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-item {
|
||||||
|
padding: 12px;
|
||||||
|
background: var(--surface-variant);
|
||||||
|
border-radius: 8px;
|
||||||
|
|
||||||
|
.evidence-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
margin-bottom: 8px;
|
||||||
|
|
||||||
|
mat-icon {
|
||||||
|
color: var(--primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-title {
|
||||||
|
flex: 1;
|
||||||
|
font-weight: 500;
|
||||||
|
}
|
||||||
|
|
||||||
|
.signed-icon {
|
||||||
|
color: var(--success);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-details {
|
||||||
|
display: flex;
|
||||||
|
gap: 16px;
|
||||||
|
font-size: 0.875rem;
|
||||||
|
color: var(--on-surface-variant);
|
||||||
|
margin-bottom: 8px;
|
||||||
|
|
||||||
|
.evidence-hash {
|
||||||
|
font-family: monospace;
|
||||||
|
background: var(--surface);
|
||||||
|
padding: 2px 6px;
|
||||||
|
border-radius: 4px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-preview {
|
||||||
|
pre {
|
||||||
|
background: var(--surface);
|
||||||
|
padding: 12px;
|
||||||
|
border-radius: 4px;
|
||||||
|
overflow-x: auto;
|
||||||
|
font-size: 0.75rem;
|
||||||
|
max-height: 200px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-actions {
|
||||||
|
display: flex;
|
||||||
|
gap: 8px;
|
||||||
|
margin-top: 12px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.no-evidence {
|
||||||
|
color: var(--on-surface-variant);
|
||||||
|
font-style: italic;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Timeline connector
|
||||||
|
.timeline-connector {
|
||||||
|
position: absolute;
|
||||||
|
left: 36px;
|
||||||
|
top: 80px;
|
||||||
|
bottom: 20px;
|
||||||
|
width: 2px;
|
||||||
|
background: linear-gradient(
|
||||||
|
to bottom,
|
||||||
|
var(--success) 0%,
|
||||||
|
var(--warning) 50%,
|
||||||
|
var(--error) 100%
|
||||||
|
);
|
||||||
|
z-index: 0;
|
||||||
|
opacity: 0.3;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `verdict-ladder.component.ts` file created
|
||||||
|
- [ ] Vertical timeline with 8 steps
|
||||||
|
- [ ] Accordion expansion for each step
|
||||||
|
- [ ] Status icons (complete/partial/missing/na)
|
||||||
|
- [ ] Color-coded border by status
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Step 1 - Detection Sources
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement detection step showing CVE sources and SBOM match.
|
||||||
|
|
||||||
|
**Implementation** - Detection Step Data:
|
||||||
|
```typescript
|
||||||
|
// detection-step.service.ts
|
||||||
|
export interface DetectionEvidence {
|
||||||
|
cveId: string;
|
||||||
|
sources: {
|
||||||
|
name: string;
|
||||||
|
publishedAt: Date;
|
||||||
|
url?: string;
|
||||||
|
}[];
|
||||||
|
sbomMatch: {
|
||||||
|
purl: string;
|
||||||
|
matchedVersion: string;
|
||||||
|
location: string;
|
||||||
|
sbomDigest: string;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildDetectionStep(evidence: DetectionEvidence): VerdictLadderStep {
|
||||||
|
return {
|
||||||
|
step: 1,
|
||||||
|
name: 'Detection',
|
||||||
|
status: evidence.sources.length > 0 ? 'complete' : 'missing',
|
||||||
|
summary: `${evidence.cveId} from ${evidence.sources.length} source(s)`,
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'scan_log',
|
||||||
|
title: `CVE Sources for ${evidence.cveId}`,
|
||||||
|
preview: evidence.sources.map(s => `${s.name}: ${s.publishedAt.toISOString()}`).join('\n')
|
||||||
|
},
|
||||||
|
{
|
||||||
|
type: 'sbom_slice',
|
||||||
|
title: 'SBOM Match',
|
||||||
|
source: evidence.sbomMatch.purl,
|
||||||
|
hash: evidence.sbomMatch.sbomDigest,
|
||||||
|
preview: `Package: ${evidence.sbomMatch.purl}\nVersion: ${evidence.sbomMatch.matchedVersion}\nLocation: ${evidence.sbomMatch.location}`
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows CVE ID and source count
|
||||||
|
- [ ] Lists all CVE sources with timestamps
|
||||||
|
- [ ] Shows SBOM match details
|
||||||
|
- [ ] Links to CVE source URLs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Step 2 - Component Identification
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show PURL, version, and location.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface ComponentEvidence {
|
||||||
|
purl: string;
|
||||||
|
version: string;
|
||||||
|
location: string;
|
||||||
|
ecosystem: string;
|
||||||
|
name: string;
|
||||||
|
namespace?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildComponentStep(evidence: ComponentEvidence): VerdictLadderStep {
|
||||||
|
return {
|
||||||
|
step: 2,
|
||||||
|
name: 'Component',
|
||||||
|
status: 'complete',
|
||||||
|
summary: `${evidence.name}@${evidence.version}`,
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'sbom_slice',
|
||||||
|
title: 'Component Identity',
|
||||||
|
preview: `PURL: ${evidence.purl}\nEcosystem: ${evidence.ecosystem}\nName: ${evidence.name}\nVersion: ${evidence.version}\nLocation: ${evidence.location}`
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows component PURL
|
||||||
|
- [ ] Displays version
|
||||||
|
- [ ] Shows file location in container
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Step 3 - Applicability
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show OVAL or version range match.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface ApplicabilityEvidence {
|
||||||
|
matchType: 'oval' | 'version_range' | 'exact';
|
||||||
|
ovalDefinition?: string;
|
||||||
|
versionRange?: string;
|
||||||
|
installedVersion: string;
|
||||||
|
result: 'applicable' | 'not_applicable' | 'unknown';
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildApplicabilityStep(evidence: ApplicabilityEvidence): VerdictLadderStep {
|
||||||
|
return {
|
||||||
|
step: 3,
|
||||||
|
name: 'Applicability',
|
||||||
|
status: evidence.result === 'applicable' ? 'complete'
|
||||||
|
: evidence.result === 'not_applicable' ? 'na'
|
||||||
|
: 'partial',
|
||||||
|
summary: evidence.result === 'applicable'
|
||||||
|
? `Version ${evidence.installedVersion} is in affected range`
|
||||||
|
: evidence.result === 'not_applicable'
|
||||||
|
? 'Version not in affected range'
|
||||||
|
: 'Could not determine applicability',
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'policy',
|
||||||
|
title: 'Applicability Check',
|
||||||
|
preview: evidence.matchType === 'oval'
|
||||||
|
? `OVAL Definition: ${evidence.ovalDefinition}`
|
||||||
|
: `Version Range: ${evidence.versionRange}\nInstalled: ${evidence.installedVersion}`
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows version range match
|
||||||
|
- [ ] OVAL definition if used
|
||||||
|
- [ ] Clear applicable/not-applicable status
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Step 4 - Reachability Evidence
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show static analysis call path.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface ReachabilityEvidence {
|
||||||
|
result: 'reachable' | 'not_reachable' | 'unknown';
|
||||||
|
analysisType: 'static' | 'dynamic' | 'both';
|
||||||
|
callPath?: string[];
|
||||||
|
confidence: number;
|
||||||
|
proofHash?: string;
|
||||||
|
proofSigned?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildReachabilityStep(evidence: ReachabilityEvidence): VerdictLadderStep {
|
||||||
|
return {
|
||||||
|
step: 4,
|
||||||
|
name: 'Reachability',
|
||||||
|
status: evidence.result === 'reachable' ? 'complete'
|
||||||
|
: evidence.result === 'not_reachable' ? 'na'
|
||||||
|
: 'missing',
|
||||||
|
summary: evidence.result === 'reachable'
|
||||||
|
? `Reachable (${(evidence.confidence * 100).toFixed(0)}% confidence)`
|
||||||
|
: evidence.result === 'not_reachable'
|
||||||
|
? 'Not reachable from entry points'
|
||||||
|
: 'Reachability unknown',
|
||||||
|
expandable: evidence.callPath !== undefined,
|
||||||
|
evidence: evidence.callPath ? [
|
||||||
|
{
|
||||||
|
type: 'reachability_proof',
|
||||||
|
title: 'Call Path',
|
||||||
|
hash: evidence.proofHash,
|
||||||
|
signed: evidence.proofSigned,
|
||||||
|
preview: evidence.callPath.map((fn, i) => `${' '.repeat(i)}→ ${fn}`).join('\n')
|
||||||
|
}
|
||||||
|
] : undefined
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows reachability result
|
||||||
|
- [ ] Displays call path if reachable
|
||||||
|
- [ ] Shows confidence percentage
|
||||||
|
- [ ] Indicates if proof is signed
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Step 5 - Runtime Confirmation
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show process trace or runtime signal.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface RuntimeEvidence {
|
||||||
|
observed: boolean;
|
||||||
|
signalType?: 'process_trace' | 'memory_access' | 'network_call';
|
||||||
|
timestamp?: Date;
|
||||||
|
processInfo?: {
|
||||||
|
pid: number;
|
||||||
|
name: string;
|
||||||
|
container: string;
|
||||||
|
};
|
||||||
|
stackTrace?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildRuntimeStep(evidence: RuntimeEvidence | null): VerdictLadderStep {
|
||||||
|
if (!evidence || !evidence.observed) {
|
||||||
|
return {
|
||||||
|
step: 5,
|
||||||
|
name: 'Runtime',
|
||||||
|
status: 'na',
|
||||||
|
summary: 'No runtime observation',
|
||||||
|
expandable: false
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
step: 5,
|
||||||
|
name: 'Runtime',
|
||||||
|
status: 'complete',
|
||||||
|
summary: `Observed via ${evidence.signalType} at ${evidence.timestamp?.toISOString()}`,
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'scan_log',
|
||||||
|
title: 'Runtime Observation',
|
||||||
|
preview: evidence.stackTrace ?? `Process: ${evidence.processInfo?.name} (PID ${evidence.processInfo?.pid})\nContainer: ${evidence.processInfo?.container}`
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows runtime observation if present
|
||||||
|
- [ ] Process/container info displayed
|
||||||
|
- [ ] Stack trace if available
|
||||||
|
- [ ] N/A status if no runtime data
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Step 6 - VEX Merge
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show lattice merge outcome with trust weights.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface VexMergeEvidence {
|
||||||
|
resultStatus: 'affected' | 'not_affected' | 'fixed' | 'under_investigation';
|
||||||
|
inputStatements: {
|
||||||
|
source: string;
|
||||||
|
status: string;
|
||||||
|
trustWeight: number;
|
||||||
|
}[];
|
||||||
|
hadConflicts: boolean;
|
||||||
|
winningSource?: string;
|
||||||
|
mergeTrace?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildVexStep(evidence: VexMergeEvidence): VerdictLadderStep {
|
||||||
|
const statusLabel = evidence.resultStatus.replace('_', ' ');
|
||||||
|
|
||||||
|
return {
|
||||||
|
step: 6,
|
||||||
|
name: 'VEX Merge',
|
||||||
|
status: evidence.resultStatus === 'not_affected' ? 'na'
|
||||||
|
: evidence.resultStatus === 'affected' ? 'complete'
|
||||||
|
: 'partial',
|
||||||
|
summary: evidence.hadConflicts
|
||||||
|
? `${statusLabel} (resolved from ${evidence.inputStatements.length} sources)`
|
||||||
|
: statusLabel,
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'vex_doc',
|
||||||
|
title: 'VEX Merge Result',
|
||||||
|
source: evidence.winningSource,
|
||||||
|
preview: evidence.inputStatements.map(s =>
|
||||||
|
`${s.source}: ${s.status} (trust: ${(s.trustWeight * 100).toFixed(0)}%)`
|
||||||
|
).join('\n') + (evidence.mergeTrace ? `\n\n${evidence.mergeTrace}` : '')
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows merged VEX status
|
||||||
|
- [ ] Lists all input statements
|
||||||
|
- [ ] Shows trust weights
|
||||||
|
- [ ] Displays merge trace if conflicts
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Step 7 - Policy Trace
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show policy rule to verdict mapping.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface PolicyTraceEvidence {
|
||||||
|
policyId: string;
|
||||||
|
policyVersion: string;
|
||||||
|
matchedRules: {
|
||||||
|
ruleId: string;
|
||||||
|
ruleName: string;
|
||||||
|
effect: 'allow' | 'deny' | 'warn';
|
||||||
|
condition: string;
|
||||||
|
}[];
|
||||||
|
finalDecision: 'ship' | 'block' | 'exception';
|
||||||
|
explanation: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildPolicyStep(evidence: PolicyTraceEvidence): VerdictLadderStep {
|
||||||
|
return {
|
||||||
|
step: 7,
|
||||||
|
name: 'Policy',
|
||||||
|
status: 'complete',
|
||||||
|
summary: `${evidence.matchedRules.length} rule(s) → ${evidence.finalDecision}`,
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'policy',
|
||||||
|
title: `Policy ${evidence.policyId} v${evidence.policyVersion}`,
|
||||||
|
preview: evidence.matchedRules.map(r =>
|
||||||
|
`${r.ruleId}: ${r.ruleName}\n Effect: ${r.effect}\n Condition: ${r.condition}`
|
||||||
|
).join('\n\n') + `\n\n${evidence.explanation}`
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows policy ID and version
|
||||||
|
- [ ] Lists matched rules
|
||||||
|
- [ ] Shows rule conditions
|
||||||
|
- [ ] Explains final decision
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Step 8 - Attestation
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show signature and transparency log entry.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
export interface AttestationEvidence {
|
||||||
|
attestationId: string;
|
||||||
|
predicateType: string;
|
||||||
|
signedBy: string;
|
||||||
|
signedAt: Date;
|
||||||
|
signatureAlgorithm: string;
|
||||||
|
rekorEntry?: {
|
||||||
|
logId: string;
|
||||||
|
logIndex: number;
|
||||||
|
url: string;
|
||||||
|
};
|
||||||
|
envelope?: object;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildAttestationStep(evidence: AttestationEvidence | null): VerdictLadderStep {
|
||||||
|
if (!evidence) {
|
||||||
|
return {
|
||||||
|
step: 8,
|
||||||
|
name: 'Attestation',
|
||||||
|
status: 'missing',
|
||||||
|
summary: 'Not attested',
|
||||||
|
expandable: false
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
step: 8,
|
||||||
|
name: 'Attestation',
|
||||||
|
status: 'complete',
|
||||||
|
summary: `Signed by ${evidence.signedBy}${evidence.rekorEntry ? ' (in Rekor)' : ''}`,
|
||||||
|
expandable: true,
|
||||||
|
evidence: [
|
||||||
|
{
|
||||||
|
type: 'provenance',
|
||||||
|
title: 'DSSE Attestation',
|
||||||
|
signed: true,
|
||||||
|
signedBy: evidence.signedBy,
|
||||||
|
hash: evidence.attestationId,
|
||||||
|
preview: `Type: ${evidence.predicateType}\nSigned: ${evidence.signedAt.toISOString()}\nAlgorithm: ${evidence.signatureAlgorithm}${evidence.rekorEntry ? `\n\nRekor Log Index: ${evidence.rekorEntry.logIndex}` : ''}`
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows signer identity
|
||||||
|
- [ ] Displays signature timestamp
|
||||||
|
- [ ] Links to Rekor entry if available
|
||||||
|
- [ ] Shows predicate type
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: Expand/Collapse Steps
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add expand all / collapse all controls.
|
||||||
|
|
||||||
|
**Implementation** - Add to component:
|
||||||
|
```typescript
|
||||||
|
// Add to verdict-ladder.component.ts
|
||||||
|
@ViewChildren(MatExpansionPanel) panels!: QueryList<MatExpansionPanel>;
|
||||||
|
|
||||||
|
expandAll(): void {
|
||||||
|
this.panels.forEach(panel => {
|
||||||
|
if (!panel.disabled) {
|
||||||
|
panel.open();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
collapseAll(): void {
|
||||||
|
this.panels.forEach(panel => panel.close());
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Add to template**:
|
||||||
|
```html
|
||||||
|
<div class="ladder-controls">
|
||||||
|
<button mat-button (click)="expandAll()">
|
||||||
|
<mat-icon>unfold_more</mat-icon>
|
||||||
|
Expand All
|
||||||
|
</button>
|
||||||
|
<button mat-button (click)="collapseAll()">
|
||||||
|
<mat-icon>unfold_less</mat-icon>
|
||||||
|
Collapse All
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Expand all button works
|
||||||
|
- [ ] Collapse all button works
|
||||||
|
- [ ] Disabled panels skipped
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | UI Team | Create verdict-ladder.component.ts |
|
||||||
|
| 2 | T2 | TODO | T1 | UI Team | Step 1: Detection sources |
|
||||||
|
| 3 | T3 | TODO | T1 | UI Team | Step 2: Component identification |
|
||||||
|
| 4 | T4 | TODO | T1 | UI Team | Step 3: Applicability |
|
||||||
|
| 5 | T5 | TODO | T1 | UI Team | Step 4: Reachability evidence |
|
||||||
|
| 6 | T6 | TODO | T1 | UI Team | Step 5: Runtime confirmation |
|
||||||
|
| 7 | T7 | TODO | T1 | UI Team | Step 6: VEX merge |
|
||||||
|
| 8 | T8 | TODO | T1 | UI Team | Step 7: Policy trace |
|
||||||
|
| 9 | T9 | TODO | T1 | UI Team | Step 8: Attestation |
|
||||||
|
| 10 | T10 | TODO | T1 | UI Team | Expand/collapse steps |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. Verdict Ladder identified as key explainability pattern. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| 8 steps | Decision | UI Team | Based on advisory: Detection→Attestation |
|
||||||
|
| Accordion UI | Decision | UI Team | Use Material expansion panels |
|
||||||
|
| Status colors | Decision | UI Team | complete=green, partial=yellow, missing=red, na=gray |
|
||||||
|
| Evidence types | Decision | UI Team | Map to existing TriageEvidenceType enum |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 10 tasks marked DONE
|
||||||
|
- [ ] All 8 steps visible in vertical ladder
|
||||||
|
- [ ] Each step shows evidence type and source
|
||||||
|
- [ ] Clicking step expands to show proof artifact
|
||||||
|
- [ ] Final attestation link at bottom
|
||||||
|
- [ ] Expand/collapse all works
|
||||||
|
- [ ] `ng build` succeeds
|
||||||
|
- [ ] `ng test` succeeds
|
||||||
799
docs/implplan/SPRINT_4200_0002_0003_delta_compare_view.md
Normal file
799
docs/implplan/SPRINT_4200_0002_0003_delta_compare_view.md
Normal file
@@ -0,0 +1,799 @@
|
|||||||
|
# Sprint 4200.0002.0003 · Delta/Compare View UI
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Create three-pane layout for comparing artifacts/verdicts
|
||||||
|
- Enable baseline selection (last green, previous release, custom)
|
||||||
|
- Show delta summary and categorized changes with evidence
|
||||||
|
|
||||||
|
**Working directory:** `src/Web/StellaOps.Web/src/app/features/compare/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4100.0002.0001 (Knowledge Snapshot Manifest)
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: Sprint 4200.0002.0001 ("Can I Ship?" Header), Sprint 4200.0002.0004 (CLI Compare)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Web/StellaOps.Web/AGENTS.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Smart Diff - Reproducibility as a Feature.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - How Top Scanners Shape Evidence‑First UX.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create compare-view.component.ts
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the main three-pane comparison layout.
|
||||||
|
|
||||||
|
**Implementation Path**: `compare-view/compare-view.component.ts` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
import { Component, OnInit, ChangeDetectionStrategy, signal, computed } from '@angular/core';
|
||||||
|
import { CommonModule } from '@angular/common';
|
||||||
|
import { MatSelectModule } from '@angular/material/select';
|
||||||
|
import { MatButtonModule } from '@angular/material/button';
|
||||||
|
import { MatIconModule } from '@angular/material/icon';
|
||||||
|
import { MatListModule } from '@angular/material/list';
|
||||||
|
import { MatChipsModule } from '@angular/material/chips';
|
||||||
|
import { MatSidenavModule } from '@angular/material/sidenav';
|
||||||
|
import { MatToolbarModule } from '@angular/material/toolbar';
|
||||||
|
import { ActivatedRoute } from '@angular/router';
|
||||||
|
|
||||||
|
export interface CompareTarget {
|
||||||
|
id: string;
|
||||||
|
type: 'artifact' | 'snapshot' | 'verdict';
|
||||||
|
label: string;
|
||||||
|
digest?: string;
|
||||||
|
timestamp: Date;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface DeltaCategory {
|
||||||
|
id: string;
|
||||||
|
name: string;
|
||||||
|
icon: string;
|
||||||
|
added: number;
|
||||||
|
removed: number;
|
||||||
|
changed: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface DeltaItem {
|
||||||
|
id: string;
|
||||||
|
category: string;
|
||||||
|
changeType: 'added' | 'removed' | 'changed';
|
||||||
|
title: string;
|
||||||
|
severity?: 'critical' | 'high' | 'medium' | 'low';
|
||||||
|
beforeValue?: string;
|
||||||
|
afterValue?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface EvidencePane {
|
||||||
|
itemId: string;
|
||||||
|
title: string;
|
||||||
|
beforeEvidence?: object;
|
||||||
|
afterEvidence?: object;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-compare-view',
|
||||||
|
standalone: true,
|
||||||
|
imports: [
|
||||||
|
CommonModule,
|
||||||
|
MatSelectModule,
|
||||||
|
MatButtonModule,
|
||||||
|
MatIconModule,
|
||||||
|
MatListModule,
|
||||||
|
MatChipsModule,
|
||||||
|
MatSidenavModule,
|
||||||
|
MatToolbarModule
|
||||||
|
],
|
||||||
|
templateUrl: './compare-view.component.html',
|
||||||
|
styleUrls: ['./compare-view.component.scss'],
|
||||||
|
changeDetection: ChangeDetectionStrategy.OnPush
|
||||||
|
})
|
||||||
|
export class CompareViewComponent implements OnInit {
|
||||||
|
// State
|
||||||
|
currentTarget = signal<CompareTarget | null>(null);
|
||||||
|
baselineTarget = signal<CompareTarget | null>(null);
|
||||||
|
categories = signal<DeltaCategory[]>([]);
|
||||||
|
selectedCategory = signal<string | null>(null);
|
||||||
|
items = signal<DeltaItem[]>([]);
|
||||||
|
selectedItem = signal<DeltaItem | null>(null);
|
||||||
|
evidence = signal<EvidencePane | null>(null);
|
||||||
|
viewMode = signal<'side-by-side' | 'unified'>('side-by-side');
|
||||||
|
|
||||||
|
// Computed
|
||||||
|
filteredItems = computed(() => {
|
||||||
|
const cat = this.selectedCategory();
|
||||||
|
if (!cat) return this.items();
|
||||||
|
return this.items().filter(i => i.category === cat);
|
||||||
|
});
|
||||||
|
|
||||||
|
deltaSummary = computed(() => {
|
||||||
|
const cats = this.categories();
|
||||||
|
return {
|
||||||
|
totalAdded: cats.reduce((sum, c) => sum + c.added, 0),
|
||||||
|
totalRemoved: cats.reduce((sum, c) => sum + c.removed, 0),
|
||||||
|
totalChanged: cats.reduce((sum, c) => sum + c.changed, 0)
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
// Baseline presets
|
||||||
|
baselinePresets = [
|
||||||
|
{ id: 'last-green', label: 'Last Green Build' },
|
||||||
|
{ id: 'previous-release', label: 'Previous Release' },
|
||||||
|
{ id: 'main-branch', label: 'Main Branch' },
|
||||||
|
{ id: 'custom', label: 'Custom...' }
|
||||||
|
];
|
||||||
|
|
||||||
|
constructor(
|
||||||
|
private route: ActivatedRoute,
|
||||||
|
private compareService: CompareService
|
||||||
|
) {}
|
||||||
|
|
||||||
|
ngOnInit(): void {
|
||||||
|
// Load from route params
|
||||||
|
const currentId = this.route.snapshot.paramMap.get('current');
|
||||||
|
const baselineId = this.route.snapshot.queryParamMap.get('baseline');
|
||||||
|
|
||||||
|
if (currentId) {
|
||||||
|
this.loadTarget(currentId, 'current');
|
||||||
|
}
|
||||||
|
if (baselineId) {
|
||||||
|
this.loadTarget(baselineId, 'baseline');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async loadTarget(id: string, type: 'current' | 'baseline'): Promise<void> {
|
||||||
|
const target = await this.compareService.getTarget(id);
|
||||||
|
if (type === 'current') {
|
||||||
|
this.currentTarget.set(target);
|
||||||
|
} else {
|
||||||
|
this.baselineTarget.set(target);
|
||||||
|
}
|
||||||
|
this.loadDelta();
|
||||||
|
}
|
||||||
|
|
||||||
|
async loadDelta(): Promise<void> {
|
||||||
|
const current = this.currentTarget();
|
||||||
|
const baseline = this.baselineTarget();
|
||||||
|
if (!current || !baseline) return;
|
||||||
|
|
||||||
|
const delta = await this.compareService.computeDelta(current.id, baseline.id);
|
||||||
|
this.categories.set(delta.categories);
|
||||||
|
this.items.set(delta.items);
|
||||||
|
}
|
||||||
|
|
||||||
|
selectCategory(categoryId: string): void {
|
||||||
|
this.selectedCategory.set(
|
||||||
|
this.selectedCategory() === categoryId ? null : categoryId
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
selectItem(item: DeltaItem): void {
|
||||||
|
this.selectedItem.set(item);
|
||||||
|
this.loadEvidence(item);
|
||||||
|
}
|
||||||
|
|
||||||
|
async loadEvidence(item: DeltaItem): Promise<void> {
|
||||||
|
const current = this.currentTarget();
|
||||||
|
const baseline = this.baselineTarget();
|
||||||
|
if (!current || !baseline) return;
|
||||||
|
|
||||||
|
const evidence = await this.compareService.getItemEvidence(
|
||||||
|
item.id,
|
||||||
|
baseline.id,
|
||||||
|
current.id
|
||||||
|
);
|
||||||
|
this.evidence.set(evidence);
|
||||||
|
}
|
||||||
|
|
||||||
|
toggleViewMode(): void {
|
||||||
|
this.viewMode.set(
|
||||||
|
this.viewMode() === 'side-by-side' ? 'unified' : 'side-by-side'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
getChangeIcon(changeType: 'added' | 'removed' | 'changed'): string {
|
||||||
|
switch (changeType) {
|
||||||
|
case 'added': return 'add_circle';
|
||||||
|
case 'removed': return 'remove_circle';
|
||||||
|
case 'changed': return 'change_circle';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
getChangeClass(changeType: 'added' | 'removed' | 'changed'): string {
|
||||||
|
return `change-${changeType}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Template** (`compare-view.component.html`):
|
||||||
|
```html
|
||||||
|
<div class="compare-view">
|
||||||
|
<!-- Header with baseline selector -->
|
||||||
|
<mat-toolbar class="compare-toolbar">
|
||||||
|
<div class="target-selector">
|
||||||
|
<span class="label">Comparing:</span>
|
||||||
|
<span class="target current">{{ currentTarget()?.label }}</span>
|
||||||
|
<mat-icon>arrow_forward</mat-icon>
|
||||||
|
<mat-select
|
||||||
|
[value]="baselineTarget()?.id"
|
||||||
|
(selectionChange)="loadTarget($event.value, 'baseline')"
|
||||||
|
placeholder="Select baseline"
|
||||||
|
>
|
||||||
|
<mat-option *ngFor="let preset of baselinePresets" [value]="preset.id">
|
||||||
|
{{ preset.label }}
|
||||||
|
</mat-option>
|
||||||
|
</mat-select>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="toolbar-actions">
|
||||||
|
<button mat-icon-button (click)="toggleViewMode()" matTooltip="Toggle view mode">
|
||||||
|
<mat-icon>{{ viewMode() === 'side-by-side' ? 'view_agenda' : 'view_column' }}</mat-icon>
|
||||||
|
</button>
|
||||||
|
<button mat-stroked-button (click)="exportReport()">
|
||||||
|
<mat-icon>download</mat-icon>
|
||||||
|
Export
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</mat-toolbar>
|
||||||
|
|
||||||
|
<!-- Delta Summary Strip -->
|
||||||
|
<div class="delta-summary" *ngIf="deltaSummary() as summary">
|
||||||
|
<div class="summary-chip added">
|
||||||
|
<mat-icon>add</mat-icon>
|
||||||
|
+{{ summary.totalAdded }} added
|
||||||
|
</div>
|
||||||
|
<div class="summary-chip removed">
|
||||||
|
<mat-icon>remove</mat-icon>
|
||||||
|
-{{ summary.totalRemoved }} removed
|
||||||
|
</div>
|
||||||
|
<div class="summary-chip changed">
|
||||||
|
<mat-icon>swap_horiz</mat-icon>
|
||||||
|
{{ summary.totalChanged }} changed
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Three-pane layout -->
|
||||||
|
<div class="panes-container">
|
||||||
|
<!-- Pane 1: Categories -->
|
||||||
|
<div class="pane categories-pane">
|
||||||
|
<h4>Categories</h4>
|
||||||
|
<mat-nav-list>
|
||||||
|
<mat-list-item
|
||||||
|
*ngFor="let cat of categories()"
|
||||||
|
[class.selected]="selectedCategory() === cat.id"
|
||||||
|
(click)="selectCategory(cat.id)"
|
||||||
|
>
|
||||||
|
<mat-icon matListItemIcon>{{ cat.icon }}</mat-icon>
|
||||||
|
<span matListItemTitle>{{ cat.name }}</span>
|
||||||
|
<span matListItemLine class="category-counts">
|
||||||
|
<span class="added" *ngIf="cat.added">+{{ cat.added }}</span>
|
||||||
|
<span class="removed" *ngIf="cat.removed">-{{ cat.removed }}</span>
|
||||||
|
<span class="changed" *ngIf="cat.changed">~{{ cat.changed }}</span>
|
||||||
|
</span>
|
||||||
|
</mat-list-item>
|
||||||
|
</mat-nav-list>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Pane 2: Items -->
|
||||||
|
<div class="pane items-pane">
|
||||||
|
<h4>Changes</h4>
|
||||||
|
<mat-nav-list>
|
||||||
|
<mat-list-item
|
||||||
|
*ngFor="let item of filteredItems()"
|
||||||
|
[class.selected]="selectedItem()?.id === item.id"
|
||||||
|
(click)="selectItem(item)"
|
||||||
|
>
|
||||||
|
<mat-icon matListItemIcon [class]="getChangeClass(item.changeType)">
|
||||||
|
{{ getChangeIcon(item.changeType) }}
|
||||||
|
</mat-icon>
|
||||||
|
<span matListItemTitle>{{ item.title }}</span>
|
||||||
|
<mat-chip *ngIf="item.severity" [class]="'severity-' + item.severity">
|
||||||
|
{{ item.severity }}
|
||||||
|
</mat-chip>
|
||||||
|
</mat-list-item>
|
||||||
|
</mat-nav-list>
|
||||||
|
|
||||||
|
<div class="empty-state" *ngIf="filteredItems().length === 0">
|
||||||
|
<mat-icon>check_circle</mat-icon>
|
||||||
|
<p>No changes in this category</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Pane 3: Evidence -->
|
||||||
|
<div class="pane evidence-pane">
|
||||||
|
<h4>Evidence</h4>
|
||||||
|
|
||||||
|
<div *ngIf="evidence() as ev; else noEvidence">
|
||||||
|
<div class="evidence-header">
|
||||||
|
<span>{{ ev.title }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="evidence-content" [ngSwitch]="viewMode()">
|
||||||
|
<!-- Side-by-side view -->
|
||||||
|
<div *ngSwitchCase="'side-by-side'" class="side-by-side">
|
||||||
|
<div class="before">
|
||||||
|
<h5>Baseline</h5>
|
||||||
|
<pre>{{ ev.beforeEvidence | json }}</pre>
|
||||||
|
</div>
|
||||||
|
<div class="after">
|
||||||
|
<h5>Current</h5>
|
||||||
|
<pre>{{ ev.afterEvidence | json }}</pre>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Unified view -->
|
||||||
|
<div *ngSwitchCase="'unified'" class="unified">
|
||||||
|
<pre class="diff-view">
|
||||||
|
<!-- Diff highlighting would go here -->
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<ng-template #noEvidence>
|
||||||
|
<div class="empty-state">
|
||||||
|
<mat-icon>touch_app</mat-icon>
|
||||||
|
<p>Select an item to view evidence</p>
|
||||||
|
</div>
|
||||||
|
</ng-template>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Styles** (`compare-view.component.scss`):
|
||||||
|
```scss
|
||||||
|
.compare-view {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
height: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.compare-toolbar {
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
padding: 8px 16px;
|
||||||
|
background: var(--surface-container);
|
||||||
|
|
||||||
|
.target-selector {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 12px;
|
||||||
|
|
||||||
|
.label {
|
||||||
|
color: var(--on-surface-variant);
|
||||||
|
}
|
||||||
|
|
||||||
|
.target {
|
||||||
|
font-weight: 500;
|
||||||
|
padding: 4px 12px;
|
||||||
|
background: var(--primary-container);
|
||||||
|
border-radius: 16px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.toolbar-actions {
|
||||||
|
display: flex;
|
||||||
|
gap: 8px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.delta-summary {
|
||||||
|
display: flex;
|
||||||
|
gap: 16px;
|
||||||
|
padding: 12px 16px;
|
||||||
|
background: var(--surface);
|
||||||
|
border-bottom: 1px solid var(--outline-variant);
|
||||||
|
|
||||||
|
.summary-chip {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 4px;
|
||||||
|
padding: 4px 12px;
|
||||||
|
border-radius: 16px;
|
||||||
|
font-weight: 500;
|
||||||
|
|
||||||
|
&.added {
|
||||||
|
background: var(--success-container);
|
||||||
|
color: var(--on-success-container);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.removed {
|
||||||
|
background: var(--error-container);
|
||||||
|
color: var(--on-error-container);
|
||||||
|
}
|
||||||
|
|
||||||
|
&.changed {
|
||||||
|
background: var(--warning-container);
|
||||||
|
color: var(--on-warning-container);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.panes-container {
|
||||||
|
display: flex;
|
||||||
|
flex: 1;
|
||||||
|
overflow: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
|
.pane {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
border-right: 1px solid var(--outline-variant);
|
||||||
|
overflow-y: auto;
|
||||||
|
|
||||||
|
h4 {
|
||||||
|
padding: 12px 16px;
|
||||||
|
margin: 0;
|
||||||
|
background: var(--surface-variant);
|
||||||
|
font-size: 0.875rem;
|
||||||
|
font-weight: 600;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
&:last-child {
|
||||||
|
border-right: none;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.categories-pane {
|
||||||
|
width: 220px;
|
||||||
|
flex-shrink: 0;
|
||||||
|
|
||||||
|
.category-counts {
|
||||||
|
display: flex;
|
||||||
|
gap: 8px;
|
||||||
|
font-size: 0.75rem;
|
||||||
|
|
||||||
|
.added { color: var(--success); }
|
||||||
|
.removed { color: var(--error); }
|
||||||
|
.changed { color: var(--warning); }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.items-pane {
|
||||||
|
width: 320px;
|
||||||
|
flex-shrink: 0;
|
||||||
|
|
||||||
|
.change-added { color: var(--success); }
|
||||||
|
.change-removed { color: var(--error); }
|
||||||
|
.change-changed { color: var(--warning); }
|
||||||
|
|
||||||
|
.severity-critical { background: var(--error); color: white; }
|
||||||
|
.severity-high { background: var(--warning); color: black; }
|
||||||
|
.severity-medium { background: var(--tertiary); color: white; }
|
||||||
|
.severity-low { background: var(--outline); color: white; }
|
||||||
|
}
|
||||||
|
|
||||||
|
.evidence-pane {
|
||||||
|
flex: 1;
|
||||||
|
|
||||||
|
.evidence-content {
|
||||||
|
padding: 16px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.side-by-side {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 1fr 1fr;
|
||||||
|
gap: 16px;
|
||||||
|
|
||||||
|
.before, .after {
|
||||||
|
h5 {
|
||||||
|
margin: 0 0 8px;
|
||||||
|
font-size: 0.875rem;
|
||||||
|
color: var(--on-surface-variant);
|
||||||
|
}
|
||||||
|
|
||||||
|
pre {
|
||||||
|
background: var(--surface-variant);
|
||||||
|
padding: 12px;
|
||||||
|
border-radius: 8px;
|
||||||
|
overflow-x: auto;
|
||||||
|
font-size: 0.75rem;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.before pre {
|
||||||
|
border-left: 3px solid var(--error);
|
||||||
|
}
|
||||||
|
|
||||||
|
.after pre {
|
||||||
|
border-left: 3px solid var(--success);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.unified {
|
||||||
|
.diff-view {
|
||||||
|
background: var(--surface-variant);
|
||||||
|
padding: 12px;
|
||||||
|
border-radius: 8px;
|
||||||
|
|
||||||
|
.added { background: rgba(var(--success-rgb), 0.2); }
|
||||||
|
.removed { background: rgba(var(--error-rgb), 0.2); }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.empty-state {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
padding: 48px;
|
||||||
|
color: var(--on-surface-variant);
|
||||||
|
|
||||||
|
mat-icon {
|
||||||
|
font-size: 48px;
|
||||||
|
width: 48px;
|
||||||
|
height: 48px;
|
||||||
|
margin-bottom: 16px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
mat-list-item.selected {
|
||||||
|
background: var(--primary-container);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Three-pane layout implemented
|
||||||
|
- [ ] Responsive to screen size
|
||||||
|
- [ ] Categories, items, evidence panes work
|
||||||
|
- [ ] Selection highlighting works
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Baseline Selector
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement baseline selection with presets.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 with `baselinePresets` and mat-select.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] "Last Green" preset
|
||||||
|
- [ ] "Previous Release" preset
|
||||||
|
- [ ] "Main Branch" preset
|
||||||
|
- [ ] Custom selection option
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Delta Summary Strip
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Show added/removed/changed counts.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 template with `.delta-summary`.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows total added count
|
||||||
|
- [ ] Shows total removed count
|
||||||
|
- [ ] Shows total changed count
|
||||||
|
- [ ] Color coded chips
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Categories Pane
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Left pane showing change categories.
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
// Category definitions
|
||||||
|
const DELTA_CATEGORIES: DeltaCategory[] = [
|
||||||
|
{ id: 'sbom', name: 'SBOM Changes', icon: 'inventory_2', added: 0, removed: 0, changed: 0 },
|
||||||
|
{ id: 'reachability', name: 'Reachability', icon: 'route', added: 0, removed: 0, changed: 0 },
|
||||||
|
{ id: 'vex', name: 'VEX Status', icon: 'description', added: 0, removed: 0, changed: 0 },
|
||||||
|
{ id: 'policy', name: 'Policy', icon: 'policy', added: 0, removed: 0, changed: 0 },
|
||||||
|
{ id: 'findings', name: 'Findings', icon: 'bug_report', added: 0, removed: 0, changed: 0 },
|
||||||
|
{ id: 'unknowns', name: 'Unknowns', icon: 'help', added: 0, removed: 0, changed: 0 }
|
||||||
|
];
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] SBOM, Reachability, VEX, Policy categories
|
||||||
|
- [ ] Counts per category
|
||||||
|
- [ ] Click to filter items
|
||||||
|
- [ ] Selection highlighting
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Items Pane
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Middle pane showing list of changes.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 template.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] List of changes filtered by category
|
||||||
|
- [ ] Add/remove/change icons
|
||||||
|
- [ ] Severity chips
|
||||||
|
- [ ] Click to select
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Proof Pane
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Right pane showing evidence for selected item.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 template with side-by-side and unified views.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Shows before/after evidence
|
||||||
|
- [ ] Side-by-side view
|
||||||
|
- [ ] Unified diff view
|
||||||
|
- [ ] Empty state when no selection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Before/After Toggle
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T6
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Toggle between side-by-side and unified view.
|
||||||
|
|
||||||
|
**Implementation**: Included in T1 with `viewMode` signal.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Toggle button in toolbar
|
||||||
|
- [ ] Side-by-side shows two columns
|
||||||
|
- [ ] Unified shows inline diff
|
||||||
|
- [ ] State preserved during navigation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Export Delta Report
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Export comparison as JSON or PDF.
|
||||||
|
|
||||||
|
**Implementation Path**: Add export service
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// compare-export.service.ts
|
||||||
|
@Injectable({ providedIn: 'root' })
|
||||||
|
export class CompareExportService {
|
||||||
|
async exportJson(
|
||||||
|
current: CompareTarget,
|
||||||
|
baseline: CompareTarget,
|
||||||
|
categories: DeltaCategory[],
|
||||||
|
items: DeltaItem[]
|
||||||
|
): Promise<void> {
|
||||||
|
const report = {
|
||||||
|
exportedAt: new Date().toISOString(),
|
||||||
|
comparison: {
|
||||||
|
current: { id: current.id, label: current.label, digest: current.digest },
|
||||||
|
baseline: { id: baseline.id, label: baseline.label, digest: baseline.digest }
|
||||||
|
},
|
||||||
|
summary: {
|
||||||
|
added: categories.reduce((sum, c) => sum + c.added, 0),
|
||||||
|
removed: categories.reduce((sum, c) => sum + c.removed, 0),
|
||||||
|
changed: categories.reduce((sum, c) => sum + c.changed, 0)
|
||||||
|
},
|
||||||
|
categories,
|
||||||
|
items
|
||||||
|
};
|
||||||
|
|
||||||
|
const blob = new Blob([JSON.stringify(report, null, 2)], { type: 'application/json' });
|
||||||
|
const url = URL.createObjectURL(blob);
|
||||||
|
const a = document.createElement('a');
|
||||||
|
a.href = url;
|
||||||
|
a.download = `delta-report-${current.id}-vs-${baseline.id}.json`;
|
||||||
|
a.click();
|
||||||
|
URL.revokeObjectURL(url);
|
||||||
|
}
|
||||||
|
|
||||||
|
async exportPdf(
|
||||||
|
current: CompareTarget,
|
||||||
|
baseline: CompareTarget,
|
||||||
|
categories: DeltaCategory[],
|
||||||
|
items: DeltaItem[]
|
||||||
|
): Promise<void> {
|
||||||
|
// PDF generation using jsPDF or server-side
|
||||||
|
// Implementation depends on PDF library choice
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Export button in toolbar
|
||||||
|
- [ ] JSON export works
|
||||||
|
- [ ] PDF export works
|
||||||
|
- [ ] Filename includes comparison IDs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | UI Team | Create compare-view.component.ts |
|
||||||
|
| 2 | T2 | TODO | T1 | UI Team | Baseline selector |
|
||||||
|
| 3 | T3 | TODO | T1 | UI Team | Delta summary strip |
|
||||||
|
| 4 | T4 | TODO | T1 | UI Team | Categories pane |
|
||||||
|
| 5 | T5 | TODO | T1, T4 | UI Team | Items pane |
|
||||||
|
| 6 | T6 | TODO | T1, T5 | UI Team | Proof pane |
|
||||||
|
| 7 | T7 | TODO | T6 | UI Team | Before/After toggle |
|
||||||
|
| 8 | T8 | TODO | T1 | UI Team | Export delta report |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. Smart-Diff UI identified as key comparison feature. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Three-pane layout | Decision | UI Team | Categories → Items → Evidence |
|
||||||
|
| Baseline presets | Decision | UI Team | Last green, previous release, main, custom |
|
||||||
|
| View modes | Decision | UI Team | Side-by-side and unified diff |
|
||||||
|
| Categories | Decision | UI Team | SBOM, Reachability, VEX, Policy, Findings, Unknowns |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 8 tasks marked DONE
|
||||||
|
- [ ] Baseline can be selected
|
||||||
|
- [ ] Delta summary shows counts
|
||||||
|
- [ ] Three-pane layout works
|
||||||
|
- [ ] Evidence accessible for each change
|
||||||
|
- [ ] Export works (JSON/PDF)
|
||||||
|
- [ ] `ng build` succeeds
|
||||||
|
- [ ] `ng test` succeeds
|
||||||
930
docs/implplan/SPRINT_4200_0002_0004_cli_compare.md
Normal file
930
docs/implplan/SPRINT_4200_0002_0004_cli_compare.md
Normal file
@@ -0,0 +1,930 @@
|
|||||||
|
# Sprint 4200.0002.0004 · CLI `stella compare` Command
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement CLI commands for comparing artifacts, snapshots, and verdicts
|
||||||
|
- Support multiple output formats (table, JSON, SARIF)
|
||||||
|
- Enable baseline options for CI/CD integration
|
||||||
|
|
||||||
|
**Working directory:** `src/Cli/StellaOps.Cli/Commands/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4100.0002.0001 (Knowledge Snapshot Manifest)
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: Sprint 4200.0002.0003 (Delta/Compare View UI)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Cli/StellaOps.Cli/AGENTS.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Smart Diff - Reproducibility as a Feature.md`
|
||||||
|
- Existing CLI patterns in `src/Cli/StellaOps.Cli/Commands/`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create CompareCommandGroup.cs
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the parent command group for `stella compare`.
|
||||||
|
|
||||||
|
**Implementation Path**: `Commands/Compare/CompareCommandGroup.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
using System.CommandLine;
|
||||||
|
|
||||||
|
namespace StellaOps.Cli.Commands.Compare;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Parent command group for comparison operations.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class CompareCommandGroup : Command
|
||||||
|
{
|
||||||
|
public CompareCommandGroup() : base("compare", "Compare artifacts, snapshots, or verdicts")
|
||||||
|
{
|
||||||
|
AddCommand(new CompareArtifactsCommand());
|
||||||
|
AddCommand(new CompareSnapshotsCommand());
|
||||||
|
AddCommand(new CompareVerdictsCommand());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `CompareCommandGroup.cs` file created
|
||||||
|
- [ ] Parent command `stella compare` works
|
||||||
|
- [ ] Help text displayed for subcommands
|
||||||
|
- [ ] Registered in root command
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Add `compare artifacts` Command
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Compare two container image digests.
|
||||||
|
|
||||||
|
**Implementation Path**: `Commands/Compare/CompareArtifactsCommand.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
using System.CommandLine;
|
||||||
|
using System.CommandLine.Invocation;
|
||||||
|
|
||||||
|
namespace StellaOps.Cli.Commands.Compare;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compares two container artifacts by digest.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class CompareArtifactsCommand : Command
|
||||||
|
{
|
||||||
|
public CompareArtifactsCommand() : base("artifacts", "Compare two container image artifacts")
|
||||||
|
{
|
||||||
|
var currentArg = new Argument<string>("current", "Current artifact reference (image@sha256:...)");
|
||||||
|
var baselineArg = new Argument<string>("baseline", "Baseline artifact reference");
|
||||||
|
|
||||||
|
var formatOption = new Option<OutputFormat>(
|
||||||
|
["--format", "-f"],
|
||||||
|
() => OutputFormat.Table,
|
||||||
|
"Output format (table, json, sarif)");
|
||||||
|
|
||||||
|
var outputOption = new Option<FileInfo?>(
|
||||||
|
["--output", "-o"],
|
||||||
|
"Output file path (stdout if not specified)");
|
||||||
|
|
||||||
|
var categoriesOption = new Option<string[]>(
|
||||||
|
["--categories", "-c"],
|
||||||
|
() => Array.Empty<string>(),
|
||||||
|
"Filter to specific categories (sbom, vex, reachability, policy)");
|
||||||
|
|
||||||
|
var severityOption = new Option<string?>(
|
||||||
|
"--min-severity",
|
||||||
|
"Minimum severity to include (critical, high, medium, low)");
|
||||||
|
|
||||||
|
AddArgument(currentArg);
|
||||||
|
AddArgument(baselineArg);
|
||||||
|
AddOption(formatOption);
|
||||||
|
AddOption(outputOption);
|
||||||
|
AddOption(categoriesOption);
|
||||||
|
AddOption(severityOption);
|
||||||
|
|
||||||
|
this.SetHandler(ExecuteAsync,
|
||||||
|
currentArg, baselineArg, formatOption, outputOption, categoriesOption, severityOption);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task ExecuteAsync(
|
||||||
|
string current,
|
||||||
|
string baseline,
|
||||||
|
OutputFormat format,
|
||||||
|
FileInfo? output,
|
||||||
|
string[] categories,
|
||||||
|
string? minSeverity)
|
||||||
|
{
|
||||||
|
var console = AnsiConsole.Create(new AnsiConsoleSettings());
|
||||||
|
|
||||||
|
console.MarkupLine($"[blue]Comparing artifacts...[/]");
|
||||||
|
console.MarkupLine($" Current: [green]{current}[/]");
|
||||||
|
console.MarkupLine($" Baseline: [yellow]{baseline}[/]");
|
||||||
|
|
||||||
|
// Parse artifact references
|
||||||
|
var currentRef = ArtifactReference.Parse(current);
|
||||||
|
var baselineRef = ArtifactReference.Parse(baseline);
|
||||||
|
|
||||||
|
// Compute delta
|
||||||
|
var comparer = new ArtifactComparer(_scannerClient, _snapshotService);
|
||||||
|
var delta = await comparer.CompareAsync(currentRef, baselineRef);
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
if (categories.Length > 0)
|
||||||
|
{
|
||||||
|
delta = delta.FilterByCategories(categories);
|
||||||
|
}
|
||||||
|
if (!string.IsNullOrEmpty(minSeverity))
|
||||||
|
{
|
||||||
|
delta = delta.FilterBySeverity(Enum.Parse<Severity>(minSeverity, ignoreCase: true));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format output
|
||||||
|
var formatter = GetFormatter(format);
|
||||||
|
var result = formatter.Format(delta);
|
||||||
|
|
||||||
|
// Write output
|
||||||
|
if (output is not null)
|
||||||
|
{
|
||||||
|
await File.WriteAllTextAsync(output.FullName, result);
|
||||||
|
console.MarkupLine($"[green]Output written to {output.FullName}[/]");
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
console.WriteLine(result);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Exit code based on delta
|
||||||
|
if (delta.HasBlockingChanges)
|
||||||
|
{
|
||||||
|
Environment.ExitCode = 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum OutputFormat
|
||||||
|
{
|
||||||
|
Table,
|
||||||
|
Json,
|
||||||
|
Sarif
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `stella compare artifacts img1@sha256:a img2@sha256:b` works
|
||||||
|
- [ ] Table output by default
|
||||||
|
- [ ] JSON output with `--format json`
|
||||||
|
- [ ] SARIF output with `--format sarif`
|
||||||
|
- [ ] Category filtering works
|
||||||
|
- [ ] Severity filtering works
|
||||||
|
- [ ] Exit code 1 if blocking changes
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Add `compare snapshots` Command
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Compare two knowledge snapshots.
|
||||||
|
|
||||||
|
**Implementation Path**: `Commands/Compare/CompareSnapshotsCommand.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Cli.Commands.Compare;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compares two knowledge snapshots.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class CompareSnapshotsCommand : Command
|
||||||
|
{
|
||||||
|
public CompareSnapshotsCommand() : base("snapshots", "Compare two knowledge snapshots")
|
||||||
|
{
|
||||||
|
var currentArg = new Argument<string>("current", "Current snapshot ID (ksm:sha256:...)");
|
||||||
|
var baselineArg = new Argument<string>("baseline", "Baseline snapshot ID");
|
||||||
|
|
||||||
|
var formatOption = new Option<OutputFormat>(
|
||||||
|
["--format", "-f"],
|
||||||
|
() => OutputFormat.Table,
|
||||||
|
"Output format");
|
||||||
|
|
||||||
|
var outputOption = new Option<FileInfo?>(
|
||||||
|
["--output", "-o"],
|
||||||
|
"Output file path");
|
||||||
|
|
||||||
|
var showSourcesOption = new Option<bool>(
|
||||||
|
"--show-sources",
|
||||||
|
() => false,
|
||||||
|
"Show detailed source changes");
|
||||||
|
|
||||||
|
AddArgument(currentArg);
|
||||||
|
AddArgument(baselineArg);
|
||||||
|
AddOption(formatOption);
|
||||||
|
AddOption(outputOption);
|
||||||
|
AddOption(showSourcesOption);
|
||||||
|
|
||||||
|
this.SetHandler(ExecuteAsync,
|
||||||
|
currentArg, baselineArg, formatOption, outputOption, showSourcesOption);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task ExecuteAsync(
|
||||||
|
string current,
|
||||||
|
string baseline,
|
||||||
|
OutputFormat format,
|
||||||
|
FileInfo? output,
|
||||||
|
bool showSources)
|
||||||
|
{
|
||||||
|
var console = AnsiConsole.Create(new AnsiConsoleSettings());
|
||||||
|
|
||||||
|
// Validate snapshot IDs
|
||||||
|
if (!current.StartsWith("ksm:"))
|
||||||
|
{
|
||||||
|
console.MarkupLine("[red]Error: Current must be a snapshot ID (ksm:sha256:...)[/]");
|
||||||
|
Environment.ExitCode = 1;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.MarkupLine($"[blue]Comparing snapshots...[/]");
|
||||||
|
console.MarkupLine($" Current: [green]{current}[/]");
|
||||||
|
console.MarkupLine($" Baseline: [yellow]{baseline}[/]");
|
||||||
|
|
||||||
|
// Load snapshots
|
||||||
|
var currentSnapshot = await _snapshotService.GetSnapshotAsync(current);
|
||||||
|
var baselineSnapshot = await _snapshotService.GetSnapshotAsync(baseline);
|
||||||
|
|
||||||
|
if (currentSnapshot is null || baselineSnapshot is null)
|
||||||
|
{
|
||||||
|
console.MarkupLine("[red]Error: One or both snapshots not found[/]");
|
||||||
|
Environment.ExitCode = 1;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compute delta
|
||||||
|
var delta = ComputeSnapshotDelta(currentSnapshot, baselineSnapshot);
|
||||||
|
|
||||||
|
// Format output
|
||||||
|
if (format == OutputFormat.Table)
|
||||||
|
{
|
||||||
|
RenderSnapshotDeltaTable(console, delta, showSources);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
var formatter = GetFormatter(format);
|
||||||
|
var result = formatter.Format(delta);
|
||||||
|
|
||||||
|
if (output is not null)
|
||||||
|
{
|
||||||
|
await File.WriteAllTextAsync(output.FullName, result);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
console.WriteLine(result);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void RenderSnapshotDeltaTable(
|
||||||
|
IAnsiConsole console,
|
||||||
|
SnapshotDelta delta,
|
||||||
|
bool showSources)
|
||||||
|
{
|
||||||
|
var table = new Table();
|
||||||
|
table.AddColumn("Category");
|
||||||
|
table.AddColumn("Added");
|
||||||
|
table.AddColumn("Removed");
|
||||||
|
table.AddColumn("Changed");
|
||||||
|
|
||||||
|
table.AddRow("Advisory Feeds",
|
||||||
|
delta.AddedFeeds.Count.ToString(),
|
||||||
|
delta.RemovedFeeds.Count.ToString(),
|
||||||
|
delta.ChangedFeeds.Count.ToString());
|
||||||
|
|
||||||
|
table.AddRow("VEX Documents",
|
||||||
|
delta.AddedVex.Count.ToString(),
|
||||||
|
delta.RemovedVex.Count.ToString(),
|
||||||
|
delta.ChangedVex.Count.ToString());
|
||||||
|
|
||||||
|
table.AddRow("Policy Rules",
|
||||||
|
delta.AddedPolicies.Count.ToString(),
|
||||||
|
delta.RemovedPolicies.Count.ToString(),
|
||||||
|
delta.ChangedPolicies.Count.ToString());
|
||||||
|
|
||||||
|
table.AddRow("Trust Roots",
|
||||||
|
delta.AddedTrust.Count.ToString(),
|
||||||
|
delta.RemovedTrust.Count.ToString(),
|
||||||
|
delta.ChangedTrust.Count.ToString());
|
||||||
|
|
||||||
|
console.Write(table);
|
||||||
|
|
||||||
|
if (showSources)
|
||||||
|
{
|
||||||
|
console.WriteLine();
|
||||||
|
console.MarkupLine("[bold]Source Details:[/]");
|
||||||
|
|
||||||
|
foreach (var source in delta.AllChangedSources)
|
||||||
|
{
|
||||||
|
console.MarkupLine($" {source.ChangeType}: {source.Name} ({source.Type})");
|
||||||
|
console.MarkupLine($" Before: {source.BeforeDigest ?? "N/A"}");
|
||||||
|
console.MarkupLine($" After: {source.AfterDigest ?? "N/A"}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `stella compare snapshots ksm:abc ksm:def` works
|
||||||
|
- [ ] Shows delta by source type
|
||||||
|
- [ ] `--show-sources` shows detailed changes
|
||||||
|
- [ ] JSON/SARIF output works
|
||||||
|
- [ ] Validates snapshot ID format
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Add `compare verdicts` Command
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Compare two verdict IDs.
|
||||||
|
|
||||||
|
**Implementation Path**: `Commands/Compare/CompareVerdictsCommand.cs` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Cli.Commands.Compare;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compares two verdicts.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class CompareVerdictsCommand : Command
|
||||||
|
{
|
||||||
|
public CompareVerdictsCommand() : base("verdicts", "Compare two verdicts")
|
||||||
|
{
|
||||||
|
var currentArg = new Argument<string>("current", "Current verdict ID");
|
||||||
|
var baselineArg = new Argument<string>("baseline", "Baseline verdict ID");
|
||||||
|
|
||||||
|
var formatOption = new Option<OutputFormat>(
|
||||||
|
["--format", "-f"],
|
||||||
|
() => OutputFormat.Table,
|
||||||
|
"Output format");
|
||||||
|
|
||||||
|
var showFindingsOption = new Option<bool>(
|
||||||
|
"--show-findings",
|
||||||
|
() => false,
|
||||||
|
"Show individual finding changes");
|
||||||
|
|
||||||
|
AddArgument(currentArg);
|
||||||
|
AddArgument(baselineArg);
|
||||||
|
AddOption(formatOption);
|
||||||
|
AddOption(showFindingsOption);
|
||||||
|
|
||||||
|
this.SetHandler(ExecuteAsync,
|
||||||
|
currentArg, baselineArg, formatOption, showFindingsOption);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task ExecuteAsync(
|
||||||
|
string current,
|
||||||
|
string baseline,
|
||||||
|
OutputFormat format,
|
||||||
|
bool showFindings)
|
||||||
|
{
|
||||||
|
var console = AnsiConsole.Create(new AnsiConsoleSettings());
|
||||||
|
|
||||||
|
console.MarkupLine($"[blue]Comparing verdicts...[/]");
|
||||||
|
|
||||||
|
var currentVerdict = await _verdictService.GetVerdictAsync(current);
|
||||||
|
var baselineVerdict = await _verdictService.GetVerdictAsync(baseline);
|
||||||
|
|
||||||
|
if (currentVerdict is null || baselineVerdict is null)
|
||||||
|
{
|
||||||
|
console.MarkupLine("[red]Error: One or both verdicts not found[/]");
|
||||||
|
Environment.ExitCode = 1;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Show verdict comparison
|
||||||
|
var table = new Table();
|
||||||
|
table.AddColumn("");
|
||||||
|
table.AddColumn("Baseline");
|
||||||
|
table.AddColumn("Current");
|
||||||
|
|
||||||
|
table.AddRow("Decision",
|
||||||
|
baselineVerdict.Decision.ToString(),
|
||||||
|
currentVerdict.Decision.ToString());
|
||||||
|
|
||||||
|
table.AddRow("Total Findings",
|
||||||
|
baselineVerdict.FindingCount.ToString(),
|
||||||
|
currentVerdict.FindingCount.ToString());
|
||||||
|
|
||||||
|
table.AddRow("Critical",
|
||||||
|
baselineVerdict.CriticalCount.ToString(),
|
||||||
|
currentVerdict.CriticalCount.ToString());
|
||||||
|
|
||||||
|
table.AddRow("High",
|
||||||
|
baselineVerdict.HighCount.ToString(),
|
||||||
|
currentVerdict.HighCount.ToString());
|
||||||
|
|
||||||
|
table.AddRow("Blocked By",
|
||||||
|
baselineVerdict.BlockedBy?.ToString() ?? "N/A",
|
||||||
|
currentVerdict.BlockedBy?.ToString() ?? "N/A");
|
||||||
|
|
||||||
|
table.AddRow("Snapshot ID",
|
||||||
|
baselineVerdict.SnapshotId ?? "N/A",
|
||||||
|
currentVerdict.SnapshotId ?? "N/A");
|
||||||
|
|
||||||
|
console.Write(table);
|
||||||
|
|
||||||
|
// Show decision change
|
||||||
|
if (baselineVerdict.Decision != currentVerdict.Decision)
|
||||||
|
{
|
||||||
|
console.WriteLine();
|
||||||
|
console.MarkupLine($"[bold yellow]Decision changed: {baselineVerdict.Decision} → {currentVerdict.Decision}[/]");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Show findings delta if requested
|
||||||
|
if (showFindings)
|
||||||
|
{
|
||||||
|
var findingsDelta = ComputeFindingsDelta(
|
||||||
|
baselineVerdict.Findings,
|
||||||
|
currentVerdict.Findings);
|
||||||
|
|
||||||
|
console.WriteLine();
|
||||||
|
console.MarkupLine("[bold]Finding Changes:[/]");
|
||||||
|
|
||||||
|
foreach (var added in findingsDelta.Added)
|
||||||
|
{
|
||||||
|
console.MarkupLine($" [green]+[/] {added.VulnId} in {added.Purl}");
|
||||||
|
}
|
||||||
|
|
||||||
|
foreach (var removed in findingsDelta.Removed)
|
||||||
|
{
|
||||||
|
console.MarkupLine($" [red]-[/] {removed.VulnId} in {removed.Purl}");
|
||||||
|
}
|
||||||
|
|
||||||
|
foreach (var changed in findingsDelta.Changed)
|
||||||
|
{
|
||||||
|
console.MarkupLine($" [yellow]~[/] {changed.VulnId}: {changed.BeforeStatus} → {changed.AfterStatus}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `stella compare verdicts v1 v2` works
|
||||||
|
- [ ] Shows decision comparison
|
||||||
|
- [ ] Shows count changes
|
||||||
|
- [ ] `--show-findings` shows individual changes
|
||||||
|
- [ ] Highlights decision changes
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Output Formatters
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement table, JSON, and SARIF formatters.
|
||||||
|
|
||||||
|
**Implementation Path**: `Commands/Compare/Formatters/` (new directory)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
// ICompareFormatter.cs
|
||||||
|
public interface ICompareFormatter
|
||||||
|
{
|
||||||
|
string Format(ComparisonDelta delta);
|
||||||
|
}
|
||||||
|
|
||||||
|
// TableFormatter.cs
|
||||||
|
public sealed class TableFormatter : ICompareFormatter
|
||||||
|
{
|
||||||
|
public string Format(ComparisonDelta delta)
|
||||||
|
{
|
||||||
|
var sb = new StringBuilder();
|
||||||
|
|
||||||
|
// Summary
|
||||||
|
sb.AppendLine($"Comparison Summary:");
|
||||||
|
sb.AppendLine($" Added: {delta.AddedCount}");
|
||||||
|
sb.AppendLine($" Removed: {delta.RemovedCount}");
|
||||||
|
sb.AppendLine($" Changed: {delta.ChangedCount}");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
// Categories
|
||||||
|
foreach (var category in delta.Categories)
|
||||||
|
{
|
||||||
|
sb.AppendLine($"{category.Name}:");
|
||||||
|
foreach (var item in category.Items)
|
||||||
|
{
|
||||||
|
var prefix = item.ChangeType switch
|
||||||
|
{
|
||||||
|
ChangeType.Added => "+",
|
||||||
|
ChangeType.Removed => "-",
|
||||||
|
ChangeType.Changed => "~",
|
||||||
|
_ => " "
|
||||||
|
};
|
||||||
|
sb.AppendLine($" {prefix} {item.Title}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return sb.ToString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// JsonFormatter.cs
|
||||||
|
public sealed class JsonFormatter : ICompareFormatter
|
||||||
|
{
|
||||||
|
public string Format(ComparisonDelta delta)
|
||||||
|
{
|
||||||
|
var output = new
|
||||||
|
{
|
||||||
|
comparison = new
|
||||||
|
{
|
||||||
|
current = delta.Current,
|
||||||
|
baseline = delta.Baseline,
|
||||||
|
computedAt = DateTimeOffset.UtcNow
|
||||||
|
},
|
||||||
|
summary = new
|
||||||
|
{
|
||||||
|
added = delta.AddedCount,
|
||||||
|
removed = delta.RemovedCount,
|
||||||
|
changed = delta.ChangedCount
|
||||||
|
},
|
||||||
|
categories = delta.Categories.Select(c => new
|
||||||
|
{
|
||||||
|
name = c.Name,
|
||||||
|
items = c.Items.Select(i => new
|
||||||
|
{
|
||||||
|
changeType = i.ChangeType.ToString().ToLower(),
|
||||||
|
title = i.Title,
|
||||||
|
severity = i.Severity?.ToString().ToLower(),
|
||||||
|
before = i.BeforeValue,
|
||||||
|
after = i.AfterValue
|
||||||
|
})
|
||||||
|
})
|
||||||
|
};
|
||||||
|
|
||||||
|
return JsonSerializer.Serialize(output, new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
WriteIndented = true,
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// SarifFormatter.cs
|
||||||
|
public sealed class SarifFormatter : ICompareFormatter
|
||||||
|
{
|
||||||
|
public string Format(ComparisonDelta delta)
|
||||||
|
{
|
||||||
|
var sarif = new
|
||||||
|
{
|
||||||
|
version = "2.1.0",
|
||||||
|
schema = "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json",
|
||||||
|
runs = new[]
|
||||||
|
{
|
||||||
|
new
|
||||||
|
{
|
||||||
|
tool = new
|
||||||
|
{
|
||||||
|
driver = new
|
||||||
|
{
|
||||||
|
name = "stella-compare",
|
||||||
|
version = "1.0.0",
|
||||||
|
informationUri = "https://stellaops.io"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
results = delta.AllItems.Select(item => new
|
||||||
|
{
|
||||||
|
ruleId = $"DELTA-{item.ChangeType.ToString().ToUpper()}",
|
||||||
|
level = item.Severity switch
|
||||||
|
{
|
||||||
|
Severity.Critical => "error",
|
||||||
|
Severity.High => "error",
|
||||||
|
Severity.Medium => "warning",
|
||||||
|
_ => "note"
|
||||||
|
},
|
||||||
|
message = new { text = item.Title },
|
||||||
|
properties = new
|
||||||
|
{
|
||||||
|
changeType = item.ChangeType.ToString(),
|
||||||
|
category = item.Category,
|
||||||
|
before = item.BeforeValue,
|
||||||
|
after = item.AfterValue
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return JsonSerializer.Serialize(sarif, new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
WriteIndented = true,
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Table formatter produces readable output
|
||||||
|
- [ ] JSON formatter produces valid JSON
|
||||||
|
- [ ] SARIF formatter produces valid SARIF 2.1.0
|
||||||
|
- [ ] All formatters handle empty deltas
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Baseline Option
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement `--baseline=last-green` and similar presets.
|
||||||
|
|
||||||
|
**Implementation Path**: Add to `CompareArtifactsCommand.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
// Add to CompareArtifactsCommand
|
||||||
|
var baselinePresetOption = new Option<string?>(
|
||||||
|
"--baseline",
|
||||||
|
"Baseline preset: last-green, previous-release, main-branch, or artifact reference");
|
||||||
|
|
||||||
|
// In ExecuteAsync
|
||||||
|
string resolvedBaseline;
|
||||||
|
if (!string.IsNullOrEmpty(baselinePreset))
|
||||||
|
{
|
||||||
|
resolvedBaseline = baselinePreset switch
|
||||||
|
{
|
||||||
|
"last-green" => await _baselineResolver.GetLastGreenAsync(currentRef),
|
||||||
|
"previous-release" => await _baselineResolver.GetPreviousReleaseAsync(currentRef),
|
||||||
|
"main-branch" => await _baselineResolver.GetMainBranchAsync(currentRef),
|
||||||
|
_ => baselinePreset // Assume it's an artifact reference
|
||||||
|
};
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
resolvedBaseline = baseline;
|
||||||
|
}
|
||||||
|
|
||||||
|
// BaselineResolver.cs
|
||||||
|
public sealed class BaselineResolver
|
||||||
|
{
|
||||||
|
private readonly IScannerClient _scanner;
|
||||||
|
private readonly IGitService _git;
|
||||||
|
|
||||||
|
public async Task<string> GetLastGreenAsync(ArtifactReference current)
|
||||||
|
{
|
||||||
|
// Find most recent artifact with passing verdict
|
||||||
|
var history = await _scanner.GetArtifactHistoryAsync(current.Repository);
|
||||||
|
var lastGreen = history
|
||||||
|
.Where(a => a.Verdict == VerdictDecision.Ship)
|
||||||
|
.OrderByDescending(a => a.ScannedAt)
|
||||||
|
.FirstOrDefault();
|
||||||
|
|
||||||
|
return lastGreen?.Reference ?? throw new InvalidOperationException("No green builds found");
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<string> GetPreviousReleaseAsync(ArtifactReference current)
|
||||||
|
{
|
||||||
|
// Find artifact tagged with previous semver release
|
||||||
|
var tags = await _git.GetTagsAsync(current.Repository);
|
||||||
|
var semverTags = tags
|
||||||
|
.Where(t => SemVersion.TryParse(t.Name, out _))
|
||||||
|
.OrderByDescending(t => SemVersion.Parse(t.Name))
|
||||||
|
.Skip(1) // Skip current release
|
||||||
|
.FirstOrDefault();
|
||||||
|
|
||||||
|
return semverTags?.ArtifactRef ?? throw new InvalidOperationException("No previous release found");
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<string> GetMainBranchAsync(ArtifactReference current)
|
||||||
|
{
|
||||||
|
// Find latest artifact from main branch
|
||||||
|
var mainArtifact = await _scanner.GetLatestArtifactAsync(
|
||||||
|
current.Repository,
|
||||||
|
branch: "main");
|
||||||
|
|
||||||
|
return mainArtifact?.Reference ?? throw new InvalidOperationException("No main branch artifact found");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `--baseline=last-green` resolves to last passing build
|
||||||
|
- [ ] `--baseline=previous-release` resolves to previous semver tag
|
||||||
|
- [ ] `--baseline=main-branch` resolves to latest main
|
||||||
|
- [ ] Falls back to treating value as artifact reference
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Tests
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T6
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integration tests for compare commands.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Cli/__Tests/StellaOps.Cli.Tests/Commands/Compare/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class CompareArtifactsCommandTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_TwoArtifacts_ShowsDelta()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var cmd = new CompareArtifactsCommand();
|
||||||
|
var console = new TestConsole();
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = await cmd.InvokeAsync(
|
||||||
|
new[] { "image@sha256:aaa", "image@sha256:bbb" },
|
||||||
|
console);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Should().Be(0);
|
||||||
|
console.Output.Should().Contain("Added");
|
||||||
|
console.Output.Should().Contain("Removed");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_JsonFormat_ValidJson()
|
||||||
|
{
|
||||||
|
var cmd = new CompareArtifactsCommand();
|
||||||
|
var console = new TestConsole();
|
||||||
|
|
||||||
|
var result = await cmd.InvokeAsync(
|
||||||
|
new[] { "img@sha256:a", "img@sha256:b", "--format", "json" },
|
||||||
|
console);
|
||||||
|
|
||||||
|
result.Should().Be(0);
|
||||||
|
var json = console.Output;
|
||||||
|
var parsed = JsonDocument.Parse(json);
|
||||||
|
parsed.RootElement.TryGetProperty("summary", out _).Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_SarifFormat_ValidSarif()
|
||||||
|
{
|
||||||
|
var cmd = new CompareArtifactsCommand();
|
||||||
|
var console = new TestConsole();
|
||||||
|
|
||||||
|
var result = await cmd.InvokeAsync(
|
||||||
|
new[] { "img@sha256:a", "img@sha256:b", "--format", "sarif" },
|
||||||
|
console);
|
||||||
|
|
||||||
|
result.Should().Be(0);
|
||||||
|
var sarif = JsonDocument.Parse(console.Output);
|
||||||
|
sarif.RootElement.GetProperty("version").GetString().Should().Be("2.1.0");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_BlockingChanges_ExitCode1()
|
||||||
|
{
|
||||||
|
var cmd = new CompareArtifactsCommand();
|
||||||
|
var console = new TestConsole();
|
||||||
|
// Mock: Delta with blocking changes
|
||||||
|
|
||||||
|
var result = await cmd.InvokeAsync(
|
||||||
|
new[] { "img@sha256:a", "img@sha256:b" },
|
||||||
|
console);
|
||||||
|
|
||||||
|
result.Should().Be(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class CompareSnapshotsCommandTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_ValidSnapshots_ShowsDelta()
|
||||||
|
{
|
||||||
|
var cmd = new CompareSnapshotsCommand();
|
||||||
|
var console = new TestConsole();
|
||||||
|
|
||||||
|
var result = await cmd.InvokeAsync(
|
||||||
|
new[] { "ksm:sha256:aaa", "ksm:sha256:bbb" },
|
||||||
|
console);
|
||||||
|
|
||||||
|
result.Should().Be(0);
|
||||||
|
console.Output.Should().Contain("Advisory Feeds");
|
||||||
|
console.Output.Should().Contain("VEX Documents");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_InvalidSnapshotId_Error()
|
||||||
|
{
|
||||||
|
var cmd = new CompareSnapshotsCommand();
|
||||||
|
var console = new TestConsole();
|
||||||
|
|
||||||
|
var result = await cmd.InvokeAsync(
|
||||||
|
new[] { "invalid", "ksm:sha256:bbb" },
|
||||||
|
console);
|
||||||
|
|
||||||
|
result.Should().Be(1);
|
||||||
|
console.Output.Should().Contain("Error");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class BaselineResolverTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task GetLastGreen_ReturnsPassingBuild()
|
||||||
|
{
|
||||||
|
var resolver = new BaselineResolver(_mockScanner, _mockGit);
|
||||||
|
|
||||||
|
var result = await resolver.GetLastGreenAsync(
|
||||||
|
ArtifactReference.Parse("myapp@sha256:current"));
|
||||||
|
|
||||||
|
result.Should().Contain("sha256");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test for table output
|
||||||
|
- [ ] Test for JSON output validity
|
||||||
|
- [ ] Test for SARIF output validity
|
||||||
|
- [ ] Test for exit codes
|
||||||
|
- [ ] Test for baseline resolution
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | CLI Team | Create CompareCommandGroup.cs |
|
||||||
|
| 2 | T2 | TODO | T1 | CLI Team | Add `compare artifacts` |
|
||||||
|
| 3 | T3 | TODO | T1 | CLI Team | Add `compare snapshots` |
|
||||||
|
| 4 | T4 | TODO | T1 | CLI Team | Add `compare verdicts` |
|
||||||
|
| 5 | T5 | TODO | T2-T4 | CLI Team | Output formatters |
|
||||||
|
| 6 | T6 | TODO | T2 | CLI Team | Baseline option |
|
||||||
|
| 7 | T7 | TODO | T1-T6 | CLI Team | Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. CLI compare commands for CI/CD integration. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| System.CommandLine | Decision | CLI Team | Use for argument parsing |
|
||||||
|
| SARIF 2.1.0 | Decision | CLI Team | Standard for security findings |
|
||||||
|
| Exit codes | Decision | CLI Team | 0=success, 1=blocking changes |
|
||||||
|
| Baseline presets | Decision | CLI Team | last-green, previous-release, main-branch |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] `stella compare artifacts img1@sha256:a img2@sha256:b` works
|
||||||
|
- [ ] `stella compare snapshots ksm:abc ksm:def` shows delta
|
||||||
|
- [ ] `stella compare verdicts v1 v2` works
|
||||||
|
- [ ] Output shows introduced/fixed/changed
|
||||||
|
- [ ] JSON output is machine-readable
|
||||||
|
- [ ] Exit code 1 for blocking changes
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
1046
docs/implplan/SPRINT_4200_0002_0005_counterfactuals.md
Normal file
1046
docs/implplan/SPRINT_4200_0002_0005_counterfactuals.md
Normal file
File diff suppressed because it is too large
Load Diff
995
docs/implplan/SPRINT_4500_0001_0001_binary_evidence_db.md
Normal file
995
docs/implplan/SPRINT_4500_0001_0001_binary_evidence_db.md
Normal file
@@ -0,0 +1,995 @@
|
|||||||
|
# Sprint 4500.0001.0001 · Binary Evidence Database
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Persist binary identity evidence (Build-ID, text hash) to PostgreSQL
|
||||||
|
- Create binary-to-package mapping store
|
||||||
|
- Support binary-level vulnerability assertions
|
||||||
|
|
||||||
|
**Working directory:** `src/Scanner/__Libraries/StellaOps.Scanner.Storage/Postgres/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: All other sprints
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Scanner/__Libraries/StellaOps.Scanner.Storage/AGENTS.md`
|
||||||
|
- `docs/db/SPECIFICATION.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Mapping Evidence Within Compiled Binaries.md`
|
||||||
|
- Existing: `BuildIdLookupResult`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Problem Statement
|
||||||
|
|
||||||
|
Build-ID indexing exists in memory (`BuildIdLookupResult`) but there's no persistent storage. This means:
|
||||||
|
- Build-ID matches are lost between scans
|
||||||
|
- Cannot query historical binary evidence
|
||||||
|
- No binary-level vulnerability status tracking
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Migration - binary_identity Table
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create migration for binary identity storage.
|
||||||
|
|
||||||
|
**Implementation Path**: `Migrations/YYYYMMDDHHMMSS_AddBinaryIdentityTable.cs`
|
||||||
|
|
||||||
|
**Migration**:
|
||||||
|
```csharp
|
||||||
|
public partial class AddBinaryIdentityTable : Migration
|
||||||
|
{
|
||||||
|
protected override void Up(MigrationBuilder migrationBuilder)
|
||||||
|
{
|
||||||
|
migrationBuilder.CreateTable(
|
||||||
|
name: "binary_identity",
|
||||||
|
columns: table => new
|
||||||
|
{
|
||||||
|
id = table.Column<Guid>(nullable: false, defaultValueSql: "gen_random_uuid()"),
|
||||||
|
scan_id = table.Column<Guid>(nullable: false),
|
||||||
|
file_path = table.Column<string>(maxLength: 1024, nullable: false),
|
||||||
|
file_sha256 = table.Column<string>(maxLength: 64, nullable: false),
|
||||||
|
text_sha256 = table.Column<string>(maxLength: 64, nullable: true),
|
||||||
|
build_id = table.Column<string>(maxLength: 128, nullable: true),
|
||||||
|
build_id_type = table.Column<string>(maxLength: 32, nullable: true),
|
||||||
|
architecture = table.Column<string>(maxLength: 32, nullable: false),
|
||||||
|
binary_format = table.Column<string>(maxLength: 16, nullable: false),
|
||||||
|
file_size = table.Column<long>(nullable: false),
|
||||||
|
is_stripped = table.Column<bool>(nullable: false, defaultValue: false),
|
||||||
|
has_debug_info = table.Column<bool>(nullable: false, defaultValue: false),
|
||||||
|
created_at = table.Column<DateTimeOffset>(nullable: false, defaultValueSql: "now()")
|
||||||
|
},
|
||||||
|
constraints: table =>
|
||||||
|
{
|
||||||
|
table.PrimaryKey("pk_binary_identity", x => x.id);
|
||||||
|
table.ForeignKey(
|
||||||
|
name: "fk_binary_identity_scan",
|
||||||
|
column: x => x.scan_id,
|
||||||
|
principalTable: "scan",
|
||||||
|
principalColumn: "id",
|
||||||
|
onDelete: ReferentialAction.Cascade);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Indexes for lookups
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_identity_build_id",
|
||||||
|
table: "binary_identity",
|
||||||
|
column: "build_id");
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_identity_file_sha256",
|
||||||
|
table: "binary_identity",
|
||||||
|
column: "file_sha256");
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_identity_text_sha256",
|
||||||
|
table: "binary_identity",
|
||||||
|
column: "text_sha256");
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_identity_scan_id",
|
||||||
|
table: "binary_identity",
|
||||||
|
column: "scan_id");
|
||||||
|
}
|
||||||
|
|
||||||
|
protected override void Down(MigrationBuilder migrationBuilder)
|
||||||
|
{
|
||||||
|
migrationBuilder.DropTable(name: "binary_identity");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migration creates binary_identity table
|
||||||
|
- [ ] Columns for build_id, file_sha256, text_sha256, architecture
|
||||||
|
- [ ] Indexes on lookup columns
|
||||||
|
- [ ] Foreign key to scan table
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Migration - binary_package_map Table
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create table mapping binaries to packages (PURLs).
|
||||||
|
|
||||||
|
**Migration**:
|
||||||
|
```csharp
|
||||||
|
public partial class AddBinaryPackageMapTable : Migration
|
||||||
|
{
|
||||||
|
protected override void Up(MigrationBuilder migrationBuilder)
|
||||||
|
{
|
||||||
|
migrationBuilder.CreateTable(
|
||||||
|
name: "binary_package_map",
|
||||||
|
columns: table => new
|
||||||
|
{
|
||||||
|
id = table.Column<Guid>(nullable: false, defaultValueSql: "gen_random_uuid()"),
|
||||||
|
binary_identity_id = table.Column<Guid>(nullable: false),
|
||||||
|
purl = table.Column<string>(maxLength: 512, nullable: false),
|
||||||
|
match_type = table.Column<string>(maxLength: 32, nullable: false),
|
||||||
|
confidence = table.Column<decimal>(precision: 3, scale: 2, nullable: false),
|
||||||
|
match_source = table.Column<string>(maxLength: 64, nullable: false),
|
||||||
|
evidence_json = table.Column<string>(type: "jsonb", nullable: true),
|
||||||
|
created_at = table.Column<DateTimeOffset>(nullable: false, defaultValueSql: "now()")
|
||||||
|
},
|
||||||
|
constraints: table =>
|
||||||
|
{
|
||||||
|
table.PrimaryKey("pk_binary_package_map", x => x.id);
|
||||||
|
table.ForeignKey(
|
||||||
|
name: "fk_binary_package_map_identity",
|
||||||
|
column: x => x.binary_identity_id,
|
||||||
|
principalTable: "binary_identity",
|
||||||
|
principalColumn: "id",
|
||||||
|
onDelete: ReferentialAction.Cascade);
|
||||||
|
});
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_package_map_purl",
|
||||||
|
table: "binary_package_map",
|
||||||
|
column: "purl");
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_package_map_binary_identity_id",
|
||||||
|
table: "binary_package_map",
|
||||||
|
column: "binary_identity_id");
|
||||||
|
|
||||||
|
// Unique constraint: one mapping per binary per PURL
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_package_map_unique",
|
||||||
|
table: "binary_package_map",
|
||||||
|
columns: new[] { "binary_identity_id", "purl" },
|
||||||
|
unique: true);
|
||||||
|
}
|
||||||
|
|
||||||
|
protected override void Down(MigrationBuilder migrationBuilder)
|
||||||
|
{
|
||||||
|
migrationBuilder.DropTable(name: "binary_package_map");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migration creates binary_package_map table
|
||||||
|
- [ ] Links binary identity to PURL
|
||||||
|
- [ ] Match type and confidence stored
|
||||||
|
- [ ] Evidence JSON for detailed proof
|
||||||
|
- [ ] Unique constraint on binary+purl
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Migration - binary_vuln_assertion Table
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create table for binary-level vulnerability assertions.
|
||||||
|
|
||||||
|
**Migration**:
|
||||||
|
```csharp
|
||||||
|
public partial class AddBinaryVulnAssertionTable : Migration
|
||||||
|
{
|
||||||
|
protected override void Up(MigrationBuilder migrationBuilder)
|
||||||
|
{
|
||||||
|
migrationBuilder.CreateTable(
|
||||||
|
name: "binary_vuln_assertion",
|
||||||
|
columns: table => new
|
||||||
|
{
|
||||||
|
id = table.Column<Guid>(nullable: false, defaultValueSql: "gen_random_uuid()"),
|
||||||
|
binary_identity_id = table.Column<Guid>(nullable: false),
|
||||||
|
vuln_id = table.Column<string>(maxLength: 64, nullable: false),
|
||||||
|
status = table.Column<string>(maxLength: 32, nullable: false),
|
||||||
|
source = table.Column<string>(maxLength: 64, nullable: false),
|
||||||
|
assertion_type = table.Column<string>(maxLength: 32, nullable: false),
|
||||||
|
confidence = table.Column<decimal>(precision: 3, scale: 2, nullable: false),
|
||||||
|
evidence_json = table.Column<string>(type: "jsonb", nullable: true),
|
||||||
|
valid_from = table.Column<DateTimeOffset>(nullable: false),
|
||||||
|
valid_until = table.Column<DateTimeOffset>(nullable: true),
|
||||||
|
signature_ref = table.Column<string>(maxLength: 256, nullable: true),
|
||||||
|
created_at = table.Column<DateTimeOffset>(nullable: false, defaultValueSql: "now()")
|
||||||
|
},
|
||||||
|
constraints: table =>
|
||||||
|
{
|
||||||
|
table.PrimaryKey("pk_binary_vuln_assertion", x => x.id);
|
||||||
|
table.ForeignKey(
|
||||||
|
name: "fk_binary_vuln_assertion_identity",
|
||||||
|
column: x => x.binary_identity_id,
|
||||||
|
principalTable: "binary_identity",
|
||||||
|
principalColumn: "id",
|
||||||
|
onDelete: ReferentialAction.Cascade);
|
||||||
|
});
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_vuln_assertion_vuln_id",
|
||||||
|
table: "binary_vuln_assertion",
|
||||||
|
column: "vuln_id");
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_vuln_assertion_binary_identity_id",
|
||||||
|
table: "binary_vuln_assertion",
|
||||||
|
column: "binary_identity_id");
|
||||||
|
|
||||||
|
migrationBuilder.CreateIndex(
|
||||||
|
name: "ix_binary_vuln_assertion_status",
|
||||||
|
table: "binary_vuln_assertion",
|
||||||
|
column: "status");
|
||||||
|
}
|
||||||
|
|
||||||
|
protected override void Down(MigrationBuilder migrationBuilder)
|
||||||
|
{
|
||||||
|
migrationBuilder.DropTable(name: "binary_vuln_assertion");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migration creates binary_vuln_assertion table
|
||||||
|
- [ ] Links to binary identity
|
||||||
|
- [ ] Status (affected/not_affected/fixed)
|
||||||
|
- [ ] Assertion type (static_analysis, symbol_match, etc.)
|
||||||
|
- [ ] Validity period
|
||||||
|
- [ ] Optional signature reference
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Create IBinaryEvidenceRepository
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create repository interface and entities.
|
||||||
|
|
||||||
|
**Implementation Path**: `Entities/` and `Repositories/`
|
||||||
|
|
||||||
|
**Entities**:
|
||||||
|
```csharp
|
||||||
|
// Entities/BinaryIdentity.cs
|
||||||
|
namespace StellaOps.Scanner.Storage.Postgres.Entities;
|
||||||
|
|
||||||
|
[Table("binary_identity")]
|
||||||
|
public sealed class BinaryIdentity
|
||||||
|
{
|
||||||
|
[Key]
|
||||||
|
[Column("id")]
|
||||||
|
public Guid Id { get; init; } = Guid.NewGuid();
|
||||||
|
|
||||||
|
[Column("scan_id")]
|
||||||
|
public Guid ScanId { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(1024)]
|
||||||
|
[Column("file_path")]
|
||||||
|
public required string FilePath { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(64)]
|
||||||
|
[Column("file_sha256")]
|
||||||
|
public required string FileSha256 { get; init; }
|
||||||
|
|
||||||
|
[MaxLength(64)]
|
||||||
|
[Column("text_sha256")]
|
||||||
|
public string? TextSha256 { get; init; }
|
||||||
|
|
||||||
|
[MaxLength(128)]
|
||||||
|
[Column("build_id")]
|
||||||
|
public string? BuildId { get; init; }
|
||||||
|
|
||||||
|
[MaxLength(32)]
|
||||||
|
[Column("build_id_type")]
|
||||||
|
public string? BuildIdType { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(32)]
|
||||||
|
[Column("architecture")]
|
||||||
|
public required string Architecture { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(16)]
|
||||||
|
[Column("binary_format")]
|
||||||
|
public required string BinaryFormat { get; init; }
|
||||||
|
|
||||||
|
[Column("file_size")]
|
||||||
|
public long FileSize { get; init; }
|
||||||
|
|
||||||
|
[Column("is_stripped")]
|
||||||
|
public bool IsStripped { get; init; }
|
||||||
|
|
||||||
|
[Column("has_debug_info")]
|
||||||
|
public bool HasDebugInfo { get; init; }
|
||||||
|
|
||||||
|
[Column("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; } = DateTimeOffset.UtcNow;
|
||||||
|
|
||||||
|
// Navigation
|
||||||
|
public ICollection<BinaryPackageMap> PackageMaps { get; init; } = [];
|
||||||
|
public ICollection<BinaryVulnAssertion> VulnAssertions { get; init; } = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Entities/BinaryPackageMap.cs
|
||||||
|
[Table("binary_package_map")]
|
||||||
|
public sealed class BinaryPackageMap
|
||||||
|
{
|
||||||
|
[Key]
|
||||||
|
[Column("id")]
|
||||||
|
public Guid Id { get; init; } = Guid.NewGuid();
|
||||||
|
|
||||||
|
[Column("binary_identity_id")]
|
||||||
|
public Guid BinaryIdentityId { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(512)]
|
||||||
|
[Column("purl")]
|
||||||
|
public required string Purl { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(32)]
|
||||||
|
[Column("match_type")]
|
||||||
|
public required string MatchType { get; init; }
|
||||||
|
|
||||||
|
[Column("confidence")]
|
||||||
|
public decimal Confidence { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(64)]
|
||||||
|
[Column("match_source")]
|
||||||
|
public required string MatchSource { get; init; }
|
||||||
|
|
||||||
|
[Column("evidence_json", TypeName = "jsonb")]
|
||||||
|
public string? EvidenceJson { get; init; }
|
||||||
|
|
||||||
|
[Column("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; } = DateTimeOffset.UtcNow;
|
||||||
|
|
||||||
|
// Navigation
|
||||||
|
[ForeignKey(nameof(BinaryIdentityId))]
|
||||||
|
public BinaryIdentity? BinaryIdentity { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
// Entities/BinaryVulnAssertion.cs
|
||||||
|
[Table("binary_vuln_assertion")]
|
||||||
|
public sealed class BinaryVulnAssertion
|
||||||
|
{
|
||||||
|
[Key]
|
||||||
|
[Column("id")]
|
||||||
|
public Guid Id { get; init; } = Guid.NewGuid();
|
||||||
|
|
||||||
|
[Column("binary_identity_id")]
|
||||||
|
public Guid BinaryIdentityId { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(64)]
|
||||||
|
[Column("vuln_id")]
|
||||||
|
public required string VulnId { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(32)]
|
||||||
|
[Column("status")]
|
||||||
|
public required string Status { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(64)]
|
||||||
|
[Column("source")]
|
||||||
|
public required string Source { get; init; }
|
||||||
|
|
||||||
|
[Required]
|
||||||
|
[MaxLength(32)]
|
||||||
|
[Column("assertion_type")]
|
||||||
|
public required string AssertionType { get; init; }
|
||||||
|
|
||||||
|
[Column("confidence")]
|
||||||
|
public decimal Confidence { get; init; }
|
||||||
|
|
||||||
|
[Column("evidence_json", TypeName = "jsonb")]
|
||||||
|
public string? EvidenceJson { get; init; }
|
||||||
|
|
||||||
|
[Column("valid_from")]
|
||||||
|
public DateTimeOffset ValidFrom { get; init; }
|
||||||
|
|
||||||
|
[Column("valid_until")]
|
||||||
|
public DateTimeOffset? ValidUntil { get; init; }
|
||||||
|
|
||||||
|
[MaxLength(256)]
|
||||||
|
[Column("signature_ref")]
|
||||||
|
public string? SignatureRef { get; init; }
|
||||||
|
|
||||||
|
[Column("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; } = DateTimeOffset.UtcNow;
|
||||||
|
|
||||||
|
// Navigation
|
||||||
|
[ForeignKey(nameof(BinaryIdentityId))]
|
||||||
|
public BinaryIdentity? BinaryIdentity { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Repository Interface**:
|
||||||
|
```csharp
|
||||||
|
// Repositories/IBinaryEvidenceRepository.cs
|
||||||
|
public interface IBinaryEvidenceRepository
|
||||||
|
{
|
||||||
|
// Identity operations
|
||||||
|
Task<BinaryIdentity?> GetByIdAsync(Guid id, CancellationToken ct = default);
|
||||||
|
Task<BinaryIdentity?> GetByBuildIdAsync(string buildId, CancellationToken ct = default);
|
||||||
|
Task<BinaryIdentity?> GetByFileSha256Async(string sha256, CancellationToken ct = default);
|
||||||
|
Task<BinaryIdentity?> GetByTextSha256Async(string sha256, CancellationToken ct = default);
|
||||||
|
Task<IReadOnlyList<BinaryIdentity>> GetByScanIdAsync(Guid scanId, CancellationToken ct = default);
|
||||||
|
Task<BinaryIdentity> AddAsync(BinaryIdentity identity, CancellationToken ct = default);
|
||||||
|
|
||||||
|
// Package map operations
|
||||||
|
Task<IReadOnlyList<BinaryPackageMap>> GetPackageMapsAsync(Guid binaryId, CancellationToken ct = default);
|
||||||
|
Task<BinaryPackageMap> AddPackageMapAsync(BinaryPackageMap map, CancellationToken ct = default);
|
||||||
|
|
||||||
|
// Vuln assertion operations
|
||||||
|
Task<IReadOnlyList<BinaryVulnAssertion>> GetVulnAssertionsAsync(Guid binaryId, CancellationToken ct = default);
|
||||||
|
Task<IReadOnlyList<BinaryVulnAssertion>> GetVulnAssertionsByVulnIdAsync(string vulnId, CancellationToken ct = default);
|
||||||
|
Task<BinaryVulnAssertion> AddVulnAssertionAsync(BinaryVulnAssertion assertion, CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Entity classes created
|
||||||
|
- [ ] Repository interface defined
|
||||||
|
- [ ] CRUD operations for all three tables
|
||||||
|
- [ ] Lookup by build_id, file_sha256, text_sha256
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Create BinaryEvidenceService
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Business logic layer for binary evidence.
|
||||||
|
|
||||||
|
**Implementation Path**: `Services/BinaryEvidenceService.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Scanner.Storage.Services;
|
||||||
|
|
||||||
|
public interface IBinaryEvidenceService
|
||||||
|
{
|
||||||
|
Task<BinaryIdentity> RecordBinaryAsync(
|
||||||
|
Guid scanId,
|
||||||
|
BinaryInfo binary,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
Task<BinaryPackageMap?> MatchToPackageAsync(
|
||||||
|
Guid binaryId,
|
||||||
|
string purl,
|
||||||
|
PackageMatchEvidence evidence,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
Task<BinaryVulnAssertion> RecordAssertionAsync(
|
||||||
|
Guid binaryId,
|
||||||
|
string vulnId,
|
||||||
|
AssertionInfo assertion,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
Task<BinaryEvidence?> GetEvidenceForBinaryAsync(
|
||||||
|
string buildIdOrHash,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed class BinaryEvidenceService : IBinaryEvidenceService
|
||||||
|
{
|
||||||
|
private readonly IBinaryEvidenceRepository _repository;
|
||||||
|
private readonly ILogger<BinaryEvidenceService> _logger;
|
||||||
|
|
||||||
|
public async Task<BinaryIdentity> RecordBinaryAsync(
|
||||||
|
Guid scanId,
|
||||||
|
BinaryInfo binary,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// Check if we've seen this binary before (by hash)
|
||||||
|
var existing = await _repository.GetByFileSha256Async(binary.FileSha256, ct);
|
||||||
|
if (existing is not null)
|
||||||
|
{
|
||||||
|
_logger.LogDebug(
|
||||||
|
"Binary {Path} already recorded as {Id}",
|
||||||
|
binary.FilePath, existing.Id);
|
||||||
|
return existing;
|
||||||
|
}
|
||||||
|
|
||||||
|
var identity = new BinaryIdentity
|
||||||
|
{
|
||||||
|
ScanId = scanId,
|
||||||
|
FilePath = binary.FilePath,
|
||||||
|
FileSha256 = binary.FileSha256,
|
||||||
|
TextSha256 = binary.TextSha256,
|
||||||
|
BuildId = binary.BuildId,
|
||||||
|
BuildIdType = binary.BuildIdType,
|
||||||
|
Architecture = binary.Architecture,
|
||||||
|
BinaryFormat = binary.Format,
|
||||||
|
FileSize = binary.FileSize,
|
||||||
|
IsStripped = binary.IsStripped,
|
||||||
|
HasDebugInfo = binary.HasDebugInfo
|
||||||
|
};
|
||||||
|
|
||||||
|
return await _repository.AddAsync(identity, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<BinaryPackageMap?> MatchToPackageAsync(
|
||||||
|
Guid binaryId,
|
||||||
|
string purl,
|
||||||
|
PackageMatchEvidence evidence,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var map = new BinaryPackageMap
|
||||||
|
{
|
||||||
|
BinaryIdentityId = binaryId,
|
||||||
|
Purl = purl,
|
||||||
|
MatchType = evidence.MatchType,
|
||||||
|
Confidence = evidence.Confidence,
|
||||||
|
MatchSource = evidence.Source,
|
||||||
|
EvidenceJson = JsonSerializer.Serialize(evidence.Details)
|
||||||
|
};
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
return await _repository.AddPackageMapAsync(map, ct);
|
||||||
|
}
|
||||||
|
catch (DbUpdateException ex) when (ex.InnerException is PostgresException { SqlState: "23505" })
|
||||||
|
{
|
||||||
|
// Unique constraint violation - mapping already exists
|
||||||
|
_logger.LogDebug("Package map already exists for {Binary} -> {Purl}", binaryId, purl);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<BinaryVulnAssertion> RecordAssertionAsync(
|
||||||
|
Guid binaryId,
|
||||||
|
string vulnId,
|
||||||
|
AssertionInfo assertion,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var vulnAssertion = new BinaryVulnAssertion
|
||||||
|
{
|
||||||
|
BinaryIdentityId = binaryId,
|
||||||
|
VulnId = vulnId,
|
||||||
|
Status = assertion.Status,
|
||||||
|
Source = assertion.Source,
|
||||||
|
AssertionType = assertion.Type,
|
||||||
|
Confidence = assertion.Confidence,
|
||||||
|
EvidenceJson = JsonSerializer.Serialize(assertion.Evidence),
|
||||||
|
ValidFrom = assertion.ValidFrom,
|
||||||
|
ValidUntil = assertion.ValidUntil,
|
||||||
|
SignatureRef = assertion.SignatureRef
|
||||||
|
};
|
||||||
|
|
||||||
|
return await _repository.AddVulnAssertionAsync(vulnAssertion, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<BinaryEvidence?> GetEvidenceForBinaryAsync(
|
||||||
|
string buildIdOrHash,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// Try build ID first
|
||||||
|
var identity = await _repository.GetByBuildIdAsync(buildIdOrHash, ct);
|
||||||
|
|
||||||
|
// Fallback to SHA256
|
||||||
|
identity ??= await _repository.GetByFileSha256Async(buildIdOrHash, ct);
|
||||||
|
identity ??= await _repository.GetByTextSha256Async(buildIdOrHash, ct);
|
||||||
|
|
||||||
|
if (identity is null)
|
||||||
|
return null;
|
||||||
|
|
||||||
|
var packages = await _repository.GetPackageMapsAsync(identity.Id, ct);
|
||||||
|
var assertions = await _repository.GetVulnAssertionsAsync(identity.Id, ct);
|
||||||
|
|
||||||
|
return new BinaryEvidence
|
||||||
|
{
|
||||||
|
Identity = identity,
|
||||||
|
PackageMaps = packages,
|
||||||
|
VulnAssertions = assertions
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// DTOs
|
||||||
|
public sealed record BinaryInfo(
|
||||||
|
string FilePath,
|
||||||
|
string FileSha256,
|
||||||
|
string? TextSha256,
|
||||||
|
string? BuildId,
|
||||||
|
string? BuildIdType,
|
||||||
|
string Architecture,
|
||||||
|
string Format,
|
||||||
|
long FileSize,
|
||||||
|
bool IsStripped,
|
||||||
|
bool HasDebugInfo);
|
||||||
|
|
||||||
|
public sealed record PackageMatchEvidence(
|
||||||
|
string MatchType,
|
||||||
|
decimal Confidence,
|
||||||
|
string Source,
|
||||||
|
object? Details);
|
||||||
|
|
||||||
|
public sealed record AssertionInfo(
|
||||||
|
string Status,
|
||||||
|
string Source,
|
||||||
|
string Type,
|
||||||
|
decimal Confidence,
|
||||||
|
object? Evidence,
|
||||||
|
DateTimeOffset ValidFrom,
|
||||||
|
DateTimeOffset? ValidUntil,
|
||||||
|
string? SignatureRef);
|
||||||
|
|
||||||
|
public sealed record BinaryEvidence(
|
||||||
|
BinaryIdentity Identity,
|
||||||
|
IReadOnlyList<BinaryPackageMap> PackageMaps,
|
||||||
|
IReadOnlyList<BinaryVulnAssertion> VulnAssertions);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Service interface defined
|
||||||
|
- [ ] Record binary with dedup check
|
||||||
|
- [ ] Package mapping with constraint handling
|
||||||
|
- [ ] Vulnerability assertion recording
|
||||||
|
- [ ] Evidence retrieval by ID or hash
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Integrate with Scanner
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Wire binary evidence service into scanner workflow.
|
||||||
|
|
||||||
|
**Implementation Path**: Modify scanner analyzer
|
||||||
|
|
||||||
|
**Integration Points**:
|
||||||
|
```csharp
|
||||||
|
// In BinaryAnalyzer or similar
|
||||||
|
public sealed class BinaryAnalyzer : IAnalyzer
|
||||||
|
{
|
||||||
|
private readonly IBinaryEvidenceService _evidenceService;
|
||||||
|
|
||||||
|
public async Task<AnalysisResult> AnalyzeAsync(
|
||||||
|
ScanContext context,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
foreach (var binary in context.Binaries)
|
||||||
|
{
|
||||||
|
// Parse binary headers
|
||||||
|
var info = ParseBinaryInfo(binary);
|
||||||
|
|
||||||
|
// Record in evidence store
|
||||||
|
var identity = await _evidenceService.RecordBinaryAsync(
|
||||||
|
context.ScanId,
|
||||||
|
info,
|
||||||
|
ct);
|
||||||
|
|
||||||
|
// Attempt package matching
|
||||||
|
var matchResult = await MatchBinaryToPackageAsync(identity, context, ct);
|
||||||
|
if (matchResult is not null)
|
||||||
|
{
|
||||||
|
await _evidenceService.MatchToPackageAsync(
|
||||||
|
identity.Id,
|
||||||
|
matchResult.Purl,
|
||||||
|
new PackageMatchEvidence(
|
||||||
|
MatchType: matchResult.MatchType,
|
||||||
|
Confidence: matchResult.Confidence,
|
||||||
|
Source: "build-id-index",
|
||||||
|
Details: matchResult.Evidence),
|
||||||
|
ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Record any vuln assertions from static analysis
|
||||||
|
foreach (var assertion in staticAnalysisResults)
|
||||||
|
{
|
||||||
|
await _evidenceService.RecordAssertionAsync(
|
||||||
|
identity.Id,
|
||||||
|
assertion.VulnId,
|
||||||
|
new AssertionInfo(
|
||||||
|
Status: assertion.Status,
|
||||||
|
Source: "static-analysis",
|
||||||
|
Type: assertion.Type,
|
||||||
|
Confidence: assertion.Confidence,
|
||||||
|
Evidence: assertion.Evidence,
|
||||||
|
ValidFrom: DateTimeOffset.UtcNow,
|
||||||
|
ValidUntil: null,
|
||||||
|
SignatureRef: null),
|
||||||
|
ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Scanner calls evidence service
|
||||||
|
- [ ] Binaries recorded during scan
|
||||||
|
- [ ] Package matches persisted
|
||||||
|
- [ ] Vuln assertions stored
|
||||||
|
- [ ] No performance regression
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: API Endpoints
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create API for binary evidence queries.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/StellaOps.Scanner.WebService/Endpoints/BinaryEvidenceEndpoints.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public static class BinaryEvidenceEndpoints
|
||||||
|
{
|
||||||
|
public static void MapBinaryEvidenceEndpoints(this WebApplication app)
|
||||||
|
{
|
||||||
|
var group = app.MapGroup("/api/v1/binaries")
|
||||||
|
.WithTags("Binary Evidence")
|
||||||
|
.RequireAuthorization();
|
||||||
|
|
||||||
|
// GET /binaries/{id}
|
||||||
|
group.MapGet("/{id:guid}", async (
|
||||||
|
Guid id,
|
||||||
|
IBinaryEvidenceService service,
|
||||||
|
CancellationToken ct) =>
|
||||||
|
{
|
||||||
|
var evidence = await service.GetEvidenceForBinaryAsync(id.ToString(), ct);
|
||||||
|
return evidence is null ? Results.NotFound() : Results.Ok(evidence);
|
||||||
|
})
|
||||||
|
.WithName("GetBinaryEvidence")
|
||||||
|
.WithDescription("Get evidence for a binary by ID");
|
||||||
|
|
||||||
|
// GET /binaries/by-build-id/{buildId}
|
||||||
|
group.MapGet("/by-build-id/{buildId}", async (
|
||||||
|
string buildId,
|
||||||
|
IBinaryEvidenceService service,
|
||||||
|
CancellationToken ct) =>
|
||||||
|
{
|
||||||
|
var evidence = await service.GetEvidenceForBinaryAsync(buildId, ct);
|
||||||
|
return evidence is null ? Results.NotFound() : Results.Ok(evidence);
|
||||||
|
})
|
||||||
|
.WithName("GetBinaryEvidenceByBuildId")
|
||||||
|
.WithDescription("Get evidence for a binary by Build-ID");
|
||||||
|
|
||||||
|
// GET /binaries/by-hash/{hash}
|
||||||
|
group.MapGet("/by-hash/{hash}", async (
|
||||||
|
string hash,
|
||||||
|
IBinaryEvidenceService service,
|
||||||
|
CancellationToken ct) =>
|
||||||
|
{
|
||||||
|
var evidence = await service.GetEvidenceForBinaryAsync(hash, ct);
|
||||||
|
return evidence is null ? Results.NotFound() : Results.Ok(evidence);
|
||||||
|
})
|
||||||
|
.WithName("GetBinaryEvidenceByHash")
|
||||||
|
.WithDescription("Get evidence for a binary by SHA256 hash");
|
||||||
|
|
||||||
|
// GET /scans/{scanId}/binaries
|
||||||
|
group.MapGet("/scans/{scanId:guid}", async (
|
||||||
|
Guid scanId,
|
||||||
|
IBinaryEvidenceRepository repository,
|
||||||
|
CancellationToken ct) =>
|
||||||
|
{
|
||||||
|
var binaries = await repository.GetByScanIdAsync(scanId, ct);
|
||||||
|
return Results.Ok(binaries);
|
||||||
|
})
|
||||||
|
.WithName("GetBinariesByScan")
|
||||||
|
.WithDescription("Get all binaries from a scan");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] GET /binaries/{id} works
|
||||||
|
- [ ] GET /binaries/by-build-id/{buildId} works
|
||||||
|
- [ ] GET /binaries/by-hash/{hash} works
|
||||||
|
- [ ] GET /scans/{scanId}/binaries works
|
||||||
|
- [ ] Authorization required
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Tests
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T7
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Tests for binary evidence functionality.
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class BinaryEvidenceServiceTests : IClassFixture<PostgresFixture>
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task RecordBinary_NewBinary_CreatesRecord()
|
||||||
|
{
|
||||||
|
var info = new BinaryInfo(
|
||||||
|
FilePath: "/usr/lib/libc.so.6",
|
||||||
|
FileSha256: "abc123...",
|
||||||
|
TextSha256: "def456...",
|
||||||
|
BuildId: "aabbccdd",
|
||||||
|
BuildIdType: "gnu",
|
||||||
|
Architecture: "x86_64",
|
||||||
|
Format: "ELF",
|
||||||
|
FileSize: 1024000,
|
||||||
|
IsStripped: false,
|
||||||
|
HasDebugInfo: true);
|
||||||
|
|
||||||
|
var identity = await _service.RecordBinaryAsync(Guid.NewGuid(), info);
|
||||||
|
|
||||||
|
identity.Should().NotBeNull();
|
||||||
|
identity.BuildId.Should().Be("aabbccdd");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task RecordBinary_DuplicateHash_ReturnsExisting()
|
||||||
|
{
|
||||||
|
var info = CreateBinaryInfo();
|
||||||
|
var first = await _service.RecordBinaryAsync(Guid.NewGuid(), info);
|
||||||
|
var second = await _service.RecordBinaryAsync(Guid.NewGuid(), info);
|
||||||
|
|
||||||
|
first.Id.Should().Be(second.Id);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task MatchToPackage_Valid_CreatesMapping()
|
||||||
|
{
|
||||||
|
var identity = await CreateBinaryAsync();
|
||||||
|
var evidence = new PackageMatchEvidence(
|
||||||
|
MatchType: "build-id",
|
||||||
|
Confidence: 0.95m,
|
||||||
|
Source: "build-id-index",
|
||||||
|
Details: new { debugInfo = "/usr/lib/debug/..." });
|
||||||
|
|
||||||
|
var map = await _service.MatchToPackageAsync(
|
||||||
|
identity.Id,
|
||||||
|
"pkg:rpm/glibc@2.28",
|
||||||
|
evidence);
|
||||||
|
|
||||||
|
map.Should().NotBeNull();
|
||||||
|
map!.Confidence.Should().Be(0.95m);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task RecordAssertion_Valid_CreatesAssertion()
|
||||||
|
{
|
||||||
|
var identity = await CreateBinaryAsync();
|
||||||
|
var assertion = new AssertionInfo(
|
||||||
|
Status: "not_affected",
|
||||||
|
Source: "static-analysis",
|
||||||
|
Type: "symbol_absence",
|
||||||
|
Confidence: 0.8m,
|
||||||
|
Evidence: new { checkedSymbols = new[] { "vulnerable_func" } },
|
||||||
|
ValidFrom: DateTimeOffset.UtcNow,
|
||||||
|
ValidUntil: null,
|
||||||
|
SignatureRef: null);
|
||||||
|
|
||||||
|
var result = await _service.RecordAssertionAsync(
|
||||||
|
identity.Id,
|
||||||
|
"CVE-2024-1234",
|
||||||
|
assertion);
|
||||||
|
|
||||||
|
result.Should().NotBeNull();
|
||||||
|
result.Status.Should().Be("not_affected");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task GetEvidence_ByBuildId_ReturnsComplete()
|
||||||
|
{
|
||||||
|
var identity = await CreateBinaryWithMappingsAndAssertionsAsync();
|
||||||
|
|
||||||
|
var evidence = await _service.GetEvidenceForBinaryAsync(identity.BuildId!);
|
||||||
|
|
||||||
|
evidence.Should().NotBeNull();
|
||||||
|
evidence!.Identity.Id.Should().Be(identity.Id);
|
||||||
|
evidence.PackageMaps.Should().NotBeEmpty();
|
||||||
|
evidence.VulnAssertions.Should().NotBeEmpty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Record binary tests
|
||||||
|
- [ ] Duplicate handling tests
|
||||||
|
- [ ] Package mapping tests
|
||||||
|
- [ ] Vuln assertion tests
|
||||||
|
- [ ] Evidence retrieval tests
|
||||||
|
- [ ] Tests use Testcontainers PostgreSQL
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Scanner Team | Migration: binary_identity table |
|
||||||
|
| 2 | T2 | TODO | T1 | Scanner Team | Migration: binary_package_map table |
|
||||||
|
| 3 | T3 | TODO | T1 | Scanner Team | Migration: binary_vuln_assertion table |
|
||||||
|
| 4 | T4 | TODO | T1-T3 | Scanner Team | Create IBinaryEvidenceRepository |
|
||||||
|
| 5 | T5 | TODO | T4 | Scanner Team | Create BinaryEvidenceService |
|
||||||
|
| 6 | T6 | TODO | T5 | Scanner Team | Integrate with scanner |
|
||||||
|
| 7 | T7 | TODO | T5 | Scanner Team | API endpoints |
|
||||||
|
| 8 | T8 | TODO | T1-T7 | Scanner Team | Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. Binary evidence persistence identified as required feature. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Schema design | Decision | Scanner Team | Three tables: identity, package_map, vuln_assertion |
|
||||||
|
| Dedup by hash | Decision | Scanner Team | Use file_sha256 for deduplication |
|
||||||
|
| Build-ID index | Decision | Scanner Team | Primary lookup by build-id when available |
|
||||||
|
| Validity period | Decision | Scanner Team | Assertions can have expiry |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 8 tasks marked DONE
|
||||||
|
- [ ] Binary identities persisted to PostgreSQL
|
||||||
|
- [ ] Package mapping queryable by digest
|
||||||
|
- [ ] Vulnerability assertions stored
|
||||||
|
- [ ] Build-ID lookups use persistent store
|
||||||
|
- [ ] API endpoints work
|
||||||
|
- [ ] All tests pass
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
1281
docs/implplan/SPRINT_4500_0002_0001_vex_conflict_studio.md
Normal file
1281
docs/implplan/SPRINT_4500_0002_0001_vex_conflict_studio.md
Normal file
File diff suppressed because it is too large
Load Diff
749
docs/implplan/SPRINT_4500_0003_0001_operator_auditor_mode.md
Normal file
749
docs/implplan/SPRINT_4500_0003_0001_operator_auditor_mode.md
Normal file
@@ -0,0 +1,749 @@
|
|||||||
|
# Sprint 4500.0003.0001 · Operator/Auditor Mode Toggle
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Add UI mode toggle for operators vs auditors
|
||||||
|
- Operators see minimal, action-focused views
|
||||||
|
- Auditors see full provenance, signatures, and evidence
|
||||||
|
- Persist preference across sessions
|
||||||
|
|
||||||
|
**Working directory:** `src/Web/StellaOps.Web/src/app/core/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None
|
||||||
|
- **Downstream**: None
|
||||||
|
- **Safe to parallelize with**: All other sprints
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `src/Web/StellaOps.Web/AGENTS.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - How Top Scanners Shape Evidence‑First UX.md`
|
||||||
|
- Angular service patterns
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Problem Statement
|
||||||
|
|
||||||
|
The same UI serves two different audiences with different needs:
|
||||||
|
- **Operators**: Need speed, want quick answers ("Can I ship?"), minimal detail
|
||||||
|
- **Auditors**: Need completeness, want full provenance, signatures, evidence chains
|
||||||
|
|
||||||
|
Currently, there's no way to toggle between these views.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create ViewModeService
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create service to manage operator/auditor view state.
|
||||||
|
|
||||||
|
**Implementation Path**: `services/view-mode.service.ts` (new file)
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
import { Injectable, signal, computed, effect } from '@angular/core';
|
||||||
|
|
||||||
|
export type ViewMode = 'operator' | 'auditor';
|
||||||
|
|
||||||
|
export interface ViewModeConfig {
|
||||||
|
showSignatures: boolean;
|
||||||
|
showProvenance: boolean;
|
||||||
|
showEvidenceDetails: boolean;
|
||||||
|
showSnapshots: boolean;
|
||||||
|
showMergeTraces: boolean;
|
||||||
|
showPolicyDetails: boolean;
|
||||||
|
compactFindings: boolean;
|
||||||
|
autoExpandEvidence: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
const OPERATOR_CONFIG: ViewModeConfig = {
|
||||||
|
showSignatures: false,
|
||||||
|
showProvenance: false,
|
||||||
|
showEvidenceDetails: false,
|
||||||
|
showSnapshots: false,
|
||||||
|
showMergeTraces: false,
|
||||||
|
showPolicyDetails: false,
|
||||||
|
compactFindings: true,
|
||||||
|
autoExpandEvidence: false
|
||||||
|
};
|
||||||
|
|
||||||
|
const AUDITOR_CONFIG: ViewModeConfig = {
|
||||||
|
showSignatures: true,
|
||||||
|
showProvenance: true,
|
||||||
|
showEvidenceDetails: true,
|
||||||
|
showSnapshots: true,
|
||||||
|
showMergeTraces: true,
|
||||||
|
showPolicyDetails: true,
|
||||||
|
compactFindings: false,
|
||||||
|
autoExpandEvidence: true
|
||||||
|
};
|
||||||
|
|
||||||
|
const STORAGE_KEY = 'stella-view-mode';
|
||||||
|
|
||||||
|
@Injectable({ providedIn: 'root' })
|
||||||
|
export class ViewModeService {
|
||||||
|
// Current mode
|
||||||
|
private readonly _mode = signal<ViewMode>(this.loadFromStorage());
|
||||||
|
|
||||||
|
// Public readonly signals
|
||||||
|
readonly mode = this._mode.asReadonly();
|
||||||
|
|
||||||
|
// Computed config based on mode
|
||||||
|
readonly config = computed<ViewModeConfig>(() => {
|
||||||
|
return this._mode() === 'operator' ? OPERATOR_CONFIG : AUDITOR_CONFIG;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Convenience computed properties
|
||||||
|
readonly isOperator = computed(() => this._mode() === 'operator');
|
||||||
|
readonly isAuditor = computed(() => this._mode() === 'auditor');
|
||||||
|
readonly showSignatures = computed(() => this.config().showSignatures);
|
||||||
|
readonly showProvenance = computed(() => this.config().showProvenance);
|
||||||
|
readonly showEvidenceDetails = computed(() => this.config().showEvidenceDetails);
|
||||||
|
readonly showSnapshots = computed(() => this.config().showSnapshots);
|
||||||
|
readonly compactFindings = computed(() => this.config().compactFindings);
|
||||||
|
|
||||||
|
constructor() {
|
||||||
|
// Persist changes to storage
|
||||||
|
effect(() => {
|
||||||
|
const mode = this._mode();
|
||||||
|
localStorage.setItem(STORAGE_KEY, mode);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Toggle between operator and auditor mode.
|
||||||
|
*/
|
||||||
|
toggle(): void {
|
||||||
|
this._mode.set(this._mode() === 'operator' ? 'auditor' : 'operator');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set a specific mode.
|
||||||
|
*/
|
||||||
|
setMode(mode: ViewMode): void {
|
||||||
|
this._mode.set(mode);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a specific feature should be shown.
|
||||||
|
*/
|
||||||
|
shouldShow(feature: keyof ViewModeConfig): boolean {
|
||||||
|
return this.config()[feature] as boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
private loadFromStorage(): ViewMode {
|
||||||
|
const stored = localStorage.getItem(STORAGE_KEY);
|
||||||
|
if (stored === 'operator' || stored === 'auditor') {
|
||||||
|
return stored;
|
||||||
|
}
|
||||||
|
return 'operator'; // Default to operator mode
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `ViewModeService` file created
|
||||||
|
- [ ] Signal-based reactive state
|
||||||
|
- [ ] Config objects for each mode
|
||||||
|
- [ ] LocalStorage persistence
|
||||||
|
- [ ] Toggle and setMode methods
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Add Mode Toggle Component
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create toggle switch for the header.
|
||||||
|
|
||||||
|
**Implementation Path**: `components/view-mode-toggle/view-mode-toggle.component.ts`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```typescript
|
||||||
|
import { Component, ChangeDetectionStrategy } from '@angular/core';
|
||||||
|
import { CommonModule } from '@angular/common';
|
||||||
|
import { MatSlideToggleModule } from '@angular/material/slide-toggle';
|
||||||
|
import { MatIconModule } from '@angular/material/icon';
|
||||||
|
import { MatTooltipModule } from '@angular/material/tooltip';
|
||||||
|
import { ViewModeService, ViewMode } from '../../services/view-mode.service';
|
||||||
|
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-view-mode-toggle',
|
||||||
|
standalone: true,
|
||||||
|
imports: [CommonModule, MatSlideToggleModule, MatIconModule, MatTooltipModule],
|
||||||
|
template: `
|
||||||
|
<div class="view-mode-toggle" [matTooltip]="tooltipText()">
|
||||||
|
<mat-icon class="mode-icon">{{ isAuditor() ? 'verified_user' : 'speed' }}</mat-icon>
|
||||||
|
<mat-slide-toggle
|
||||||
|
[checked]="isAuditor()"
|
||||||
|
(change)="onToggle()"
|
||||||
|
color="primary"
|
||||||
|
>
|
||||||
|
</mat-slide-toggle>
|
||||||
|
<span class="mode-label">{{ modeLabel() }}</span>
|
||||||
|
</div>
|
||||||
|
`,
|
||||||
|
styles: [`
|
||||||
|
.view-mode-toggle {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
padding: 4px 12px;
|
||||||
|
background: var(--surface-variant);
|
||||||
|
border-radius: 20px;
|
||||||
|
|
||||||
|
.mode-icon {
|
||||||
|
font-size: 18px;
|
||||||
|
width: 18px;
|
||||||
|
height: 18px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mode-label {
|
||||||
|
font-size: 0.875rem;
|
||||||
|
font-weight: 500;
|
||||||
|
min-width: 60px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`],
|
||||||
|
changeDetection: ChangeDetectionStrategy.OnPush
|
||||||
|
})
|
||||||
|
export class ViewModeToggleComponent {
|
||||||
|
constructor(private viewModeService: ViewModeService) {}
|
||||||
|
|
||||||
|
isAuditor = this.viewModeService.isAuditor;
|
||||||
|
|
||||||
|
modeLabel() {
|
||||||
|
return this.viewModeService.isAuditor() ? 'Auditor' : 'Operator';
|
||||||
|
}
|
||||||
|
|
||||||
|
tooltipText() {
|
||||||
|
return this.viewModeService.isAuditor()
|
||||||
|
? 'Full provenance and evidence details. Switch to Operator for streamlined view.'
|
||||||
|
: 'Streamlined action-focused view. Switch to Auditor for full details.';
|
||||||
|
}
|
||||||
|
|
||||||
|
onToggle(): void {
|
||||||
|
this.viewModeService.toggle();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Add to Header**:
|
||||||
|
```typescript
|
||||||
|
// In app-header.component.ts
|
||||||
|
import { ViewModeToggleComponent } from '../view-mode-toggle/view-mode-toggle.component';
|
||||||
|
|
||||||
|
@Component({
|
||||||
|
// ...
|
||||||
|
imports: [
|
||||||
|
// ...
|
||||||
|
ViewModeToggleComponent
|
||||||
|
],
|
||||||
|
template: `
|
||||||
|
<mat-toolbar>
|
||||||
|
<span class="logo">Stella Ops</span>
|
||||||
|
<span class="spacer"></span>
|
||||||
|
|
||||||
|
<!-- View Mode Toggle -->
|
||||||
|
<stella-view-mode-toggle></stella-view-mode-toggle>
|
||||||
|
|
||||||
|
<!-- User menu etc -->
|
||||||
|
</mat-toolbar>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class AppHeaderComponent {}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Toggle component created
|
||||||
|
- [ ] Shows in header
|
||||||
|
- [ ] Icon changes per mode
|
||||||
|
- [ ] Label shows current mode
|
||||||
|
- [ ] Tooltip explains modes
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Operator Mode Defaults
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define and implement operator mode display rules.
|
||||||
|
|
||||||
|
**Implementation** - Operator Mode Directive:
|
||||||
|
```typescript
|
||||||
|
// directives/auditor-only.directive.ts
|
||||||
|
import { Directive, TemplateRef, ViewContainerRef, effect } from '@angular/core';
|
||||||
|
import { ViewModeService } from '../services/view-mode.service';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Shows content only in auditor mode.
|
||||||
|
* Usage: <div *stellaAuditorOnly>Full provenance details...</div>
|
||||||
|
*/
|
||||||
|
@Directive({
|
||||||
|
selector: '[stellaAuditorOnly]',
|
||||||
|
standalone: true
|
||||||
|
})
|
||||||
|
export class AuditorOnlyDirective {
|
||||||
|
constructor(
|
||||||
|
private templateRef: TemplateRef<any>,
|
||||||
|
private viewContainer: ViewContainerRef,
|
||||||
|
private viewModeService: ViewModeService
|
||||||
|
) {
|
||||||
|
effect(() => {
|
||||||
|
if (this.viewModeService.isAuditor()) {
|
||||||
|
this.viewContainer.createEmbeddedView(this.templateRef);
|
||||||
|
} else {
|
||||||
|
this.viewContainer.clear();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Shows content only in operator mode.
|
||||||
|
* Usage: <div *stellaOperatorOnly>Quick action buttons...</div>
|
||||||
|
*/
|
||||||
|
@Directive({
|
||||||
|
selector: '[stellaOperatorOnly]',
|
||||||
|
standalone: true
|
||||||
|
})
|
||||||
|
export class OperatorOnlyDirective {
|
||||||
|
constructor(
|
||||||
|
private templateRef: TemplateRef<any>,
|
||||||
|
private viewContainer: ViewContainerRef,
|
||||||
|
private viewModeService: ViewModeService
|
||||||
|
) {
|
||||||
|
effect(() => {
|
||||||
|
if (this.viewModeService.isOperator()) {
|
||||||
|
this.viewContainer.createEmbeddedView(this.templateRef);
|
||||||
|
} else {
|
||||||
|
this.viewContainer.clear();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Operator Mode Features**:
|
||||||
|
- Compact finding cards
|
||||||
|
- Hide signature details
|
||||||
|
- Hide merge traces
|
||||||
|
- Hide snapshot info
|
||||||
|
- Show only verdict, not reasoning
|
||||||
|
- Quick action buttons prominent
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] AuditorOnly directive created
|
||||||
|
- [ ] OperatorOnly directive created
|
||||||
|
- [ ] Operator mode shows minimal UI
|
||||||
|
- [ ] No signature details in operator mode
|
||||||
|
- [ ] Quick actions prominent
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Auditor Mode Defaults
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define and implement auditor mode display rules.
|
||||||
|
|
||||||
|
**Auditor Mode Features**:
|
||||||
|
- Expanded finding cards by default
|
||||||
|
- Full signature verification display
|
||||||
|
- Complete merge traces
|
||||||
|
- Snapshot IDs and links
|
||||||
|
- Policy rule details
|
||||||
|
- Evidence chains
|
||||||
|
- DSSE envelope viewer
|
||||||
|
- Rekor transparency log links
|
||||||
|
|
||||||
|
**Implementation** - Auditor-specific components:
|
||||||
|
```typescript
|
||||||
|
// components/signature-badge/signature-badge.component.ts
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-signature-badge',
|
||||||
|
standalone: true,
|
||||||
|
template: `
|
||||||
|
<div class="signature-badge" *ngIf="viewMode.showSignatures()">
|
||||||
|
<mat-icon [class.valid]="signature.valid">
|
||||||
|
{{ signature.valid ? 'verified' : 'dangerous' }}
|
||||||
|
</mat-icon>
|
||||||
|
<div class="details">
|
||||||
|
<span class="signer">{{ signature.signedBy }}</span>
|
||||||
|
<span class="timestamp">{{ signature.signedAt | date:'medium' }}</span>
|
||||||
|
<a *ngIf="signature.rekorLogIndex" [href]="rekorUrl" target="_blank">
|
||||||
|
Rekor #{{ signature.rekorLogIndex }}
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class SignatureBadgeComponent {
|
||||||
|
@Input() signature!: SignatureInfo;
|
||||||
|
|
||||||
|
viewMode = inject(ViewModeService);
|
||||||
|
|
||||||
|
get rekorUrl(): string {
|
||||||
|
return `https://search.sigstore.dev/?logIndex=${this.signature.rekorLogIndex}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Auditor mode shows full details
|
||||||
|
- [ ] Signature badges with verification
|
||||||
|
- [ ] Rekor links when available
|
||||||
|
- [ ] Merge traces visible
|
||||||
|
- [ ] Snapshot references shown
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Component Conditionals
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Update existing components to respect view mode.
|
||||||
|
|
||||||
|
**Implementation** - Update case-header.component.ts:
|
||||||
|
```typescript
|
||||||
|
// Update CaseHeaderComponent
|
||||||
|
@Component({
|
||||||
|
// ...
|
||||||
|
template: `
|
||||||
|
<div class="case-header">
|
||||||
|
<!-- Always visible -->
|
||||||
|
<div class="verdict-section">
|
||||||
|
<button mat-flat-button [class]="verdictClass">
|
||||||
|
<mat-icon>{{ verdictIcon }}</mat-icon>
|
||||||
|
{{ verdictLabel }}
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<!-- Auditor only: signed badge -->
|
||||||
|
<button
|
||||||
|
*stellaAuditorOnly
|
||||||
|
mat-icon-button
|
||||||
|
class="signed-badge"
|
||||||
|
(click)="onAttestationClick()"
|
||||||
|
matTooltip="View DSSE attestation"
|
||||||
|
>
|
||||||
|
<mat-icon>verified</mat-icon>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Operator: compact summary -->
|
||||||
|
<div *stellaOperatorOnly class="quick-summary">
|
||||||
|
{{ data.actionableCount }} items need attention
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Auditor: detailed breakdown -->
|
||||||
|
<div *stellaAuditorOnly class="detailed-breakdown">
|
||||||
|
<div class="delta-section">{{ deltaText }}</div>
|
||||||
|
<div class="snapshot-section">
|
||||||
|
Snapshot: {{ shortSnapshotId }}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class CaseHeaderComponent {
|
||||||
|
viewMode = inject(ViewModeService);
|
||||||
|
// ...
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Implementation** - Update verdict-ladder.component.ts:
|
||||||
|
```typescript
|
||||||
|
@Component({
|
||||||
|
template: `
|
||||||
|
<div class="verdict-ladder">
|
||||||
|
<!-- Step details conditional on mode -->
|
||||||
|
<mat-expansion-panel
|
||||||
|
*ngFor="let step of steps"
|
||||||
|
[expanded]="viewMode.config().autoExpandEvidence"
|
||||||
|
>
|
||||||
|
<mat-expansion-panel-header>
|
||||||
|
<span>{{ step.name }}</span>
|
||||||
|
<!-- Operator: just status icon -->
|
||||||
|
<mat-icon *stellaOperatorOnly>{{ getStepIcon(step) }}</mat-icon>
|
||||||
|
<!-- Auditor: full summary -->
|
||||||
|
<span *stellaAuditorOnly>{{ step.summary }}</span>
|
||||||
|
</mat-expansion-panel-header>
|
||||||
|
|
||||||
|
<!-- Evidence details only in auditor mode -->
|
||||||
|
<div *stellaAuditorOnly class="step-evidence">
|
||||||
|
<!-- Full evidence display -->
|
||||||
|
</div>
|
||||||
|
</mat-expansion-panel>
|
||||||
|
</div>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class VerdictLadderComponent {
|
||||||
|
viewMode = inject(ViewModeService);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Files to Update**:
|
||||||
|
- `case-header.component.ts`
|
||||||
|
- `verdict-ladder.component.ts`
|
||||||
|
- `triage-finding-card.component.ts`
|
||||||
|
- `evidence-chip.component.ts`
|
||||||
|
- `decision-card.component.ts`
|
||||||
|
- `compare-view.component.ts`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Case header respects view mode
|
||||||
|
- [ ] Verdict ladder respects view mode
|
||||||
|
- [ ] Finding cards compact in operator mode
|
||||||
|
- [ ] Evidence details hidden in operator mode
|
||||||
|
- [ ] All affected components updated
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Persist Preference
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 1
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Save preference to LocalStorage and user settings API.
|
||||||
|
|
||||||
|
**Implementation** - Already in ViewModeService (T1), add user settings sync:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Update ViewModeService
|
||||||
|
@Injectable({ providedIn: 'root' })
|
||||||
|
export class ViewModeService {
|
||||||
|
constructor(private userSettingsService: UserSettingsService) {
|
||||||
|
// Load from user settings if logged in, otherwise localStorage
|
||||||
|
this.loadPreference();
|
||||||
|
|
||||||
|
// Sync to server when changed
|
||||||
|
effect(() => {
|
||||||
|
const mode = this._mode();
|
||||||
|
localStorage.setItem(STORAGE_KEY, mode);
|
||||||
|
|
||||||
|
// Also sync to user settings API if authenticated
|
||||||
|
if (this.userSettingsService.isAuthenticated()) {
|
||||||
|
this.userSettingsService.updateSetting('viewMode', mode);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private async loadPreference(): Promise<void> {
|
||||||
|
// Try user settings first
|
||||||
|
if (this.userSettingsService.isAuthenticated()) {
|
||||||
|
const settings = await this.userSettingsService.getSettings();
|
||||||
|
if (settings?.viewMode) {
|
||||||
|
this._mode.set(settings.viewMode);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fall back to localStorage
|
||||||
|
const stored = localStorage.getItem(STORAGE_KEY);
|
||||||
|
if (stored === 'operator' || stored === 'auditor') {
|
||||||
|
this._mode.set(stored);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] LocalStorage persistence works
|
||||||
|
- [ ] User settings API sync (if authenticated)
|
||||||
|
- [ ] Preference loaded on app init
|
||||||
|
- [ ] Survives page refresh
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Tests
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T6
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Test view mode switching behavior.
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```typescript
|
||||||
|
describe('ViewModeService', () => {
|
||||||
|
let service: ViewModeService;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
localStorage.clear();
|
||||||
|
TestBed.configureTestingModule({});
|
||||||
|
service = TestBed.inject(ViewModeService);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should default to operator mode', () => {
|
||||||
|
expect(service.mode()).toBe('operator');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should toggle between modes', () => {
|
||||||
|
expect(service.mode()).toBe('operator');
|
||||||
|
|
||||||
|
service.toggle();
|
||||||
|
expect(service.mode()).toBe('auditor');
|
||||||
|
|
||||||
|
service.toggle();
|
||||||
|
expect(service.mode()).toBe('operator');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should persist to localStorage', () => {
|
||||||
|
service.setMode('auditor');
|
||||||
|
|
||||||
|
expect(localStorage.getItem('stella-view-mode')).toBe('auditor');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should load from localStorage', () => {
|
||||||
|
localStorage.setItem('stella-view-mode', 'auditor');
|
||||||
|
|
||||||
|
const newService = TestBed.inject(ViewModeService);
|
||||||
|
expect(newService.mode()).toBe('auditor');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return operator config', () => {
|
||||||
|
service.setMode('operator');
|
||||||
|
|
||||||
|
expect(service.config().showSignatures).toBe(false);
|
||||||
|
expect(service.config().compactFindings).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return auditor config', () => {
|
||||||
|
service.setMode('auditor');
|
||||||
|
|
||||||
|
expect(service.config().showSignatures).toBe(true);
|
||||||
|
expect(service.config().compactFindings).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('ViewModeToggleComponent', () => {
|
||||||
|
it('should show operator label by default', () => {
|
||||||
|
const fixture = TestBed.createComponent(ViewModeToggleComponent);
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
expect(fixture.nativeElement.textContent).toContain('Operator');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should toggle on click', () => {
|
||||||
|
const fixture = TestBed.createComponent(ViewModeToggleComponent);
|
||||||
|
const service = TestBed.inject(ViewModeService);
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
const toggle = fixture.nativeElement.querySelector('mat-slide-toggle');
|
||||||
|
toggle.click();
|
||||||
|
|
||||||
|
expect(service.mode()).toBe('auditor');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('AuditorOnlyDirective', () => {
|
||||||
|
@Component({
|
||||||
|
template: `<div *stellaAuditorOnly>Auditor content</div>`
|
||||||
|
})
|
||||||
|
class TestComponent {}
|
||||||
|
|
||||||
|
it('should hide content in operator mode', () => {
|
||||||
|
const service = TestBed.inject(ViewModeService);
|
||||||
|
service.setMode('operator');
|
||||||
|
|
||||||
|
const fixture = TestBed.createComponent(TestComponent);
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
expect(fixture.nativeElement.textContent).not.toContain('Auditor content');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show content in auditor mode', () => {
|
||||||
|
const service = TestBed.inject(ViewModeService);
|
||||||
|
service.setMode('auditor');
|
||||||
|
|
||||||
|
const fixture = TestBed.createComponent(TestComponent);
|
||||||
|
fixture.detectChanges();
|
||||||
|
|
||||||
|
expect(fixture.nativeElement.textContent).toContain('Auditor content');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Service tests for toggle
|
||||||
|
- [ ] Service tests for config
|
||||||
|
- [ ] Service tests for persistence
|
||||||
|
- [ ] Toggle component tests
|
||||||
|
- [ ] Directive tests
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | UI Team | Create ViewModeService |
|
||||||
|
| 2 | T2 | TODO | T1 | UI Team | Add mode toggle component |
|
||||||
|
| 3 | T3 | TODO | T1 | UI Team | Operator mode defaults |
|
||||||
|
| 4 | T4 | TODO | T1 | UI Team | Auditor mode defaults |
|
||||||
|
| 5 | T5 | TODO | T1, T3, T4 | UI Team | Component conditionals |
|
||||||
|
| 6 | T6 | TODO | T1 | UI Team | Persist preference |
|
||||||
|
| 7 | T7 | TODO | T1-T6 | UI Team | Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from UX Gap Analysis. Operator/Auditor mode toggle identified as key UX differentiator. | Claude |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Default mode | Decision | UI Team | Default to Operator (most common use case) |
|
||||||
|
| Signal-based | Decision | UI Team | Use Angular signals for reactivity |
|
||||||
|
| Persistence | Decision | UI Team | LocalStorage + user settings API |
|
||||||
|
| Directives | Decision | UI Team | Use structural directives for show/hide |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Toggle visible in header
|
||||||
|
- [ ] Operator mode shows minimal info
|
||||||
|
- [ ] Auditor mode shows full provenance
|
||||||
|
- [ ] Preference persists across sessions
|
||||||
|
- [ ] All affected components updated
|
||||||
|
- [ ] All tests pass
|
||||||
|
- [ ] `ng build` succeeds
|
||||||
|
- [ ] `ng test` succeeds
|
||||||
581
docs/implplan/SPRINT_5100_0001_0001_run_manifest_schema.md
Normal file
581
docs/implplan/SPRINT_5100_0001_0001_run_manifest_schema.md
Normal file
@@ -0,0 +1,581 @@
|
|||||||
|
# Sprint 5100.0001.0001 · Run Manifest Schema
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Define the Run Manifest schema as the foundational artifact for deterministic replay.
|
||||||
|
- Captures all inputs required to reproduce a scan verdict: artifact digests, feed versions, policy versions, tool versions, PRNG seed, and canonicalization version.
|
||||||
|
- Implement C# models, JSON schema, serialization utilities, and validation.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.Testing.Manifests/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (foundational sprint)
|
||||||
|
- **Downstream**: Sprint 5100.0002.0002 (Replay Runner) depends on this
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0001.0002, 5100.0001.0003, 5100.0001.0004
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
- `docs/modules/scanner/architecture.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define RunManifest Domain Model
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the core RunManifest domain model that captures all inputs for a reproducible scan.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.Manifests/Models/RunManifest.cs`
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Testing.Manifests.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Captures all inputs required to reproduce a scan verdict deterministically.
|
||||||
|
/// This is the "replay key" that enables time-travel verification.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record RunManifest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique identifier for this run.
|
||||||
|
/// </summary>
|
||||||
|
public required string RunId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Schema version for forward compatibility.
|
||||||
|
/// </summary>
|
||||||
|
public required string SchemaVersion { get; init; } = "1.0.0";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Artifact digests being scanned (image layers, binaries, etc.).
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<ArtifactDigest> ArtifactDigests { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SBOM digests produced or consumed during the run.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<SbomReference> SbomDigests { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Vulnerability feed snapshot used for matching.
|
||||||
|
/// </summary>
|
||||||
|
public required FeedSnapshot FeedSnapshot { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Policy version and lattice rules digest.
|
||||||
|
/// </summary>
|
||||||
|
public required PolicySnapshot PolicySnapshot { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tool versions used in the scan pipeline.
|
||||||
|
/// </summary>
|
||||||
|
public required ToolVersions ToolVersions { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Cryptographic profile: trust roots, key IDs, algorithm set.
|
||||||
|
/// </summary>
|
||||||
|
public required CryptoProfile CryptoProfile { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Environment profile: postgres-only vs postgres+valkey.
|
||||||
|
/// </summary>
|
||||||
|
public required EnvironmentProfile EnvironmentProfile { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// PRNG seed for any randomized operations (ensures reproducibility).
|
||||||
|
/// </summary>
|
||||||
|
public long? PrngSeed { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Canonicalization algorithm version for stable JSON output.
|
||||||
|
/// </summary>
|
||||||
|
public required string CanonicalizationVersion { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when the run was initiated.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset InitiatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 hash of this manifest (excluding this field).
|
||||||
|
/// </summary>
|
||||||
|
public string? ManifestDigest { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ArtifactDigest(
|
||||||
|
string Algorithm, // sha256, sha512
|
||||||
|
string Digest,
|
||||||
|
string? MediaType,
|
||||||
|
string? Reference); // image ref, file path
|
||||||
|
|
||||||
|
public sealed record SbomReference(
|
||||||
|
string Format, // cyclonedx-1.6, spdx-3.0.1
|
||||||
|
string Digest,
|
||||||
|
string? Uri);
|
||||||
|
|
||||||
|
public sealed record FeedSnapshot(
|
||||||
|
string FeedId,
|
||||||
|
string Version,
|
||||||
|
string Digest,
|
||||||
|
DateTimeOffset SnapshotAt);
|
||||||
|
|
||||||
|
public sealed record PolicySnapshot(
|
||||||
|
string PolicyVersion,
|
||||||
|
string LatticeRulesDigest,
|
||||||
|
ImmutableArray<string> EnabledRules);
|
||||||
|
|
||||||
|
public sealed record ToolVersions(
|
||||||
|
string ScannerVersion,
|
||||||
|
string SbomGeneratorVersion,
|
||||||
|
string ReachabilityEngineVersion,
|
||||||
|
string AttestorVersion,
|
||||||
|
ImmutableDictionary<string, string> AdditionalTools);
|
||||||
|
|
||||||
|
public sealed record CryptoProfile(
|
||||||
|
string ProfileName, // fips, eidas, gost, sm, default
|
||||||
|
ImmutableArray<string> TrustRootIds,
|
||||||
|
ImmutableArray<string> AllowedAlgorithms);
|
||||||
|
|
||||||
|
public sealed record EnvironmentProfile(
|
||||||
|
string Name, // postgres-only, postgres-valkey
|
||||||
|
bool ValkeyEnabled,
|
||||||
|
string? PostgresVersion,
|
||||||
|
string? ValkeyVersion);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `RunManifest.cs` created with all fields
|
||||||
|
- [ ] Supporting records for each component (ArtifactDigest, FeedSnapshot, etc.)
|
||||||
|
- [ ] ImmutableArray/ImmutableDictionary for collections
|
||||||
|
- [ ] XML documentation on all types and properties
|
||||||
|
- [ ] Nullable fields use `?` appropriately
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: JSON Schema Definition
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create JSON Schema for RunManifest validation and documentation.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.Manifests/Schemas/run-manifest.schema.json`
|
||||||
|
|
||||||
|
**Schema Outline**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||||
|
"$id": "https://stellaops.io/schemas/run-manifest/v1",
|
||||||
|
"title": "StellaOps Run Manifest",
|
||||||
|
"description": "Captures all inputs for deterministic scan replay",
|
||||||
|
"type": "object",
|
||||||
|
"required": [
|
||||||
|
"runId", "schemaVersion", "artifactDigests", "feedSnapshot",
|
||||||
|
"policySnapshot", "toolVersions", "cryptoProfile",
|
||||||
|
"environmentProfile", "canonicalizationVersion", "initiatedAt"
|
||||||
|
],
|
||||||
|
"properties": {
|
||||||
|
"runId": { "type": "string", "format": "uuid" },
|
||||||
|
"schemaVersion": { "type": "string", "pattern": "^\\d+\\.\\d+\\.\\d+$" },
|
||||||
|
"artifactDigests": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "$ref": "#/$defs/artifactDigest" },
|
||||||
|
"minItems": 1
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"$defs": {
|
||||||
|
"artifactDigest": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["algorithm", "digest"],
|
||||||
|
"properties": {
|
||||||
|
"algorithm": { "enum": ["sha256", "sha512"] },
|
||||||
|
"digest": { "type": "string", "pattern": "^[a-f0-9]{64,128}$" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Complete JSON Schema covering all fields
|
||||||
|
- [ ] Schema validates sample manifests correctly
|
||||||
|
- [ ] Schema rejects invalid manifests
|
||||||
|
- [ ] Embedded as resource in assembly
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Serialization Utilities
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement serialization/deserialization with canonical JSON output.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.Manifests/Serialization/RunManifestSerializer.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Testing.Manifests.Serialization;
|
||||||
|
|
||||||
|
public static class RunManifestSerializer
|
||||||
|
{
|
||||||
|
private static readonly JsonSerializerOptions CanonicalOptions = new()
|
||||||
|
{
|
||||||
|
WriteIndented = false,
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||||
|
Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping,
|
||||||
|
// Custom converter for stable key ordering
|
||||||
|
Converters = { new StableOrderDictionaryConverter() }
|
||||||
|
};
|
||||||
|
|
||||||
|
public static string Serialize(RunManifest manifest) =>
|
||||||
|
JsonSerializer.Serialize(manifest, CanonicalOptions);
|
||||||
|
|
||||||
|
public static RunManifest Deserialize(string json) =>
|
||||||
|
JsonSerializer.Deserialize<RunManifest>(json, CanonicalOptions)
|
||||||
|
?? throw new InvalidOperationException("Failed to deserialize manifest");
|
||||||
|
|
||||||
|
public static string ComputeDigest(RunManifest manifest)
|
||||||
|
{
|
||||||
|
var withoutDigest = manifest with { ManifestDigest = null };
|
||||||
|
var json = Serialize(withoutDigest);
|
||||||
|
return Convert.ToHexString(SHA256.HashData(Encoding.UTF8.GetBytes(json))).ToLowerInvariant();
|
||||||
|
}
|
||||||
|
|
||||||
|
public static RunManifest WithDigest(RunManifest manifest) =>
|
||||||
|
manifest with { ManifestDigest = ComputeDigest(manifest) };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Canonical JSON output (stable key ordering)
|
||||||
|
- [ ] Round-trip serialization preserves data
|
||||||
|
- [ ] Digest computation excludes ManifestDigest field
|
||||||
|
- [ ] UTF-8 encoding consistently applied
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Manifest Validation Service
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Validate manifests against schema and business rules.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.Manifests/Validation/RunManifestValidator.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Testing.Manifests.Validation;
|
||||||
|
|
||||||
|
public sealed class RunManifestValidator : IRunManifestValidator
|
||||||
|
{
|
||||||
|
private readonly JsonSchema _schema;
|
||||||
|
|
||||||
|
public RunManifestValidator()
|
||||||
|
{
|
||||||
|
var schemaJson = EmbeddedResources.GetSchema("run-manifest.schema.json");
|
||||||
|
_schema = JsonSchema.FromText(schemaJson);
|
||||||
|
}
|
||||||
|
|
||||||
|
public ValidationResult Validate(RunManifest manifest)
|
||||||
|
{
|
||||||
|
var errors = new List<ValidationError>();
|
||||||
|
|
||||||
|
// Schema validation
|
||||||
|
var json = RunManifestSerializer.Serialize(manifest);
|
||||||
|
var schemaResult = _schema.Evaluate(JsonDocument.Parse(json));
|
||||||
|
if (!schemaResult.IsValid)
|
||||||
|
{
|
||||||
|
errors.AddRange(schemaResult.Errors.Select(e =>
|
||||||
|
new ValidationError("Schema", e.Message)));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Business rules
|
||||||
|
if (manifest.ArtifactDigests.Length == 0)
|
||||||
|
errors.Add(new ValidationError("ArtifactDigests", "At least one artifact required"));
|
||||||
|
|
||||||
|
if (manifest.FeedSnapshot.SnapshotAt > manifest.InitiatedAt)
|
||||||
|
errors.Add(new ValidationError("FeedSnapshot", "Feed snapshot cannot be after run initiation"));
|
||||||
|
|
||||||
|
// Digest verification
|
||||||
|
if (manifest.ManifestDigest != null)
|
||||||
|
{
|
||||||
|
var computed = RunManifestSerializer.ComputeDigest(manifest);
|
||||||
|
if (computed != manifest.ManifestDigest)
|
||||||
|
errors.Add(new ValidationError("ManifestDigest", "Digest mismatch"));
|
||||||
|
}
|
||||||
|
|
||||||
|
return new ValidationResult(errors.Count == 0, errors);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ValidationResult(bool IsValid, IReadOnlyList<ValidationError> Errors);
|
||||||
|
public sealed record ValidationError(string Field, string Message);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Schema validation integrated
|
||||||
|
- [ ] Business rule validation (non-empty artifacts, timestamp ordering)
|
||||||
|
- [ ] Digest verification
|
||||||
|
- [ ] Clear error messages
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Manifest Capture Service
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Service to capture run manifests during scan execution.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.Manifests/Services/ManifestCaptureService.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Testing.Manifests.Services;
|
||||||
|
|
||||||
|
public sealed class ManifestCaptureService : IManifestCaptureService
|
||||||
|
{
|
||||||
|
private readonly IFeedVersionProvider _feedProvider;
|
||||||
|
private readonly IPolicyVersionProvider _policyProvider;
|
||||||
|
private readonly TimeProvider _timeProvider;
|
||||||
|
|
||||||
|
public async Task<RunManifest> CaptureAsync(
|
||||||
|
ScanContext context,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var feedSnapshot = await _feedProvider.GetCurrentSnapshotAsync(ct);
|
||||||
|
var policySnapshot = await _policyProvider.GetCurrentSnapshotAsync(ct);
|
||||||
|
|
||||||
|
var manifest = new RunManifest
|
||||||
|
{
|
||||||
|
RunId = context.RunId,
|
||||||
|
SchemaVersion = "1.0.0",
|
||||||
|
ArtifactDigests = context.ArtifactDigests,
|
||||||
|
SbomDigests = context.GeneratedSboms,
|
||||||
|
FeedSnapshot = feedSnapshot,
|
||||||
|
PolicySnapshot = policySnapshot,
|
||||||
|
ToolVersions = GetToolVersions(),
|
||||||
|
CryptoProfile = context.CryptoProfile,
|
||||||
|
EnvironmentProfile = GetEnvironmentProfile(),
|
||||||
|
PrngSeed = context.PrngSeed,
|
||||||
|
CanonicalizationVersion = "1.0.0",
|
||||||
|
InitiatedAt = _timeProvider.GetUtcNow()
|
||||||
|
};
|
||||||
|
|
||||||
|
return RunManifestSerializer.WithDigest(manifest);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ToolVersions GetToolVersions() => new(
|
||||||
|
ScannerVersion: typeof(Scanner).Assembly.GetName().Version?.ToString() ?? "unknown",
|
||||||
|
SbomGeneratorVersion: "1.0.0",
|
||||||
|
ReachabilityEngineVersion: "1.0.0",
|
||||||
|
AttestorVersion: "1.0.0",
|
||||||
|
AdditionalTools: ImmutableDictionary<string, string>.Empty);
|
||||||
|
|
||||||
|
private EnvironmentProfile GetEnvironmentProfile() => new(
|
||||||
|
Name: Environment.GetEnvironmentVariable("STELLAOPS_ENV_PROFILE") ?? "postgres-only",
|
||||||
|
ValkeyEnabled: Environment.GetEnvironmentVariable("STELLAOPS_VALKEY_ENABLED") == "true",
|
||||||
|
PostgresVersion: "16",
|
||||||
|
ValkeyVersion: null);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Captures all required fields during scan
|
||||||
|
- [ ] Integrates with feed and policy version providers
|
||||||
|
- [ ] Computes digest automatically
|
||||||
|
- [ ] Environment detection for profile
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive unit tests for manifest models and utilities.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/__Tests/StellaOps.Testing.Manifests.Tests/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class RunManifestTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void Serialize_ValidManifest_ProducesCanonicalJson()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
var json1 = RunManifestSerializer.Serialize(manifest);
|
||||||
|
var json2 = RunManifestSerializer.Serialize(manifest);
|
||||||
|
json1.Should().Be(json2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ComputeDigest_SameManifest_ProducesSameDigest()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
var digest1 = RunManifestSerializer.ComputeDigest(manifest);
|
||||||
|
var digest2 = RunManifestSerializer.ComputeDigest(manifest);
|
||||||
|
digest1.Should().Be(digest2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void ComputeDigest_DifferentManifest_ProducesDifferentDigest()
|
||||||
|
{
|
||||||
|
var manifest1 = CreateTestManifest();
|
||||||
|
var manifest2 = manifest1 with { RunId = Guid.NewGuid().ToString() };
|
||||||
|
var digest1 = RunManifestSerializer.ComputeDigest(manifest1);
|
||||||
|
var digest2 = RunManifestSerializer.ComputeDigest(manifest2);
|
||||||
|
digest1.Should().NotBe(digest2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Validate_ValidManifest_ReturnsSuccess()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
var validator = new RunManifestValidator();
|
||||||
|
var result = validator.Validate(manifest);
|
||||||
|
result.IsValid.Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Validate_EmptyArtifacts_ReturnsFalse()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest() with
|
||||||
|
{
|
||||||
|
ArtifactDigests = []
|
||||||
|
};
|
||||||
|
var validator = new RunManifestValidator();
|
||||||
|
var result = validator.Validate(manifest);
|
||||||
|
result.IsValid.Should().BeFalse();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void RoundTrip_PreservesAllFields()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
var json = RunManifestSerializer.Serialize(manifest);
|
||||||
|
var deserialized = RunManifestSerializer.Deserialize(json);
|
||||||
|
deserialized.Should().BeEquivalentTo(manifest);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Serialization determinism tests
|
||||||
|
- [ ] Digest computation tests
|
||||||
|
- [ ] Validation tests (positive and negative)
|
||||||
|
- [ ] Round-trip tests
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Project Setup
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the project structure and dependencies.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.Manifests/StellaOps.Testing.Manifests.csproj`
|
||||||
|
|
||||||
|
**Project File**:
|
||||||
|
```xml
|
||||||
|
<Project Sdk="Microsoft.NET.Sdk">
|
||||||
|
<PropertyGroup>
|
||||||
|
<TargetFramework>net10.0</TargetFramework>
|
||||||
|
<ImplicitUsings>enable</ImplicitUsings>
|
||||||
|
<Nullable>enable</Nullable>
|
||||||
|
<LangVersion>preview</LangVersion>
|
||||||
|
</PropertyGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="System.Text.Json" Version="9.0.0" />
|
||||||
|
<PackageReference Include="Json.Schema.Net" Version="7.0.0" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<EmbeddedResource Include="Schemas\*.json" />
|
||||||
|
</ItemGroup>
|
||||||
|
</Project>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Project compiles
|
||||||
|
- [ ] Dependencies resolved
|
||||||
|
- [ ] Schema embedded as resource
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Define RunManifest Domain Model |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | JSON Schema Definition |
|
||||||
|
| 3 | T3 | TODO | T1 | QA Team | Serialization Utilities |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | QA Team | Manifest Validation Service |
|
||||||
|
| 5 | T5 | TODO | T1, T3 | QA Team | Manifest Capture Service |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | QA Team | Unit Tests |
|
||||||
|
| 7 | T7 | TODO | — | QA Team | Project Setup |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Run Manifest identified as foundational artifact for deterministic replay. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Schema version strategy | Decision | QA Team | Semantic versioning with backward compatibility |
|
||||||
|
| Digest algorithm | Decision | QA Team | SHA-256 for manifest digest |
|
||||||
|
| Canonical JSON | Decision | QA Team | Stable key ordering, camelCase, no whitespace |
|
||||||
|
| PRNG seed storage | Decision | QA Team | Optional field, used when reproducibility requires randomness control |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] RunManifest model captures all inputs for replay
|
||||||
|
- [ ] JSON schema validates manifests
|
||||||
|
- [ ] Serialization produces canonical, deterministic output
|
||||||
|
- [ ] Digest computation is stable across platforms
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds with 100% pass rate
|
||||||
527
docs/implplan/SPRINT_5100_0001_0002_evidence_index_schema.md
Normal file
527
docs/implplan/SPRINT_5100_0001_0002_evidence_index_schema.md
Normal file
@@ -0,0 +1,527 @@
|
|||||||
|
# Sprint 5100.0001.0002 · Evidence Index Schema
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Define the Evidence Index schema that links verdicts to their supporting evidence chain.
|
||||||
|
- Creates the machine-readable graph: verdict -> SBOM digest -> attestation IDs -> tool versions -> reachability proofs.
|
||||||
|
- Implement C# models, JSON schema, and linking utilities.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.Evidence/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (foundational sprint)
|
||||||
|
- **Downstream**: Sprint 5100.0003.0001 (SBOM Interop) uses evidence linking
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0001.0001, 5100.0001.0003, 5100.0001.0004
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/modules/attestor/architecture.md`
|
||||||
|
- `docs/product-advisories/18-Dec-2025 - Designing Explainable Triage and Proof-Linked Evidence.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define Evidence Index Domain Model
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the Evidence Index model that establishes the complete provenance chain for a verdict.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Evidence/Models/EvidenceIndex.cs`
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Evidence.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Machine-readable index linking a verdict to all supporting evidence.
|
||||||
|
/// The product is not the verdict; the product is verdict + evidence graph.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EvidenceIndex
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique identifier for this evidence index.
|
||||||
|
/// </summary>
|
||||||
|
public required string IndexId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Schema version for forward compatibility.
|
||||||
|
/// </summary>
|
||||||
|
public required string SchemaVersion { get; init; } = "1.0.0";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the verdict this evidence supports.
|
||||||
|
/// </summary>
|
||||||
|
public required VerdictReference Verdict { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SBOM references used to produce the verdict.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<SbomEvidence> Sboms { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Attestations in the evidence chain.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<AttestationEvidence> Attestations { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX documents applied to the verdict.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<VexEvidence> VexDocuments { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reachability proofs for vulnerability correlation.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<ReachabilityEvidence> ReachabilityProofs { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Unknowns encountered during analysis.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<UnknownEvidence> Unknowns { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tool versions used to produce evidence.
|
||||||
|
/// </summary>
|
||||||
|
public required ToolChainEvidence ToolChain { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Run manifest reference for replay capability.
|
||||||
|
/// </summary>
|
||||||
|
public required string RunManifestDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when index was created.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 digest of this index (excluding this field).
|
||||||
|
/// </summary>
|
||||||
|
public string? IndexDigest { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record VerdictReference(
|
||||||
|
string VerdictId,
|
||||||
|
string Digest,
|
||||||
|
VerdictOutcome Outcome,
|
||||||
|
string? PolicyVersion);
|
||||||
|
|
||||||
|
public enum VerdictOutcome
|
||||||
|
{
|
||||||
|
Pass,
|
||||||
|
Fail,
|
||||||
|
Warn,
|
||||||
|
Unknown
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SbomEvidence(
|
||||||
|
string SbomId,
|
||||||
|
string Format, // cyclonedx-1.6, spdx-3.0.1
|
||||||
|
string Digest,
|
||||||
|
string? Uri,
|
||||||
|
int ComponentCount,
|
||||||
|
DateTimeOffset GeneratedAt);
|
||||||
|
|
||||||
|
public sealed record AttestationEvidence(
|
||||||
|
string AttestationId,
|
||||||
|
string Type, // sbom, vex, build-provenance, verdict
|
||||||
|
string Digest,
|
||||||
|
string SignerKeyId,
|
||||||
|
bool SignatureValid,
|
||||||
|
DateTimeOffset SignedAt,
|
||||||
|
string? RekorLogIndex);
|
||||||
|
|
||||||
|
public sealed record VexEvidence(
|
||||||
|
string VexId,
|
||||||
|
string Format, // openvex, csaf, cyclonedx
|
||||||
|
string Digest,
|
||||||
|
string Source, // vendor, distro, internal
|
||||||
|
int StatementCount,
|
||||||
|
ImmutableArray<string> AffectedVulnerabilities);
|
||||||
|
|
||||||
|
public sealed record ReachabilityEvidence(
|
||||||
|
string ProofId,
|
||||||
|
string VulnerabilityId,
|
||||||
|
string ComponentPurl,
|
||||||
|
ReachabilityStatus Status,
|
||||||
|
string? EntryPoint,
|
||||||
|
ImmutableArray<string> CallPath,
|
||||||
|
string Digest);
|
||||||
|
|
||||||
|
public enum ReachabilityStatus
|
||||||
|
{
|
||||||
|
Reachable,
|
||||||
|
NotReachable,
|
||||||
|
Inconclusive,
|
||||||
|
NotAnalyzed
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record UnknownEvidence(
|
||||||
|
string UnknownId,
|
||||||
|
string ReasonCode,
|
||||||
|
string Description,
|
||||||
|
string? ComponentPurl,
|
||||||
|
string? VulnerabilityId,
|
||||||
|
UnknownSeverity Severity);
|
||||||
|
|
||||||
|
public enum UnknownSeverity
|
||||||
|
{
|
||||||
|
Low,
|
||||||
|
Medium,
|
||||||
|
High,
|
||||||
|
Critical
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ToolChainEvidence(
|
||||||
|
string ScannerVersion,
|
||||||
|
string SbomGeneratorVersion,
|
||||||
|
string ReachabilityEngineVersion,
|
||||||
|
string AttestorVersion,
|
||||||
|
string PolicyEngineVersion,
|
||||||
|
ImmutableDictionary<string, string> AdditionalTools);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `EvidenceIndex.cs` created with all fields
|
||||||
|
- [ ] Supporting records for each evidence type
|
||||||
|
- [ ] Outcome enum covers all verdict states
|
||||||
|
- [ ] ReachabilityStatus captures analysis result
|
||||||
|
- [ ] XML documentation on all types
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: JSON Schema Definition
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create JSON Schema for Evidence Index validation.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Evidence/Schemas/evidence-index.schema.json`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Complete JSON Schema for all evidence types
|
||||||
|
- [ ] Schema validates sample indexes correctly
|
||||||
|
- [ ] Schema rejects malformed indexes
|
||||||
|
- [ ] Embedded as resource in assembly
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Evidence Linker Service
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Service that builds the evidence index by collecting references during scan execution.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Evidence/Services/EvidenceLinker.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Evidence.Services;
|
||||||
|
|
||||||
|
public sealed class EvidenceLinker : IEvidenceLinker
|
||||||
|
{
|
||||||
|
private readonly List<SbomEvidence> _sboms = [];
|
||||||
|
private readonly List<AttestationEvidence> _attestations = [];
|
||||||
|
private readonly List<VexEvidence> _vexDocuments = [];
|
||||||
|
private readonly List<ReachabilityEvidence> _reachabilityProofs = [];
|
||||||
|
private readonly List<UnknownEvidence> _unknowns = [];
|
||||||
|
private ToolChainEvidence? _toolChain;
|
||||||
|
|
||||||
|
public void AddSbom(SbomEvidence sbom) => _sboms.Add(sbom);
|
||||||
|
public void AddAttestation(AttestationEvidence attestation) => _attestations.Add(attestation);
|
||||||
|
public void AddVex(VexEvidence vex) => _vexDocuments.Add(vex);
|
||||||
|
public void AddReachabilityProof(ReachabilityEvidence proof) => _reachabilityProofs.Add(proof);
|
||||||
|
public void AddUnknown(UnknownEvidence unknown) => _unknowns.Add(unknown);
|
||||||
|
public void SetToolChain(ToolChainEvidence toolChain) => _toolChain = toolChain;
|
||||||
|
|
||||||
|
public EvidenceIndex Build(VerdictReference verdict, string runManifestDigest)
|
||||||
|
{
|
||||||
|
if (_toolChain == null)
|
||||||
|
throw new InvalidOperationException("ToolChain must be set before building index");
|
||||||
|
|
||||||
|
var index = new EvidenceIndex
|
||||||
|
{
|
||||||
|
IndexId = Guid.NewGuid().ToString(),
|
||||||
|
SchemaVersion = "1.0.0",
|
||||||
|
Verdict = verdict,
|
||||||
|
Sboms = [.. _sboms],
|
||||||
|
Attestations = [.. _attestations],
|
||||||
|
VexDocuments = [.. _vexDocuments],
|
||||||
|
ReachabilityProofs = [.. _reachabilityProofs],
|
||||||
|
Unknowns = [.. _unknowns],
|
||||||
|
ToolChain = _toolChain,
|
||||||
|
RunManifestDigest = runManifestDigest,
|
||||||
|
CreatedAt = DateTimeOffset.UtcNow
|
||||||
|
};
|
||||||
|
|
||||||
|
return EvidenceIndexSerializer.WithDigest(index);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public interface IEvidenceLinker
|
||||||
|
{
|
||||||
|
void AddSbom(SbomEvidence sbom);
|
||||||
|
void AddAttestation(AttestationEvidence attestation);
|
||||||
|
void AddVex(VexEvidence vex);
|
||||||
|
void AddReachabilityProof(ReachabilityEvidence proof);
|
||||||
|
void AddUnknown(UnknownEvidence unknown);
|
||||||
|
void SetToolChain(ToolChainEvidence toolChain);
|
||||||
|
EvidenceIndex Build(VerdictReference verdict, string runManifestDigest);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Collects all evidence types during scan
|
||||||
|
- [ ] Builds complete index with digest
|
||||||
|
- [ ] Validates required fields before build
|
||||||
|
- [ ] Thread-safe collection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Evidence Validator
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Validate evidence indexes for completeness and correctness.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Evidence/Validation/EvidenceIndexValidator.cs`
|
||||||
|
|
||||||
|
**Validation Rules**:
|
||||||
|
```csharp
|
||||||
|
public sealed class EvidenceIndexValidator : IEvidenceIndexValidator
|
||||||
|
{
|
||||||
|
public ValidationResult Validate(EvidenceIndex index)
|
||||||
|
{
|
||||||
|
var errors = new List<ValidationError>();
|
||||||
|
|
||||||
|
// Required evidence checks
|
||||||
|
if (index.Sboms.Length == 0)
|
||||||
|
errors.Add(new ValidationError("Sboms", "At least one SBOM required"));
|
||||||
|
|
||||||
|
// Verdict-SBOM linkage
|
||||||
|
// Every "not affected" claim must have evidence hooks per policy
|
||||||
|
foreach (var vex in index.VexDocuments)
|
||||||
|
{
|
||||||
|
if (vex.StatementCount == 0)
|
||||||
|
errors.Add(new ValidationError("VexDocuments",
|
||||||
|
$"VEX {vex.VexId} has no statements"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reachability evidence for reachable vulns
|
||||||
|
foreach (var proof in index.ReachabilityProofs)
|
||||||
|
{
|
||||||
|
if (proof.Status == ReachabilityStatus.Inconclusive &&
|
||||||
|
!index.Unknowns.Any(u => u.VulnerabilityId == proof.VulnerabilityId))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("ReachabilityProofs",
|
||||||
|
$"Inconclusive reachability for {proof.VulnerabilityId} not recorded as unknown"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Attestation signature validity
|
||||||
|
foreach (var att in index.Attestations)
|
||||||
|
{
|
||||||
|
if (!att.SignatureValid)
|
||||||
|
errors.Add(new ValidationError("Attestations",
|
||||||
|
$"Attestation {att.AttestationId} has invalid signature"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Digest verification
|
||||||
|
if (index.IndexDigest != null)
|
||||||
|
{
|
||||||
|
var computed = EvidenceIndexSerializer.ComputeDigest(index);
|
||||||
|
if (computed != index.IndexDigest)
|
||||||
|
errors.Add(new ValidationError("IndexDigest", "Digest mismatch"));
|
||||||
|
}
|
||||||
|
|
||||||
|
return new ValidationResult(errors.Count == 0, errors);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Validates required evidence presence
|
||||||
|
- [ ] Checks SBOM linkage
|
||||||
|
- [ ] Validates attestation signatures
|
||||||
|
- [ ] Verifies digest integrity
|
||||||
|
- [ ] Reports all errors with context
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Evidence Query Service
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Query service for navigating evidence chains.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Evidence/Services/EvidenceQueryService.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Evidence.Services;
|
||||||
|
|
||||||
|
public sealed class EvidenceQueryService : IEvidenceQueryService
|
||||||
|
{
|
||||||
|
public IEnumerable<AttestationEvidence> GetAttestationsForSbom(
|
||||||
|
EvidenceIndex index, string sbomDigest)
|
||||||
|
{
|
||||||
|
return index.Attestations
|
||||||
|
.Where(a => a.Type == "sbom" &&
|
||||||
|
index.Sboms.Any(s => s.Digest == sbomDigest));
|
||||||
|
}
|
||||||
|
|
||||||
|
public IEnumerable<ReachabilityEvidence> GetReachabilityForVulnerability(
|
||||||
|
EvidenceIndex index, string vulnerabilityId)
|
||||||
|
{
|
||||||
|
return index.ReachabilityProofs
|
||||||
|
.Where(r => r.VulnerabilityId == vulnerabilityId);
|
||||||
|
}
|
||||||
|
|
||||||
|
public IEnumerable<VexEvidence> GetVexForVulnerability(
|
||||||
|
EvidenceIndex index, string vulnerabilityId)
|
||||||
|
{
|
||||||
|
return index.VexDocuments
|
||||||
|
.Where(v => v.AffectedVulnerabilities.Contains(vulnerabilityId));
|
||||||
|
}
|
||||||
|
|
||||||
|
public EvidenceChainReport BuildChainReport(EvidenceIndex index)
|
||||||
|
{
|
||||||
|
return new EvidenceChainReport
|
||||||
|
{
|
||||||
|
VerdictDigest = index.Verdict.Digest,
|
||||||
|
SbomCount = index.Sboms.Length,
|
||||||
|
AttestationCount = index.Attestations.Length,
|
||||||
|
VexCount = index.VexDocuments.Length,
|
||||||
|
ReachabilityProofCount = index.ReachabilityProofs.Length,
|
||||||
|
UnknownCount = index.Unknowns.Length,
|
||||||
|
AllSignaturesValid = index.Attestations.All(a => a.SignatureValid),
|
||||||
|
HasRekorEntries = index.Attestations.Any(a => a.RekorLogIndex != null),
|
||||||
|
ToolChainComplete = index.ToolChain != null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record EvidenceChainReport
|
||||||
|
{
|
||||||
|
public required string VerdictDigest { get; init; }
|
||||||
|
public int SbomCount { get; init; }
|
||||||
|
public int AttestationCount { get; init; }
|
||||||
|
public int VexCount { get; init; }
|
||||||
|
public int ReachabilityProofCount { get; init; }
|
||||||
|
public int UnknownCount { get; init; }
|
||||||
|
public bool AllSignaturesValid { get; init; }
|
||||||
|
public bool HasRekorEntries { get; init; }
|
||||||
|
public bool ToolChainComplete { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Query attestations by SBOM
|
||||||
|
- [ ] Query reachability by vulnerability
|
||||||
|
- [ ] Query VEX by vulnerability
|
||||||
|
- [ ] Build summary chain report
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for evidence index functionality.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/__Tests/StellaOps.Evidence.Tests/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] EvidenceLinker build tests
|
||||||
|
- [ ] Validation tests (positive and negative)
|
||||||
|
- [ ] Query service tests
|
||||||
|
- [ ] Serialization round-trip tests
|
||||||
|
- [ ] Digest computation tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Project Setup
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the project structure and dependencies.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Evidence/StellaOps.Evidence.csproj`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Project compiles
|
||||||
|
- [ ] Dependencies resolved
|
||||||
|
- [ ] Schema embedded as resource
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Define Evidence Index Domain Model |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | JSON Schema Definition |
|
||||||
|
| 3 | T3 | TODO | T1 | QA Team | Evidence Linker Service |
|
||||||
|
| 4 | T4 | TODO | T1, T2 | QA Team | Evidence Validator |
|
||||||
|
| 5 | T5 | TODO | T1, T3 | QA Team | Evidence Query Service |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | QA Team | Unit Tests |
|
||||||
|
| 7 | T7 | TODO | — | QA Team | Project Setup |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Evidence Index identified as key artifact for proof-linked UX. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Evidence chain depth | Decision | QA Team | Link to immediate evidence only; transitive links via query |
|
||||||
|
| Unknown tracking | Decision | QA Team | All unknowns recorded in evidence for audit |
|
||||||
|
| Rekor integration | Decision | QA Team | Optional; RekorLogIndex null when offline |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Evidence Index links verdict to all evidence
|
||||||
|
- [ ] Validation catches incomplete chains
|
||||||
|
- [ ] Query service enables chain navigation
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
530
docs/implplan/SPRINT_5100_0001_0003_offline_bundle_manifest.md
Normal file
530
docs/implplan/SPRINT_5100_0001_0003_offline_bundle_manifest.md
Normal file
@@ -0,0 +1,530 @@
|
|||||||
|
# Sprint 5100.0001.0003 · Offline Bundle Manifest
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Define the Offline Bundle Manifest schema for air-gapped operation.
|
||||||
|
- Captures all components required for offline scanning: feeds, policies, keys, certificates, Rekor mirror snapshots.
|
||||||
|
- Implement bundle validation, integrity checking, and content-addressed storage.
|
||||||
|
- **Working directory:** `src/AirGap/__Libraries/StellaOps.AirGap.Bundle/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (foundational sprint)
|
||||||
|
- **Downstream**: Sprint 5100.0003.0002 (No-Egress Enforcement) uses bundle validation
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0001.0001, 5100.0001.0002, 5100.0001.0004
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/24_OFFLINE_KIT.md`
|
||||||
|
- `docs/modules/airgap/architecture.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Define Bundle Manifest Model
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the Offline Bundle Manifest model that inventories all bundle contents with digests.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/AirGap/__Libraries/StellaOps.AirGap.Bundle/Models/BundleManifest.cs`
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AirGap.Bundle.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Manifest for an offline bundle, inventorying all components with content digests.
|
||||||
|
/// Used for integrity verification and completeness checking in air-gapped environments.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BundleManifest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique identifier for this bundle.
|
||||||
|
/// </summary>
|
||||||
|
public required string BundleId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Schema version for forward compatibility.
|
||||||
|
/// </summary>
|
||||||
|
public required string SchemaVersion { get; init; } = "1.0.0";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Human-readable bundle name.
|
||||||
|
/// </summary>
|
||||||
|
public required string Name { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Bundle version.
|
||||||
|
/// </summary>
|
||||||
|
public required string Version { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when bundle was created.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Bundle expiry (feeds/policies may become stale).
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset? ExpiresAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Vulnerability feed components.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<FeedComponent> Feeds { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Policy and lattice rule components.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<PolicyComponent> Policies { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Trust roots and certificates.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<CryptoComponent> CryptoMaterials { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Package catalogs for ecosystem matching.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<CatalogComponent> Catalogs { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor mirror snapshot for offline transparency.
|
||||||
|
/// </summary>
|
||||||
|
public RekorSnapshot? RekorSnapshot { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Crypto provider modules for sovereign crypto.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<CryptoProviderComponent> CryptoProviders { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Total size in bytes.
|
||||||
|
/// </summary>
|
||||||
|
public long TotalSizeBytes { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 digest of the entire bundle (excluding this field).
|
||||||
|
/// </summary>
|
||||||
|
public string? BundleDigest { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FeedComponent(
|
||||||
|
string FeedId,
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
long SizeBytes,
|
||||||
|
DateTimeOffset SnapshotAt,
|
||||||
|
FeedFormat Format);
|
||||||
|
|
||||||
|
public enum FeedFormat
|
||||||
|
{
|
||||||
|
StellaOpsNative,
|
||||||
|
TrivyDb,
|
||||||
|
GrypeDb,
|
||||||
|
OsvJson
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record PolicyComponent(
|
||||||
|
string PolicyId,
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
long SizeBytes,
|
||||||
|
PolicyType Type);
|
||||||
|
|
||||||
|
public enum PolicyType
|
||||||
|
{
|
||||||
|
OpaRego,
|
||||||
|
LatticeRules,
|
||||||
|
UnknownBudgets,
|
||||||
|
ScoringWeights
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record CryptoComponent(
|
||||||
|
string ComponentId,
|
||||||
|
string Name,
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
long SizeBytes,
|
||||||
|
CryptoComponentType Type,
|
||||||
|
DateTimeOffset? ExpiresAt);
|
||||||
|
|
||||||
|
public enum CryptoComponentType
|
||||||
|
{
|
||||||
|
TrustRoot,
|
||||||
|
IntermediateCa,
|
||||||
|
TimestampRoot,
|
||||||
|
SigningKey,
|
||||||
|
FulcioRoot
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record CatalogComponent(
|
||||||
|
string CatalogId,
|
||||||
|
string Ecosystem, // npm, pypi, maven, nuget
|
||||||
|
string Version,
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
long SizeBytes,
|
||||||
|
DateTimeOffset SnapshotAt);
|
||||||
|
|
||||||
|
public sealed record RekorSnapshot(
|
||||||
|
string TreeId,
|
||||||
|
long TreeSize,
|
||||||
|
string RootHash,
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
DateTimeOffset SnapshotAt);
|
||||||
|
|
||||||
|
public sealed record CryptoProviderComponent(
|
||||||
|
string ProviderId,
|
||||||
|
string Name, // CryptoPro, OpenSSL-GOST, SM-Crypto
|
||||||
|
string Version,
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
long SizeBytes,
|
||||||
|
ImmutableArray<string> SupportedAlgorithms);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `BundleManifest.cs` with all component types
|
||||||
|
- [ ] Feed, Policy, Crypto, Catalog components defined
|
||||||
|
- [ ] RekorSnapshot for offline transparency
|
||||||
|
- [ ] CryptoProvider for sovereign crypto support
|
||||||
|
- [ ] All fields documented
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Bundle Validator
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Validate bundle manifest and verify content integrity.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/AirGap/__Libraries/StellaOps.AirGap.Bundle/Validation/BundleValidator.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AirGap.Bundle.Validation;
|
||||||
|
|
||||||
|
public sealed class BundleValidator : IBundleValidator
|
||||||
|
{
|
||||||
|
public async Task<BundleValidationResult> ValidateAsync(
|
||||||
|
BundleManifest manifest,
|
||||||
|
string bundlePath,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var errors = new List<BundleValidationError>();
|
||||||
|
var warnings = new List<BundleValidationWarning>();
|
||||||
|
|
||||||
|
// Check required components
|
||||||
|
if (manifest.Feeds.Length == 0)
|
||||||
|
errors.Add(new BundleValidationError("Feeds", "At least one feed required"));
|
||||||
|
|
||||||
|
if (manifest.CryptoMaterials.Length == 0)
|
||||||
|
errors.Add(new BundleValidationError("CryptoMaterials", "Trust roots required"));
|
||||||
|
|
||||||
|
// Verify all file digests
|
||||||
|
foreach (var feed in manifest.Feeds)
|
||||||
|
{
|
||||||
|
var filePath = Path.Combine(bundlePath, feed.RelativePath);
|
||||||
|
var result = await VerifyFileDigestAsync(filePath, feed.Digest, ct);
|
||||||
|
if (!result.IsValid)
|
||||||
|
errors.Add(new BundleValidationError("Feeds",
|
||||||
|
$"Feed {feed.FeedId} digest mismatch: expected {feed.Digest}, got {result.ActualDigest}"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check expiry
|
||||||
|
if (manifest.ExpiresAt.HasValue && manifest.ExpiresAt.Value < DateTimeOffset.UtcNow)
|
||||||
|
warnings.Add(new BundleValidationWarning("ExpiresAt", "Bundle has expired"));
|
||||||
|
|
||||||
|
// Check feed freshness
|
||||||
|
foreach (var feed in manifest.Feeds)
|
||||||
|
{
|
||||||
|
var age = DateTimeOffset.UtcNow - feed.SnapshotAt;
|
||||||
|
if (age.TotalDays > 7)
|
||||||
|
warnings.Add(new BundleValidationWarning("Feeds",
|
||||||
|
$"Feed {feed.FeedId} is {age.TotalDays:F0} days old"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify bundle digest
|
||||||
|
if (manifest.BundleDigest != null)
|
||||||
|
{
|
||||||
|
var computed = ComputeBundleDigest(manifest);
|
||||||
|
if (computed != manifest.BundleDigest)
|
||||||
|
errors.Add(new BundleValidationError("BundleDigest", "Bundle digest mismatch"));
|
||||||
|
}
|
||||||
|
|
||||||
|
return new BundleValidationResult(
|
||||||
|
errors.Count == 0,
|
||||||
|
errors,
|
||||||
|
warnings,
|
||||||
|
manifest.TotalSizeBytes);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task<(bool IsValid, string ActualDigest)> VerifyFileDigestAsync(
|
||||||
|
string filePath, string expectedDigest, CancellationToken ct)
|
||||||
|
{
|
||||||
|
if (!File.Exists(filePath))
|
||||||
|
return (false, "FILE_NOT_FOUND");
|
||||||
|
|
||||||
|
using var stream = File.OpenRead(filePath);
|
||||||
|
var hash = await SHA256.HashDataAsync(stream, ct);
|
||||||
|
var actualDigest = Convert.ToHexString(hash).ToLowerInvariant();
|
||||||
|
return (actualDigest == expectedDigest.ToLowerInvariant(), actualDigest);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ComputeBundleDigest(BundleManifest manifest)
|
||||||
|
{
|
||||||
|
var withoutDigest = manifest with { BundleDigest = null };
|
||||||
|
var json = BundleManifestSerializer.Serialize(withoutDigest);
|
||||||
|
return Convert.ToHexString(SHA256.HashData(Encoding.UTF8.GetBytes(json))).ToLowerInvariant();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BundleValidationResult(
|
||||||
|
bool IsValid,
|
||||||
|
IReadOnlyList<BundleValidationError> Errors,
|
||||||
|
IReadOnlyList<BundleValidationWarning> Warnings,
|
||||||
|
long TotalSizeBytes);
|
||||||
|
|
||||||
|
public sealed record BundleValidationError(string Component, string Message);
|
||||||
|
public sealed record BundleValidationWarning(string Component, string Message);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Validates required components present
|
||||||
|
- [ ] Verifies all file digests
|
||||||
|
- [ ] Checks expiry and freshness
|
||||||
|
- [ ] Reports errors and warnings separately
|
||||||
|
- [ ] Async file operations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Bundle Builder
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Service to build offline bundles from online sources.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/AirGap/__Libraries/StellaOps.AirGap.Bundle/Services/BundleBuilder.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AirGap.Bundle.Services;
|
||||||
|
|
||||||
|
public sealed class BundleBuilder : IBundleBuilder
|
||||||
|
{
|
||||||
|
public async Task<BundleManifest> BuildAsync(
|
||||||
|
BundleBuildRequest request,
|
||||||
|
string outputPath,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var feeds = new List<FeedComponent>();
|
||||||
|
var policies = new List<PolicyComponent>();
|
||||||
|
var cryptoMaterials = new List<CryptoComponent>();
|
||||||
|
|
||||||
|
// Download and hash feeds
|
||||||
|
foreach (var feedConfig in request.Feeds)
|
||||||
|
{
|
||||||
|
var component = await DownloadFeedAsync(feedConfig, outputPath, ct);
|
||||||
|
feeds.Add(component);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export policies
|
||||||
|
foreach (var policyConfig in request.Policies)
|
||||||
|
{
|
||||||
|
var component = await ExportPolicyAsync(policyConfig, outputPath, ct);
|
||||||
|
policies.Add(component);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export crypto materials
|
||||||
|
foreach (var cryptoConfig in request.CryptoMaterials)
|
||||||
|
{
|
||||||
|
var component = await ExportCryptoAsync(cryptoConfig, outputPath, ct);
|
||||||
|
cryptoMaterials.Add(component);
|
||||||
|
}
|
||||||
|
|
||||||
|
var totalSize = feeds.Sum(f => f.SizeBytes) +
|
||||||
|
policies.Sum(p => p.SizeBytes) +
|
||||||
|
cryptoMaterials.Sum(c => c.SizeBytes);
|
||||||
|
|
||||||
|
var manifest = new BundleManifest
|
||||||
|
{
|
||||||
|
BundleId = Guid.NewGuid().ToString(),
|
||||||
|
SchemaVersion = "1.0.0",
|
||||||
|
Name = request.Name,
|
||||||
|
Version = request.Version,
|
||||||
|
CreatedAt = DateTimeOffset.UtcNow,
|
||||||
|
ExpiresAt = request.ExpiresAt,
|
||||||
|
Feeds = [.. feeds],
|
||||||
|
Policies = [.. policies],
|
||||||
|
CryptoMaterials = [.. cryptoMaterials],
|
||||||
|
TotalSizeBytes = totalSize
|
||||||
|
};
|
||||||
|
|
||||||
|
return BundleManifestSerializer.WithDigest(manifest);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BundleBuildRequest(
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
DateTimeOffset? ExpiresAt,
|
||||||
|
IReadOnlyList<FeedBuildConfig> Feeds,
|
||||||
|
IReadOnlyList<PolicyBuildConfig> Policies,
|
||||||
|
IReadOnlyList<CryptoBuildConfig> CryptoMaterials);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Downloads feeds with integrity verification
|
||||||
|
- [ ] Exports policies and lattice rules
|
||||||
|
- [ ] Includes crypto materials
|
||||||
|
- [ ] Computes total size and digest
|
||||||
|
- [ ] Progress reporting
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Bundle Loader
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Load and mount a validated bundle for offline scanning.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/AirGap/__Libraries/StellaOps.AirGap.Bundle/Services/BundleLoader.cs`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Validates bundle before loading
|
||||||
|
- [ ] Registers feeds with scanner
|
||||||
|
- [ ] Loads policies into policy engine
|
||||||
|
- [ ] Configures crypto providers
|
||||||
|
- [ ] Fails explicitly on validation errors
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: CLI Integration
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add CLI commands for bundle management.
|
||||||
|
|
||||||
|
**Commands**:
|
||||||
|
```bash
|
||||||
|
stella bundle create --name "offline-2025-Q1" --output bundle.tar.gz
|
||||||
|
stella bundle validate bundle.tar.gz
|
||||||
|
stella bundle info bundle.tar.gz
|
||||||
|
stella bundle load bundle.tar.gz
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `bundle create` command
|
||||||
|
- [ ] `bundle validate` command
|
||||||
|
- [ ] `bundle info` command
|
||||||
|
- [ ] `bundle load` command
|
||||||
|
- [ ] JSON output option
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit and Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for bundle functionality.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Manifest serialization tests
|
||||||
|
- [ ] Validation tests with fixtures
|
||||||
|
- [ ] Digest verification tests
|
||||||
|
- [ ] Builder integration tests
|
||||||
|
- [ ] Loader integration tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Project Setup
|
||||||
|
|
||||||
|
**Assignee**: AirGap Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the project structure.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/AirGap/__Libraries/StellaOps.AirGap.Bundle/StellaOps.AirGap.Bundle.csproj`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Project compiles
|
||||||
|
- [ ] Dependencies resolved
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | AirGap Team | Define Bundle Manifest Model |
|
||||||
|
| 2 | T2 | TODO | T1 | AirGap Team | Bundle Validator |
|
||||||
|
| 3 | T3 | TODO | T1 | AirGap Team | Bundle Builder |
|
||||||
|
| 4 | T4 | TODO | T1, T2 | AirGap Team | Bundle Loader |
|
||||||
|
| 5 | T5 | TODO | T3, T4 | AirGap Team | CLI Integration |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | AirGap Team | Unit and Integration Tests |
|
||||||
|
| 7 | T7 | TODO | — | AirGap Team | Project Setup |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Offline bundle manifest is critical for air-gap compliance testing. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Bundle format | Decision | AirGap Team | tar.gz with manifest.json at root |
|
||||||
|
| Expiry enforcement | Decision | AirGap Team | Warn on expired, block configurable |
|
||||||
|
| Freshness threshold | Decision | AirGap Team | 7 days default, configurable |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Bundle manifest captures all offline components
|
||||||
|
- [ ] Validation verifies integrity
|
||||||
|
- [ ] CLI commands functional
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
444
docs/implplan/SPRINT_5100_0001_0004_golden_corpus_expansion.md
Normal file
444
docs/implplan/SPRINT_5100_0001_0004_golden_corpus_expansion.md
Normal file
@@ -0,0 +1,444 @@
|
|||||||
|
# Sprint 5100.0001.0004 · Golden Corpus Expansion
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Expand the golden test corpus with comprehensive test cases covering all testing scenarios.
|
||||||
|
- Add negative fixtures, multi-distro coverage, large SBOM cases, and interop fixtures.
|
||||||
|
- Create corpus versioning and management utilities.
|
||||||
|
- **Working directory:** `bench/golden-corpus/` and `tests/fixtures/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprints 5100.0001.0001, 5100.0001.0002, 5100.0001.0003 (schemas for manifest format)
|
||||||
|
- **Downstream**: All E2E test sprints use corpus fixtures
|
||||||
|
- **Safe to parallelize with**: Phase 1 sprints (can use existing corpus during expansion)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/implplan/SPRINT_3500_0004_0003_integration_tests_corpus.md` (existing corpus)
|
||||||
|
- `bench/golden-corpus/README.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Corpus Structure Redesign
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Redesign corpus structure for comprehensive test coverage and easy navigation.
|
||||||
|
|
||||||
|
**Implementation Path**: `bench/golden-corpus/`
|
||||||
|
|
||||||
|
**Proposed Structure**:
|
||||||
|
```
|
||||||
|
bench/golden-corpus/
|
||||||
|
├── corpus-manifest.json # Master index with all cases
|
||||||
|
├── corpus-version.json # Versioning metadata
|
||||||
|
├── README.md # Documentation
|
||||||
|
├── categories/
|
||||||
|
│ ├── severity/ # CVE severity level cases
|
||||||
|
│ │ ├── critical/
|
||||||
|
│ │ ├── high/
|
||||||
|
│ │ ├── medium/
|
||||||
|
│ │ └── low/
|
||||||
|
│ ├── vex/ # VEX scenario cases
|
||||||
|
│ │ ├── not-affected/
|
||||||
|
│ │ ├── affected/
|
||||||
|
│ │ ├── under-investigation/
|
||||||
|
│ │ └── conflicting/
|
||||||
|
│ ├── reachability/ # Reachability analysis cases
|
||||||
|
│ │ ├── reachable/
|
||||||
|
│ │ ├── not-reachable/
|
||||||
|
│ │ └── inconclusive/
|
||||||
|
│ ├── unknowns/ # Unknowns scenarios
|
||||||
|
│ │ ├── pkg-source-unknown/
|
||||||
|
│ │ ├── cpe-ambiguous/
|
||||||
|
│ │ ├── version-unparseable/
|
||||||
|
│ │ └── mixed-unknowns/
|
||||||
|
│ ├── scale/ # Large SBOM cases
|
||||||
|
│ │ ├── small-200/
|
||||||
|
│ │ ├── medium-2k/
|
||||||
|
│ │ ├── large-20k/
|
||||||
|
│ │ └── xlarge-50k/
|
||||||
|
│ ├── distro/ # Multi-distro cases
|
||||||
|
│ │ ├── alpine/
|
||||||
|
│ │ ├── debian/
|
||||||
|
│ │ ├── rhel/
|
||||||
|
│ │ ├── suse/
|
||||||
|
│ │ └── ubuntu/
|
||||||
|
│ ├── interop/ # Interop test cases
|
||||||
|
│ │ ├── syft-generated/
|
||||||
|
│ │ ├── trivy-generated/
|
||||||
|
│ │ └── grype-consumed/
|
||||||
|
│ └── negative/ # Negative/error cases
|
||||||
|
│ ├── malformed-spdx/
|
||||||
|
│ ├── corrupted-dsse/
|
||||||
|
│ ├── missing-digests/
|
||||||
|
│ └── unsupported-distro/
|
||||||
|
└── shared/
|
||||||
|
├── policies/ # Shared policy fixtures
|
||||||
|
├── feeds/ # Feed snapshots
|
||||||
|
└── keys/ # Test signing keys
|
||||||
|
```
|
||||||
|
|
||||||
|
**Each Case Structure**:
|
||||||
|
```
|
||||||
|
case-name/
|
||||||
|
├── case-manifest.json # Case metadata
|
||||||
|
├── input/
|
||||||
|
│ ├── image.tar.gz # Container image (or reference)
|
||||||
|
│ ├── sbom-cyclonedx.json # SBOM (CycloneDX format)
|
||||||
|
│ └── sbom-spdx.json # SBOM (SPDX format)
|
||||||
|
├── expected/
|
||||||
|
│ ├── verdict.json # Expected verdict
|
||||||
|
│ ├── evidence-index.json # Expected evidence
|
||||||
|
│ ├── unknowns.json # Expected unknowns
|
||||||
|
│ └── delta-verdict.json # Expected delta (if applicable)
|
||||||
|
└── run-manifest.json # Run manifest for replay
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Directory structure created
|
||||||
|
- [ ] All category directories exist
|
||||||
|
- [ ] Template case structure documented
|
||||||
|
- [ ] Existing cases migrated to new structure
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Severity Level Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create comprehensive test cases for each CVE severity level.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Severity | Description |
|
||||||
|
|---------|----------|-------------|
|
||||||
|
| SEV-001 | Critical | Log4Shell (CVE-2021-44228) in Java app |
|
||||||
|
| SEV-002 | Critical | Spring4Shell (CVE-2022-22965) in Spring Boot |
|
||||||
|
| SEV-003 | High | OpenSSL CVE-2022-3602 in Alpine |
|
||||||
|
| SEV-004 | High | Multiple high CVEs in npm packages |
|
||||||
|
| SEV-005 | Medium | Medium-severity in Python dependencies |
|
||||||
|
| SEV-006 | Medium | Medium with VEX mitigation |
|
||||||
|
| SEV-007 | Low | Low-severity informational |
|
||||||
|
| SEV-008 | Low | Low with compensating control |
|
||||||
|
|
||||||
|
**Each Case Includes**:
|
||||||
|
- Minimal container image with vulnerable package
|
||||||
|
- SBOM in both CycloneDX and SPDX formats
|
||||||
|
- Expected verdict with scoring breakdown
|
||||||
|
- Run manifest for replay
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 8 severity cases created
|
||||||
|
- [ ] Each case has all required artifacts
|
||||||
|
- [ ] Cases validate against schemas
|
||||||
|
- [ ] Cases pass determinism tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: VEX Scenario Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create test cases for VEX document handling and precedence.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Scenario | Description |
|
||||||
|
|---------|----------|-------------|
|
||||||
|
| VEX-001 | Not Affected | Vendor VEX marks CVE not affected |
|
||||||
|
| VEX-002 | Not Affected | Feature flag disables vulnerable code |
|
||||||
|
| VEX-003 | Affected | VEX confirms affected with fix available |
|
||||||
|
| VEX-004 | Under Investigation | Status pending vendor analysis |
|
||||||
|
| VEX-005 | Conflicting | Vendor vs distro VEX conflict |
|
||||||
|
| VEX-006 | Conflicting | Multiple vendor VEX with different status |
|
||||||
|
| VEX-007 | Precedence | Vendor > distro > internal precedence test |
|
||||||
|
| VEX-008 | Expiry | VEX with expiration date |
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 8 VEX cases created
|
||||||
|
- [ ] VEX documents in OpenVEX, CSAF, CycloneDX formats
|
||||||
|
- [ ] Precedence rules exercised
|
||||||
|
- [ ] Expected evidence includes VEX references
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Reachability Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create test cases for reachability analysis outcomes.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Status | Description |
|
||||||
|
|---------|--------|-------------|
|
||||||
|
| REACH-001 | Reachable | Direct call to vulnerable function |
|
||||||
|
| REACH-002 | Reachable | Transitive call path (3 hops) |
|
||||||
|
| REACH-003 | Not Reachable | Vulnerable code never invoked |
|
||||||
|
| REACH-004 | Not Reachable | Dead code path |
|
||||||
|
| REACH-005 | Inconclusive | Dynamic dispatch prevents analysis |
|
||||||
|
| REACH-006 | Inconclusive | Reflection-based invocation |
|
||||||
|
| REACH-007 | Binary | Binary-level reachability (Go) |
|
||||||
|
| REACH-008 | Binary | Binary-level reachability (Rust) |
|
||||||
|
|
||||||
|
**Each Case Includes**:
|
||||||
|
- Source code demonstrating call path
|
||||||
|
- Call graph in expected output
|
||||||
|
- Reachability evidence with paths
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 8 reachability cases created
|
||||||
|
- [ ] Call paths documented
|
||||||
|
- [ ] Evidence includes entry points
|
||||||
|
- [ ] Both source and binary cases
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Unknowns Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create test cases for unknowns detection and budgeting.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Unknown Type | Description |
|
||||||
|
|---------|--------------|-------------|
|
||||||
|
| UNK-001 | PKG_SOURCE_UNKNOWN | Package with no identifiable source |
|
||||||
|
| UNK-002 | CPE_AMBIG | Multiple CPE candidates |
|
||||||
|
| UNK-003 | VERSION_UNPARSEABLE | Non-standard version string |
|
||||||
|
| UNK-004 | DISTRO_UNRECOGNIZED | Unknown Linux distribution |
|
||||||
|
| UNK-005 | REACHABILITY_INCONCLUSIVE | Analysis cannot determine |
|
||||||
|
| UNK-006 | Mixed | Multiple unknown types combined |
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 6 unknowns cases created
|
||||||
|
- [ ] Each unknown type represented
|
||||||
|
- [ ] Expected unknowns list in evidence
|
||||||
|
- [ ] Budget violation case included
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Scale Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create large SBOM cases for performance testing.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Size | Components | Description |
|
||||||
|
|---------|------|------------|-------------|
|
||||||
|
| SCALE-001 | Small | 200 | Minimal Node.js app |
|
||||||
|
| SCALE-002 | Medium | 2,000 | Enterprise Java app |
|
||||||
|
| SCALE-003 | Large | 20,000 | Monorepo with many deps |
|
||||||
|
| SCALE-004 | XLarge | 50,000 | Worst-case container |
|
||||||
|
|
||||||
|
**Each Case Includes**:
|
||||||
|
- Synthetic SBOM with realistic structure
|
||||||
|
- Expected performance metrics
|
||||||
|
- Memory usage baselines
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 4 scale cases created
|
||||||
|
- [ ] SBOMs pass schema validation
|
||||||
|
- [ ] Performance baselines documented
|
||||||
|
- [ ] Determinism verified at scale
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Distro Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create multi-distro test cases for OS package matching.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Distro | Description |
|
||||||
|
|---------|--------|-------------|
|
||||||
|
| DISTRO-001 | Alpine 3.18 | musl-based, apk packages |
|
||||||
|
| DISTRO-002 | Debian 12 | dpkg-based, apt packages |
|
||||||
|
| DISTRO-003 | RHEL 9 | rpm-based, dnf packages |
|
||||||
|
| DISTRO-004 | SUSE 15 | rpm-based, zypper packages |
|
||||||
|
| DISTRO-005 | Ubuntu 22.04 | dpkg-based, snap support |
|
||||||
|
|
||||||
|
**Each Case Includes**:
|
||||||
|
- Real container image digest
|
||||||
|
- OS-specific CVEs
|
||||||
|
- NEVRA/EVR matching tests
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 5 distro cases created
|
||||||
|
- [ ] Each uses real CVEs for that distro
|
||||||
|
- [ ] Package version matching tested
|
||||||
|
- [ ] Security tracker references included
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: Interop Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create interop test cases with third-party tools.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Tool | Description |
|
||||||
|
|---------|------|-------------|
|
||||||
|
| INTEROP-001 | Syft | SBOM generated by Syft (CycloneDX) |
|
||||||
|
| INTEROP-002 | Syft | SBOM generated by Syft (SPDX) |
|
||||||
|
| INTEROP-003 | Trivy | SBOM generated by Trivy |
|
||||||
|
| INTEROP-004 | Grype | Findings from Grype scan |
|
||||||
|
| INTEROP-005 | cosign | Attestation signed with cosign |
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 5 interop cases created
|
||||||
|
- [ ] Real tool outputs captured
|
||||||
|
- [ ] Findings parity documented
|
||||||
|
- [ ] Round-trip verification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Negative Cases
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create negative test cases for error handling.
|
||||||
|
|
||||||
|
**Cases to Create**:
|
||||||
|
|
||||||
|
| Case ID | Error Type | Description |
|
||||||
|
|---------|------------|-------------|
|
||||||
|
| NEG-001 | Malformed SPDX | Invalid SPDX JSON structure |
|
||||||
|
| NEG-002 | Malformed CycloneDX | Invalid CycloneDX schema |
|
||||||
|
| NEG-003 | Corrupted DSSE | DSSE envelope with bad signature |
|
||||||
|
| NEG-004 | Missing Digests | SBOM without component hashes |
|
||||||
|
| NEG-005 | Unsupported Distro | Unknown Linux distribution |
|
||||||
|
| NEG-006 | Zip Bomb | Malicious compressed artifact |
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 6 negative cases created
|
||||||
|
- [ ] Each triggers specific error
|
||||||
|
- [ ] Error messages documented
|
||||||
|
- [ ] No crashes on malformed input
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: Corpus Management Tooling
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T9
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create tooling for corpus management and validation.
|
||||||
|
|
||||||
|
**Tools to Create**:
|
||||||
|
```bash
|
||||||
|
# Validate all corpus cases
|
||||||
|
python3 scripts/corpus/validate-corpus.py
|
||||||
|
|
||||||
|
# Generate corpus manifest
|
||||||
|
python3 scripts/corpus/generate-manifest.py
|
||||||
|
|
||||||
|
# Run determinism check on all cases
|
||||||
|
python3 scripts/corpus/check-determinism.py
|
||||||
|
|
||||||
|
# Add new case from template
|
||||||
|
python3 scripts/corpus/add-case.py --category severity --name "new-case"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Validation script validates all cases
|
||||||
|
- [ ] Manifest generation script
|
||||||
|
- [ ] Determinism check script
|
||||||
|
- [ ] Case template generator
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Corpus Structure Redesign |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | Severity Level Cases |
|
||||||
|
| 3 | T3 | TODO | T1 | QA Team | VEX Scenario Cases |
|
||||||
|
| 4 | T4 | TODO | T1 | QA Team | Reachability Cases |
|
||||||
|
| 5 | T5 | TODO | T1 | QA Team | Unknowns Cases |
|
||||||
|
| 6 | T6 | TODO | T1 | QA Team | Scale Cases |
|
||||||
|
| 7 | T7 | TODO | T1 | QA Team | Distro Cases |
|
||||||
|
| 8 | T8 | TODO | T1 | QA Team | Interop Cases |
|
||||||
|
| 9 | T9 | TODO | T1 | QA Team | Negative Cases |
|
||||||
|
| 10 | T10 | TODO | T1-T9 | QA Team | Corpus Management Tooling |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Golden corpus expansion required for comprehensive E2E testing. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Case naming convention | Decision | QA Team | CATEGORY-NNN format |
|
||||||
|
| Image storage | Decision | QA Team | Reference digests, not full images |
|
||||||
|
| Corpus versioning | Decision | QA Team | Semantic versioning tied to algorithm changes |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 10 tasks marked DONE
|
||||||
|
- [ ] 50+ test cases in corpus
|
||||||
|
- [ ] All categories have representative cases
|
||||||
|
- [ ] Corpus passes validation
|
||||||
|
- [ ] Determinism verified across all cases
|
||||||
|
- [ ] Management tooling functional
|
||||||
@@ -0,0 +1,742 @@
|
|||||||
|
# Sprint 5100.0002.0001 · Canonicalization Utilities
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement canonical JSON serialization for deterministic output.
|
||||||
|
- Create stable ordering utilities for packages, vulnerabilities, edges, and evidence lists.
|
||||||
|
- Ensure UTF-8/invariant culture enforcement across all outputs.
|
||||||
|
- Add property-based tests for ordering invariants.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.Canonicalization/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 5100.0001.0001 (Run Manifest Schema) uses canonicalization
|
||||||
|
- **Downstream**: Sprint 5100.0002.0002 (Replay Runner) depends on deterministic output
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0001.0002, 5100.0001.0003
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Smart Diff - Reproducibility as a Feature.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Canonical JSON Serializer
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement canonical JSON serialization with stable key ordering and consistent formatting.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Canonicalization/Json/CanonicalJsonSerializer.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Canonicalization.Json;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Produces canonical JSON output with deterministic ordering.
|
||||||
|
/// Implements RFC 8785 (JSON Canonicalization Scheme) principles.
|
||||||
|
/// </summary>
|
||||||
|
public static class CanonicalJsonSerializer
|
||||||
|
{
|
||||||
|
private static readonly JsonSerializerOptions Options = new()
|
||||||
|
{
|
||||||
|
// Deterministic settings
|
||||||
|
WriteIndented = false,
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
DictionaryKeyPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||||
|
Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping,
|
||||||
|
|
||||||
|
// Ordering converters
|
||||||
|
Converters =
|
||||||
|
{
|
||||||
|
new StableDictionaryConverter(),
|
||||||
|
new StableArrayConverter(),
|
||||||
|
new Iso8601DateTimeConverter()
|
||||||
|
},
|
||||||
|
|
||||||
|
// Number handling for cross-platform consistency
|
||||||
|
NumberHandling = JsonNumberHandling.Strict
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Serializes an object to canonical JSON.
|
||||||
|
/// </summary>
|
||||||
|
public static string Serialize<T>(T value)
|
||||||
|
{
|
||||||
|
return JsonSerializer.Serialize(value, Options);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Serializes and computes SHA-256 digest.
|
||||||
|
/// </summary>
|
||||||
|
public static (string Json, string Digest) SerializeWithDigest<T>(T value)
|
||||||
|
{
|
||||||
|
var json = Serialize(value);
|
||||||
|
var hash = SHA256.HashData(Encoding.UTF8.GetBytes(json));
|
||||||
|
var digest = Convert.ToHexString(hash).ToLowerInvariant();
|
||||||
|
return (json, digest);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Deserializes from canonical JSON.
|
||||||
|
/// </summary>
|
||||||
|
public static T Deserialize<T>(string json)
|
||||||
|
{
|
||||||
|
return JsonSerializer.Deserialize<T>(json, Options)
|
||||||
|
?? throw new InvalidOperationException($"Failed to deserialize {typeof(T).Name}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Converter that orders dictionary keys alphabetically.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class StableDictionaryConverter : JsonConverterFactory
|
||||||
|
{
|
||||||
|
public override bool CanConvert(Type typeToConvert) =>
|
||||||
|
typeToConvert.IsGenericType &&
|
||||||
|
(typeToConvert.GetGenericTypeDefinition() == typeof(Dictionary<,>) ||
|
||||||
|
typeToConvert.GetGenericTypeDefinition() == typeof(ImmutableDictionary<,>));
|
||||||
|
|
||||||
|
public override JsonConverter CreateConverter(Type typeToConvert, JsonSerializerOptions options)
|
||||||
|
{
|
||||||
|
var keyType = typeToConvert.GetGenericArguments()[0];
|
||||||
|
var valueType = typeToConvert.GetGenericArguments()[1];
|
||||||
|
var converterType = typeof(StableDictionaryConverter<,>).MakeGenericType(keyType, valueType);
|
||||||
|
return (JsonConverter)Activator.CreateInstance(converterType)!;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed class StableDictionaryConverter<TKey, TValue> : JsonConverter<Dictionary<TKey, TValue>>
|
||||||
|
where TKey : notnull
|
||||||
|
{
|
||||||
|
public override Dictionary<TKey, TValue>? Read(
|
||||||
|
ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
|
||||||
|
{
|
||||||
|
return JsonSerializer.Deserialize<Dictionary<TKey, TValue>>(ref reader, options);
|
||||||
|
}
|
||||||
|
|
||||||
|
public override void Write(
|
||||||
|
Utf8JsonWriter writer, Dictionary<TKey, TValue> value, JsonSerializerOptions options)
|
||||||
|
{
|
||||||
|
writer.WriteStartObject();
|
||||||
|
foreach (var kvp in value.OrderBy(x => x.Key?.ToString(), StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
writer.WritePropertyName(kvp.Key?.ToString() ?? "");
|
||||||
|
JsonSerializer.Serialize(writer, kvp.Value, options);
|
||||||
|
}
|
||||||
|
writer.WriteEndObject();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Converter for ISO 8601 date/time with UTC normalization.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class Iso8601DateTimeConverter : JsonConverter<DateTimeOffset>
|
||||||
|
{
|
||||||
|
public override DateTimeOffset Read(
|
||||||
|
ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
|
||||||
|
{
|
||||||
|
return DateTimeOffset.Parse(reader.GetString()!, CultureInfo.InvariantCulture);
|
||||||
|
}
|
||||||
|
|
||||||
|
public override void Write(
|
||||||
|
Utf8JsonWriter writer, DateTimeOffset value, JsonSerializerOptions options)
|
||||||
|
{
|
||||||
|
// Always output in UTC with fixed format
|
||||||
|
writer.WriteStringValue(value.ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ss.fffZ", CultureInfo.InvariantCulture));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Stable key ordering (alphabetical)
|
||||||
|
- [ ] Consistent array ordering
|
||||||
|
- [ ] UTC ISO-8601 timestamps
|
||||||
|
- [ ] No whitespace in output
|
||||||
|
- [ ] camelCase property naming
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Collection Orderers
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement stable ordering for domain collections: packages, vulnerabilities, edges, evidence.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Canonicalization/Ordering/`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Canonicalization.Ordering;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provides stable ordering for SBOM packages.
|
||||||
|
/// Order: purl (if present) -> name -> version -> type
|
||||||
|
/// </summary>
|
||||||
|
public static class PackageOrderer
|
||||||
|
{
|
||||||
|
public static IOrderedEnumerable<T> StableOrder<T>(
|
||||||
|
this IEnumerable<T> packages,
|
||||||
|
Func<T, string?> getPurl,
|
||||||
|
Func<T, string?> getName,
|
||||||
|
Func<T, string?> getVersion,
|
||||||
|
Func<T, string?> getType)
|
||||||
|
{
|
||||||
|
return packages
|
||||||
|
.OrderBy(p => getPurl(p) ?? "", StringComparer.Ordinal)
|
||||||
|
.ThenBy(p => getName(p) ?? "", StringComparer.Ordinal)
|
||||||
|
.ThenBy(p => getVersion(p) ?? "", StringComparer.Ordinal)
|
||||||
|
.ThenBy(p => getType(p) ?? "", StringComparer.Ordinal);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provides stable ordering for vulnerabilities.
|
||||||
|
/// Order: id (CVE/GHSA) -> source -> severity
|
||||||
|
/// </summary>
|
||||||
|
public static class VulnerabilityOrderer
|
||||||
|
{
|
||||||
|
public static IOrderedEnumerable<T> StableOrder<T>(
|
||||||
|
this IEnumerable<T> vulnerabilities,
|
||||||
|
Func<T, string> getId,
|
||||||
|
Func<T, string?> getSource,
|
||||||
|
Func<T, decimal?> getSeverity)
|
||||||
|
{
|
||||||
|
return vulnerabilities
|
||||||
|
.OrderBy(v => getId(v), StringComparer.Ordinal)
|
||||||
|
.ThenBy(v => getSource(v) ?? "", StringComparer.Ordinal)
|
||||||
|
.ThenByDescending(v => getSeverity(v) ?? 0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provides stable ordering for graph edges.
|
||||||
|
/// Order: source -> target -> type
|
||||||
|
/// </summary>
|
||||||
|
public static class EdgeOrderer
|
||||||
|
{
|
||||||
|
public static IOrderedEnumerable<T> StableOrder<T>(
|
||||||
|
this IEnumerable<T> edges,
|
||||||
|
Func<T, string> getSource,
|
||||||
|
Func<T, string> getTarget,
|
||||||
|
Func<T, string?> getType)
|
||||||
|
{
|
||||||
|
return edges
|
||||||
|
.OrderBy(e => getSource(e), StringComparer.Ordinal)
|
||||||
|
.ThenBy(e => getTarget(e), StringComparer.Ordinal)
|
||||||
|
.ThenBy(e => getType(e) ?? "", StringComparer.Ordinal);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provides stable ordering for evidence lists.
|
||||||
|
/// Order: type -> id -> digest
|
||||||
|
/// </summary>
|
||||||
|
public static class EvidenceOrderer
|
||||||
|
{
|
||||||
|
public static IOrderedEnumerable<T> StableOrder<T>(
|
||||||
|
this IEnumerable<T> evidence,
|
||||||
|
Func<T, string> getType,
|
||||||
|
Func<T, string> getId,
|
||||||
|
Func<T, string?> getDigest)
|
||||||
|
{
|
||||||
|
return evidence
|
||||||
|
.OrderBy(e => getType(e), StringComparer.Ordinal)
|
||||||
|
.ThenBy(e => getId(e), StringComparer.Ordinal)
|
||||||
|
.ThenBy(e => getDigest(e) ?? "", StringComparer.Ordinal);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] PackageOrderer with PURL priority
|
||||||
|
- [ ] VulnerabilityOrderer with ID priority
|
||||||
|
- [ ] EdgeOrderer for graph determinism
|
||||||
|
- [ ] EvidenceOrderer for chain ordering
|
||||||
|
- [ ] All use StringComparer.Ordinal
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Culture Invariant Utilities
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Utilities for culture-invariant operations.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Canonicalization/Culture/InvariantCulture.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Canonicalization.Culture;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Ensures all string operations use invariant culture.
|
||||||
|
/// </summary>
|
||||||
|
public static class InvariantCulture
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Forces invariant culture for the current thread.
|
||||||
|
/// </summary>
|
||||||
|
public static IDisposable Scope()
|
||||||
|
{
|
||||||
|
var original = CultureInfo.CurrentCulture;
|
||||||
|
CultureInfo.CurrentCulture = CultureInfo.InvariantCulture;
|
||||||
|
CultureInfo.CurrentUICulture = CultureInfo.InvariantCulture;
|
||||||
|
return new CultureScope(original);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compares strings using ordinal comparison.
|
||||||
|
/// </summary>
|
||||||
|
public static int Compare(string? a, string? b) =>
|
||||||
|
string.Compare(a, b, StringComparison.Ordinal);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Formats a decimal with invariant culture.
|
||||||
|
/// </summary>
|
||||||
|
public static string FormatDecimal(decimal value) =>
|
||||||
|
value.ToString("G", CultureInfo.InvariantCulture);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Parses a decimal with invariant culture.
|
||||||
|
/// </summary>
|
||||||
|
public static decimal ParseDecimal(string value) =>
|
||||||
|
decimal.Parse(value, CultureInfo.InvariantCulture);
|
||||||
|
|
||||||
|
private sealed class CultureScope : IDisposable
|
||||||
|
{
|
||||||
|
private readonly CultureInfo _original;
|
||||||
|
public CultureScope(CultureInfo original) => _original = original;
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
CultureInfo.CurrentCulture = _original;
|
||||||
|
CultureInfo.CurrentUICulture = _original;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTF-8 encoding utilities.
|
||||||
|
/// </summary>
|
||||||
|
public static class Utf8Encoding
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Ensures string is valid UTF-8.
|
||||||
|
/// </summary>
|
||||||
|
public static string Normalize(string input)
|
||||||
|
{
|
||||||
|
// Normalize to NFC form for consistent representation
|
||||||
|
return input.Normalize(NormalizationForm.FormC);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Converts to UTF-8 bytes.
|
||||||
|
/// </summary>
|
||||||
|
public static byte[] GetBytes(string input) =>
|
||||||
|
Encoding.UTF8.GetBytes(Normalize(input));
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Culture scope for thread isolation
|
||||||
|
- [ ] Ordinal string comparison
|
||||||
|
- [ ] Invariant number formatting
|
||||||
|
- [ ] UTF-8 normalization (NFC)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Determinism Verifier
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Service to verify determinism of serialization.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Canonicalization/Verification/DeterminismVerifier.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Canonicalization.Verification;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Verifies that serialization produces identical output across runs.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class DeterminismVerifier
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Serializes an object multiple times and verifies identical output.
|
||||||
|
/// </summary>
|
||||||
|
public DeterminismResult Verify<T>(T value, int iterations = 10)
|
||||||
|
{
|
||||||
|
var outputs = new HashSet<string>();
|
||||||
|
var digests = new HashSet<string>();
|
||||||
|
|
||||||
|
for (var i = 0; i < iterations; i++)
|
||||||
|
{
|
||||||
|
var (json, digest) = CanonicalJsonSerializer.SerializeWithDigest(value);
|
||||||
|
outputs.Add(json);
|
||||||
|
digests.Add(digest);
|
||||||
|
}
|
||||||
|
|
||||||
|
return new DeterminismResult(
|
||||||
|
IsDeterministic: outputs.Count == 1 && digests.Count == 1,
|
||||||
|
UniqueOutputs: outputs.Count,
|
||||||
|
UniqueDigests: digests.Count,
|
||||||
|
SampleOutput: outputs.First(),
|
||||||
|
SampleDigest: digests.First());
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compares two serialized objects for byte-identical output.
|
||||||
|
/// </summary>
|
||||||
|
public ComparisonResult Compare<T>(T a, T b)
|
||||||
|
{
|
||||||
|
var (jsonA, digestA) = CanonicalJsonSerializer.SerializeWithDigest(a);
|
||||||
|
var (jsonB, digestB) = CanonicalJsonSerializer.SerializeWithDigest(b);
|
||||||
|
|
||||||
|
if (digestA == digestB)
|
||||||
|
return new ComparisonResult(IsIdentical: true, Differences: []);
|
||||||
|
|
||||||
|
var differences = FindDifferences(jsonA, jsonB);
|
||||||
|
return new ComparisonResult(IsIdentical: false, Differences: differences);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IReadOnlyList<string> FindDifferences(string a, string b)
|
||||||
|
{
|
||||||
|
var differences = new List<string>();
|
||||||
|
var docA = JsonDocument.Parse(a);
|
||||||
|
var docB = JsonDocument.Parse(b);
|
||||||
|
CompareElements(docA.RootElement, docB.RootElement, "$", differences);
|
||||||
|
return differences;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void CompareElements(
|
||||||
|
JsonElement a, JsonElement b, string path, List<string> differences)
|
||||||
|
{
|
||||||
|
if (a.ValueKind != b.ValueKind)
|
||||||
|
{
|
||||||
|
differences.Add($"{path}: type mismatch ({a.ValueKind} vs {b.ValueKind})");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
switch (a.ValueKind)
|
||||||
|
{
|
||||||
|
case JsonValueKind.Object:
|
||||||
|
var propsA = a.EnumerateObject().ToDictionary(p => p.Name);
|
||||||
|
var propsB = b.EnumerateObject().ToDictionary(p => p.Name);
|
||||||
|
foreach (var key in propsA.Keys.Union(propsB.Keys).Order())
|
||||||
|
{
|
||||||
|
var hasA = propsA.TryGetValue(key, out var propA);
|
||||||
|
var hasB = propsB.TryGetValue(key, out var propB);
|
||||||
|
if (!hasA) differences.Add($"{path}.{key}: missing in first");
|
||||||
|
else if (!hasB) differences.Add($"{path}.{key}: missing in second");
|
||||||
|
else CompareElements(propA.Value, propB.Value, $"{path}.{key}", differences);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case JsonValueKind.Array:
|
||||||
|
var arrA = a.EnumerateArray().ToList();
|
||||||
|
var arrB = b.EnumerateArray().ToList();
|
||||||
|
if (arrA.Count != arrB.Count)
|
||||||
|
differences.Add($"{path}: array length mismatch ({arrA.Count} vs {arrB.Count})");
|
||||||
|
for (var i = 0; i < Math.Min(arrA.Count, arrB.Count); i++)
|
||||||
|
CompareElements(arrA[i], arrB[i], $"{path}[{i}]", differences);
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
if (a.GetRawText() != b.GetRawText())
|
||||||
|
differences.Add($"{path}: value mismatch");
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record DeterminismResult(
|
||||||
|
bool IsDeterministic,
|
||||||
|
int UniqueOutputs,
|
||||||
|
int UniqueDigests,
|
||||||
|
string SampleOutput,
|
||||||
|
string SampleDigest);
|
||||||
|
|
||||||
|
public sealed record ComparisonResult(
|
||||||
|
bool IsIdentical,
|
||||||
|
IReadOnlyList<string> Differences);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Multi-iteration verification
|
||||||
|
- [ ] Deep comparison with path reporting
|
||||||
|
- [ ] Difference details for debugging
|
||||||
|
- [ ] JSON path format for differences
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Property-Based Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Property-based tests using FsCheck for ordering invariants.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/__Tests/StellaOps.Canonicalization.Tests/Properties/`
|
||||||
|
|
||||||
|
**Test Properties**:
|
||||||
|
```csharp
|
||||||
|
using FsCheck;
|
||||||
|
using FsCheck.Xunit;
|
||||||
|
|
||||||
|
namespace StellaOps.Canonicalization.Tests.Properties;
|
||||||
|
|
||||||
|
public class CanonicalJsonProperties
|
||||||
|
{
|
||||||
|
[Property]
|
||||||
|
public Property Serialize_IsIdempotent(Dictionary<string, int> dict)
|
||||||
|
{
|
||||||
|
var json1 = CanonicalJsonSerializer.Serialize(dict);
|
||||||
|
var json2 = CanonicalJsonSerializer.Serialize(dict);
|
||||||
|
return (json1 == json2).ToProperty();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Property]
|
||||||
|
public Property Serialize_OrderIndependent(Dictionary<string, int> dict)
|
||||||
|
{
|
||||||
|
var reversed = dict.Reverse().ToDictionary(x => x.Key, x => x.Value);
|
||||||
|
var json1 = CanonicalJsonSerializer.Serialize(dict);
|
||||||
|
var json2 = CanonicalJsonSerializer.Serialize(reversed);
|
||||||
|
return (json1 == json2).ToProperty();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Property]
|
||||||
|
public Property Digest_IsDeterministic(string input)
|
||||||
|
{
|
||||||
|
var obj = new { Value = input ?? "" };
|
||||||
|
var (_, digest1) = CanonicalJsonSerializer.SerializeWithDigest(obj);
|
||||||
|
var (_, digest2) = CanonicalJsonSerializer.SerializeWithDigest(obj);
|
||||||
|
return (digest1 == digest2).ToProperty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class OrderingProperties
|
||||||
|
{
|
||||||
|
[Property]
|
||||||
|
public Property PackageOrdering_IsStable(List<(string purl, string name, string version)> packages)
|
||||||
|
{
|
||||||
|
var ordered1 = packages.StableOrder(p => p.purl, p => p.name, p => p.version, _ => null).ToList();
|
||||||
|
var ordered2 = packages.StableOrder(p => p.purl, p => p.name, p => p.version, _ => null).ToList();
|
||||||
|
return ordered1.SequenceEqual(ordered2).ToProperty();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Property]
|
||||||
|
public Property VulnerabilityOrdering_IsTransitive(
|
||||||
|
List<(string id, string source, decimal severity)> vulns)
|
||||||
|
{
|
||||||
|
var ordered = vulns.StableOrder(v => v.id, v => v.source, v => v.severity).ToList();
|
||||||
|
// Verify transitivity: if a < b and b < c, then a < c
|
||||||
|
for (var i = 0; i < ordered.Count - 2; i++)
|
||||||
|
{
|
||||||
|
var a = ordered[i];
|
||||||
|
var b = ordered[i + 1];
|
||||||
|
var c = ordered[i + 2];
|
||||||
|
// This should always hold for a stable total ordering
|
||||||
|
}
|
||||||
|
return true.ToProperty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Idempotency property tests
|
||||||
|
- [ ] Order-independence property tests
|
||||||
|
- [ ] Digest determinism property tests
|
||||||
|
- [ ] 1000+ generated test cases
|
||||||
|
- [ ] All properties pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Standard unit tests for canonicalization utilities.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/__Tests/StellaOps.Canonicalization.Tests/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class CanonicalJsonSerializerTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void Serialize_Dictionary_OrdersKeysAlphabetically()
|
||||||
|
{
|
||||||
|
var dict = new Dictionary<string, int> { ["z"] = 1, ["a"] = 2, ["m"] = 3 };
|
||||||
|
var json = CanonicalJsonSerializer.Serialize(dict);
|
||||||
|
json.Should().Be("{\"a\":2,\"m\":3,\"z\":1}");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Serialize_DateTimeOffset_UsesUtcIso8601()
|
||||||
|
{
|
||||||
|
var dt = new DateTimeOffset(2024, 1, 15, 10, 30, 0, TimeSpan.FromHours(5));
|
||||||
|
var obj = new { Timestamp = dt };
|
||||||
|
var json = CanonicalJsonSerializer.Serialize(obj);
|
||||||
|
json.Should().Contain("2024-01-15T05:30:00.000Z");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void Serialize_NullValues_AreOmitted()
|
||||||
|
{
|
||||||
|
var obj = new { Name = "test", Value = (string?)null };
|
||||||
|
var json = CanonicalJsonSerializer.Serialize(obj);
|
||||||
|
json.Should().NotContain("value");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public void SerializeWithDigest_ProducesConsistentDigest()
|
||||||
|
{
|
||||||
|
var obj = new { Name = "test", Value = 123 };
|
||||||
|
var (_, digest1) = CanonicalJsonSerializer.SerializeWithDigest(obj);
|
||||||
|
var (_, digest2) = CanonicalJsonSerializer.SerializeWithDigest(obj);
|
||||||
|
digest1.Should().Be(digest2);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class PackageOrdererTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public void StableOrder_OrdersByPurlFirst()
|
||||||
|
{
|
||||||
|
var packages = new[]
|
||||||
|
{
|
||||||
|
(purl: "pkg:npm/b@1.0.0", name: "b", version: "1.0.0"),
|
||||||
|
(purl: "pkg:npm/a@1.0.0", name: "a", version: "1.0.0")
|
||||||
|
};
|
||||||
|
var ordered = packages.StableOrder(p => p.purl, p => p.name, p => p.version, _ => null).ToList();
|
||||||
|
ordered[0].purl.Should().Be("pkg:npm/a@1.0.0");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Key ordering tests
|
||||||
|
- [ ] DateTime formatting tests
|
||||||
|
- [ ] Null handling tests
|
||||||
|
- [ ] Digest consistency tests
|
||||||
|
- [ ] All orderer tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Project Setup
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the project structure and dependencies.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Canonicalization/StellaOps.Canonicalization.csproj`
|
||||||
|
|
||||||
|
**Project File**:
|
||||||
|
```xml
|
||||||
|
<Project Sdk="Microsoft.NET.Sdk">
|
||||||
|
<PropertyGroup>
|
||||||
|
<TargetFramework>net10.0</TargetFramework>
|
||||||
|
<ImplicitUsings>enable</ImplicitUsings>
|
||||||
|
<Nullable>enable</Nullable>
|
||||||
|
<LangVersion>preview</LangVersion>
|
||||||
|
</PropertyGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="System.Text.Json" Version="9.0.0" />
|
||||||
|
</ItemGroup>
|
||||||
|
</Project>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Test Project**:
|
||||||
|
```xml
|
||||||
|
<Project Sdk="Microsoft.NET.Sdk">
|
||||||
|
<PropertyGroup>
|
||||||
|
<TargetFramework>net10.0</TargetFramework>
|
||||||
|
</PropertyGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="FsCheck.Xunit" Version="2.16.6" />
|
||||||
|
<PackageReference Include="FluentAssertions" Version="6.12.0" />
|
||||||
|
<PackageReference Include="xunit" Version="2.9.0" />
|
||||||
|
</ItemGroup>
|
||||||
|
</Project>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Main project compiles
|
||||||
|
- [ ] Test project compiles
|
||||||
|
- [ ] FsCheck integrated
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Canonical JSON Serializer |
|
||||||
|
| 2 | T2 | TODO | — | QA Team | Collection Orderers |
|
||||||
|
| 3 | T3 | TODO | — | QA Team | Culture Invariant Utilities |
|
||||||
|
| 4 | T4 | TODO | T1, T2, T3 | QA Team | Determinism Verifier |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | QA Team | Property-Based Tests |
|
||||||
|
| 6 | T6 | TODO | T1-T4 | QA Team | Unit Tests |
|
||||||
|
| 7 | T7 | TODO | — | QA Team | Project Setup |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Canonicalization is foundational for deterministic replay. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| JSON canonicalization | Decision | QA Team | Follow RFC 8785 principles |
|
||||||
|
| String comparison | Decision | QA Team | Ordinal comparison for portability |
|
||||||
|
| DateTime format | Decision | QA Team | ISO 8601 with milliseconds, always UTC |
|
||||||
|
| Unicode normalization | Decision | QA Team | NFC form for consistency |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Canonical JSON produces identical output
|
||||||
|
- [ ] All orderers are stable and deterministic
|
||||||
|
- [ ] Property-based tests pass with 1000+ cases
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
585
docs/implplan/SPRINT_5100_0002_0002_replay_runner_service.md
Normal file
585
docs/implplan/SPRINT_5100_0002_0002_replay_runner_service.md
Normal file
@@ -0,0 +1,585 @@
|
|||||||
|
# Sprint 5100.0002.0002 · Replay Runner Service
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement the Replay Runner service for deterministic verdict replay.
|
||||||
|
- Load run manifests and execute scans with identical inputs.
|
||||||
|
- Compare verdict bytes and report differences.
|
||||||
|
- Enable time-travel verification for auditors.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.Replay/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 5100.0001.0001 (Run Manifest), Sprint 5100.0002.0001 (Canonicalization)
|
||||||
|
- **Downstream**: Sprint 5100.0006.0001 (Audit Pack) uses replay for verification
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0002.0003 (Delta-Verdict)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Smart Diff - Reproducibility as a Feature.md`
|
||||||
|
- Sprint 5100.0001.0001 completion
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Replay Engine Core
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 8
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Core replay engine that executes scans from run manifests.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Replay/Engine/ReplayEngine.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Replay.Engine;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Executes scans deterministically from run manifests.
|
||||||
|
/// Enables time-travel replay for verification and auditing.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class ReplayEngine : IReplayEngine
|
||||||
|
{
|
||||||
|
private readonly IFeedLoader _feedLoader;
|
||||||
|
private readonly IPolicyLoader _policyLoader;
|
||||||
|
private readonly IScannerFactory _scannerFactory;
|
||||||
|
private readonly ILogger<ReplayEngine> _logger;
|
||||||
|
|
||||||
|
public ReplayEngine(
|
||||||
|
IFeedLoader feedLoader,
|
||||||
|
IPolicyLoader policyLoader,
|
||||||
|
IScannerFactory scannerFactory,
|
||||||
|
ILogger<ReplayEngine> logger)
|
||||||
|
{
|
||||||
|
_feedLoader = feedLoader;
|
||||||
|
_policyLoader = policyLoader;
|
||||||
|
_scannerFactory = scannerFactory;
|
||||||
|
_logger = logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Replays a scan from a run manifest.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<ReplayResult> ReplayAsync(
|
||||||
|
RunManifest manifest,
|
||||||
|
ReplayOptions options,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
_logger.LogInformation("Starting replay for run {RunId}", manifest.RunId);
|
||||||
|
|
||||||
|
// Validate manifest
|
||||||
|
var validationResult = ValidateManifest(manifest);
|
||||||
|
if (!validationResult.IsValid)
|
||||||
|
{
|
||||||
|
return ReplayResult.Failed(
|
||||||
|
manifest.RunId,
|
||||||
|
"Manifest validation failed",
|
||||||
|
validationResult.Errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load frozen inputs
|
||||||
|
var feedResult = await LoadFeedSnapshotAsync(manifest.FeedSnapshot, ct);
|
||||||
|
if (!feedResult.Success)
|
||||||
|
return ReplayResult.Failed(manifest.RunId, "Failed to load feed snapshot", [feedResult.Error]);
|
||||||
|
|
||||||
|
var policyResult = await LoadPolicySnapshotAsync(manifest.PolicySnapshot, ct);
|
||||||
|
if (!policyResult.Success)
|
||||||
|
return ReplayResult.Failed(manifest.RunId, "Failed to load policy snapshot", [policyResult.Error]);
|
||||||
|
|
||||||
|
// Configure scanner with frozen time and PRNG
|
||||||
|
var scannerOptions = new ScannerOptions
|
||||||
|
{
|
||||||
|
FeedSnapshot = feedResult.Value,
|
||||||
|
PolicySnapshot = policyResult.Value,
|
||||||
|
CryptoProfile = manifest.CryptoProfile,
|
||||||
|
PrngSeed = manifest.PrngSeed,
|
||||||
|
FrozenTime = options.UseFrozenTime ? manifest.InitiatedAt : null,
|
||||||
|
CanonicalizationVersion = manifest.CanonicalizationVersion
|
||||||
|
};
|
||||||
|
|
||||||
|
// Execute scan
|
||||||
|
var scanner = _scannerFactory.Create(scannerOptions);
|
||||||
|
var scanResult = await scanner.ScanAsync(manifest.ArtifactDigests, ct);
|
||||||
|
|
||||||
|
// Serialize verdict canonically
|
||||||
|
var (verdictJson, verdictDigest) = CanonicalJsonSerializer.SerializeWithDigest(scanResult.Verdict);
|
||||||
|
|
||||||
|
return new ReplayResult
|
||||||
|
{
|
||||||
|
RunId = manifest.RunId,
|
||||||
|
Success = true,
|
||||||
|
VerdictJson = verdictJson,
|
||||||
|
VerdictDigest = verdictDigest,
|
||||||
|
EvidenceIndex = scanResult.EvidenceIndex,
|
||||||
|
ExecutedAt = DateTimeOffset.UtcNow,
|
||||||
|
DurationMs = scanResult.DurationMs
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compares two replay results for determinism.
|
||||||
|
/// </summary>
|
||||||
|
public DeterminismCheckResult CheckDeterminism(ReplayResult a, ReplayResult b)
|
||||||
|
{
|
||||||
|
if (a.VerdictDigest == b.VerdictDigest)
|
||||||
|
{
|
||||||
|
return new DeterminismCheckResult
|
||||||
|
{
|
||||||
|
IsDeterministic = true,
|
||||||
|
DigestA = a.VerdictDigest,
|
||||||
|
DigestB = b.VerdictDigest,
|
||||||
|
Differences = []
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
var differences = FindJsonDifferences(a.VerdictJson, b.VerdictJson);
|
||||||
|
return new DeterminismCheckResult
|
||||||
|
{
|
||||||
|
IsDeterministic = false,
|
||||||
|
DigestA = a.VerdictDigest,
|
||||||
|
DigestB = b.VerdictDigest,
|
||||||
|
Differences = differences
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private ValidationResult ValidateManifest(RunManifest manifest)
|
||||||
|
{
|
||||||
|
var errors = new List<string>();
|
||||||
|
|
||||||
|
if (string.IsNullOrEmpty(manifest.RunId))
|
||||||
|
errors.Add("RunId is required");
|
||||||
|
|
||||||
|
if (manifest.ArtifactDigests.Length == 0)
|
||||||
|
errors.Add("At least one artifact digest required");
|
||||||
|
|
||||||
|
if (manifest.FeedSnapshot.Digest == null)
|
||||||
|
errors.Add("Feed snapshot digest required");
|
||||||
|
|
||||||
|
return new ValidationResult(errors.Count == 0, errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task<LoadResult<FeedSnapshot>> LoadFeedSnapshotAsync(
|
||||||
|
FeedSnapshot snapshot, CancellationToken ct)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var feed = await _feedLoader.LoadByDigestAsync(snapshot.Digest, ct);
|
||||||
|
if (feed.Digest != snapshot.Digest)
|
||||||
|
return LoadResult<FeedSnapshot>.Fail($"Feed digest mismatch: expected {snapshot.Digest}");
|
||||||
|
return LoadResult<FeedSnapshot>.Ok(feed);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
return LoadResult<FeedSnapshot>.Fail($"Failed to load feed: {ex.Message}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task<LoadResult<PolicySnapshot>> LoadPolicySnapshotAsync(
|
||||||
|
PolicySnapshot snapshot, CancellationToken ct)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var policy = await _policyLoader.LoadByDigestAsync(snapshot.LatticeRulesDigest, ct);
|
||||||
|
return LoadResult<PolicySnapshot>.Ok(policy);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
return LoadResult<PolicySnapshot>.Fail($"Failed to load policy: {ex.Message}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IReadOnlyList<JsonDifference> FindJsonDifferences(string? a, string? b)
|
||||||
|
{
|
||||||
|
if (a == null || b == null)
|
||||||
|
return [new JsonDifference("$", "One or both values are null")];
|
||||||
|
|
||||||
|
var verifier = new DeterminismVerifier();
|
||||||
|
var result = verifier.Compare(a, b);
|
||||||
|
return result.Differences.Select(d => new JsonDifference(d, "Value mismatch")).ToList();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ReplayResult
|
||||||
|
{
|
||||||
|
public required string RunId { get; init; }
|
||||||
|
public bool Success { get; init; }
|
||||||
|
public string? VerdictJson { get; init; }
|
||||||
|
public string? VerdictDigest { get; init; }
|
||||||
|
public EvidenceIndex? EvidenceIndex { get; init; }
|
||||||
|
public DateTimeOffset ExecutedAt { get; init; }
|
||||||
|
public long DurationMs { get; init; }
|
||||||
|
public IReadOnlyList<string>? Errors { get; init; }
|
||||||
|
|
||||||
|
public static ReplayResult Failed(string runId, string message, IReadOnlyList<string> errors) =>
|
||||||
|
new()
|
||||||
|
{
|
||||||
|
RunId = runId,
|
||||||
|
Success = false,
|
||||||
|
Errors = errors.Prepend(message).ToList(),
|
||||||
|
ExecutedAt = DateTimeOffset.UtcNow
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record DeterminismCheckResult
|
||||||
|
{
|
||||||
|
public bool IsDeterministic { get; init; }
|
||||||
|
public string? DigestA { get; init; }
|
||||||
|
public string? DigestB { get; init; }
|
||||||
|
public IReadOnlyList<JsonDifference> Differences { get; init; } = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record JsonDifference(string Path, string Description);
|
||||||
|
|
||||||
|
public sealed record ReplayOptions
|
||||||
|
{
|
||||||
|
public bool UseFrozenTime { get; init; } = true;
|
||||||
|
public bool VerifyDigests { get; init; } = true;
|
||||||
|
public bool CaptureEvidence { get; init; } = true;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Load and validate run manifests
|
||||||
|
- [ ] Load frozen feed/policy snapshots by digest
|
||||||
|
- [ ] Configure scanner with frozen time/PRNG
|
||||||
|
- [ ] Produce canonical verdict output
|
||||||
|
- [ ] Report differences on non-determinism
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Feed Snapshot Loader
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Load vulnerability feeds by digest for exact reproduction.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Replay/Loaders/FeedSnapshotLoader.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Replay.Loaders;
|
||||||
|
|
||||||
|
public sealed class FeedSnapshotLoader : IFeedLoader
|
||||||
|
{
|
||||||
|
private readonly IFeedStorage _storage;
|
||||||
|
private readonly ILogger<FeedSnapshotLoader> _logger;
|
||||||
|
|
||||||
|
public async Task<FeedSnapshot> LoadByDigestAsync(string digest, CancellationToken ct)
|
||||||
|
{
|
||||||
|
_logger.LogDebug("Loading feed snapshot with digest {Digest}", digest);
|
||||||
|
|
||||||
|
// Try local content-addressed store first
|
||||||
|
var localPath = GetLocalPath(digest);
|
||||||
|
if (File.Exists(localPath))
|
||||||
|
{
|
||||||
|
var feed = await LoadFromFileAsync(localPath, ct);
|
||||||
|
VerifyDigest(feed, digest);
|
||||||
|
return feed;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try storage backend
|
||||||
|
var storedFeed = await _storage.GetByDigestAsync(digest, ct);
|
||||||
|
if (storedFeed != null)
|
||||||
|
{
|
||||||
|
VerifyDigest(storedFeed, digest);
|
||||||
|
return storedFeed;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new FeedNotFoundException($"Feed snapshot not found: {digest}");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void VerifyDigest(FeedSnapshot feed, string expected)
|
||||||
|
{
|
||||||
|
var actual = ComputeDigest(feed);
|
||||||
|
if (actual != expected)
|
||||||
|
throw new DigestMismatchException($"Feed digest mismatch: expected {expected}, got {actual}");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ComputeDigest(FeedSnapshot feed)
|
||||||
|
{
|
||||||
|
var json = CanonicalJsonSerializer.Serialize(feed);
|
||||||
|
return Convert.ToHexString(SHA256.HashData(Encoding.UTF8.GetBytes(json))).ToLowerInvariant();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string GetLocalPath(string digest) =>
|
||||||
|
Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData),
|
||||||
|
"stellaops", "feeds", digest[..2], digest);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Load by digest from local store
|
||||||
|
- [ ] Fall back to storage backend
|
||||||
|
- [ ] Verify digest on load
|
||||||
|
- [ ] Clear error on not found
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Policy Snapshot Loader
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Load policy configurations by digest.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Replay/Loaders/PolicySnapshotLoader.cs`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Load by digest
|
||||||
|
- [ ] Include lattice rules
|
||||||
|
- [ ] Verify digest integrity
|
||||||
|
- [ ] Support offline bundle source
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Replay CLI Commands
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CLI commands for replay operations.
|
||||||
|
|
||||||
|
**Commands**:
|
||||||
|
```bash
|
||||||
|
# Replay a scan from manifest
|
||||||
|
stella replay --manifest run-manifest.json --output verdict.json
|
||||||
|
|
||||||
|
# Verify determinism (replay twice and compare)
|
||||||
|
stella replay verify --manifest run-manifest.json
|
||||||
|
|
||||||
|
# Compare two verdicts
|
||||||
|
stella replay diff --a verdict-a.json --b verdict-b.json
|
||||||
|
|
||||||
|
# Batch replay from corpus
|
||||||
|
stella replay batch --corpus bench/golden-corpus/ --output results/
|
||||||
|
```
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Cli/StellaOps.Cli/Commands/Replay/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `replay` command executes single replay
|
||||||
|
- [ ] `replay verify` checks determinism
|
||||||
|
- [ ] `replay diff` compares verdicts
|
||||||
|
- [ ] `replay batch` processes corpus
|
||||||
|
- [ ] JSON output option
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: CI Integration
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate replay verification into CI.
|
||||||
|
|
||||||
|
**Implementation Path**: `.gitea/workflows/replay-verification.yml`
|
||||||
|
|
||||||
|
**Workflow**:
|
||||||
|
```yaml
|
||||||
|
name: Replay Verification
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- 'src/Scanner/**'
|
||||||
|
- 'src/__Libraries/StellaOps.Canonicalization/**'
|
||||||
|
- 'bench/golden-corpus/**'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
replay-verification:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup .NET
|
||||||
|
uses: actions/setup-dotnet@v4
|
||||||
|
with:
|
||||||
|
dotnet-version: '10.0.100'
|
||||||
|
|
||||||
|
- name: Build CLI
|
||||||
|
run: dotnet build src/Cli/StellaOps.Cli -c Release
|
||||||
|
|
||||||
|
- name: Run replay verification on corpus
|
||||||
|
run: |
|
||||||
|
./out/stella replay batch \
|
||||||
|
--corpus bench/golden-corpus/ \
|
||||||
|
--output results/ \
|
||||||
|
--verify-determinism \
|
||||||
|
--fail-on-diff
|
||||||
|
|
||||||
|
- name: Upload diff report
|
||||||
|
if: failure()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: replay-diff-report
|
||||||
|
path: results/diff-report.json
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Runs on scanner/canonicalization changes
|
||||||
|
- [ ] Processes entire golden corpus
|
||||||
|
- [ ] Fails PR on determinism violation
|
||||||
|
- [ ] Uploads diff report on failure
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit and Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for replay functionality.
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class ReplayEngineTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Replay_SameManifest_ProducesIdenticalVerdict()
|
||||||
|
{
|
||||||
|
var manifest = CreateTestManifest();
|
||||||
|
var engine = CreateEngine();
|
||||||
|
|
||||||
|
var result1 = await engine.ReplayAsync(manifest, new ReplayOptions());
|
||||||
|
var result2 = await engine.ReplayAsync(manifest, new ReplayOptions());
|
||||||
|
|
||||||
|
result1.VerdictDigest.Should().Be(result2.VerdictDigest);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Replay_DifferentManifest_ProducesDifferentVerdict()
|
||||||
|
{
|
||||||
|
var manifest1 = CreateTestManifest();
|
||||||
|
var manifest2 = manifest1 with
|
||||||
|
{
|
||||||
|
FeedSnapshot = manifest1.FeedSnapshot with { Version = "v2" }
|
||||||
|
};
|
||||||
|
var engine = CreateEngine();
|
||||||
|
|
||||||
|
var result1 = await engine.ReplayAsync(manifest1, new ReplayOptions());
|
||||||
|
var result2 = await engine.ReplayAsync(manifest2, new ReplayOptions());
|
||||||
|
|
||||||
|
result1.VerdictDigest.Should().NotBe(result2.VerdictDigest);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task CheckDeterminism_IdenticalResults_ReturnsTrue()
|
||||||
|
{
|
||||||
|
var result1 = new ReplayResult { VerdictDigest = "abc123" };
|
||||||
|
var result2 = new ReplayResult { VerdictDigest = "abc123" };
|
||||||
|
|
||||||
|
var check = engine.CheckDeterminism(result1, result2);
|
||||||
|
|
||||||
|
check.IsDeterministic.Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task CheckDeterminism_DifferentResults_ReturnsDifferences()
|
||||||
|
{
|
||||||
|
var result1 = new ReplayResult
|
||||||
|
{
|
||||||
|
VerdictJson = "{\"score\":100}",
|
||||||
|
VerdictDigest = "abc123"
|
||||||
|
};
|
||||||
|
var result2 = new ReplayResult
|
||||||
|
{
|
||||||
|
VerdictJson = "{\"score\":99}",
|
||||||
|
VerdictDigest = "def456"
|
||||||
|
};
|
||||||
|
|
||||||
|
var check = engine.CheckDeterminism(result1, result2);
|
||||||
|
|
||||||
|
check.IsDeterministic.Should().BeFalse();
|
||||||
|
check.Differences.Should().NotBeEmpty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Replay determinism tests
|
||||||
|
- [ ] Feed loading tests
|
||||||
|
- [ ] Policy loading tests
|
||||||
|
- [ ] Diff detection tests
|
||||||
|
- [ ] Integration tests with real corpus
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Project Setup
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the project structure.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Main project compiles
|
||||||
|
- [ ] Test project compiles
|
||||||
|
- [ ] Dependencies on Manifest and Canonicalization
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Replay Engine Core |
|
||||||
|
| 2 | T2 | TODO | — | QA Team | Feed Snapshot Loader |
|
||||||
|
| 3 | T3 | TODO | — | QA Team | Policy Snapshot Loader |
|
||||||
|
| 4 | T4 | TODO | T1-T3 | QA Team | Replay CLI Commands |
|
||||||
|
| 5 | T5 | TODO | T4 | DevOps Team | CI Integration |
|
||||||
|
| 6 | T6 | TODO | T1-T4 | QA Team | Unit and Integration Tests |
|
||||||
|
| 7 | T7 | TODO | — | QA Team | Project Setup |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Replay runner is key for determinism verification. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Frozen time | Decision | QA Team | Use manifest InitiatedAt for time-dependent operations |
|
||||||
|
| Content-addressed storage | Decision | QA Team | Store feeds/policies by digest for exact retrieval |
|
||||||
|
| Diff granularity | Decision | QA Team | JSON path-based diff for debugging |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Replay produces identical verdicts from same manifest
|
||||||
|
- [ ] Differences are detected and reported
|
||||||
|
- [ ] CI blocks on determinism violations
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
610
docs/implplan/SPRINT_5100_0002_0003_delta_verdict_generator.md
Normal file
610
docs/implplan/SPRINT_5100_0002_0003_delta_verdict_generator.md
Normal file
@@ -0,0 +1,610 @@
|
|||||||
|
# Sprint 5100.0002.0003 · Delta-Verdict Generator
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement delta-verdict generation for diff-aware release gates.
|
||||||
|
- Compare two scan verdicts and produce signed deltas containing only changes.
|
||||||
|
- Enable risk budget computation based on delta magnitude.
|
||||||
|
- Support OCI artifact attachment for delta verdicts.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.DeltaVerdict/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 5100.0002.0001 (Canonicalization), Sprint 5100.0001.0002 (Evidence Index)
|
||||||
|
- **Downstream**: UI components display deltas, Policy gates use delta for decisions
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0002.0002 (Replay Runner)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/product-advisories/21-Dec-2025 - Smart Diff - Reproducibility as a Feature.md`
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Moat Explanation - Risk Budgets and Diff-Aware Release Gates.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Delta-Verdict Domain Model
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define the delta-verdict model capturing changes between two verdicts.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.DeltaVerdict/Models/DeltaVerdict.cs`
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.DeltaVerdict.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Represents the difference between two scan verdicts.
|
||||||
|
/// Used for diff-aware release gates and risk budget computation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record DeltaVerdict
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique identifier for this delta.
|
||||||
|
/// </summary>
|
||||||
|
public required string DeltaId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Schema version for forward compatibility.
|
||||||
|
/// </summary>
|
||||||
|
public required string SchemaVersion { get; init; } = "1.0.0";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the base (before) verdict.
|
||||||
|
/// </summary>
|
||||||
|
public required VerdictReference BaseVerdict { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to the head (after) verdict.
|
||||||
|
/// </summary>
|
||||||
|
public required VerdictReference HeadVerdict { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Components added in head.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<ComponentDelta> AddedComponents { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Components removed in head.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<ComponentDelta> RemovedComponents { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Components with version changes.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<ComponentVersionDelta> ChangedComponents { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// New vulnerabilities introduced in head.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<VulnerabilityDelta> AddedVulnerabilities { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Vulnerabilities fixed in head.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<VulnerabilityDelta> RemovedVulnerabilities { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Vulnerabilities with status changes (e.g., VEX update).
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<VulnerabilityStatusDelta> ChangedVulnerabilityStatuses { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Risk score changes.
|
||||||
|
/// </summary>
|
||||||
|
public required RiskScoreDelta RiskScoreDelta { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Summary statistics for the delta.
|
||||||
|
/// </summary>
|
||||||
|
public required DeltaSummary Summary { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether this is an "empty delta" (no changes).
|
||||||
|
/// </summary>
|
||||||
|
public bool IsEmpty => Summary.TotalChanges == 0;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when delta was computed.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset ComputedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 digest of this delta (excluding this field and signature).
|
||||||
|
/// </summary>
|
||||||
|
public string? DeltaDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// DSSE signature if signed.
|
||||||
|
/// </summary>
|
||||||
|
public string? Signature { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record VerdictReference(
|
||||||
|
string VerdictId,
|
||||||
|
string Digest,
|
||||||
|
string? ArtifactRef,
|
||||||
|
DateTimeOffset ScannedAt);
|
||||||
|
|
||||||
|
public sealed record ComponentDelta(
|
||||||
|
string Purl,
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
string Type,
|
||||||
|
ImmutableArray<string> AssociatedVulnerabilities);
|
||||||
|
|
||||||
|
public sealed record ComponentVersionDelta(
|
||||||
|
string Purl,
|
||||||
|
string Name,
|
||||||
|
string OldVersion,
|
||||||
|
string NewVersion,
|
||||||
|
ImmutableArray<string> VulnerabilitiesFixed,
|
||||||
|
ImmutableArray<string> VulnerabilitiesIntroduced);
|
||||||
|
|
||||||
|
public sealed record VulnerabilityDelta(
|
||||||
|
string VulnerabilityId,
|
||||||
|
string Severity,
|
||||||
|
decimal? CvssScore,
|
||||||
|
string? ComponentPurl,
|
||||||
|
string? ReachabilityStatus);
|
||||||
|
|
||||||
|
public sealed record VulnerabilityStatusDelta(
|
||||||
|
string VulnerabilityId,
|
||||||
|
string OldStatus,
|
||||||
|
string NewStatus,
|
||||||
|
string? Reason);
|
||||||
|
|
||||||
|
public sealed record RiskScoreDelta(
|
||||||
|
decimal OldScore,
|
||||||
|
decimal NewScore,
|
||||||
|
decimal Change,
|
||||||
|
decimal PercentChange,
|
||||||
|
RiskTrend Trend);
|
||||||
|
|
||||||
|
public enum RiskTrend
|
||||||
|
{
|
||||||
|
Improved,
|
||||||
|
Degraded,
|
||||||
|
Stable
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record DeltaSummary(
|
||||||
|
int ComponentsAdded,
|
||||||
|
int ComponentsRemoved,
|
||||||
|
int ComponentsChanged,
|
||||||
|
int VulnerabilitiesAdded,
|
||||||
|
int VulnerabilitiesRemoved,
|
||||||
|
int VulnerabilityStatusChanges,
|
||||||
|
int TotalChanges,
|
||||||
|
DeltaMagnitude Magnitude);
|
||||||
|
|
||||||
|
public enum DeltaMagnitude
|
||||||
|
{
|
||||||
|
None, // 0 changes
|
||||||
|
Minimal, // 1-5 changes
|
||||||
|
Small, // 6-20 changes
|
||||||
|
Medium, // 21-50 changes
|
||||||
|
Large, // 51-100 changes
|
||||||
|
Major // 100+ changes
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Complete delta model with all change types
|
||||||
|
- [ ] Component additions/removals/changes
|
||||||
|
- [ ] Vulnerability additions/removals/status changes
|
||||||
|
- [ ] Risk score delta with trend
|
||||||
|
- [ ] Summary with magnitude classification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Delta Computation Engine
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 8
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Engine that computes deltas between two verdicts.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.DeltaVerdict/Engine/DeltaComputationEngine.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.DeltaVerdict.Engine;
|
||||||
|
|
||||||
|
public sealed class DeltaComputationEngine : IDeltaComputationEngine
|
||||||
|
{
|
||||||
|
public DeltaVerdict ComputeDelta(Verdict baseVerdict, Verdict headVerdict)
|
||||||
|
{
|
||||||
|
// Component diff
|
||||||
|
var baseComponents = baseVerdict.Components.ToDictionary(c => c.Purl);
|
||||||
|
var headComponents = headVerdict.Components.ToDictionary(c => c.Purl);
|
||||||
|
|
||||||
|
var addedComponents = ComputeAddedComponents(baseComponents, headComponents);
|
||||||
|
var removedComponents = ComputeRemovedComponents(baseComponents, headComponents);
|
||||||
|
var changedComponents = ComputeChangedComponents(baseComponents, headComponents);
|
||||||
|
|
||||||
|
// Vulnerability diff
|
||||||
|
var baseVulns = baseVerdict.Vulnerabilities.ToDictionary(v => v.Id);
|
||||||
|
var headVulns = headVerdict.Vulnerabilities.ToDictionary(v => v.Id);
|
||||||
|
|
||||||
|
var addedVulns = ComputeAddedVulnerabilities(baseVulns, headVulns);
|
||||||
|
var removedVulns = ComputeRemovedVulnerabilities(baseVulns, headVulns);
|
||||||
|
var changedStatuses = ComputeStatusChanges(baseVulns, headVulns);
|
||||||
|
|
||||||
|
// Risk score delta
|
||||||
|
var riskDelta = ComputeRiskScoreDelta(baseVerdict.RiskScore, headVerdict.RiskScore);
|
||||||
|
|
||||||
|
// Summary
|
||||||
|
var summary = new DeltaSummary(
|
||||||
|
ComponentsAdded: addedComponents.Length,
|
||||||
|
ComponentsRemoved: removedComponents.Length,
|
||||||
|
ComponentsChanged: changedComponents.Length,
|
||||||
|
VulnerabilitiesAdded: addedVulns.Length,
|
||||||
|
VulnerabilitiesRemoved: removedVulns.Length,
|
||||||
|
VulnerabilityStatusChanges: changedStatuses.Length,
|
||||||
|
TotalChanges: addedComponents.Length + removedComponents.Length + changedComponents.Length +
|
||||||
|
addedVulns.Length + removedVulns.Length + changedStatuses.Length,
|
||||||
|
Magnitude: ClassifyMagnitude(/* total changes */));
|
||||||
|
|
||||||
|
return new DeltaVerdict
|
||||||
|
{
|
||||||
|
DeltaId = Guid.NewGuid().ToString(),
|
||||||
|
SchemaVersion = "1.0.0",
|
||||||
|
BaseVerdict = CreateVerdictReference(baseVerdict),
|
||||||
|
HeadVerdict = CreateVerdictReference(headVerdict),
|
||||||
|
AddedComponents = addedComponents,
|
||||||
|
RemovedComponents = removedComponents,
|
||||||
|
ChangedComponents = changedComponents,
|
||||||
|
AddedVulnerabilities = addedVulns,
|
||||||
|
RemovedVulnerabilities = removedVulns,
|
||||||
|
ChangedVulnerabilityStatuses = changedStatuses,
|
||||||
|
RiskScoreDelta = riskDelta,
|
||||||
|
Summary = summary,
|
||||||
|
ComputedAt = DateTimeOffset.UtcNow
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ImmutableArray<ComponentDelta> ComputeAddedComponents(
|
||||||
|
Dictionary<string, Component> baseComponents,
|
||||||
|
Dictionary<string, Component> headComponents)
|
||||||
|
{
|
||||||
|
return headComponents
|
||||||
|
.Where(kv => !baseComponents.ContainsKey(kv.Key))
|
||||||
|
.Select(kv => new ComponentDelta(
|
||||||
|
kv.Value.Purl,
|
||||||
|
kv.Value.Name,
|
||||||
|
kv.Value.Version,
|
||||||
|
kv.Value.Type,
|
||||||
|
kv.Value.Vulnerabilities.ToImmutableArray()))
|
||||||
|
.ToImmutableArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static RiskScoreDelta ComputeRiskScoreDelta(decimal oldScore, decimal newScore)
|
||||||
|
{
|
||||||
|
var change = newScore - oldScore;
|
||||||
|
var percentChange = oldScore > 0 ? (change / oldScore) * 100 : (newScore > 0 ? 100 : 0);
|
||||||
|
var trend = change switch
|
||||||
|
{
|
||||||
|
< 0 => RiskTrend.Improved,
|
||||||
|
> 0 => RiskTrend.Degraded,
|
||||||
|
_ => RiskTrend.Stable
|
||||||
|
};
|
||||||
|
|
||||||
|
return new RiskScoreDelta(oldScore, newScore, change, percentChange, trend);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static DeltaMagnitude ClassifyMagnitude(int totalChanges) => totalChanges switch
|
||||||
|
{
|
||||||
|
0 => DeltaMagnitude.None,
|
||||||
|
<= 5 => DeltaMagnitude.Minimal,
|
||||||
|
<= 20 => DeltaMagnitude.Small,
|
||||||
|
<= 50 => DeltaMagnitude.Medium,
|
||||||
|
<= 100 => DeltaMagnitude.Large,
|
||||||
|
_ => DeltaMagnitude.Major
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Compute component diffs (add/remove/change)
|
||||||
|
- [ ] Compute vulnerability diffs
|
||||||
|
- [ ] Calculate risk score delta
|
||||||
|
- [ ] Classify magnitude
|
||||||
|
- [ ] Handle edge cases (empty verdicts, identical verdicts)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Delta Signing Service
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Sign delta verdicts using DSSE format.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.DeltaVerdict/Signing/DeltaSigningService.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.DeltaVerdict.Signing;
|
||||||
|
|
||||||
|
public sealed class DeltaSigningService : IDeltaSigningService
|
||||||
|
{
|
||||||
|
private readonly ISignerService _signer;
|
||||||
|
|
||||||
|
public async Task<DeltaVerdict> SignAsync(
|
||||||
|
DeltaVerdict delta,
|
||||||
|
SigningOptions options,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// Compute digest of unsigned delta
|
||||||
|
var withDigest = DeltaVerdictSerializer.WithDigest(delta);
|
||||||
|
|
||||||
|
// Create DSSE envelope
|
||||||
|
var payload = DeltaVerdictSerializer.Serialize(withDigest);
|
||||||
|
var envelope = await _signer.CreateDsseEnvelopeAsync(
|
||||||
|
payload,
|
||||||
|
"application/vnd.stellaops.delta-verdict+json",
|
||||||
|
options,
|
||||||
|
ct);
|
||||||
|
|
||||||
|
return withDigest with { Signature = envelope.Signature };
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<VerificationResult> VerifyAsync(
|
||||||
|
DeltaVerdict delta,
|
||||||
|
VerificationOptions options,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrEmpty(delta.Signature))
|
||||||
|
return VerificationResult.Fail("Delta is not signed");
|
||||||
|
|
||||||
|
// Verify signature
|
||||||
|
var payload = DeltaVerdictSerializer.Serialize(delta with { Signature = null });
|
||||||
|
var result = await _signer.VerifyDsseEnvelopeAsync(
|
||||||
|
payload,
|
||||||
|
delta.Signature,
|
||||||
|
options,
|
||||||
|
ct);
|
||||||
|
|
||||||
|
// Verify digest
|
||||||
|
if (delta.DeltaDigest != null)
|
||||||
|
{
|
||||||
|
var computed = DeltaVerdictSerializer.ComputeDigest(delta);
|
||||||
|
if (computed != delta.DeltaDigest)
|
||||||
|
return VerificationResult.Fail("Delta digest mismatch");
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Sign deltas with DSSE format
|
||||||
|
- [ ] Verify signatures
|
||||||
|
- [ ] Verify digest integrity
|
||||||
|
- [ ] Support multiple key types
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Risk Budget Evaluator
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Evaluate deltas against risk budgets for release gates.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.DeltaVerdict/Policy/RiskBudgetEvaluator.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.DeltaVerdict.Policy;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Evaluates delta verdicts against risk budgets for release gates.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class RiskBudgetEvaluator : IRiskBudgetEvaluator
|
||||||
|
{
|
||||||
|
public RiskBudgetResult Evaluate(DeltaVerdict delta, RiskBudget budget)
|
||||||
|
{
|
||||||
|
var violations = new List<RiskBudgetViolation>();
|
||||||
|
|
||||||
|
// Check new critical vulnerabilities
|
||||||
|
var criticalAdded = delta.AddedVulnerabilities
|
||||||
|
.Count(v => v.Severity == "critical");
|
||||||
|
if (criticalAdded > budget.MaxNewCriticalVulnerabilities)
|
||||||
|
{
|
||||||
|
violations.Add(new RiskBudgetViolation(
|
||||||
|
"CriticalVulnerabilities",
|
||||||
|
$"Added {criticalAdded} critical vulnerabilities (budget: {budget.MaxNewCriticalVulnerabilities})"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check risk score increase
|
||||||
|
if (delta.RiskScoreDelta.Change > budget.MaxRiskScoreIncrease)
|
||||||
|
{
|
||||||
|
violations.Add(new RiskBudgetViolation(
|
||||||
|
"RiskScoreIncrease",
|
||||||
|
$"Risk score increased by {delta.RiskScoreDelta.Change} (budget: {budget.MaxRiskScoreIncrease})"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check magnitude threshold
|
||||||
|
if ((int)delta.Summary.Magnitude > (int)budget.MaxMagnitude)
|
||||||
|
{
|
||||||
|
violations.Add(new RiskBudgetViolation(
|
||||||
|
"DeltaMagnitude",
|
||||||
|
$"Delta magnitude {delta.Summary.Magnitude} exceeds budget {budget.MaxMagnitude}"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check specific vulnerability additions
|
||||||
|
foreach (var vuln in delta.AddedVulnerabilities)
|
||||||
|
{
|
||||||
|
if (budget.BlockedVulnerabilities.Contains(vuln.VulnerabilityId))
|
||||||
|
{
|
||||||
|
violations.Add(new RiskBudgetViolation(
|
||||||
|
"BlockedVulnerability",
|
||||||
|
$"Added blocked vulnerability {vuln.VulnerabilityId}"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new RiskBudgetResult(
|
||||||
|
IsWithinBudget: violations.Count == 0,
|
||||||
|
Violations: violations,
|
||||||
|
Delta: delta,
|
||||||
|
Budget: budget);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record RiskBudget
|
||||||
|
{
|
||||||
|
public int MaxNewCriticalVulnerabilities { get; init; } = 0;
|
||||||
|
public int MaxNewHighVulnerabilities { get; init; } = 3;
|
||||||
|
public decimal MaxRiskScoreIncrease { get; init; } = 10;
|
||||||
|
public DeltaMagnitude MaxMagnitude { get; init; } = DeltaMagnitude.Medium;
|
||||||
|
public ImmutableHashSet<string> BlockedVulnerabilities { get; init; } = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record RiskBudgetResult(
|
||||||
|
bool IsWithinBudget,
|
||||||
|
IReadOnlyList<RiskBudgetViolation> Violations,
|
||||||
|
DeltaVerdict Delta,
|
||||||
|
RiskBudget Budget);
|
||||||
|
|
||||||
|
public sealed record RiskBudgetViolation(string Category, string Message);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Check new vulnerability counts
|
||||||
|
- [ ] Check risk score increases
|
||||||
|
- [ ] Check magnitude thresholds
|
||||||
|
- [ ] Check blocked vulnerabilities
|
||||||
|
- [ ] Return detailed violations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: OCI Attachment Support
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Attach delta verdicts to OCI artifacts.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.DeltaVerdict/Oci/DeltaOciAttacher.cs`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Attach delta to OCI artifact
|
||||||
|
- [ ] Use standardized media type
|
||||||
|
- [ ] Include base/head references
|
||||||
|
- [ ] Support cosign-style annotations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: CLI Commands
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CLI commands for delta operations.
|
||||||
|
|
||||||
|
**Commands**:
|
||||||
|
```bash
|
||||||
|
# Compute delta between two verdicts
|
||||||
|
stella delta compute --base verdict-v1.json --head verdict-v2.json --output delta.json
|
||||||
|
|
||||||
|
# Compute and sign delta
|
||||||
|
stella delta compute --base verdict-v1.json --head verdict-v2.json --sign --output delta.json
|
||||||
|
|
||||||
|
# Check delta against risk budget
|
||||||
|
stella delta check --delta delta.json --budget prod
|
||||||
|
|
||||||
|
# Attach delta to OCI artifact
|
||||||
|
stella delta attach --delta delta.json --artifact registry/image:tag
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `delta compute` command
|
||||||
|
- [ ] `delta check` command
|
||||||
|
- [ ] `delta attach` command
|
||||||
|
- [ ] JSON output option
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for delta functionality.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Delta computation tests
|
||||||
|
- [ ] Risk budget evaluation tests
|
||||||
|
- [ ] Signing/verification tests
|
||||||
|
- [ ] Edge case tests (empty deltas, identical verdicts)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Delta-Verdict Domain Model |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | Delta Computation Engine |
|
||||||
|
| 3 | T3 | TODO | T1 | QA Team | Delta Signing Service |
|
||||||
|
| 4 | T4 | TODO | T1, T2 | Policy Team | Risk Budget Evaluator |
|
||||||
|
| 5 | T5 | TODO | T1, T3 | QA Team | OCI Attachment Support |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | QA Team | CLI Commands |
|
||||||
|
| 7 | T7 | TODO | T1-T4 | QA Team | Unit Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Delta verdicts enable diff-aware release gates. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Magnitude thresholds | Decision | Policy Team | Configurable per environment |
|
||||||
|
| Risk budget defaults | Decision | Policy Team | Conservative for prod, permissive for dev |
|
||||||
|
| OCI media type | Decision | QA Team | application/vnd.stellaops.delta-verdict+json |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] Delta computation is deterministic
|
||||||
|
- [ ] Risk budgets block excessive changes
|
||||||
|
- [ ] Deltas can be signed and verified
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
639
docs/implplan/SPRINT_5100_0003_0001_sbom_interop_roundtrip.md
Normal file
639
docs/implplan/SPRINT_5100_0003_0001_sbom_interop_roundtrip.md
Normal file
@@ -0,0 +1,639 @@
|
|||||||
|
# Sprint 5100.0003.0001 · SBOM Interop Round-Trip
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement comprehensive SBOM interoperability testing with third-party tools.
|
||||||
|
- Create round-trip tests: Syft → cosign attest → Grype consume → verify findings parity.
|
||||||
|
- Support both CycloneDX 1.6 and SPDX 3.0.1 formats.
|
||||||
|
- Establish interop as a release-blocking contract.
|
||||||
|
- **Working directory:** `tests/interop/` and `src/__Libraries/StellaOps.Interop/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 5100.0001.0002 (Evidence Index) for evidence chain tracking
|
||||||
|
- **Downstream**: CI gates depend on interop pass/fail
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0003.0002 (No-Egress)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- CycloneDX 1.6 specification
|
||||||
|
- SPDX 3.0.1 specification
|
||||||
|
- cosign attestation documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Interop Test Harness
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create test harness for running interop tests with third-party tools.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/interop/StellaOps.Interop.Tests/`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Interop.Tests;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Test harness for SBOM interoperability testing.
|
||||||
|
/// Coordinates Syft, Grype, Trivy, and cosign tools.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class InteropTestHarness : IAsyncLifetime
|
||||||
|
{
|
||||||
|
private readonly ToolManager _toolManager;
|
||||||
|
private readonly string _workDir;
|
||||||
|
|
||||||
|
public InteropTestHarness()
|
||||||
|
{
|
||||||
|
_workDir = Path.Combine(Path.GetTempPath(), $"interop-{Guid.NewGuid():N}");
|
||||||
|
_toolManager = new ToolManager(_workDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task InitializeAsync()
|
||||||
|
{
|
||||||
|
Directory.CreateDirectory(_workDir);
|
||||||
|
|
||||||
|
// Verify tools are available
|
||||||
|
await _toolManager.VerifyToolAsync("syft", "--version");
|
||||||
|
await _toolManager.VerifyToolAsync("grype", "--version");
|
||||||
|
await _toolManager.VerifyToolAsync("cosign", "version");
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Generate SBOM using Syft.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<SbomResult> GenerateSbomWithSyft(
|
||||||
|
string imageRef,
|
||||||
|
SbomFormat format,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var formatArg = format switch
|
||||||
|
{
|
||||||
|
SbomFormat.CycloneDx16 => "cyclonedx-json",
|
||||||
|
SbomFormat.Spdx30 => "spdx-json",
|
||||||
|
_ => throw new ArgumentException($"Unsupported format: {format}")
|
||||||
|
};
|
||||||
|
|
||||||
|
var outputPath = Path.Combine(_workDir, $"sbom-{format}.json");
|
||||||
|
var result = await _toolManager.RunAsync(
|
||||||
|
"syft",
|
||||||
|
$"{imageRef} -o {formatArg}={outputPath}",
|
||||||
|
ct);
|
||||||
|
|
||||||
|
if (!result.Success)
|
||||||
|
return SbomResult.Failed(result.Error);
|
||||||
|
|
||||||
|
var content = await File.ReadAllTextAsync(outputPath, ct);
|
||||||
|
var digest = ComputeDigest(content);
|
||||||
|
|
||||||
|
return new SbomResult(
|
||||||
|
Success: true,
|
||||||
|
Path: outputPath,
|
||||||
|
Format: format,
|
||||||
|
Content: content,
|
||||||
|
Digest: digest);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Generate SBOM using Stella scanner.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<SbomResult> GenerateSbomWithStella(
|
||||||
|
string imageRef,
|
||||||
|
SbomFormat format,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var formatArg = format switch
|
||||||
|
{
|
||||||
|
SbomFormat.CycloneDx16 => "cyclonedx",
|
||||||
|
SbomFormat.Spdx30 => "spdx",
|
||||||
|
_ => throw new ArgumentException($"Unsupported format: {format}")
|
||||||
|
};
|
||||||
|
|
||||||
|
var outputPath = Path.Combine(_workDir, $"stella-sbom-{format}.json");
|
||||||
|
var result = await _toolManager.RunAsync(
|
||||||
|
"stella",
|
||||||
|
$"scan {imageRef} --sbom-format {formatArg} --sbom-output {outputPath}",
|
||||||
|
ct);
|
||||||
|
|
||||||
|
if (!result.Success)
|
||||||
|
return SbomResult.Failed(result.Error);
|
||||||
|
|
||||||
|
var content = await File.ReadAllTextAsync(outputPath, ct);
|
||||||
|
var digest = ComputeDigest(content);
|
||||||
|
|
||||||
|
return new SbomResult(
|
||||||
|
Success: true,
|
||||||
|
Path: outputPath,
|
||||||
|
Format: format,
|
||||||
|
Content: content,
|
||||||
|
Digest: digest);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Attest SBOM using cosign.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<AttestationResult> AttestWithCosign(
|
||||||
|
string sbomPath,
|
||||||
|
string imageRef,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var result = await _toolManager.RunAsync(
|
||||||
|
"cosign",
|
||||||
|
$"attest --predicate {sbomPath} --type cyclonedx {imageRef} --yes",
|
||||||
|
ct);
|
||||||
|
|
||||||
|
if (!result.Success)
|
||||||
|
return AttestationResult.Failed(result.Error);
|
||||||
|
|
||||||
|
return new AttestationResult(Success: true, ImageRef: imageRef);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Scan using Grype from SBOM (no image pull).
|
||||||
|
/// </summary>
|
||||||
|
public async Task<GrypeScanResult> ScanWithGrypeFromSbom(
|
||||||
|
string sbomPath,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var outputPath = Path.Combine(_workDir, "grype-findings.json");
|
||||||
|
var result = await _toolManager.RunAsync(
|
||||||
|
"grype",
|
||||||
|
$"sbom:{sbomPath} -o json --file {outputPath}",
|
||||||
|
ct);
|
||||||
|
|
||||||
|
if (!result.Success)
|
||||||
|
return GrypeScanResult.Failed(result.Error);
|
||||||
|
|
||||||
|
var content = await File.ReadAllTextAsync(outputPath, ct);
|
||||||
|
var findings = ParseGrypeFindings(content);
|
||||||
|
|
||||||
|
return new GrypeScanResult(
|
||||||
|
Success: true,
|
||||||
|
Findings: findings,
|
||||||
|
RawOutput: content);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compare findings between Stella and Grype.
|
||||||
|
/// </summary>
|
||||||
|
public FindingsComparisonResult CompareFindings(
|
||||||
|
IReadOnlyList<Finding> stellaFindings,
|
||||||
|
IReadOnlyList<GrypeFinding> grypeFindings,
|
||||||
|
decimal tolerancePercent = 5)
|
||||||
|
{
|
||||||
|
var stellaVulns = stellaFindings
|
||||||
|
.Select(f => (f.VulnerabilityId, f.PackagePurl))
|
||||||
|
.ToHashSet();
|
||||||
|
|
||||||
|
var grypeVulns = grypeFindings
|
||||||
|
.Select(f => (f.VulnerabilityId, f.PackagePurl))
|
||||||
|
.ToHashSet();
|
||||||
|
|
||||||
|
var onlyInStella = stellaVulns.Except(grypeVulns).ToList();
|
||||||
|
var onlyInGrype = grypeVulns.Except(stellaVulns).ToList();
|
||||||
|
var inBoth = stellaVulns.Intersect(grypeVulns).ToList();
|
||||||
|
|
||||||
|
var totalUnique = stellaVulns.Union(grypeVulns).Count;
|
||||||
|
var parityPercent = totalUnique > 0
|
||||||
|
? (decimal)inBoth.Count / totalUnique * 100
|
||||||
|
: 100;
|
||||||
|
|
||||||
|
return new FindingsComparisonResult(
|
||||||
|
ParityPercent: parityPercent,
|
||||||
|
IsWithinTolerance: parityPercent >= (100 - tolerancePercent),
|
||||||
|
StellaTotalFindings: stellaFindings.Count,
|
||||||
|
GrypeTotalFindings: grypeFindings.Count,
|
||||||
|
MatchingFindings: inBoth.Count,
|
||||||
|
OnlyInStella: onlyInStella.Count,
|
||||||
|
OnlyInGrype: onlyInGrype.Count,
|
||||||
|
OnlyInStellaDetails: onlyInStella,
|
||||||
|
OnlyInGrypeDetails: onlyInGrype);
|
||||||
|
}
|
||||||
|
|
||||||
|
public Task DisposeAsync()
|
||||||
|
{
|
||||||
|
if (Directory.Exists(_workDir))
|
||||||
|
Directory.Delete(_workDir, recursive: true);
|
||||||
|
return Task.CompletedTask;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ComputeDigest(string content) =>
|
||||||
|
Convert.ToHexString(SHA256.HashData(Encoding.UTF8.GetBytes(content))).ToLowerInvariant();
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum SbomFormat
|
||||||
|
{
|
||||||
|
CycloneDx16,
|
||||||
|
Spdx30
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SbomResult(
|
||||||
|
bool Success,
|
||||||
|
string? Path = null,
|
||||||
|
SbomFormat? Format = null,
|
||||||
|
string? Content = null,
|
||||||
|
string? Digest = null,
|
||||||
|
string? Error = null)
|
||||||
|
{
|
||||||
|
public static SbomResult Failed(string error) => new(false, Error: error);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FindingsComparisonResult(
|
||||||
|
decimal ParityPercent,
|
||||||
|
bool IsWithinTolerance,
|
||||||
|
int StellaTotalFindings,
|
||||||
|
int GrypeTotalFindings,
|
||||||
|
int MatchingFindings,
|
||||||
|
int OnlyInStella,
|
||||||
|
int OnlyInGrype,
|
||||||
|
IReadOnlyList<(string VulnId, string Purl)> OnlyInStellaDetails,
|
||||||
|
IReadOnlyList<(string VulnId, string Purl)> OnlyInGrypeDetails);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Tool management (Syft, Grype, cosign)
|
||||||
|
- [ ] SBOM generation with both tools
|
||||||
|
- [ ] Attestation with cosign
|
||||||
|
- [ ] Findings comparison
|
||||||
|
- [ ] Parity percentage calculation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: CycloneDX 1.6 Round-Trip Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Complete round-trip tests for CycloneDX 1.6 format.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/interop/StellaOps.Interop.Tests/CycloneDx/CycloneDxRoundTripTests.cs`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "Interop")]
|
||||||
|
[Trait("Format", "CycloneDX")]
|
||||||
|
public class CycloneDxRoundTripTests : IClassFixture<InteropTestHarness>
|
||||||
|
{
|
||||||
|
private readonly InteropTestHarness _harness;
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[MemberData(nameof(TestImages))]
|
||||||
|
public async Task Syft_GeneratesCycloneDx_GrypeCanConsume(string imageRef)
|
||||||
|
{
|
||||||
|
// Generate SBOM with Syft
|
||||||
|
var sbomResult = await _harness.GenerateSbomWithSyft(
|
||||||
|
imageRef, SbomFormat.CycloneDx16);
|
||||||
|
sbomResult.Success.Should().BeTrue();
|
||||||
|
|
||||||
|
// Scan from SBOM with Grype
|
||||||
|
var grypeResult = await _harness.ScanWithGrypeFromSbom(sbomResult.Path);
|
||||||
|
grypeResult.Success.Should().BeTrue();
|
||||||
|
|
||||||
|
// Grype should be able to parse and find vulnerabilities
|
||||||
|
grypeResult.Findings.Should().NotBeNull();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[MemberData(nameof(TestImages))]
|
||||||
|
public async Task Stella_GeneratesCycloneDx_GrypeCanConsume(string imageRef)
|
||||||
|
{
|
||||||
|
// Generate SBOM with Stella
|
||||||
|
var sbomResult = await _harness.GenerateSbomWithStella(
|
||||||
|
imageRef, SbomFormat.CycloneDx16);
|
||||||
|
sbomResult.Success.Should().BeTrue();
|
||||||
|
|
||||||
|
// Scan from SBOM with Grype
|
||||||
|
var grypeResult = await _harness.ScanWithGrypeFromSbom(sbomResult.Path);
|
||||||
|
grypeResult.Success.Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[MemberData(nameof(TestImages))]
|
||||||
|
public async Task Stella_And_Grype_FindingsParity_Above95Percent(string imageRef)
|
||||||
|
{
|
||||||
|
// Generate SBOM with Stella
|
||||||
|
var stellaSbom = await _harness.GenerateSbomWithStella(
|
||||||
|
imageRef, SbomFormat.CycloneDx16);
|
||||||
|
|
||||||
|
// Get Stella findings
|
||||||
|
var stellaFindings = await _harness.GetStellaFindings(imageRef);
|
||||||
|
|
||||||
|
// Scan SBOM with Grype
|
||||||
|
var grypeResult = await _harness.ScanWithGrypeFromSbom(stellaSbom.Path);
|
||||||
|
|
||||||
|
// Compare findings
|
||||||
|
var comparison = _harness.CompareFindings(
|
||||||
|
stellaFindings,
|
||||||
|
grypeResult.Findings,
|
||||||
|
tolerancePercent: 5);
|
||||||
|
|
||||||
|
comparison.ParityPercent.Should().BeGreaterOrEqualTo(95,
|
||||||
|
$"Findings parity {comparison.ParityPercent}% is below 95% threshold. " +
|
||||||
|
$"Only in Stella: {comparison.OnlyInStella}, Only in Grype: {comparison.OnlyInGrype}");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[MemberData(nameof(TestImages))]
|
||||||
|
public async Task CycloneDx_Attestation_RoundTrip(string imageRef)
|
||||||
|
{
|
||||||
|
// Generate SBOM
|
||||||
|
var sbomResult = await _harness.GenerateSbomWithStella(
|
||||||
|
imageRef, SbomFormat.CycloneDx16);
|
||||||
|
|
||||||
|
// Attest with cosign
|
||||||
|
var attestResult = await _harness.AttestWithCosign(
|
||||||
|
sbomResult.Path, imageRef);
|
||||||
|
attestResult.Success.Should().BeTrue();
|
||||||
|
|
||||||
|
// Verify attestation
|
||||||
|
var verifyResult = await _harness.VerifyCosignAttestation(imageRef);
|
||||||
|
verifyResult.Success.Should().BeTrue();
|
||||||
|
|
||||||
|
// Digest should match
|
||||||
|
var attestedDigest = verifyResult.PredicateDigest;
|
||||||
|
attestedDigest.Should().Be(sbomResult.Digest);
|
||||||
|
}
|
||||||
|
|
||||||
|
public static IEnumerable<object[]> TestImages =>
|
||||||
|
[
|
||||||
|
["alpine:3.18"],
|
||||||
|
["debian:12-slim"],
|
||||||
|
["node:20-alpine"],
|
||||||
|
["python:3.12-slim"],
|
||||||
|
["golang:1.22-alpine"]
|
||||||
|
];
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Syft CycloneDX generation test
|
||||||
|
- [ ] Stella CycloneDX generation test
|
||||||
|
- [ ] Grype consumption tests
|
||||||
|
- [ ] Findings parity at 95%+
|
||||||
|
- [ ] Attestation round-trip
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: SPDX 3.0.1 Round-Trip Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Complete round-trip tests for SPDX 3.0.1 format.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/interop/StellaOps.Interop.Tests/Spdx/SpdxRoundTripTests.cs`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Syft SPDX generation test
|
||||||
|
- [ ] Stella SPDX generation test
|
||||||
|
- [ ] Consumer compatibility tests
|
||||||
|
- [ ] Schema validation tests
|
||||||
|
- [ ] Evidence chain verification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Cross-Tool Findings Parity Analysis
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Analyze and document expected differences between tools.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/interop/StellaOps.Interop.Tests/Analysis/FindingsParityAnalyzer.cs`
|
||||||
|
|
||||||
|
**Analysis Categories**:
|
||||||
|
```csharp
|
||||||
|
public sealed class FindingsParityAnalyzer
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Categorizes differences between tools.
|
||||||
|
/// </summary>
|
||||||
|
public ParityAnalysisReport Analyze(
|
||||||
|
IReadOnlyList<Finding> stellaFindings,
|
||||||
|
IReadOnlyList<GrypeFinding> grypeFindings)
|
||||||
|
{
|
||||||
|
var differences = new List<FindingDifference>();
|
||||||
|
|
||||||
|
// Category 1: Version matching differences
|
||||||
|
// (e.g., semver vs non-semver interpretation)
|
||||||
|
var versionDiffs = AnalyzeVersionMatchingDifferences(...);
|
||||||
|
|
||||||
|
// Category 2: Feed coverage differences
|
||||||
|
// (e.g., Stella has feed X, Grype doesn't)
|
||||||
|
var feedDiffs = AnalyzeFeedCoverageDifferences(...);
|
||||||
|
|
||||||
|
// Category 3: Package identification differences
|
||||||
|
// (e.g., different PURL generation)
|
||||||
|
var purlDiffs = AnalyzePurlDifferences(...);
|
||||||
|
|
||||||
|
// Category 4: VEX application differences
|
||||||
|
// (e.g., Stella applies VEX, Grype doesn't)
|
||||||
|
var vexDiffs = AnalyzeVexDifferences(...);
|
||||||
|
|
||||||
|
return new ParityAnalysisReport
|
||||||
|
{
|
||||||
|
TotalDifferences = differences.Count,
|
||||||
|
VersionMatchingDifferences = versionDiffs,
|
||||||
|
FeedCoverageDifferences = feedDiffs,
|
||||||
|
PurlDifferences = purlDiffs,
|
||||||
|
VexDifferences = vexDiffs,
|
||||||
|
AcceptableDifferences = differences.Count(d => d.IsAcceptable),
|
||||||
|
RequiresInvestigation = differences.Count(d => !d.IsAcceptable)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Categorize difference types
|
||||||
|
- [ ] Document acceptable vs concerning differences
|
||||||
|
- [ ] Generate parity report
|
||||||
|
- [ ] Track trends over time
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Interop CI Pipeline
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CI pipeline for interop testing.
|
||||||
|
|
||||||
|
**Implementation Path**: `.gitea/workflows/interop-e2e.yml`
|
||||||
|
|
||||||
|
**Workflow**:
|
||||||
|
```yaml
|
||||||
|
name: Interop E2E Tests
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- 'src/Scanner/**'
|
||||||
|
- 'src/Excititor/**'
|
||||||
|
- 'tests/interop/**'
|
||||||
|
schedule:
|
||||||
|
- cron: '0 6 * * *' # Nightly
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
interop-tests:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
format: [cyclonedx, spdx]
|
||||||
|
arch: [amd64]
|
||||||
|
include:
|
||||||
|
- format: cyclonedx
|
||||||
|
format_flag: cyclonedx-json
|
||||||
|
- format: spdx
|
||||||
|
format_flag: spdx-json
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install tools
|
||||||
|
run: |
|
||||||
|
# Install Syft
|
||||||
|
curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s -- -b /usr/local/bin
|
||||||
|
|
||||||
|
# Install Grype
|
||||||
|
curl -sSfL https://raw.githubusercontent.com/anchore/grype/main/install.sh | sh -s -- -b /usr/local/bin
|
||||||
|
|
||||||
|
# Install cosign
|
||||||
|
curl -sSfL https://github.com/sigstore/cosign/releases/latest/download/cosign-linux-amd64 -o /usr/local/bin/cosign
|
||||||
|
chmod +x /usr/local/bin/cosign
|
||||||
|
|
||||||
|
- name: Setup .NET
|
||||||
|
uses: actions/setup-dotnet@v4
|
||||||
|
with:
|
||||||
|
dotnet-version: '10.0.100'
|
||||||
|
|
||||||
|
- name: Build Stella CLI
|
||||||
|
run: dotnet build src/Cli/StellaOps.Cli -c Release
|
||||||
|
|
||||||
|
- name: Run interop tests
|
||||||
|
run: |
|
||||||
|
dotnet test tests/interop/StellaOps.Interop.Tests \
|
||||||
|
--filter "Format=${{ matrix.format }}" \
|
||||||
|
--logger "trx;LogFileName=interop-${{ matrix.format }}.trx" \
|
||||||
|
--results-directory ./results
|
||||||
|
|
||||||
|
- name: Upload parity report
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: parity-report-${{ matrix.format }}
|
||||||
|
path: ./results/parity-report.json
|
||||||
|
|
||||||
|
- name: Check parity threshold
|
||||||
|
run: |
|
||||||
|
PARITY=$(jq '.parityPercent' ./results/parity-report.json)
|
||||||
|
if (( $(echo "$PARITY < 95" | bc -l) )); then
|
||||||
|
echo "::error::Findings parity $PARITY% is below 95% threshold"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Matrix for CycloneDX and SPDX
|
||||||
|
- [ ] Tool installation steps
|
||||||
|
- [ ] Parity threshold enforcement
|
||||||
|
- [ ] Report artifacts
|
||||||
|
- [ ] Nightly schedule
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Interop Documentation
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Document interop test results and known differences.
|
||||||
|
|
||||||
|
**Implementation Path**: `docs/interop/README.md`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Tool compatibility matrix
|
||||||
|
- [ ] Known differences documentation
|
||||||
|
- [ ] Parity expectations per format
|
||||||
|
- [ ] Troubleshooting guide
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Project Setup
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the interop test project structure.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test project compiles
|
||||||
|
- [ ] Dependencies resolved
|
||||||
|
- [ ] Tool wrappers functional
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Interop Test Harness |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | CycloneDX 1.6 Round-Trip Tests |
|
||||||
|
| 3 | T3 | TODO | T1 | QA Team | SPDX 3.0.1 Round-Trip Tests |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | QA Team | Cross-Tool Findings Parity Analysis |
|
||||||
|
| 5 | T5 | TODO | T2-T4 | DevOps Team | Interop CI Pipeline |
|
||||||
|
| 6 | T6 | TODO | T4 | QA Team | Interop Documentation |
|
||||||
|
| 7 | T7 | TODO | — | QA Team | Project Setup |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. SBOM interop is critical for ecosystem compatibility. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Parity threshold | Decision | QA Team | 95% threshold, adjustable per format |
|
||||||
|
| Acceptable differences | Decision | QA Team | VEX application expected to differ |
|
||||||
|
| Tool versions | Risk | QA Team | Pin tool versions for reproducibility |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 7 tasks marked DONE
|
||||||
|
- [ ] CycloneDX round-trip at 95%+ parity
|
||||||
|
- [ ] SPDX round-trip at 95%+ parity
|
||||||
|
- [ ] CI blocks on parity regression
|
||||||
|
- [ ] Differences documented and categorized
|
||||||
|
- [ ] `dotnet test` passes all interop tests
|
||||||
632
docs/implplan/SPRINT_5100_0003_0002_no_egress_enforcement.md
Normal file
632
docs/implplan/SPRINT_5100_0003_0002_no_egress_enforcement.md
Normal file
@@ -0,0 +1,632 @@
|
|||||||
|
# Sprint 5100.0003.0002 · No-Egress Test Enforcement
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement network isolation for air-gap compliance testing.
|
||||||
|
- Ensure all offline tests run with no network egress.
|
||||||
|
- Detect and fail tests that attempt network calls.
|
||||||
|
- Prove air-gap operation works correctly.
|
||||||
|
- **Working directory:** `tests/offline/` and `.gitea/workflows/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 5100.0001.0003 (Offline Bundle Manifest)
|
||||||
|
- **Downstream**: All offline E2E tests require this infrastructure
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0003.0001 (SBOM Interop)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/24_OFFLINE_KIT.md`
|
||||||
|
- Docker/Podman network isolation documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Network Isolation Test Base Class
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create base class for tests that must run without network access.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.AirGap/NetworkIsolatedTestBase.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Testing.AirGap;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Base class for tests that must run without network access.
|
||||||
|
/// Monitors and blocks any network calls during test execution.
|
||||||
|
/// </summary>
|
||||||
|
public abstract class NetworkIsolatedTestBase : IAsyncLifetime
|
||||||
|
{
|
||||||
|
private readonly NetworkMonitor _monitor;
|
||||||
|
private readonly List<NetworkAttempt> _blockedAttempts = [];
|
||||||
|
|
||||||
|
protected NetworkIsolatedTestBase()
|
||||||
|
{
|
||||||
|
_monitor = new NetworkMonitor(OnNetworkAttempt);
|
||||||
|
}
|
||||||
|
|
||||||
|
public virtual async Task InitializeAsync()
|
||||||
|
{
|
||||||
|
// Install network interception
|
||||||
|
await _monitor.StartMonitoringAsync();
|
||||||
|
|
||||||
|
// Configure HttpClient factory to use monitored handler
|
||||||
|
ServicePointManager.DefaultConnectionLimit = 0;
|
||||||
|
|
||||||
|
// Block DNS resolution
|
||||||
|
_monitor.BlockDns();
|
||||||
|
}
|
||||||
|
|
||||||
|
public virtual async Task DisposeAsync()
|
||||||
|
{
|
||||||
|
await _monitor.StopMonitoringAsync();
|
||||||
|
|
||||||
|
// Fail test if any network calls were attempted
|
||||||
|
if (_blockedAttempts.Count > 0)
|
||||||
|
{
|
||||||
|
var attempts = string.Join("\n", _blockedAttempts.Select(a =>
|
||||||
|
$" - {a.Host}:{a.Port} at {a.StackTrace}"));
|
||||||
|
throw new NetworkIsolationViolationException(
|
||||||
|
$"Test attempted {_blockedAttempts.Count} network call(s):\n{attempts}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void OnNetworkAttempt(NetworkAttempt attempt)
|
||||||
|
{
|
||||||
|
_blockedAttempts.Add(attempt);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Asserts that no network calls were made during the test.
|
||||||
|
/// </summary>
|
||||||
|
protected void AssertNoNetworkCalls()
|
||||||
|
{
|
||||||
|
if (_blockedAttempts.Count > 0)
|
||||||
|
{
|
||||||
|
throw new NetworkIsolationViolationException(
|
||||||
|
$"Network isolation violated: {_blockedAttempts.Count} attempts blocked");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the offline bundle path for this test.
|
||||||
|
/// </summary>
|
||||||
|
protected string GetOfflineBundlePath() =>
|
||||||
|
Environment.GetEnvironmentVariable("STELLAOPS_OFFLINE_BUNDLE")
|
||||||
|
?? Path.Combine(AppContext.BaseDirectory, "fixtures", "offline-bundle");
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed class NetworkMonitor : IAsyncDisposable
|
||||||
|
{
|
||||||
|
private readonly Action<NetworkAttempt> _onAttempt;
|
||||||
|
private bool _isMonitoring;
|
||||||
|
|
||||||
|
public NetworkMonitor(Action<NetworkAttempt> onAttempt)
|
||||||
|
{
|
||||||
|
_onAttempt = onAttempt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Task StartMonitoringAsync()
|
||||||
|
{
|
||||||
|
_isMonitoring = true;
|
||||||
|
|
||||||
|
// Hook into socket creation
|
||||||
|
AppDomain.CurrentDomain.FirstChanceException += OnException;
|
||||||
|
|
||||||
|
return Task.CompletedTask;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Task StopMonitoringAsync()
|
||||||
|
{
|
||||||
|
_isMonitoring = false;
|
||||||
|
AppDomain.CurrentDomain.FirstChanceException -= OnException;
|
||||||
|
return Task.CompletedTask;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void BlockDns()
|
||||||
|
{
|
||||||
|
// Set environment to prevent DNS lookups
|
||||||
|
Environment.SetEnvironmentVariable("RES_OPTIONS", "timeout:0 attempts:0");
|
||||||
|
}
|
||||||
|
|
||||||
|
private void OnException(object? sender, FirstChanceExceptionEventArgs e)
|
||||||
|
{
|
||||||
|
if (!_isMonitoring) return;
|
||||||
|
|
||||||
|
if (e.Exception is SocketException se)
|
||||||
|
{
|
||||||
|
_onAttempt(new NetworkAttempt(
|
||||||
|
Host: "unknown",
|
||||||
|
Port: 0,
|
||||||
|
StackTrace: se.StackTrace ?? "",
|
||||||
|
Timestamp: DateTimeOffset.UtcNow));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public ValueTask DisposeAsync()
|
||||||
|
{
|
||||||
|
_isMonitoring = false;
|
||||||
|
return ValueTask.CompletedTask;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record NetworkAttempt(
|
||||||
|
string Host,
|
||||||
|
int Port,
|
||||||
|
string StackTrace,
|
||||||
|
DateTimeOffset Timestamp);
|
||||||
|
|
||||||
|
public sealed class NetworkIsolationViolationException : Exception
|
||||||
|
{
|
||||||
|
public NetworkIsolationViolationException(string message) : base(message) { }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Base class intercepts network calls
|
||||||
|
- [ ] Fails test on network attempt
|
||||||
|
- [ ] Records attempt details with stack trace
|
||||||
|
- [ ] Configurable via environment variables
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Docker Network Isolation
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Configure Docker/Testcontainers for network-isolated testing.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.Testing.AirGap/Docker/IsolatedContainerBuilder.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Testing.AirGap.Docker;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds containers with network isolation for air-gap testing.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class IsolatedContainerBuilder
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a container with no network access.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<IContainer> CreateIsolatedContainerAsync(
|
||||||
|
string image,
|
||||||
|
IReadOnlyList<string> volumes,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var container = new ContainerBuilder()
|
||||||
|
.WithImage(image)
|
||||||
|
.WithNetwork(NetworkMode.None) // No network!
|
||||||
|
.WithAutoRemove(true)
|
||||||
|
.WithCleanUp(true);
|
||||||
|
|
||||||
|
foreach (var volume in volumes)
|
||||||
|
{
|
||||||
|
container = container.WithBindMount(volume);
|
||||||
|
}
|
||||||
|
|
||||||
|
var built = container.Build();
|
||||||
|
await built.StartAsync(ct);
|
||||||
|
|
||||||
|
// Verify isolation
|
||||||
|
await VerifyNoNetworkAsync(built, ct);
|
||||||
|
|
||||||
|
return built;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates an isolated network for multi-container tests.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<INetwork> CreateIsolatedNetworkAsync(CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var network = new NetworkBuilder()
|
||||||
|
.WithName($"isolated-{Guid.NewGuid():N}")
|
||||||
|
.WithDriver(NetworkDriver.Bridge)
|
||||||
|
.WithOption("com.docker.network.bridge.enable_ip_masquerade", "false")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
await network.CreateAsync(ct);
|
||||||
|
return network;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static async Task VerifyNoNetworkAsync(IContainer container, CancellationToken ct)
|
||||||
|
{
|
||||||
|
var result = await container.ExecAsync(
|
||||||
|
["ping", "-c", "1", "-W", "1", "8.8.8.8"],
|
||||||
|
ct);
|
||||||
|
|
||||||
|
if (result.ExitCode == 0)
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException(
|
||||||
|
"Container has network access - isolation failed!");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Extension methods for Testcontainers with isolation.
|
||||||
|
/// </summary>
|
||||||
|
public static class ContainerBuilderExtensions
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Configures container for air-gap testing.
|
||||||
|
/// </summary>
|
||||||
|
public static ContainerBuilder WithAirGapMode(this ContainerBuilder builder)
|
||||||
|
{
|
||||||
|
return builder
|
||||||
|
.WithNetwork(NetworkMode.None)
|
||||||
|
.WithEnvironment("STELLAOPS_OFFLINE_MODE", "true")
|
||||||
|
.WithEnvironment("HTTP_PROXY", "")
|
||||||
|
.WithEnvironment("HTTPS_PROXY", "")
|
||||||
|
.WithEnvironment("NO_PROXY", "*");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Containers run with NetworkMode.None
|
||||||
|
- [ ] Verify isolation on container start
|
||||||
|
- [ ] Multi-container isolated network option
|
||||||
|
- [ ] Extension methods for easy configuration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Offline E2E Test Suite
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 8
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Complete E2E test suite that runs entirely offline.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/offline/StellaOps.Offline.E2E.Tests/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "AirGap")]
|
||||||
|
[Trait("Category", "E2E")]
|
||||||
|
public class OfflineE2ETests : NetworkIsolatedTestBase
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Scan_WithOfflineBundle_ProducesVerdict()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var bundlePath = GetOfflineBundlePath();
|
||||||
|
var imageTarball = Path.Combine(bundlePath, "images", "test-image.tar");
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = await RunScannerOfflineAsync(imageTarball, bundlePath);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Success.Should().BeTrue();
|
||||||
|
result.Verdict.Should().NotBeNull();
|
||||||
|
AssertNoNetworkCalls();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Scan_ProducesSbom_WithOfflineBundle()
|
||||||
|
{
|
||||||
|
var bundlePath = GetOfflineBundlePath();
|
||||||
|
var imageTarball = Path.Combine(bundlePath, "images", "test-image.tar");
|
||||||
|
|
||||||
|
var result = await RunScannerOfflineAsync(imageTarball, bundlePath);
|
||||||
|
|
||||||
|
result.Sbom.Should().NotBeNull();
|
||||||
|
result.Sbom.Components.Should().NotBeEmpty();
|
||||||
|
AssertNoNetworkCalls();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Attestation_SignAndVerify_WithOfflineBundle()
|
||||||
|
{
|
||||||
|
var bundlePath = GetOfflineBundlePath();
|
||||||
|
var imageTarball = Path.Combine(bundlePath, "images", "test-image.tar");
|
||||||
|
|
||||||
|
// Scan and generate attestation
|
||||||
|
var scanResult = await RunScannerOfflineAsync(imageTarball, bundlePath);
|
||||||
|
|
||||||
|
// Sign attestation (offline with local keys)
|
||||||
|
var signResult = await SignAttestationOfflineAsync(
|
||||||
|
scanResult.Sbom,
|
||||||
|
Path.Combine(bundlePath, "keys", "signing-key.pem"));
|
||||||
|
|
||||||
|
signResult.Success.Should().BeTrue();
|
||||||
|
|
||||||
|
// Verify signature (offline with local trust roots)
|
||||||
|
var verifyResult = await VerifyAttestationOfflineAsync(
|
||||||
|
signResult.Attestation,
|
||||||
|
Path.Combine(bundlePath, "certs", "trust-root.pem"));
|
||||||
|
|
||||||
|
verifyResult.Valid.Should().BeTrue();
|
||||||
|
AssertNoNetworkCalls();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task PolicyEvaluation_WithOfflineBundle_Works()
|
||||||
|
{
|
||||||
|
var bundlePath = GetOfflineBundlePath();
|
||||||
|
var imageTarball = Path.Combine(bundlePath, "images", "vuln-image.tar");
|
||||||
|
|
||||||
|
var scanResult = await RunScannerOfflineAsync(imageTarball, bundlePath);
|
||||||
|
|
||||||
|
// Policy evaluation should work offline
|
||||||
|
var policyResult = await EvaluatePolicyOfflineAsync(
|
||||||
|
scanResult.Verdict,
|
||||||
|
Path.Combine(bundlePath, "policies", "default.rego"));
|
||||||
|
|
||||||
|
policyResult.Should().NotBeNull();
|
||||||
|
policyResult.Decision.Should().BeOneOf("allow", "deny", "warn");
|
||||||
|
AssertNoNetworkCalls();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Replay_WithOfflineBundle_ProducesIdenticalVerdict()
|
||||||
|
{
|
||||||
|
var bundlePath = GetOfflineBundlePath();
|
||||||
|
var imageTarball = Path.Combine(bundlePath, "images", "test-image.tar");
|
||||||
|
|
||||||
|
// First scan
|
||||||
|
var result1 = await RunScannerOfflineAsync(imageTarball, bundlePath);
|
||||||
|
|
||||||
|
// Replay
|
||||||
|
var result2 = await ReplayFromManifestOfflineAsync(
|
||||||
|
result1.RunManifest,
|
||||||
|
bundlePath);
|
||||||
|
|
||||||
|
result1.Verdict.Digest.Should().Be(result2.Verdict.Digest);
|
||||||
|
AssertNoNetworkCalls();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task VexApplication_WithOfflineBundle_Works()
|
||||||
|
{
|
||||||
|
var bundlePath = GetOfflineBundlePath();
|
||||||
|
var imageTarball = Path.Combine(bundlePath, "images", "vuln-with-vex.tar");
|
||||||
|
|
||||||
|
var scanResult = await RunScannerOfflineAsync(imageTarball, bundlePath);
|
||||||
|
|
||||||
|
// VEX should be applied from offline bundle
|
||||||
|
var vexApplied = scanResult.Verdict.VexStatements.Any();
|
||||||
|
vexApplied.Should().BeTrue("VEX from offline bundle should be applied");
|
||||||
|
|
||||||
|
AssertNoNetworkCalls();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Scan with offline bundle
|
||||||
|
- [ ] SBOM generation offline
|
||||||
|
- [ ] Attestation sign/verify offline
|
||||||
|
- [ ] Policy evaluation offline
|
||||||
|
- [ ] Replay offline
|
||||||
|
- [ ] VEX application offline
|
||||||
|
- [ ] All tests assert no network calls
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: CI Network Isolation Workflow
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CI workflow with strict network isolation.
|
||||||
|
|
||||||
|
**Implementation Path**: `.gitea/workflows/offline-e2e.yml`
|
||||||
|
|
||||||
|
**Workflow**:
|
||||||
|
```yaml
|
||||||
|
name: Offline E2E Tests
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- 'src/AirGap/**'
|
||||||
|
- 'src/Scanner/**'
|
||||||
|
- 'tests/offline/**'
|
||||||
|
schedule:
|
||||||
|
- cron: '0 4 * * *' # Nightly at 4 AM
|
||||||
|
|
||||||
|
env:
|
||||||
|
STELLAOPS_OFFLINE_MODE: 'true'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
offline-e2e:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
# Disable all network access for this job
|
||||||
|
# Note: This requires self-hosted runner with network policy support
|
||||||
|
# or Docker-in-Docker with isolated network
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup .NET
|
||||||
|
uses: actions/setup-dotnet@v4
|
||||||
|
with:
|
||||||
|
dotnet-version: '10.0.100'
|
||||||
|
# Cache must be pre-populated; no network during test
|
||||||
|
|
||||||
|
- name: Download offline bundle
|
||||||
|
run: |
|
||||||
|
# Bundle must be pre-built and cached
|
||||||
|
cp -r /cache/offline-bundles/latest ./offline-bundle
|
||||||
|
|
||||||
|
- name: Build in isolated environment
|
||||||
|
run: |
|
||||||
|
# Build must work with no network
|
||||||
|
docker run --rm --network none \
|
||||||
|
-v $(pwd):/src \
|
||||||
|
-v /cache/nuget:/root/.nuget \
|
||||||
|
mcr.microsoft.com/dotnet/sdk:10.0 \
|
||||||
|
dotnet build /src/tests/offline/StellaOps.Offline.E2E.Tests
|
||||||
|
|
||||||
|
- name: Run offline E2E tests
|
||||||
|
run: |
|
||||||
|
docker run --rm --network none \
|
||||||
|
-v $(pwd):/src \
|
||||||
|
-v $(pwd)/offline-bundle:/bundle \
|
||||||
|
-e STELLAOPS_OFFLINE_BUNDLE=/bundle \
|
||||||
|
-e STELLAOPS_OFFLINE_MODE=true \
|
||||||
|
mcr.microsoft.com/dotnet/sdk:10.0 \
|
||||||
|
dotnet test /src/tests/offline/StellaOps.Offline.E2E.Tests \
|
||||||
|
--logger "trx;LogFileName=offline-e2e.trx"
|
||||||
|
|
||||||
|
- name: Verify no network calls
|
||||||
|
run: |
|
||||||
|
# Parse test output for any NetworkIsolationViolationException
|
||||||
|
if grep -q "NetworkIsolationViolation" ./results/offline-e2e.trx; then
|
||||||
|
echo "::error::Tests attempted network calls in offline mode!"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Upload results
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: offline-e2e-results
|
||||||
|
path: ./results/
|
||||||
|
|
||||||
|
verify-isolation:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
needs: offline-e2e
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Verify network isolation was effective
|
||||||
|
run: |
|
||||||
|
# Check Docker network stats
|
||||||
|
# Verify no egress bytes during test window
|
||||||
|
echo "Network isolation verification passed"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Runs with --network none
|
||||||
|
- [ ] Pre-populated caches for builds
|
||||||
|
- [ ] Offline bundle pre-staged
|
||||||
|
- [ ] Verifies no network calls
|
||||||
|
- [ ] Uploads results on failure
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Offline Bundle Fixtures
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create pre-packaged offline bundle fixtures for testing.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/fixtures/offline-bundle/`
|
||||||
|
|
||||||
|
**Bundle Contents**:
|
||||||
|
```
|
||||||
|
tests/fixtures/offline-bundle/
|
||||||
|
├── manifest.json # Bundle manifest
|
||||||
|
├── feeds/
|
||||||
|
│ ├── nvd-snapshot.json # NVD feed snapshot
|
||||||
|
│ ├── ghsa-snapshot.json # GHSA feed snapshot
|
||||||
|
│ └── distro/
|
||||||
|
│ ├── alpine.json
|
||||||
|
│ ├── debian.json
|
||||||
|
│ └── rhel.json
|
||||||
|
├── policies/
|
||||||
|
│ ├── default.rego # Default policy
|
||||||
|
│ └── strict.rego # Strict policy
|
||||||
|
├── keys/
|
||||||
|
│ ├── signing-key.pem # Test signing key
|
||||||
|
│ └── signing-key.pub # Test public key
|
||||||
|
├── certs/
|
||||||
|
│ ├── trust-root.pem # Test trust root
|
||||||
|
│ └── intermediate.pem # Test intermediate CA
|
||||||
|
├── vex/
|
||||||
|
│ └── vendor-vex.json # Sample VEX document
|
||||||
|
└── images/
|
||||||
|
├── test-image.tar # Basic test image
|
||||||
|
├── vuln-image.tar # Image with known vulns
|
||||||
|
└── vuln-with-vex.tar # Image with VEX coverage
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Complete bundle with all components
|
||||||
|
- [ ] Test images as tarballs
|
||||||
|
- [ ] Feed snapshots from real feeds
|
||||||
|
- [ ] Sample VEX documents
|
||||||
|
- [ ] Test keys and certificates
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Unit tests for network isolation utilities.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] NetworkMonitor tests
|
||||||
|
- [ ] IsolatedContainerBuilder tests
|
||||||
|
- [ ] Network detection accuracy tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Network Isolation Test Base Class |
|
||||||
|
| 2 | T2 | TODO | — | DevOps Team | Docker Network Isolation |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | QA Team | Offline E2E Test Suite |
|
||||||
|
| 4 | T4 | TODO | T3 | DevOps Team | CI Network Isolation Workflow |
|
||||||
|
| 5 | T5 | TODO | T3 | QA Team | Offline Bundle Fixtures |
|
||||||
|
| 6 | T6 | TODO | T1, T2 | QA Team | Unit Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. No-egress enforcement is critical for air-gap compliance. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Isolation method | Decision | DevOps Team | Docker --network none primary; process-level secondary |
|
||||||
|
| CI runner requirements | Risk | DevOps Team | May need self-hosted runners for strict isolation |
|
||||||
|
| Cache pre-population | Decision | DevOps Team | NuGet and tool caches must be pre-built |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] All offline E2E tests pass with no network
|
||||||
|
- [ ] CI workflow verifies network isolation
|
||||||
|
- [ ] Bundle fixtures complete and working
|
||||||
|
- [ ] `dotnet test` passes all offline tests
|
||||||
570
docs/implplan/SPRINT_5100_0004_0001_unknowns_budget_ci_gates.md
Normal file
570
docs/implplan/SPRINT_5100_0004_0001_unknowns_budget_ci_gates.md
Normal file
@@ -0,0 +1,570 @@
|
|||||||
|
# Sprint 5100.0004.0001 · Unknowns Budget CI Gates
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Integrate unknowns budget enforcement into CI/CD pipelines.
|
||||||
|
- Create CLI commands for budget checking in CI.
|
||||||
|
- Add CI workflow for unknowns budget gates.
|
||||||
|
- Surface unknowns in PR checks and UI.
|
||||||
|
- **Working directory:** `src/Cli/StellaOps.Cli/Commands/` and `.gitea/workflows/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 4100.0001.0001 (Reason-Coded Unknowns), Sprint 4100.0001.0002 (Unknown Budgets)
|
||||||
|
- **Downstream**: Release gates depend on budget pass/fail
|
||||||
|
- **Safe to parallelize with**: Sprint 5100.0003.0001 (SBOM Interop)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/product-advisories/19-Dec-2025 - Moat #5.md`
|
||||||
|
- Sprint 4100.0001.0002 (Unknown Budgets model)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: CLI Budget Check Command
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create CLI command for checking scans against unknowns budgets.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Cli/StellaOps.Cli/Commands/Budget/BudgetCheckCommand.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Cli.Commands.Budget;
|
||||||
|
|
||||||
|
[Command("budget", Description = "Unknowns budget operations")]
|
||||||
|
public class BudgetCommand
|
||||||
|
{
|
||||||
|
[Command("check", Description = "Check scan results against unknowns budget")]
|
||||||
|
public class CheckCommand
|
||||||
|
{
|
||||||
|
[Option("--scan-id", Description = "Scan ID to check")]
|
||||||
|
public string? ScanId { get; set; }
|
||||||
|
|
||||||
|
[Option("--verdict", Description = "Path to verdict JSON file")]
|
||||||
|
public string? VerdictPath { get; set; }
|
||||||
|
|
||||||
|
[Option("--environment", Description = "Environment budget to use (prod, stage, dev)")]
|
||||||
|
public string Environment { get; set; } = "prod";
|
||||||
|
|
||||||
|
[Option("--config", Description = "Path to budget configuration file")]
|
||||||
|
public string? ConfigPath { get; set; }
|
||||||
|
|
||||||
|
[Option("--fail-on-exceed", Description = "Exit with error code if budget exceeded")]
|
||||||
|
public bool FailOnExceed { get; set; } = true;
|
||||||
|
|
||||||
|
[Option("--output", Description = "Output format (text, json, sarif)")]
|
||||||
|
public string Output { get; set; } = "text";
|
||||||
|
|
||||||
|
public async Task<int> ExecuteAsync(
|
||||||
|
IUnknownBudgetService budgetService,
|
||||||
|
IConsole console,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
// Load verdict
|
||||||
|
var verdict = await LoadVerdictAsync(ct);
|
||||||
|
if (verdict == null)
|
||||||
|
{
|
||||||
|
console.Error.WriteLine("Failed to load verdict");
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load budget configuration
|
||||||
|
var budget = await LoadBudgetAsync(budgetService, ct);
|
||||||
|
|
||||||
|
// Check budget
|
||||||
|
var result = budgetService.CheckBudget(Environment, verdict.Unknowns);
|
||||||
|
|
||||||
|
// Output result
|
||||||
|
await OutputResultAsync(result, console, ct);
|
||||||
|
|
||||||
|
// Return exit code
|
||||||
|
if (FailOnExceed && !result.IsWithinBudget)
|
||||||
|
{
|
||||||
|
console.Error.WriteLine($"Budget exceeded: {result.Message}");
|
||||||
|
return 2; // Distinct exit code for budget failure
|
||||||
|
}
|
||||||
|
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task OutputResultAsync(
|
||||||
|
BudgetCheckResult result,
|
||||||
|
IConsole console,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
switch (Output.ToLower())
|
||||||
|
{
|
||||||
|
case "json":
|
||||||
|
var json = JsonSerializer.Serialize(result, new JsonSerializerOptions
|
||||||
|
{
|
||||||
|
WriteIndented = true
|
||||||
|
});
|
||||||
|
console.Out.WriteLine(json);
|
||||||
|
break;
|
||||||
|
|
||||||
|
case "sarif":
|
||||||
|
var sarif = ConvertToSarif(result);
|
||||||
|
console.Out.WriteLine(sarif);
|
||||||
|
break;
|
||||||
|
|
||||||
|
default:
|
||||||
|
OutputTextResult(result, console);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void OutputTextResult(BudgetCheckResult result, IConsole console)
|
||||||
|
{
|
||||||
|
var status = result.IsWithinBudget ? "[PASS]" : "[FAIL]";
|
||||||
|
console.Out.WriteLine($"{status} Unknowns Budget Check");
|
||||||
|
console.Out.WriteLine($" Environment: {result.Environment}");
|
||||||
|
console.Out.WriteLine($" Total Unknowns: {result.TotalUnknowns}");
|
||||||
|
|
||||||
|
if (result.TotalLimit.HasValue)
|
||||||
|
console.Out.WriteLine($" Budget Limit: {result.TotalLimit}");
|
||||||
|
|
||||||
|
if (result.Violations.Count > 0)
|
||||||
|
{
|
||||||
|
console.Out.WriteLine("\n Violations:");
|
||||||
|
foreach (var (code, violation) in result.Violations)
|
||||||
|
{
|
||||||
|
console.Out.WriteLine($" - {code}: {violation.Count}/{violation.Limit}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(result.Message))
|
||||||
|
console.Out.WriteLine($"\n Message: {result.Message}");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ConvertToSarif(BudgetCheckResult result)
|
||||||
|
{
|
||||||
|
// Convert to SARIF format for integration with GitHub/GitLab
|
||||||
|
var sarif = new
|
||||||
|
{
|
||||||
|
version = "2.1.0",
|
||||||
|
runs = new[]
|
||||||
|
{
|
||||||
|
new
|
||||||
|
{
|
||||||
|
tool = new { driver = new { name = "StellaOps Budget Check" } },
|
||||||
|
results = result.Violations.Select(v => new
|
||||||
|
{
|
||||||
|
ruleId = $"UNKNOWN_{v.Key}",
|
||||||
|
level = "error",
|
||||||
|
message = new { text = $"{v.Key}: {v.Value.Count} unknowns exceed limit of {v.Value.Limit}" }
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
return JsonSerializer.Serialize(sarif);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `stella budget check` command
|
||||||
|
- [ ] Support verdict file or scan ID
|
||||||
|
- [ ] Environment-based budget selection
|
||||||
|
- [ ] Exit codes for CI integration
|
||||||
|
- [ ] JSON, text, SARIF output formats
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: CI Budget Gate Workflow
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CI workflow for enforcing unknowns budgets on PRs.
|
||||||
|
|
||||||
|
**Implementation Path**: `.gitea/workflows/unknowns-gate.yml`
|
||||||
|
|
||||||
|
**Workflow**:
|
||||||
|
```yaml
|
||||||
|
name: Unknowns Budget Gate
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- 'src/**'
|
||||||
|
- 'Dockerfile*'
|
||||||
|
- '*.lock'
|
||||||
|
push:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
env:
|
||||||
|
STELLAOPS_BUDGET_CONFIG: ./etc/policy.unknowns.yaml
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
scan-and-check-budget:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup .NET
|
||||||
|
uses: actions/setup-dotnet@v4
|
||||||
|
with:
|
||||||
|
dotnet-version: '10.0.100'
|
||||||
|
|
||||||
|
- name: Build CLI
|
||||||
|
run: dotnet build src/Cli/StellaOps.Cli -c Release
|
||||||
|
|
||||||
|
- name: Determine environment
|
||||||
|
id: env
|
||||||
|
run: |
|
||||||
|
if [[ "${{ github.ref }}" == "refs/heads/main" ]]; then
|
||||||
|
echo "environment=prod" >> $GITHUB_OUTPUT
|
||||||
|
elif [[ "${{ github.event_name }}" == "pull_request" ]]; then
|
||||||
|
echo "environment=stage" >> $GITHUB_OUTPUT
|
||||||
|
else
|
||||||
|
echo "environment=dev" >> $GITHUB_OUTPUT
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Scan container image
|
||||||
|
id: scan
|
||||||
|
run: |
|
||||||
|
./out/stella scan ${{ env.IMAGE_REF }} \
|
||||||
|
--output verdict.json \
|
||||||
|
--sbom-output sbom.json
|
||||||
|
|
||||||
|
- name: Check unknowns budget
|
||||||
|
id: budget
|
||||||
|
continue-on-error: true
|
||||||
|
run: |
|
||||||
|
./out/stella budget check \
|
||||||
|
--verdict verdict.json \
|
||||||
|
--environment ${{ steps.env.outputs.environment }} \
|
||||||
|
--config ${{ env.STELLAOPS_BUDGET_CONFIG }} \
|
||||||
|
--output json \
|
||||||
|
--fail-on-exceed > budget-result.json
|
||||||
|
|
||||||
|
echo "result=$(cat budget-result.json | jq -c '.')" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
- name: Upload budget report
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: budget-report
|
||||||
|
path: budget-result.json
|
||||||
|
|
||||||
|
- name: Post PR comment
|
||||||
|
if: github.event_name == 'pull_request'
|
||||||
|
uses: actions/github-script@v7
|
||||||
|
with:
|
||||||
|
script: |
|
||||||
|
const result = ${{ steps.budget.outputs.result }};
|
||||||
|
const status = result.isWithinBudget ? ':white_check_mark:' : ':x:';
|
||||||
|
const body = `## ${status} Unknowns Budget Check
|
||||||
|
|
||||||
|
| Metric | Value |
|
||||||
|
|--------|-------|
|
||||||
|
| Environment | ${result.environment || '${{ steps.env.outputs.environment }}'} |
|
||||||
|
| Total Unknowns | ${result.totalUnknowns} |
|
||||||
|
| Budget Limit | ${result.totalLimit || 'Unlimited'} |
|
||||||
|
| Status | ${result.isWithinBudget ? 'PASS' : 'FAIL'} |
|
||||||
|
|
||||||
|
${result.violations?.length > 0 ? `
|
||||||
|
### Violations
|
||||||
|
${result.violations.map(v => `- **${v.code}**: ${v.count}/${v.limit}`).join('\n')}
|
||||||
|
` : ''}
|
||||||
|
|
||||||
|
${result.message || ''}
|
||||||
|
`;
|
||||||
|
|
||||||
|
github.rest.issues.createComment({
|
||||||
|
issue_number: context.issue.number,
|
||||||
|
owner: context.repo.owner,
|
||||||
|
repo: context.repo.repo,
|
||||||
|
body: body
|
||||||
|
});
|
||||||
|
|
||||||
|
- name: Fail if budget exceeded (prod)
|
||||||
|
if: steps.env.outputs.environment == 'prod' && steps.budget.outcome == 'failure'
|
||||||
|
run: |
|
||||||
|
echo "::error::Production unknowns budget exceeded!"
|
||||||
|
exit 1
|
||||||
|
|
||||||
|
- name: Warn if budget exceeded (non-prod)
|
||||||
|
if: steps.env.outputs.environment != 'prod' && steps.budget.outcome == 'failure'
|
||||||
|
run: |
|
||||||
|
echo "::warning::Unknowns budget exceeded for ${{ steps.env.outputs.environment }}"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Runs on PRs and pushes
|
||||||
|
- [ ] Environment detection (prod/stage/dev)
|
||||||
|
- [ ] Budget check with appropriate config
|
||||||
|
- [ ] PR comment with results
|
||||||
|
- [ ] Fail for prod, warn for non-prod
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: GitHub/GitLab PR Integration
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Rich PR integration for unknowns budget results.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Cli/StellaOps.Cli/Commands/Budget/`
|
||||||
|
|
||||||
|
**Features**:
|
||||||
|
- Status check annotations
|
||||||
|
- PR comments with budget summary
|
||||||
|
- SARIF upload for code scanning integration
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] GitHub status checks
|
||||||
|
- [ ] GitLab merge request comments
|
||||||
|
- [ ] SARIF format for security tab
|
||||||
|
- [ ] Deep links to unknowns in UI
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Unknowns Dashboard Integration
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Surface unknowns budget status in the web UI.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Web/StellaOps.Web/src/app/components/unknowns-budget/`
|
||||||
|
|
||||||
|
**Components**:
|
||||||
|
```typescript
|
||||||
|
// unknowns-budget-widget.component.ts
|
||||||
|
@Component({
|
||||||
|
selector: 'stella-unknowns-budget-widget',
|
||||||
|
template: `
|
||||||
|
<div class="budget-widget" [class.exceeded]="result?.isWithinBudget === false">
|
||||||
|
<h3>Unknowns Budget</h3>
|
||||||
|
|
||||||
|
<div class="budget-meter">
|
||||||
|
<div class="meter-fill" [style.width.%]="usagePercent"></div>
|
||||||
|
<span class="meter-label">{{ result?.totalUnknowns }} / {{ result?.totalLimit || '∞' }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="budget-status">
|
||||||
|
<span [class]="statusClass">{{ statusText }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="violations" *ngIf="result?.violations?.length > 0">
|
||||||
|
<h4>Violations by Reason</h4>
|
||||||
|
<ul>
|
||||||
|
<li *ngFor="let v of result.violations | keyvalue">
|
||||||
|
<span class="code">{{ v.key }}</span>:
|
||||||
|
{{ v.value.count }} / {{ v.value.limit }}
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="unknowns-list" *ngIf="showDetails">
|
||||||
|
<h4>Unknown Items</h4>
|
||||||
|
<stella-unknown-item
|
||||||
|
*ngFor="let unknown of unknowns"
|
||||||
|
[unknown]="unknown">
|
||||||
|
</stella-unknown-item>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`
|
||||||
|
})
|
||||||
|
export class UnknownsBudgetWidgetComponent {
|
||||||
|
@Input() result: BudgetCheckResult;
|
||||||
|
@Input() unknowns: Unknown[];
|
||||||
|
@Input() showDetails = false;
|
||||||
|
|
||||||
|
get usagePercent(): number {
|
||||||
|
if (!this.result?.totalLimit) return 0;
|
||||||
|
return (this.result.totalUnknowns / this.result.totalLimit) * 100;
|
||||||
|
}
|
||||||
|
|
||||||
|
get statusClass(): string {
|
||||||
|
return this.result?.isWithinBudget ? 'status-pass' : 'status-fail';
|
||||||
|
}
|
||||||
|
|
||||||
|
get statusText(): string {
|
||||||
|
return this.result?.isWithinBudget ? 'Within Budget' : 'Budget Exceeded';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Budget meter visualization
|
||||||
|
- [ ] Violation breakdown
|
||||||
|
- [ ] Unknowns list with details
|
||||||
|
- [ ] Status badge component
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Attestation Integration
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Include unknowns budget status in attestations.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Attestor/__Libraries/StellaOps.Attestor.Predicates/`
|
||||||
|
|
||||||
|
**Predicate Extension**:
|
||||||
|
```csharp
|
||||||
|
public sealed record VerdictPredicate
|
||||||
|
{
|
||||||
|
// Existing fields...
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Unknowns budget evaluation result.
|
||||||
|
/// </summary>
|
||||||
|
public UnknownsBudgetPredicate? UnknownsBudget { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record UnknownsBudgetPredicate
|
||||||
|
{
|
||||||
|
public required string Environment { get; init; }
|
||||||
|
public required int TotalUnknowns { get; init; }
|
||||||
|
public int? TotalLimit { get; init; }
|
||||||
|
public required bool IsWithinBudget { get; init; }
|
||||||
|
public ImmutableDictionary<string, BudgetViolationPredicate> Violations { get; init; }
|
||||||
|
= ImmutableDictionary<string, BudgetViolationPredicate>.Empty;
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BudgetViolationPredicate(
|
||||||
|
string ReasonCode,
|
||||||
|
int Count,
|
||||||
|
int Limit);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Unknowns budget in verdict attestation
|
||||||
|
- [ ] Environment recorded
|
||||||
|
- [ ] Violations detailed
|
||||||
|
- [ ] Schema backward compatible
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for budget gate functionality.
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class BudgetCheckCommandTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_WithinBudget_ReturnsZero()
|
||||||
|
{
|
||||||
|
var verdict = CreateVerdict(unknowns: 2);
|
||||||
|
var budget = CreateBudget(limit: 5);
|
||||||
|
|
||||||
|
var result = await ExecuteCommand(verdict, budget, "prod");
|
||||||
|
|
||||||
|
result.ExitCode.Should().Be(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_ExceedsBudget_ReturnsTwo()
|
||||||
|
{
|
||||||
|
var verdict = CreateVerdict(unknowns: 10);
|
||||||
|
var budget = CreateBudget(limit: 5);
|
||||||
|
|
||||||
|
var result = await ExecuteCommand(verdict, budget, "prod");
|
||||||
|
|
||||||
|
result.ExitCode.Should().Be(2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_JsonOutput_ValidJson()
|
||||||
|
{
|
||||||
|
var verdict = CreateVerdict(unknowns: 3);
|
||||||
|
var result = await ExecuteCommand(verdict, output: "json");
|
||||||
|
|
||||||
|
var json = result.Output;
|
||||||
|
var parsed = JsonSerializer.Deserialize<BudgetCheckResult>(json);
|
||||||
|
parsed.Should().NotBeNull();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Execute_SarifOutput_ValidSarif()
|
||||||
|
{
|
||||||
|
var verdict = CreateVerdict(unknowns: 3);
|
||||||
|
var result = await ExecuteCommand(verdict, output: "sarif");
|
||||||
|
|
||||||
|
var sarif = result.Output;
|
||||||
|
sarif.Should().Contain("\"version\": \"2.1.0\"");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Command exit code tests
|
||||||
|
- [ ] Output format tests
|
||||||
|
- [ ] Budget calculation tests
|
||||||
|
- [ ] CI workflow simulation tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | CLI Team | CLI Budget Check Command |
|
||||||
|
| 2 | T2 | TODO | T1 | DevOps Team | CI Budget Gate Workflow |
|
||||||
|
| 3 | T3 | TODO | T1 | DevOps Team | GitHub/GitLab PR Integration |
|
||||||
|
| 4 | T4 | TODO | T1 | UI Team | Unknowns Dashboard Integration |
|
||||||
|
| 5 | T5 | TODO | T1 | QA Team | Attestation Integration |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | QA Team | Unit Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. CI gates for unknowns budget enforcement. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Exit codes | Decision | CLI Team | 0=pass, 1=error, 2=budget exceeded |
|
||||||
|
| PR comment format | Decision | DevOps Team | Markdown table with status emoji |
|
||||||
|
| Prod enforcement | Decision | DevOps Team | Hard fail for prod, soft warn for others |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] CLI command works in CI
|
||||||
|
- [ ] PR comments display budget status
|
||||||
|
- [ ] Prod builds fail on budget exceed
|
||||||
|
- [ ] UI shows budget visualization
|
||||||
|
- [ ] Attestations include budget status
|
||||||
649
docs/implplan/SPRINT_5100_0005_0001_router_chaos_suite.md
Normal file
649
docs/implplan/SPRINT_5100_0005_0001_router_chaos_suite.md
Normal file
@@ -0,0 +1,649 @@
|
|||||||
|
# Sprint 5100.0005.0001 · Router Chaos Suite
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement chaos testing for router backpressure and resilience.
|
||||||
|
- Validate HTTP 429/503 responses with Retry-After headers.
|
||||||
|
- Test graceful degradation under load spikes.
|
||||||
|
- Verify no data loss during throttling.
|
||||||
|
- **Working directory:** `tests/load/` and `tests/chaos/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Router implementation with backpressure (existing)
|
||||||
|
- **Downstream**: Production confidence in router behavior
|
||||||
|
- **Safe to parallelize with**: All other Phase 4+ sprints
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/product-advisories/15-Dec-2025 - Designing 202 + Retry-After Backpressure Control.md`
|
||||||
|
- Router architecture documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Load Test Harness
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create load testing harness using k6 or equivalent.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/load/router/`
|
||||||
|
|
||||||
|
**k6 Script**:
|
||||||
|
```javascript
|
||||||
|
// tests/load/router/spike-test.js
|
||||||
|
import http from 'k6/http';
|
||||||
|
import { check, sleep } from 'k6';
|
||||||
|
import { Rate, Trend } from 'k6/metrics';
|
||||||
|
|
||||||
|
// Custom metrics
|
||||||
|
const throttledRate = new Rate('throttled_requests');
|
||||||
|
const retryAfterTrend = new Trend('retry_after_seconds');
|
||||||
|
const recoveryTime = new Trend('recovery_time_ms');
|
||||||
|
|
||||||
|
export const options = {
|
||||||
|
scenarios: {
|
||||||
|
// Normal load baseline
|
||||||
|
baseline: {
|
||||||
|
executor: 'constant-arrival-rate',
|
||||||
|
rate: 100,
|
||||||
|
timeUnit: '1s',
|
||||||
|
duration: '1m',
|
||||||
|
preAllocatedVUs: 50,
|
||||||
|
},
|
||||||
|
// Spike to 10x
|
||||||
|
spike_10x: {
|
||||||
|
executor: 'constant-arrival-rate',
|
||||||
|
rate: 1000,
|
||||||
|
timeUnit: '1s',
|
||||||
|
duration: '30s',
|
||||||
|
startTime: '1m',
|
||||||
|
preAllocatedVUs: 500,
|
||||||
|
},
|
||||||
|
// Spike to 50x
|
||||||
|
spike_50x: {
|
||||||
|
executor: 'constant-arrival-rate',
|
||||||
|
rate: 5000,
|
||||||
|
timeUnit: '1s',
|
||||||
|
duration: '30s',
|
||||||
|
startTime: '2m',
|
||||||
|
preAllocatedVUs: 2000,
|
||||||
|
},
|
||||||
|
// Recovery observation
|
||||||
|
recovery: {
|
||||||
|
executor: 'constant-arrival-rate',
|
||||||
|
rate: 100,
|
||||||
|
timeUnit: '1s',
|
||||||
|
duration: '2m',
|
||||||
|
startTime: '3m',
|
||||||
|
preAllocatedVUs: 50,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
thresholds: {
|
||||||
|
// At least 95% of requests should succeed OR return proper throttle response
|
||||||
|
'http_req_failed{expected_response:true}': ['rate<0.05'],
|
||||||
|
// Throttled requests should have Retry-After header
|
||||||
|
'throttled_requests': ['rate>0'], // We expect some throttling during spike
|
||||||
|
// Recovery should happen within reasonable time
|
||||||
|
'recovery_time_ms': ['p(95)<30000'], // 95% recover within 30s
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const ROUTER_URL = __ENV.ROUTER_URL || 'http://localhost:8080';
|
||||||
|
|
||||||
|
export default function () {
|
||||||
|
const response = http.post(`${ROUTER_URL}/api/v1/scan`, JSON.stringify({
|
||||||
|
image: 'alpine:latest',
|
||||||
|
}), {
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
tags: { expected_response: 'true' },
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check for proper throttle response
|
||||||
|
if (response.status === 429 || response.status === 503) {
|
||||||
|
throttledRate.add(1);
|
||||||
|
|
||||||
|
// Verify Retry-After header
|
||||||
|
const retryAfter = response.headers['Retry-After'];
|
||||||
|
check(response, {
|
||||||
|
'has Retry-After header': (r) => r.headers['Retry-After'] !== undefined,
|
||||||
|
'Retry-After is valid number': (r) => !isNaN(parseInt(r.headers['Retry-After'])),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (retryAfter) {
|
||||||
|
retryAfterTrend.add(parseInt(retryAfter));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
throttledRate.add(0);
|
||||||
|
|
||||||
|
check(response, {
|
||||||
|
'status is 200 or 202': (r) => r.status === 200 || r.status === 202,
|
||||||
|
'response has body': (r) => r.body && r.body.length > 0,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function handleSummary(data) {
|
||||||
|
return {
|
||||||
|
'results/spike-test-summary.json': JSON.stringify(data, null, 2),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] k6 test scripts for spike patterns
|
||||||
|
- [ ] Custom metrics for throttling
|
||||||
|
- [ ] Threshold definitions
|
||||||
|
- [ ] Summary output to JSON
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Backpressure Verification Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Verify router emits correct 429/503 responses with Retry-After.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/chaos/StellaOps.Chaos.Router.Tests/`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "Chaos")]
|
||||||
|
[Trait("Category", "Router")]
|
||||||
|
public class BackpressureVerificationTests : IClassFixture<RouterTestFixture>
|
||||||
|
{
|
||||||
|
private readonly RouterTestFixture _fixture;
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_UnderLoad_Returns429WithRetryAfter()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
var tasks = new List<Task<HttpResponseMessage>>();
|
||||||
|
|
||||||
|
// Act - Send burst of requests
|
||||||
|
for (var i = 0; i < 1000; i++)
|
||||||
|
{
|
||||||
|
tasks.Add(client.PostAsync("/api/v1/scan", CreateScanRequest()));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responses = await Task.WhenAll(tasks);
|
||||||
|
|
||||||
|
// Assert - Some should be throttled
|
||||||
|
var throttled = responses.Where(r => r.StatusCode == HttpStatusCode.TooManyRequests).ToList();
|
||||||
|
throttled.Should().NotBeEmpty("Expected throttling under heavy load");
|
||||||
|
|
||||||
|
foreach (var response in throttled)
|
||||||
|
{
|
||||||
|
response.Headers.Should().Contain(h => h.Key == "Retry-After");
|
||||||
|
var retryAfter = response.Headers.GetValues("Retry-After").First();
|
||||||
|
int.TryParse(retryAfter, out var seconds).Should().BeTrue();
|
||||||
|
seconds.Should().BeInRange(1, 300, "Retry-After should be reasonable");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_UnderLoad_Returns503WhenOverloaded()
|
||||||
|
{
|
||||||
|
// Arrange - Configure lower limits
|
||||||
|
_fixture.ConfigureLowLimits();
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
|
||||||
|
// Act - Massive burst
|
||||||
|
var tasks = Enumerable.Range(0, 5000)
|
||||||
|
.Select(_ => client.PostAsync("/api/v1/scan", CreateScanRequest()));
|
||||||
|
var responses = await Task.WhenAll(tasks);
|
||||||
|
|
||||||
|
// Assert - Should see 503s when completely overloaded
|
||||||
|
var overloaded = responses.Where(r =>
|
||||||
|
r.StatusCode == HttpStatusCode.ServiceUnavailable).ToList();
|
||||||
|
|
||||||
|
if (overloaded.Any())
|
||||||
|
{
|
||||||
|
foreach (var response in overloaded)
|
||||||
|
{
|
||||||
|
response.Headers.Should().Contain(h => h.Key == "Retry-After");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_RetryAfterHonored_EventuallySucceeds()
|
||||||
|
{
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
|
||||||
|
// First request triggers throttle
|
||||||
|
var response1 = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
|
||||||
|
if (response1.StatusCode == HttpStatusCode.TooManyRequests)
|
||||||
|
{
|
||||||
|
var retryAfter = int.Parse(
|
||||||
|
response1.Headers.GetValues("Retry-After").First());
|
||||||
|
|
||||||
|
// Wait for Retry-After duration
|
||||||
|
await Task.Delay(TimeSpan.FromSeconds(retryAfter + 1));
|
||||||
|
|
||||||
|
// Retry should succeed
|
||||||
|
var response2 = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
response2.StatusCode.Should().BeOneOf(
|
||||||
|
HttpStatusCode.OK,
|
||||||
|
HttpStatusCode.Accepted);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_ThrottleMetrics_AreExposed()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
|
||||||
|
// Trigger some throttling
|
||||||
|
await TriggerThrottling(client);
|
||||||
|
|
||||||
|
// Act - Check metrics endpoint
|
||||||
|
var metricsResponse = await client.GetAsync("/metrics");
|
||||||
|
var metrics = await metricsResponse.Content.ReadAsStringAsync();
|
||||||
|
|
||||||
|
// Assert - Throttle metrics present
|
||||||
|
metrics.Should().Contain("router_requests_throttled_total");
|
||||||
|
metrics.Should().Contain("router_retry_after_seconds");
|
||||||
|
metrics.Should().Contain("router_queue_depth");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 429 response verification
|
||||||
|
- [ ] 503 response verification
|
||||||
|
- [ ] Retry-After header validation
|
||||||
|
- [ ] Eventual success after wait
|
||||||
|
- [ ] Metrics exposure verification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Recovery and Resilience Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Test router recovery after load spikes.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/chaos/StellaOps.Chaos.Router.Tests/RecoveryTests.cs`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class RecoveryTests : IClassFixture<RouterTestFixture>
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_AfterSpike_RecoveryWithin30Seconds()
|
||||||
|
{
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
var stopwatch = Stopwatch.StartNew();
|
||||||
|
|
||||||
|
// Phase 1: Normal operation
|
||||||
|
var normalResponse = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
normalResponse.IsSuccessStatusCode.Should().BeTrue();
|
||||||
|
|
||||||
|
// Phase 2: Spike load
|
||||||
|
await CreateLoadSpike(client, requestCount: 2000, durationSeconds: 10);
|
||||||
|
|
||||||
|
// Phase 3: Measure recovery
|
||||||
|
var recovered = false;
|
||||||
|
while (stopwatch.Elapsed < TimeSpan.FromSeconds(60))
|
||||||
|
{
|
||||||
|
var response = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
if (response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
recovered = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
await Task.Delay(1000);
|
||||||
|
}
|
||||||
|
|
||||||
|
stopwatch.Stop();
|
||||||
|
|
||||||
|
recovered.Should().BeTrue("Router should recover after spike");
|
||||||
|
stopwatch.Elapsed.Should().BeLessThan(TimeSpan.FromSeconds(30),
|
||||||
|
"Recovery should happen within 30 seconds");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_NoDataLoss_DuringThrottling()
|
||||||
|
{
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
var submittedIds = new ConcurrentBag<string>();
|
||||||
|
var successfulIds = new ConcurrentBag<string>();
|
||||||
|
|
||||||
|
// Submit requests with tracking
|
||||||
|
var tasks = Enumerable.Range(0, 500).Select(async i =>
|
||||||
|
{
|
||||||
|
var scanId = Guid.NewGuid().ToString();
|
||||||
|
submittedIds.Add(scanId);
|
||||||
|
|
||||||
|
var response = await client.PostAsync("/api/v1/scan",
|
||||||
|
CreateScanRequest(scanId));
|
||||||
|
|
||||||
|
// If throttled, retry
|
||||||
|
while (response.StatusCode == HttpStatusCode.TooManyRequests)
|
||||||
|
{
|
||||||
|
var retryAfter = int.Parse(
|
||||||
|
response.Headers.GetValues("Retry-After").FirstOrDefault() ?? "5");
|
||||||
|
await Task.Delay(TimeSpan.FromSeconds(retryAfter));
|
||||||
|
response = await client.PostAsync("/api/v1/scan",
|
||||||
|
CreateScanRequest(scanId));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
successfulIds.Add(scanId);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
await Task.WhenAll(tasks);
|
||||||
|
|
||||||
|
// All submitted requests should eventually succeed
|
||||||
|
successfulIds.Should().HaveCount(submittedIds.Count,
|
||||||
|
"No data loss - all requests should eventually succeed");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_GracefulDegradation_MaintainsPartialService()
|
||||||
|
{
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
|
||||||
|
// Start continuous background load
|
||||||
|
var cts = new CancellationTokenSource();
|
||||||
|
var backgroundTask = CreateContinuousLoad(client, cts.Token);
|
||||||
|
|
||||||
|
// Allow load to stabilize
|
||||||
|
await Task.Delay(5000);
|
||||||
|
|
||||||
|
// Check that some requests are still succeeding
|
||||||
|
var successCount = 0;
|
||||||
|
for (var i = 0; i < 10; i++)
|
||||||
|
{
|
||||||
|
var response = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
if (response.IsSuccessStatusCode || response.StatusCode == HttpStatusCode.Accepted)
|
||||||
|
{
|
||||||
|
successCount++;
|
||||||
|
}
|
||||||
|
await Task.Delay(100);
|
||||||
|
}
|
||||||
|
|
||||||
|
cts.Cancel();
|
||||||
|
await backgroundTask;
|
||||||
|
|
||||||
|
successCount.Should().BeGreaterThan(0,
|
||||||
|
"Router should maintain partial service under load");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Recovery within 30 seconds
|
||||||
|
- [ ] No data loss during throttling
|
||||||
|
- [ ] Graceful degradation maintained
|
||||||
|
- [ ] Latencies bounded during spike
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Valkey Failure Injection
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Test router behavior when Valkey cache fails.
|
||||||
|
|
||||||
|
**Implementation Path**: `tests/chaos/StellaOps.Chaos.Router.Tests/ValkeyFailureTests.cs`
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "Chaos")]
|
||||||
|
public class ValkeyFailureTests : IClassFixture<RouterWithValkeyFixture>
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_ValkeyDown_FallsBackToLocal()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
|
||||||
|
// Verify normal operation
|
||||||
|
var response1 = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
response1.IsSuccessStatusCode.Should().BeTrue();
|
||||||
|
|
||||||
|
// Kill Valkey
|
||||||
|
await _fixture.StopValkeyAsync();
|
||||||
|
|
||||||
|
// Act - Router should degrade gracefully
|
||||||
|
var response2 = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
|
||||||
|
// Assert - Should still work with local rate limiter
|
||||||
|
response2.IsSuccessStatusCode.Should().BeTrue(
|
||||||
|
"Router should fall back to local rate limiting when Valkey is down");
|
||||||
|
|
||||||
|
// Restore Valkey
|
||||||
|
await _fixture.StartValkeyAsync();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_ValkeyReconnect_ResumesDistributedLimiting()
|
||||||
|
{
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
|
||||||
|
// Kill and restart Valkey
|
||||||
|
await _fixture.StopValkeyAsync();
|
||||||
|
await Task.Delay(5000);
|
||||||
|
await _fixture.StartValkeyAsync();
|
||||||
|
await Task.Delay(2000); // Allow reconnection
|
||||||
|
|
||||||
|
// Check metrics show distributed limiting active
|
||||||
|
var metricsResponse = await client.GetAsync("/metrics");
|
||||||
|
var metrics = await metricsResponse.Content.ReadAsStringAsync();
|
||||||
|
|
||||||
|
metrics.Should().Contain("rate_limiter_backend=\"distributed\"",
|
||||||
|
"Should resume distributed rate limiting after Valkey reconnect");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Router_ValkeyLatency_DoesNotBlock()
|
||||||
|
{
|
||||||
|
// Configure Valkey with artificial latency
|
||||||
|
await _fixture.ConfigureValkeyLatencyAsync(TimeSpan.FromSeconds(2));
|
||||||
|
|
||||||
|
var client = _fixture.CreateClient();
|
||||||
|
var stopwatch = Stopwatch.StartNew();
|
||||||
|
|
||||||
|
var response = await client.PostAsync("/api/v1/scan", CreateScanRequest());
|
||||||
|
|
||||||
|
stopwatch.Stop();
|
||||||
|
|
||||||
|
// Request should complete without waiting for slow Valkey
|
||||||
|
stopwatch.Elapsed.Should().BeLessThan(TimeSpan.FromSeconds(1),
|
||||||
|
"Slow Valkey should not block request processing");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Fallback to local limiter
|
||||||
|
- [ ] Automatic reconnection
|
||||||
|
- [ ] No blocking on Valkey latency
|
||||||
|
- [ ] Metrics reflect backend state
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: CI Chaos Workflow
|
||||||
|
|
||||||
|
**Assignee**: DevOps Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CI workflow for running chaos tests.
|
||||||
|
|
||||||
|
**Implementation Path**: `.gitea/workflows/router-chaos.yml`
|
||||||
|
|
||||||
|
**Workflow**:
|
||||||
|
```yaml
|
||||||
|
name: Router Chaos Tests
|
||||||
|
|
||||||
|
on:
|
||||||
|
schedule:
|
||||||
|
- cron: '0 3 * * *' # Nightly at 3 AM
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
spike_multiplier:
|
||||||
|
description: 'Load spike multiplier (e.g., 10, 50, 100)'
|
||||||
|
default: '10'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
chaos-tests:
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
env:
|
||||||
|
POSTGRES_PASSWORD: test
|
||||||
|
ports:
|
||||||
|
- 5432:5432
|
||||||
|
|
||||||
|
valkey:
|
||||||
|
image: valkey/valkey:7-alpine
|
||||||
|
ports:
|
||||||
|
- 6379:6379
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup .NET
|
||||||
|
uses: actions/setup-dotnet@v4
|
||||||
|
with:
|
||||||
|
dotnet-version: '10.0.100'
|
||||||
|
|
||||||
|
- name: Install k6
|
||||||
|
run: |
|
||||||
|
curl -sSL https://github.com/grafana/k6/releases/download/v0.47.0/k6-v0.47.0-linux-amd64.tar.gz | tar xz
|
||||||
|
sudo mv k6-v0.47.0-linux-amd64/k6 /usr/local/bin/
|
||||||
|
|
||||||
|
- name: Start Router
|
||||||
|
run: |
|
||||||
|
dotnet run --project src/Router/StellaOps.Router &
|
||||||
|
sleep 10 # Wait for startup
|
||||||
|
|
||||||
|
- name: Run load spike test
|
||||||
|
run: |
|
||||||
|
k6 run tests/load/router/spike-test.js \
|
||||||
|
-e ROUTER_URL=http://localhost:8080 \
|
||||||
|
--out json=results/k6-results.json
|
||||||
|
|
||||||
|
- name: Run chaos unit tests
|
||||||
|
run: |
|
||||||
|
dotnet test tests/chaos/StellaOps.Chaos.Router.Tests \
|
||||||
|
--logger "trx;LogFileName=chaos-results.trx"
|
||||||
|
|
||||||
|
- name: Analyze results
|
||||||
|
run: |
|
||||||
|
python3 tests/load/analyze-results.py \
|
||||||
|
--k6-results results/k6-results.json \
|
||||||
|
--chaos-results results/chaos-results.trx \
|
||||||
|
--output results/analysis.json
|
||||||
|
|
||||||
|
- name: Check thresholds
|
||||||
|
run: |
|
||||||
|
python3 tests/load/check-thresholds.py \
|
||||||
|
--analysis results/analysis.json \
|
||||||
|
--thresholds tests/load/thresholds.json
|
||||||
|
|
||||||
|
- name: Upload results
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: chaos-test-results
|
||||||
|
path: results/
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Nightly schedule
|
||||||
|
- [ ] k6 load tests
|
||||||
|
- [ ] .NET chaos tests
|
||||||
|
- [ ] Results analysis
|
||||||
|
- [ ] Threshold checking
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Documentation
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Document chaos testing approach and results interpretation.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Chaos test runbook
|
||||||
|
- [ ] Threshold tuning guide
|
||||||
|
- [ ] Result interpretation guide
|
||||||
|
- [ ] Recovery playbook
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Load Test Harness |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | Backpressure Verification Tests |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | QA Team | Recovery and Resilience Tests |
|
||||||
|
| 4 | T4 | TODO | T2 | QA Team | Valkey Failure Injection |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | DevOps Team | CI Chaos Workflow |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | QA Team | Documentation |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Router chaos testing for production confidence. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Load tool | Decision | QA Team | k6 for scripting flexibility |
|
||||||
|
| Spike levels | Decision | QA Team | 10x, 50x, 100x normal load |
|
||||||
|
| Recovery threshold | Decision | QA Team | 30 seconds maximum |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] 429/503 responses verified correct
|
||||||
|
- [ ] Retry-After headers present and valid
|
||||||
|
- [ ] Recovery within 30 seconds
|
||||||
|
- [ ] No data loss during throttling
|
||||||
|
- [ ] Valkey failure handled gracefully
|
||||||
790
docs/implplan/SPRINT_5100_0006_0001_audit_pack_export_import.md
Normal file
790
docs/implplan/SPRINT_5100_0006_0001_audit_pack_export_import.md
Normal file
@@ -0,0 +1,790 @@
|
|||||||
|
# Sprint 5100.0006.0001 · Audit Pack Export/Import
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement sealed audit pack export for auditors and compliance.
|
||||||
|
- Bundle: run manifest + offline bundle + evidence + verdict.
|
||||||
|
- Enable one-command replay in clean environment.
|
||||||
|
- Verify signatures under imported trust roots.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.AuditPack/` and `src/Cli/StellaOps.Cli/Commands/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 5100.0001.0001 (Run Manifest), Sprint 5100.0002.0002 (Replay Runner)
|
||||||
|
- **Downstream**: Auditor workflows, compliance verification
|
||||||
|
- **Safe to parallelize with**: All other Phase 5 sprints
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
- `docs/24_OFFLINE_KIT.md`
|
||||||
|
- Sprint 5100.0001.0001 (Run Manifest Schema)
|
||||||
|
- Sprint 5100.0002.0002 (Replay Runner)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Audit Pack Domain Model
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define the audit pack model and structure.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.AuditPack/Models/AuditPack.cs`
|
||||||
|
|
||||||
|
**Model Definition**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AuditPack.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// A sealed, self-contained audit pack for verification and compliance.
|
||||||
|
/// Contains all inputs and outputs required to reproduce and verify a scan.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record AuditPack
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique identifier for this audit pack.
|
||||||
|
/// </summary>
|
||||||
|
public required string PackId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Schema version for forward compatibility.
|
||||||
|
/// </summary>
|
||||||
|
public required string SchemaVersion { get; init; } = "1.0.0";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Human-readable name for this pack.
|
||||||
|
/// </summary>
|
||||||
|
public required string Name { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when pack was created.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Run manifest for replay.
|
||||||
|
/// </summary>
|
||||||
|
public required RunManifest RunManifest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Evidence index linking verdict to all evidence.
|
||||||
|
/// </summary>
|
||||||
|
public required EvidenceIndex EvidenceIndex { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The verdict from the scan.
|
||||||
|
/// </summary>
|
||||||
|
public required Verdict Verdict { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Offline bundle manifest (contents stored separately).
|
||||||
|
/// </summary>
|
||||||
|
public required BundleManifest OfflineBundle { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// All attestations in the evidence chain.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<Attestation> Attestations { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SBOM documents (CycloneDX and SPDX).
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<SbomDocument> Sboms { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX documents applied.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<VexDocument> VexDocuments { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Trust roots for signature verification.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<TrustRoot> TrustRoots { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Pack contents inventory with paths and digests.
|
||||||
|
/// </summary>
|
||||||
|
public required PackContents Contents { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 digest of this pack manifest (excluding signature).
|
||||||
|
/// </summary>
|
||||||
|
public string? PackDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// DSSE signature over the pack.
|
||||||
|
/// </summary>
|
||||||
|
public string? Signature { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record PackContents
|
||||||
|
{
|
||||||
|
public required ImmutableArray<PackFile> Files { get; init; }
|
||||||
|
public long TotalSizeBytes { get; init; }
|
||||||
|
public int FileCount { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record PackFile(
|
||||||
|
string RelativePath,
|
||||||
|
string Digest,
|
||||||
|
long SizeBytes,
|
||||||
|
PackFileType Type);
|
||||||
|
|
||||||
|
public enum PackFileType
|
||||||
|
{
|
||||||
|
Manifest,
|
||||||
|
RunManifest,
|
||||||
|
EvidenceIndex,
|
||||||
|
Verdict,
|
||||||
|
Sbom,
|
||||||
|
Vex,
|
||||||
|
Attestation,
|
||||||
|
Feed,
|
||||||
|
Policy,
|
||||||
|
TrustRoot,
|
||||||
|
Other
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SbomDocument(
|
||||||
|
string Id,
|
||||||
|
string Format,
|
||||||
|
string Content,
|
||||||
|
string Digest);
|
||||||
|
|
||||||
|
public sealed record VexDocument(
|
||||||
|
string Id,
|
||||||
|
string Format,
|
||||||
|
string Content,
|
||||||
|
string Digest);
|
||||||
|
|
||||||
|
public sealed record TrustRoot(
|
||||||
|
string Id,
|
||||||
|
string Type, // fulcio, rekor, custom
|
||||||
|
string Content,
|
||||||
|
string Digest);
|
||||||
|
|
||||||
|
public sealed record Attestation(
|
||||||
|
string Id,
|
||||||
|
string Type,
|
||||||
|
string Envelope, // DSSE envelope
|
||||||
|
string Digest);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Complete audit pack model
|
||||||
|
- [ ] Pack contents inventory
|
||||||
|
- [ ] Trust roots for offline verification
|
||||||
|
- [ ] Signature support
|
||||||
|
- [ ] All fields documented
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Audit Pack Builder
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 8
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Service to build audit packs from scan results.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.AuditPack/Services/AuditPackBuilder.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AuditPack.Services;
|
||||||
|
|
||||||
|
public sealed class AuditPackBuilder : IAuditPackBuilder
|
||||||
|
{
|
||||||
|
private readonly IFeedLoader _feedLoader;
|
||||||
|
private readonly IPolicyLoader _policyLoader;
|
||||||
|
private readonly IAttestationStorage _attestationStorage;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds an audit pack from a scan result.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<AuditPack> BuildAsync(
|
||||||
|
ScanResult scanResult,
|
||||||
|
AuditPackOptions options,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var files = new List<PackFile>();
|
||||||
|
|
||||||
|
// Collect all evidence
|
||||||
|
var attestations = await CollectAttestationsAsync(scanResult, ct);
|
||||||
|
var sboms = CollectSboms(scanResult);
|
||||||
|
var vexDocuments = CollectVexDocuments(scanResult);
|
||||||
|
var trustRoots = await CollectTrustRootsAsync(options, ct);
|
||||||
|
|
||||||
|
// Build offline bundle subset (only required feeds/policies)
|
||||||
|
var bundleManifest = await BuildMinimalBundleAsync(scanResult, ct);
|
||||||
|
|
||||||
|
// Create pack structure
|
||||||
|
var pack = new AuditPack
|
||||||
|
{
|
||||||
|
PackId = Guid.NewGuid().ToString(),
|
||||||
|
SchemaVersion = "1.0.0",
|
||||||
|
Name = options.Name ?? $"audit-pack-{scanResult.ScanId}",
|
||||||
|
CreatedAt = DateTimeOffset.UtcNow,
|
||||||
|
RunManifest = scanResult.RunManifest,
|
||||||
|
EvidenceIndex = scanResult.EvidenceIndex,
|
||||||
|
Verdict = scanResult.Verdict,
|
||||||
|
OfflineBundle = bundleManifest,
|
||||||
|
Attestations = [.. attestations],
|
||||||
|
Sboms = [.. sboms],
|
||||||
|
VexDocuments = [.. vexDocuments],
|
||||||
|
TrustRoots = [.. trustRoots],
|
||||||
|
Contents = new PackContents
|
||||||
|
{
|
||||||
|
Files = [.. files],
|
||||||
|
TotalSizeBytes = files.Sum(f => f.SizeBytes),
|
||||||
|
FileCount = files.Count
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return AuditPackSerializer.WithDigest(pack);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Exports audit pack to archive file.
|
||||||
|
/// </summary>
|
||||||
|
public async Task ExportAsync(
|
||||||
|
AuditPack pack,
|
||||||
|
string outputPath,
|
||||||
|
ExportOptions options,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
using var archive = new TarArchive(outputPath);
|
||||||
|
|
||||||
|
// Write pack manifest
|
||||||
|
var manifestJson = AuditPackSerializer.Serialize(pack);
|
||||||
|
await archive.WriteEntryAsync("manifest.json", manifestJson, ct);
|
||||||
|
|
||||||
|
// Write run manifest
|
||||||
|
var runManifestJson = RunManifestSerializer.Serialize(pack.RunManifest);
|
||||||
|
await archive.WriteEntryAsync("run-manifest.json", runManifestJson, ct);
|
||||||
|
|
||||||
|
// Write evidence index
|
||||||
|
var evidenceJson = EvidenceIndexSerializer.Serialize(pack.EvidenceIndex);
|
||||||
|
await archive.WriteEntryAsync("evidence-index.json", evidenceJson, ct);
|
||||||
|
|
||||||
|
// Write verdict
|
||||||
|
var verdictJson = CanonicalJsonSerializer.Serialize(pack.Verdict);
|
||||||
|
await archive.WriteEntryAsync("verdict.json", verdictJson, ct);
|
||||||
|
|
||||||
|
// Write SBOMs
|
||||||
|
foreach (var sbom in pack.Sboms)
|
||||||
|
{
|
||||||
|
await archive.WriteEntryAsync($"sboms/{sbom.Id}.json", sbom.Content, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write attestations
|
||||||
|
foreach (var att in pack.Attestations)
|
||||||
|
{
|
||||||
|
await archive.WriteEntryAsync($"attestations/{att.Id}.json", att.Envelope, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write VEX documents
|
||||||
|
foreach (var vex in pack.VexDocuments)
|
||||||
|
{
|
||||||
|
await archive.WriteEntryAsync($"vex/{vex.Id}.json", vex.Content, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write trust roots
|
||||||
|
foreach (var root in pack.TrustRoots)
|
||||||
|
{
|
||||||
|
await archive.WriteEntryAsync($"trust-roots/{root.Id}.pem", root.Content, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write offline bundle subset
|
||||||
|
await WriteOfflineBundleAsync(archive, pack.OfflineBundle, ct);
|
||||||
|
|
||||||
|
// Sign if requested
|
||||||
|
if (options.Sign)
|
||||||
|
{
|
||||||
|
var signature = await SignPackAsync(pack, options.SigningKey, ct);
|
||||||
|
await archive.WriteEntryAsync("signature.sig", signature, ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record AuditPackOptions
|
||||||
|
{
|
||||||
|
public string? Name { get; init; }
|
||||||
|
public bool IncludeFeeds { get; init; } = true;
|
||||||
|
public bool IncludePolicies { get; init; } = true;
|
||||||
|
public bool MinimizeSize { get; init; } = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ExportOptions
|
||||||
|
{
|
||||||
|
public bool Sign { get; init; } = true;
|
||||||
|
public string? SigningKey { get; init; }
|
||||||
|
public bool Compress { get; init; } = true;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Builds complete audit pack
|
||||||
|
- [ ] Exports to tar.gz archive
|
||||||
|
- [ ] Includes all evidence
|
||||||
|
- [ ] Optional signing
|
||||||
|
- [ ] Size minimization option
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Audit Pack Importer
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Import and validate audit packs.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.AuditPack/Services/AuditPackImporter.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AuditPack.Services;
|
||||||
|
|
||||||
|
public sealed class AuditPackImporter : IAuditPackImporter
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Imports an audit pack from archive.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<ImportResult> ImportAsync(
|
||||||
|
string archivePath,
|
||||||
|
ImportOptions options,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var extractDir = options.ExtractDirectory ??
|
||||||
|
Path.Combine(Path.GetTempPath(), $"audit-pack-{Guid.NewGuid():N}");
|
||||||
|
|
||||||
|
// Extract archive
|
||||||
|
await ExtractArchiveAsync(archivePath, extractDir, ct);
|
||||||
|
|
||||||
|
// Load manifest
|
||||||
|
var manifestPath = Path.Combine(extractDir, "manifest.json");
|
||||||
|
var manifestJson = await File.ReadAllTextAsync(manifestPath, ct);
|
||||||
|
var pack = AuditPackSerializer.Deserialize(manifestJson);
|
||||||
|
|
||||||
|
// Verify integrity
|
||||||
|
var integrityResult = await VerifyIntegrityAsync(pack, extractDir, ct);
|
||||||
|
if (!integrityResult.IsValid)
|
||||||
|
{
|
||||||
|
return ImportResult.Failed("Integrity verification failed", integrityResult.Errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify signatures if present
|
||||||
|
if (options.VerifySignatures)
|
||||||
|
{
|
||||||
|
var signatureResult = await VerifySignaturesAsync(pack, extractDir, ct);
|
||||||
|
if (!signatureResult.IsValid)
|
||||||
|
{
|
||||||
|
return ImportResult.Failed("Signature verification failed", signatureResult.Errors);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new ImportResult
|
||||||
|
{
|
||||||
|
Success = true,
|
||||||
|
Pack = pack,
|
||||||
|
ExtractDirectory = extractDir,
|
||||||
|
IntegrityResult = integrityResult,
|
||||||
|
SignatureResult = options.VerifySignatures ? await VerifySignaturesAsync(pack, extractDir, ct) : null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task<IntegrityResult> VerifyIntegrityAsync(
|
||||||
|
AuditPack pack,
|
||||||
|
string extractDir,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var errors = new List<string>();
|
||||||
|
|
||||||
|
// Verify each file digest
|
||||||
|
foreach (var file in pack.Contents.Files)
|
||||||
|
{
|
||||||
|
var filePath = Path.Combine(extractDir, file.RelativePath);
|
||||||
|
if (!File.Exists(filePath))
|
||||||
|
{
|
||||||
|
errors.Add($"Missing file: {file.RelativePath}");
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
var content = await File.ReadAllBytesAsync(filePath, ct);
|
||||||
|
var actualDigest = Convert.ToHexString(SHA256.HashData(content)).ToLowerInvariant();
|
||||||
|
|
||||||
|
if (actualDigest != file.Digest.ToLowerInvariant())
|
||||||
|
{
|
||||||
|
errors.Add($"Digest mismatch for {file.RelativePath}: expected {file.Digest}, got {actualDigest}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify pack digest
|
||||||
|
if (pack.PackDigest != null)
|
||||||
|
{
|
||||||
|
var computed = AuditPackSerializer.ComputeDigest(pack);
|
||||||
|
if (computed != pack.PackDigest)
|
||||||
|
{
|
||||||
|
errors.Add($"Pack digest mismatch: expected {pack.PackDigest}, got {computed}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new IntegrityResult(errors.Count == 0, errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async Task<SignatureResult> VerifySignaturesAsync(
|
||||||
|
AuditPack pack,
|
||||||
|
string extractDir,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var errors = new List<string>();
|
||||||
|
|
||||||
|
// Load signature
|
||||||
|
var signaturePath = Path.Combine(extractDir, "signature.sig");
|
||||||
|
if (!File.Exists(signaturePath))
|
||||||
|
{
|
||||||
|
return new SignatureResult(true, [], "No signature present");
|
||||||
|
}
|
||||||
|
|
||||||
|
var signature = await File.ReadAllTextAsync(signaturePath, ct);
|
||||||
|
|
||||||
|
// Verify against trust roots
|
||||||
|
foreach (var root in pack.TrustRoots)
|
||||||
|
{
|
||||||
|
var result = await VerifySignatureWithRootAsync(pack, signature, root, ct);
|
||||||
|
if (result.IsValid)
|
||||||
|
{
|
||||||
|
return new SignatureResult(true, [], $"Verified with {root.Id}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
errors.Add("Signature does not verify against any trust root");
|
||||||
|
return new SignatureResult(false, errors);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ImportResult
|
||||||
|
{
|
||||||
|
public bool Success { get; init; }
|
||||||
|
public AuditPack? Pack { get; init; }
|
||||||
|
public string? ExtractDirectory { get; init; }
|
||||||
|
public IntegrityResult? IntegrityResult { get; init; }
|
||||||
|
public SignatureResult? SignatureResult { get; init; }
|
||||||
|
public IReadOnlyList<string>? Errors { get; init; }
|
||||||
|
|
||||||
|
public static ImportResult Failed(string message, IReadOnlyList<string> errors) =>
|
||||||
|
new() { Success = false, Errors = errors.Prepend(message).ToList() };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Extracts archive
|
||||||
|
- [ ] Verifies all file digests
|
||||||
|
- [ ] Verifies pack signature
|
||||||
|
- [ ] Uses included trust roots
|
||||||
|
- [ ] Clear error reporting
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Replay from Audit Pack
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Replay scan from imported audit pack and compare results.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/__Libraries/StellaOps.AuditPack/Services/AuditPackReplayer.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.AuditPack.Services;
|
||||||
|
|
||||||
|
public sealed class AuditPackReplayer : IAuditPackReplayer
|
||||||
|
{
|
||||||
|
private readonly IReplayEngine _replayEngine;
|
||||||
|
private readonly IBundleLoader _bundleLoader;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Replays a scan from an imported audit pack.
|
||||||
|
/// </summary>
|
||||||
|
public async Task<ReplayComparisonResult> ReplayAsync(
|
||||||
|
ImportResult importResult,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
if (!importResult.Success || importResult.Pack == null)
|
||||||
|
{
|
||||||
|
return ReplayComparisonResult.Failed("Invalid import result");
|
||||||
|
}
|
||||||
|
|
||||||
|
var pack = importResult.Pack;
|
||||||
|
|
||||||
|
// Load offline bundle from pack
|
||||||
|
var bundlePath = Path.Combine(importResult.ExtractDirectory!, "bundle");
|
||||||
|
await _bundleLoader.LoadAsync(bundlePath, ct);
|
||||||
|
|
||||||
|
// Execute replay
|
||||||
|
var replayResult = await _replayEngine.ReplayAsync(
|
||||||
|
pack.RunManifest,
|
||||||
|
new ReplayOptions { UseFrozenTime = true },
|
||||||
|
ct);
|
||||||
|
|
||||||
|
if (!replayResult.Success)
|
||||||
|
{
|
||||||
|
return ReplayComparisonResult.Failed($"Replay failed: {string.Join(", ", replayResult.Errors ?? [])}");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compare verdicts
|
||||||
|
var comparison = CompareVerdicts(pack.Verdict, replayResult.Verdict);
|
||||||
|
|
||||||
|
return new ReplayComparisonResult
|
||||||
|
{
|
||||||
|
Success = true,
|
||||||
|
IsIdentical = comparison.IsIdentical,
|
||||||
|
OriginalVerdictDigest = pack.Verdict.Digest,
|
||||||
|
ReplayedVerdictDigest = replayResult.VerdictDigest,
|
||||||
|
Differences = comparison.Differences,
|
||||||
|
ReplayDurationMs = replayResult.DurationMs
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static VerdictComparison CompareVerdicts(Verdict original, Verdict? replayed)
|
||||||
|
{
|
||||||
|
if (replayed == null)
|
||||||
|
return new VerdictComparison(false, ["Replayed verdict is null"]);
|
||||||
|
|
||||||
|
var originalJson = CanonicalJsonSerializer.Serialize(original);
|
||||||
|
var replayedJson = CanonicalJsonSerializer.Serialize(replayed);
|
||||||
|
|
||||||
|
if (originalJson == replayedJson)
|
||||||
|
return new VerdictComparison(true, []);
|
||||||
|
|
||||||
|
// Find differences
|
||||||
|
var differences = FindJsonDifferences(originalJson, replayedJson);
|
||||||
|
return new VerdictComparison(false, differences);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record ReplayComparisonResult
|
||||||
|
{
|
||||||
|
public bool Success { get; init; }
|
||||||
|
public bool IsIdentical { get; init; }
|
||||||
|
public string? OriginalVerdictDigest { get; init; }
|
||||||
|
public string? ReplayedVerdictDigest { get; init; }
|
||||||
|
public IReadOnlyList<string> Differences { get; init; } = [];
|
||||||
|
public long ReplayDurationMs { get; init; }
|
||||||
|
public string? Error { get; init; }
|
||||||
|
|
||||||
|
public static ReplayComparisonResult Failed(string error) =>
|
||||||
|
new() { Success = false, Error = error };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Loads bundle from pack
|
||||||
|
- [ ] Executes replay
|
||||||
|
- [ ] Compares verdicts byte-for-byte
|
||||||
|
- [ ] Reports differences
|
||||||
|
- [ ] Performance measurement
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: CLI Commands
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3, T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
CLI commands for audit pack operations.
|
||||||
|
|
||||||
|
**Commands**:
|
||||||
|
```bash
|
||||||
|
# Export audit pack from scan
|
||||||
|
stella audit-pack export --scan-id <id> --output audit-pack.tar.gz
|
||||||
|
|
||||||
|
# Export with signing
|
||||||
|
stella audit-pack export --scan-id <id> --sign --key signing-key.pem --output audit-pack.tar.gz
|
||||||
|
|
||||||
|
# Verify audit pack integrity
|
||||||
|
stella audit-pack verify audit-pack.tar.gz
|
||||||
|
|
||||||
|
# Import and show info
|
||||||
|
stella audit-pack info audit-pack.tar.gz
|
||||||
|
|
||||||
|
# Replay from audit pack
|
||||||
|
stella audit-pack replay audit-pack.tar.gz --output replay-result.json
|
||||||
|
|
||||||
|
# Full verification workflow
|
||||||
|
stella audit-pack verify-and-replay audit-pack.tar.gz
|
||||||
|
```
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Cli/StellaOps.Cli/Commands/AuditPack/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `export` command
|
||||||
|
- [ ] `verify` command
|
||||||
|
- [ ] `info` command
|
||||||
|
- [ ] `replay` command
|
||||||
|
- [ ] `verify-and-replay` combined command
|
||||||
|
- [ ] JSON output option
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Unit and Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: QA Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for audit pack functionality.
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
```csharp
|
||||||
|
public class AuditPackBuilderTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Build_FromScanResult_CreatesCompletePack()
|
||||||
|
{
|
||||||
|
var scanResult = CreateTestScanResult();
|
||||||
|
var builder = CreateBuilder();
|
||||||
|
|
||||||
|
var pack = await builder.BuildAsync(scanResult, new AuditPackOptions());
|
||||||
|
|
||||||
|
pack.RunManifest.Should().NotBeNull();
|
||||||
|
pack.Verdict.Should().NotBeNull();
|
||||||
|
pack.EvidenceIndex.Should().NotBeNull();
|
||||||
|
pack.Attestations.Should().NotBeEmpty();
|
||||||
|
pack.TrustRoots.Should().NotBeEmpty();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Export_CreatesValidArchive()
|
||||||
|
{
|
||||||
|
var pack = CreateTestPack();
|
||||||
|
var builder = CreateBuilder();
|
||||||
|
var outputPath = GetTempPath();
|
||||||
|
|
||||||
|
await builder.ExportAsync(pack, outputPath, new ExportOptions());
|
||||||
|
|
||||||
|
File.Exists(outputPath).Should().BeTrue();
|
||||||
|
// Verify archive structure
|
||||||
|
using var archive = new TarReader(File.OpenRead(outputPath));
|
||||||
|
var entries = archive.ReadAllEntries().ToList();
|
||||||
|
entries.Should().Contain(e => e.Name == "manifest.json");
|
||||||
|
entries.Should().Contain(e => e.Name == "run-manifest.json");
|
||||||
|
entries.Should().Contain(e => e.Name == "verdict.json");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class AuditPackImporterTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Import_ValidPack_Succeeds()
|
||||||
|
{
|
||||||
|
var archivePath = CreateTestArchive();
|
||||||
|
var importer = CreateImporter();
|
||||||
|
|
||||||
|
var result = await importer.ImportAsync(archivePath, new ImportOptions());
|
||||||
|
|
||||||
|
result.Success.Should().BeTrue();
|
||||||
|
result.Pack.Should().NotBeNull();
|
||||||
|
result.IntegrityResult.IsValid.Should().BeTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Import_TamperedPack_FailsIntegrity()
|
||||||
|
{
|
||||||
|
var archivePath = CreateTamperedArchive();
|
||||||
|
var importer = CreateImporter();
|
||||||
|
|
||||||
|
var result = await importer.ImportAsync(archivePath, new ImportOptions());
|
||||||
|
|
||||||
|
result.Success.Should().BeFalse();
|
||||||
|
result.IntegrityResult.IsValid.Should().BeFalse();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class AuditPackReplayerTests
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Replay_ValidPack_ProducesIdenticalVerdict()
|
||||||
|
{
|
||||||
|
var pack = CreateTestPack();
|
||||||
|
var importResult = CreateImportResult(pack);
|
||||||
|
var replayer = CreateReplayer();
|
||||||
|
|
||||||
|
var result = await replayer.ReplayAsync(importResult);
|
||||||
|
|
||||||
|
result.Success.Should().BeTrue();
|
||||||
|
result.IsIdentical.Should().BeTrue();
|
||||||
|
result.OriginalVerdictDigest.Should().Be(result.ReplayedVerdictDigest);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Builder tests
|
||||||
|
- [ ] Exporter tests
|
||||||
|
- [ ] Importer tests
|
||||||
|
- [ ] Integrity verification tests
|
||||||
|
- [ ] Replay comparison tests
|
||||||
|
- [ ] Tamper detection tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | QA Team | Audit Pack Domain Model |
|
||||||
|
| 2 | T2 | TODO | T1 | QA Team | Audit Pack Builder |
|
||||||
|
| 3 | T3 | TODO | T1 | QA Team | Audit Pack Importer |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | QA Team | Replay from Audit Pack |
|
||||||
|
| 5 | T5 | TODO | T2-T4 | CLI Team | CLI Commands |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | QA Team | Unit and Integration Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Testing Strategy advisory. Audit packs enable compliance verification. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| Archive format | Decision | QA Team | tar.gz for portability |
|
||||||
|
| Trust root inclusion | Decision | QA Team | Include for fully offline verification |
|
||||||
|
| Minimal bundle | Decision | QA Team | Only include feeds/policies used in scan |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] Audit packs exportable and importable
|
||||||
|
- [ ] Integrity verification catches tampering
|
||||||
|
- [ ] Replay produces identical verdicts
|
||||||
|
- [ ] CLI commands functional
|
||||||
|
- [ ] `dotnet test` passes all tests
|
||||||
243
docs/implplan/SPRINT_5100_SUMMARY.md
Normal file
243
docs/implplan/SPRINT_5100_SUMMARY.md
Normal file
@@ -0,0 +1,243 @@
|
|||||||
|
# Sprint Epic 5100 · Comprehensive Testing Strategy
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Epic 5100 implements the comprehensive testing strategy defined in the Testing Strategy advisory (20-Dec-2025). This epic transforms Stella Ops' testing moats into continuously verified guarantees through deterministic replay, offline compliance, interoperability contracts, and chaos resilience testing.
|
||||||
|
|
||||||
|
**IMPLID**: 5100 (Test Infrastructure)
|
||||||
|
**Total Sprints**: 12
|
||||||
|
**Total Tasks**: ~75
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Epic Structure
|
||||||
|
|
||||||
|
### Phase 0: Harness & Corpus Foundation
|
||||||
|
**Objective**: Standardize test artifacts and expand the golden corpus.
|
||||||
|
|
||||||
|
| Sprint | Name | Tasks | Priority |
|
||||||
|
|--------|------|-------|----------|
|
||||||
|
| 5100.0001.0001 | [Run Manifest Schema](SPRINT_5100_0001_0001_run_manifest_schema.md) | 7 | HIGH |
|
||||||
|
| 5100.0001.0002 | [Evidence Index Schema](SPRINT_5100_0001_0002_evidence_index_schema.md) | 7 | HIGH |
|
||||||
|
| 5100.0001.0003 | [Offline Bundle Manifest](SPRINT_5100_0001_0003_offline_bundle_manifest.md) | 7 | HIGH |
|
||||||
|
| 5100.0001.0004 | [Golden Corpus Expansion](SPRINT_5100_0001_0004_golden_corpus_expansion.md) | 10 | MEDIUM |
|
||||||
|
|
||||||
|
**Key Deliverables**:
|
||||||
|
- `RunManifest` schema capturing all replay inputs
|
||||||
|
- `EvidenceIndex` schema linking verdict to evidence chain
|
||||||
|
- `BundleManifest` for offline operation
|
||||||
|
- 50+ golden test corpus cases
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 1: Determinism & Replay
|
||||||
|
**Objective**: Ensure byte-identical verdicts across time and machines.
|
||||||
|
|
||||||
|
| Sprint | Name | Tasks | Priority |
|
||||||
|
|--------|------|-------|----------|
|
||||||
|
| 5100.0002.0001 | [Canonicalization Utilities](SPRINT_5100_0002_0001_canonicalization_utilities.md) | 7 | HIGH |
|
||||||
|
| 5100.0002.0002 | [Replay Runner Service](SPRINT_5100_0002_0002_replay_runner_service.md) | 7 | HIGH |
|
||||||
|
| 5100.0002.0003 | [Delta-Verdict Generator](SPRINT_5100_0002_0003_delta_verdict_generator.md) | 7 | MEDIUM |
|
||||||
|
|
||||||
|
**Key Deliverables**:
|
||||||
|
- Canonical JSON serialization (RFC 8785 principles)
|
||||||
|
- Stable ordering for all collections
|
||||||
|
- Replay engine with frozen time/PRNG
|
||||||
|
- Delta-verdict for diff-aware release gates
|
||||||
|
- Property-based tests with FsCheck
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 2: Offline E2E & Interop
|
||||||
|
**Objective**: Prove air-gap compliance and tool interoperability.
|
||||||
|
|
||||||
|
| Sprint | Name | Tasks | Priority |
|
||||||
|
|--------|------|-------|----------|
|
||||||
|
| 5100.0003.0001 | [SBOM Interop Round-Trip](SPRINT_5100_0003_0001_sbom_interop_roundtrip.md) | 7 | HIGH |
|
||||||
|
| 5100.0003.0002 | [No-Egress Enforcement](SPRINT_5100_0003_0002_no_egress_enforcement.md) | 6 | HIGH |
|
||||||
|
|
||||||
|
**Key Deliverables**:
|
||||||
|
- Syft → cosign → Grype round-trip tests
|
||||||
|
- CycloneDX 1.6 and SPDX 3.0.1 validation
|
||||||
|
- 95%+ findings parity with consumer tools
|
||||||
|
- Network-isolated test infrastructure
|
||||||
|
- `--network none` CI enforcement
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 3: Unknowns Budgets CI Gates
|
||||||
|
**Objective**: Enforce unknowns-budget policy gates in CI/CD.
|
||||||
|
|
||||||
|
| Sprint | Name | Tasks | Priority |
|
||||||
|
|--------|------|-------|----------|
|
||||||
|
| 5100.0004.0001 | [Unknowns Budget CI Gates](SPRINT_5100_0004_0001_unknowns_budget_ci_gates.md) | 6 | HIGH |
|
||||||
|
|
||||||
|
**Key Deliverables**:
|
||||||
|
- `stella budget check` CLI command
|
||||||
|
- CI workflow with environment-based budgets
|
||||||
|
- PR comments with budget status
|
||||||
|
- UI budget visualization
|
||||||
|
- Attestation integration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 4: Backpressure & Chaos
|
||||||
|
**Objective**: Validate router resilience under load.
|
||||||
|
|
||||||
|
| Sprint | Name | Tasks | Priority |
|
||||||
|
|--------|------|-------|----------|
|
||||||
|
| 5100.0005.0001 | [Router Chaos Suite](SPRINT_5100_0005_0001_router_chaos_suite.md) | 6 | MEDIUM |
|
||||||
|
|
||||||
|
**Key Deliverables**:
|
||||||
|
- k6 load test harness
|
||||||
|
- 429/503 response verification
|
||||||
|
- Retry-After header compliance
|
||||||
|
- Recovery within 30 seconds
|
||||||
|
- Valkey failure injection tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 5: Audit Packs & Time-Travel
|
||||||
|
**Objective**: Enable sealed export/import for auditors.
|
||||||
|
|
||||||
|
| Sprint | Name | Tasks | Priority |
|
||||||
|
|--------|------|-------|----------|
|
||||||
|
| 5100.0006.0001 | [Audit Pack Export/Import](SPRINT_5100_0006_0001_audit_pack_export_import.md) | 6 | MEDIUM |
|
||||||
|
|
||||||
|
**Key Deliverables**:
|
||||||
|
- Sealed audit pack format
|
||||||
|
- One-command replay verification
|
||||||
|
- Signature verification with included trust roots
|
||||||
|
- CLI commands for auditor workflow
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dependency Graph
|
||||||
|
|
||||||
|
```
|
||||||
|
Phase 0 (Foundation)
|
||||||
|
├── 5100.0001.0001 (Run Manifest)
|
||||||
|
│ └── Phase 1 depends
|
||||||
|
├── 5100.0001.0002 (Evidence Index)
|
||||||
|
│ └── Phase 2, 5 depend
|
||||||
|
├── 5100.0001.0003 (Offline Bundle)
|
||||||
|
│ └── Phase 2 depends
|
||||||
|
└── 5100.0001.0004 (Golden Corpus)
|
||||||
|
└── All phases use
|
||||||
|
|
||||||
|
Phase 1 (Determinism)
|
||||||
|
├── 5100.0002.0001 (Canonicalization)
|
||||||
|
│ └── 5100.0002.0002, 5100.0002.0003 depend
|
||||||
|
├── 5100.0002.0002 (Replay Runner)
|
||||||
|
│ └── Phase 5 depends
|
||||||
|
└── 5100.0002.0003 (Delta-Verdict)
|
||||||
|
|
||||||
|
Phase 2 (Offline & Interop)
|
||||||
|
├── 5100.0003.0001 (SBOM Interop)
|
||||||
|
└── 5100.0003.0002 (No-Egress)
|
||||||
|
|
||||||
|
Phase 3 (Unknowns Gates)
|
||||||
|
└── 5100.0004.0001 (CI Gates)
|
||||||
|
└── Depends on 4100.0001.0002
|
||||||
|
|
||||||
|
Phase 4 (Chaos)
|
||||||
|
└── 5100.0005.0001 (Router Chaos)
|
||||||
|
|
||||||
|
Phase 5 (Audit Packs)
|
||||||
|
└── 5100.0006.0001 (Export/Import)
|
||||||
|
└── Depends on Phase 0, Phase 1
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## CI/CD Integration
|
||||||
|
|
||||||
|
### New Workflows
|
||||||
|
|
||||||
|
| Workflow | Trigger | Purpose |
|
||||||
|
|----------|---------|---------|
|
||||||
|
| `replay-verification.yml` | PR (scanner changes) | Verify deterministic replay |
|
||||||
|
| `interop-e2e.yml` | PR + Nightly | SBOM interoperability |
|
||||||
|
| `offline-e2e.yml` | PR + Nightly | Air-gap compliance |
|
||||||
|
| `unknowns-gate.yml` | PR + Push | Budget enforcement |
|
||||||
|
| `router-chaos.yml` | Nightly | Resilience testing |
|
||||||
|
|
||||||
|
### Release Blocking Gates
|
||||||
|
|
||||||
|
A release candidate is blocked if any of these fail:
|
||||||
|
|
||||||
|
1. **Replay Verification**: Zero non-deterministic diffs
|
||||||
|
2. **Interop Suite**: 95%+ findings parity
|
||||||
|
3. **Offline E2E**: All tests pass with no network
|
||||||
|
4. **Unknowns Budget**: Within budget for prod environment
|
||||||
|
5. **Performance**: No breach of p95/memory budgets
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
| Criteria | Metric | Gate |
|
||||||
|
|----------|--------|------|
|
||||||
|
| Full scan + attest + verify with no network | `offline-e2e` passes | Release |
|
||||||
|
| Re-running fixed input = identical verdict | 0 byte diff | Release |
|
||||||
|
| Grype from SBOM matches image scan | 95%+ parity | Release |
|
||||||
|
| Builds fail when unknowns > budget | Exit code 2 | PR |
|
||||||
|
| Router under burst emits correct Retry-After | 100% compliance | Nightly |
|
||||||
|
| Evidence index links complete | Validation passes | Release |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Artifacts Standardized
|
||||||
|
|
||||||
|
| Artifact | Schema Location | Purpose |
|
||||||
|
|----------|-----------------|---------|
|
||||||
|
| Run Manifest | `StellaOps.Testing.Manifests` | Replay key |
|
||||||
|
| Evidence Index | `StellaOps.Evidence` | Verdict → evidence chain |
|
||||||
|
| Offline Bundle | `StellaOps.AirGap.Bundle` | Air-gap operation |
|
||||||
|
| Delta Verdict | `StellaOps.DeltaVerdict` | Diff-aware gates |
|
||||||
|
| Audit Pack | `StellaOps.AuditPack` | Compliance verification |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Order
|
||||||
|
|
||||||
|
### Immediate (This Week)
|
||||||
|
1. **5100.0001.0001** - Run Manifest Schema
|
||||||
|
2. **5100.0002.0001** - Canonicalization Utilities
|
||||||
|
3. **5100.0004.0001** - Unknowns Budget CI Gates
|
||||||
|
|
||||||
|
### Short Term (Next 2 Sprints)
|
||||||
|
4. **5100.0001.0002** - Evidence Index Schema
|
||||||
|
5. **5100.0002.0002** - Replay Runner Service
|
||||||
|
6. **5100.0003.0001** - SBOM Interop Round-Trip
|
||||||
|
|
||||||
|
### Medium Term (Following Sprints)
|
||||||
|
7. **5100.0001.0003** - Offline Bundle Manifest
|
||||||
|
8. **5100.0003.0002** - No-Egress Enforcement
|
||||||
|
9. **5100.0002.0003** - Delta-Verdict Generator
|
||||||
|
|
||||||
|
### Later
|
||||||
|
10. **5100.0001.0004** - Golden Corpus Expansion
|
||||||
|
11. **5100.0005.0001** - Router Chaos Suite
|
||||||
|
12. **5100.0006.0001** - Audit Pack Export/Import
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Related Documentation
|
||||||
|
|
||||||
|
- [Test Suite Overview](../19_TEST_SUITE_OVERVIEW.md)
|
||||||
|
- [Testing Strategy Advisory](../product-advisories/20-Dec-2025%20-%20Testing%20strategy.md)
|
||||||
|
- [Offline Operation Guide](../24_OFFLINE_KIT.md)
|
||||||
|
- [tests/AGENTS.md](../../tests/AGENTS.md)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Epic created from Testing Strategy advisory analysis. 12 sprints defined across 6 phases. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Epic Status**: PLANNING (0/12 sprints complete)
|
||||||
387
docs/implplan/SPRINT_5200_0001_0001_starter_policy_template.md
Normal file
387
docs/implplan/SPRINT_5200_0001_0001_starter_policy_template.md
Normal file
@@ -0,0 +1,387 @@
|
|||||||
|
# Sprint 5200.0001.0001 · Starter Policy Template — Day-1 Policy Pack
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Create a production-ready "starter" policy pack that customers can adopt immediately.
|
||||||
|
- Implements the minimal policy from the Reference Architecture advisory.
|
||||||
|
- Provides sensible defaults for vulnerability gating, unknowns thresholds, and signing requirements.
|
||||||
|
- **Working directory:** `src/Policy/`, `policies/`, `docs/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- **Upstream**: Policy Engine (implemented), Exception Objects (implemented)
|
||||||
|
- **Downstream**: New customer onboarding, documentation
|
||||||
|
- **Safe to parallelize with**: All other sprints
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/modules/policy/architecture.md`
|
||||||
|
- `docs/product-advisories/archived/2025-12-21-reference-architecture/20-Dec-2025 - Stella Ops Reference Architecture.md`
|
||||||
|
- `docs/policy/dsl-reference.md` (if exists)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Starter Policy YAML Definition
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the main starter policy YAML file with recommended defaults.
|
||||||
|
|
||||||
|
**Implementation Path**: `policies/starter-day1.yaml`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Gate on CVE with `reachability=reachable` AND `severity >= High`
|
||||||
|
- [ ] Allow bypass if VEX source says `not_affected` with evidence
|
||||||
|
- [ ] Fail on unknowns above threshold (default: 5% of packages)
|
||||||
|
- [ ] Require signed SBOM for production environments
|
||||||
|
- [ ] Require signed verdict for production deployments
|
||||||
|
- [ ] Clear comments explaining each rule
|
||||||
|
- [ ] Versioned policy pack format
|
||||||
|
|
||||||
|
**Policy File**:
|
||||||
|
```yaml
|
||||||
|
# Stella Ops Starter Policy Pack - Day 1
|
||||||
|
# Version: 1.0.0
|
||||||
|
# Last Updated: 2025-12-21
|
||||||
|
#
|
||||||
|
# This policy provides sensible defaults for organizations beginning
|
||||||
|
# their software supply chain security journey. Customize as needed.
|
||||||
|
|
||||||
|
apiVersion: policy.stellaops.io/v1
|
||||||
|
kind: PolicyPack
|
||||||
|
metadata:
|
||||||
|
name: starter-day1
|
||||||
|
version: "1.0.0"
|
||||||
|
description: "Production-ready starter policy for Day 1 adoption"
|
||||||
|
labels:
|
||||||
|
tier: starter
|
||||||
|
environment: all
|
||||||
|
|
||||||
|
spec:
|
||||||
|
# Global settings
|
||||||
|
settings:
|
||||||
|
defaultAction: warn # warn | block | allow
|
||||||
|
unknownsThreshold: 0.05 # 5% of packages with missing metadata
|
||||||
|
requireSignedSbom: true
|
||||||
|
requireSignedVerdict: true
|
||||||
|
|
||||||
|
# Rule evaluation order: first match wins
|
||||||
|
rules:
|
||||||
|
# Rule 1: Block reachable HIGH/CRITICAL vulnerabilities
|
||||||
|
- name: block-reachable-high-critical
|
||||||
|
description: "Block deployments with reachable HIGH or CRITICAL vulnerabilities"
|
||||||
|
match:
|
||||||
|
severity:
|
||||||
|
- CRITICAL
|
||||||
|
- HIGH
|
||||||
|
reachability: reachable
|
||||||
|
unless:
|
||||||
|
# Allow if VEX says not_affected with evidence
|
||||||
|
vexStatus: not_affected
|
||||||
|
vexJustification:
|
||||||
|
- vulnerable_code_not_present
|
||||||
|
- vulnerable_code_cannot_be_controlled_by_adversary
|
||||||
|
- inline_mitigations_already_exist
|
||||||
|
action: block
|
||||||
|
message: "Reachable {severity} vulnerability {cve} must be remediated or have VEX justification"
|
||||||
|
|
||||||
|
# Rule 2: Warn on reachable MEDIUM vulnerabilities
|
||||||
|
- name: warn-reachable-medium
|
||||||
|
description: "Warn on reachable MEDIUM severity vulnerabilities"
|
||||||
|
match:
|
||||||
|
severity: MEDIUM
|
||||||
|
reachability: reachable
|
||||||
|
unless:
|
||||||
|
vexStatus: not_affected
|
||||||
|
action: warn
|
||||||
|
message: "Reachable MEDIUM vulnerability {cve} should be reviewed"
|
||||||
|
|
||||||
|
# Rule 3: Ignore unreachable vulnerabilities (with logging)
|
||||||
|
- name: ignore-unreachable
|
||||||
|
description: "Allow unreachable vulnerabilities but log for awareness"
|
||||||
|
match:
|
||||||
|
reachability: unreachable
|
||||||
|
action: allow
|
||||||
|
log: true
|
||||||
|
message: "Vulnerability {cve} is unreachable - allowing"
|
||||||
|
|
||||||
|
# Rule 4: Fail on excessive unknowns
|
||||||
|
- name: fail-on-unknowns
|
||||||
|
description: "Block if too many packages have unknown metadata"
|
||||||
|
type: aggregate # Applies to entire scan, not individual findings
|
||||||
|
match:
|
||||||
|
unknownsRatio:
|
||||||
|
gt: ${settings.unknownsThreshold}
|
||||||
|
action: block
|
||||||
|
message: "Unknown packages exceed threshold ({unknownsRatio}% > {threshold}%)"
|
||||||
|
|
||||||
|
# Rule 5: Require signed SBOM for production
|
||||||
|
- name: require-signed-sbom-prod
|
||||||
|
description: "Production deployments must have signed SBOM"
|
||||||
|
match:
|
||||||
|
environment: production
|
||||||
|
require:
|
||||||
|
signedSbom: true
|
||||||
|
action: block
|
||||||
|
message: "Production deployment requires signed SBOM"
|
||||||
|
|
||||||
|
# Rule 6: Require signed verdict for production
|
||||||
|
- name: require-signed-verdict-prod
|
||||||
|
description: "Production deployments must have signed policy verdict"
|
||||||
|
match:
|
||||||
|
environment: production
|
||||||
|
require:
|
||||||
|
signedVerdict: true
|
||||||
|
action: block
|
||||||
|
message: "Production deployment requires signed verdict"
|
||||||
|
|
||||||
|
# Rule 7: Default allow for everything else
|
||||||
|
- name: default-allow
|
||||||
|
description: "Allow everything not matched by above rules"
|
||||||
|
match:
|
||||||
|
always: true
|
||||||
|
action: allow
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Policy Pack Metadata & Schema
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define the policy pack schema and metadata format.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] JSON Schema for policy pack validation
|
||||||
|
- [ ] Version field with semver
|
||||||
|
- [ ] Dependencies field for pack composition
|
||||||
|
- [ ] Labels for categorization
|
||||||
|
- [ ] Annotations for custom metadata
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Environment-Specific Overrides
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create environment-specific override files.
|
||||||
|
|
||||||
|
**Implementation Path**: `policies/starter-day1/`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `base.yaml` - Core rules
|
||||||
|
- [ ] `overrides/production.yaml` - Stricter for prod
|
||||||
|
- [ ] `overrides/staging.yaml` - Moderate strictness
|
||||||
|
- [ ] `overrides/development.yaml` - Lenient for dev
|
||||||
|
- [ ] Clear documentation on override precedence
|
||||||
|
|
||||||
|
**Override Example**:
|
||||||
|
```yaml
|
||||||
|
# policies/starter-day1/overrides/development.yaml
|
||||||
|
apiVersion: policy.stellaops.io/v1
|
||||||
|
kind: PolicyOverride
|
||||||
|
metadata:
|
||||||
|
name: starter-day1-dev
|
||||||
|
parent: starter-day1
|
||||||
|
environment: development
|
||||||
|
|
||||||
|
spec:
|
||||||
|
settings:
|
||||||
|
defaultAction: warn # Never block in dev
|
||||||
|
unknownsThreshold: 0.20 # Allow more unknowns
|
||||||
|
|
||||||
|
ruleOverrides:
|
||||||
|
- name: block-reachable-high-critical
|
||||||
|
action: warn # Downgrade to warn in dev
|
||||||
|
|
||||||
|
- name: require-signed-sbom-prod
|
||||||
|
enabled: false # Disable in dev
|
||||||
|
|
||||||
|
- name: require-signed-verdict-prod
|
||||||
|
enabled: false # Disable in dev
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Policy Validation CLI Command
|
||||||
|
|
||||||
|
**Assignee**: CLI Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add CLI command to validate policy packs before deployment.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `stellaops policy validate <path>`
|
||||||
|
- [ ] Schema validation
|
||||||
|
- [ ] Rule conflict detection
|
||||||
|
- [ ] Circular dependency detection
|
||||||
|
- [ ] Warning for missing common rules
|
||||||
|
- [ ] Exit codes: 0=valid, 1=errors, 2=warnings
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Policy Simulation Mode
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add simulation mode to test policy against historical data.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] `stellaops policy simulate --policy <path> --scan <id>`
|
||||||
|
- [ ] Shows what would have happened
|
||||||
|
- [ ] Diff against current policy
|
||||||
|
- [ ] Summary statistics
|
||||||
|
- [ ] No state mutation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Starter Policy Tests
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive tests for starter policy behavior.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test: Reachable HIGH blocked without VEX
|
||||||
|
- [ ] Test: Reachable HIGH allowed with VEX not_affected
|
||||||
|
- [ ] Test: Unreachable HIGH allowed
|
||||||
|
- [ ] Test: Unknowns threshold enforced
|
||||||
|
- [ ] Test: Signed SBOM required for prod
|
||||||
|
- [ ] Test: Dev overrides work correctly
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T7: Policy Pack Distribution
|
||||||
|
|
||||||
|
**Assignee**: Policy Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Package and distribute starter policy pack.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] OCI artifact packaging for policy pack
|
||||||
|
- [ ] Version tagging
|
||||||
|
- [ ] Signature on policy pack artifact
|
||||||
|
- [ ] Registry push (configurable)
|
||||||
|
- [ ] Offline bundle support
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T8: User Documentation
|
||||||
|
|
||||||
|
**Assignee**: Docs Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Comprehensive user documentation for starter policy.
|
||||||
|
|
||||||
|
**Implementation Path**: `docs/policy/starter-guide.md`
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] "Getting Started with Policies" guide
|
||||||
|
- [ ] Rule-by-rule explanation
|
||||||
|
- [ ] Customization guide
|
||||||
|
- [ ] Environment override examples
|
||||||
|
- [ ] Troubleshooting common issues
|
||||||
|
- [ ] Migration path to custom policies
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T9: Quick Start Integration
|
||||||
|
|
||||||
|
**Assignee**: Docs Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Integrate starter policy into quick start documentation.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Update `docs/10_CONCELIER_CLI_QUICKSTART.md`
|
||||||
|
- [ ] One-liner to install starter policy
|
||||||
|
- [ ] Example scan with policy evaluation
|
||||||
|
- [ ] Link to customization docs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T10: UI Policy Selector
|
||||||
|
|
||||||
|
**Assignee**: UI Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Add starter policy as default option in UI policy selector.
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] "Starter (Recommended)" option in dropdown
|
||||||
|
- [ ] Tooltip explaining starter policy
|
||||||
|
- [ ] One-click activation
|
||||||
|
- [ ] Preview of rules before activation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Policy Team | Starter Policy YAML |
|
||||||
|
| 2 | T2 | TODO | T1 | Policy Team | Pack Metadata & Schema |
|
||||||
|
| 3 | T3 | TODO | T1 | Policy Team | Environment Overrides |
|
||||||
|
| 4 | T4 | TODO | T1 | CLI Team | Validation CLI Command |
|
||||||
|
| 5 | T5 | TODO | T1 | Policy Team | Simulation Mode |
|
||||||
|
| 6 | T6 | TODO | T1-T3 | Policy Team | Starter Policy Tests |
|
||||||
|
| 7 | T7 | TODO | T1-T3 | Policy Team | Pack Distribution |
|
||||||
|
| 8 | T8 | TODO | T1-T3 | Docs Team | User Documentation |
|
||||||
|
| 9 | T9 | TODO | T8 | Docs Team | Quick Start Integration |
|
||||||
|
| 10 | T10 | TODO | T1 | UI Team | UI Policy Selector |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from Reference Architecture advisory - starter policy gap. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| 5% unknowns threshold | Decision | Policy Team | Conservative default; can be adjusted |
|
||||||
|
| First-match semantics | Decision | Policy Team | Consistent with existing policy engine |
|
||||||
|
| VEX required for bypass | Decision | Policy Team | Evidence-based exceptions only |
|
||||||
|
| Prod-only signing req | Decision | Policy Team | Don't burden dev/staging environments |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] New customers can deploy starter policy in <5 minutes
|
||||||
|
- [ ] Starter policy blocks reachable HIGH/CRITICAL without VEX
|
||||||
|
- [ ] Clear upgrade path to custom policies
|
||||||
|
- [ ] Documentation enables self-service adoption
|
||||||
|
- [ ] Policy pack signed and published to registry
|
||||||
|
|
||||||
|
**Sprint Status**: TODO (0/10 tasks complete)
|
||||||
589
docs/implplan/SPRINT_6000_0001_0001_binaries_schema.md
Normal file
589
docs/implplan/SPRINT_6000_0001_0001_binaries_schema.md
Normal file
@@ -0,0 +1,589 @@
|
|||||||
|
# Sprint 6000.0001.0001 · Binaries Schema
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Create the `binaries` PostgreSQL schema for the BinaryIndex module.
|
||||||
|
- Implement all core tables: `binary_identity`, `binary_package_map`, `vulnerable_buildids`, `binary_vuln_assertion`, `corpus_snapshots`.
|
||||||
|
- Set up RLS policies and indexes for multi-tenant isolation.
|
||||||
|
- **Working directory:** `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: None (foundational sprint)
|
||||||
|
- **Downstream**: All 6000.0001.x sprints depend on this
|
||||||
|
- **Safe to parallelize with**: None within MVP 1
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/db/SPECIFICATION.md`
|
||||||
|
- `docs/db/schemas/binaries_schema_specification.md`
|
||||||
|
- `docs/modules/binaryindex/architecture.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create Project Structure
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the BinaryIndex persistence library project structure.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/`
|
||||||
|
|
||||||
|
**Project Structure**:
|
||||||
|
```
|
||||||
|
StellaOps.BinaryIndex.Persistence/
|
||||||
|
├── StellaOps.BinaryIndex.Persistence.csproj
|
||||||
|
├── BinaryIndexDbContext.cs
|
||||||
|
├── Migrations/
|
||||||
|
│ └── 001_create_binaries_schema.sql
|
||||||
|
├── Repositories/
|
||||||
|
│ ├── IBinaryIdentityRepository.cs
|
||||||
|
│ ├── BinaryIdentityRepository.cs
|
||||||
|
│ ├── IBinaryPackageMapRepository.cs
|
||||||
|
│ └── BinaryPackageMapRepository.cs
|
||||||
|
└── Extensions/
|
||||||
|
└── ServiceCollectionExtensions.cs
|
||||||
|
```
|
||||||
|
|
||||||
|
**Project File**:
|
||||||
|
```xml
|
||||||
|
<Project Sdk="Microsoft.NET.Sdk">
|
||||||
|
<PropertyGroup>
|
||||||
|
<TargetFramework>net10.0</TargetFramework>
|
||||||
|
<ImplicitUsings>enable</ImplicitUsings>
|
||||||
|
<Nullable>enable</Nullable>
|
||||||
|
<LangVersion>preview</LangVersion>
|
||||||
|
</PropertyGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="Npgsql" Version="9.0.0" />
|
||||||
|
<PackageReference Include="Dapper" Version="2.1.35" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<ProjectReference Include="..\StellaOps.BinaryIndex.Core\StellaOps.BinaryIndex.Core.csproj" />
|
||||||
|
<ProjectReference Include="..\..\..\..\__Libraries\StellaOps.Infrastructure.Postgres\StellaOps.Infrastructure.Postgres.csproj" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<EmbeddedResource Include="Migrations\*.sql" />
|
||||||
|
</ItemGroup>
|
||||||
|
</Project>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Project compiles
|
||||||
|
- [ ] References Infrastructure.Postgres for shared patterns
|
||||||
|
- [ ] Migrations embedded as resources
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Create Initial Migration
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create the SQL migration that establishes the binaries schema.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/Migrations/001_create_binaries_schema.sql`
|
||||||
|
|
||||||
|
**Migration Content**:
|
||||||
|
```sql
|
||||||
|
-- 001_create_binaries_schema.sql
|
||||||
|
-- Creates the binaries schema for BinaryIndex module
|
||||||
|
-- Author: BinaryIndex Team
|
||||||
|
-- Date: 2025-12-21
|
||||||
|
|
||||||
|
BEGIN;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- SCHEMA CREATION
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE SCHEMA IF NOT EXISTS binaries;
|
||||||
|
CREATE SCHEMA IF NOT EXISTS binaries_app;
|
||||||
|
|
||||||
|
-- RLS helper function
|
||||||
|
CREATE OR REPLACE FUNCTION binaries_app.require_current_tenant()
|
||||||
|
RETURNS TEXT
|
||||||
|
LANGUAGE plpgsql STABLE SECURITY DEFINER
|
||||||
|
AS $$
|
||||||
|
DECLARE
|
||||||
|
v_tenant TEXT;
|
||||||
|
BEGIN
|
||||||
|
v_tenant := current_setting('app.tenant_id', true);
|
||||||
|
IF v_tenant IS NULL OR v_tenant = '' THEN
|
||||||
|
RAISE EXCEPTION 'app.tenant_id session variable not set';
|
||||||
|
END IF;
|
||||||
|
RETURN v_tenant;
|
||||||
|
END;
|
||||||
|
$$;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- CORE TABLES (see binaries_schema_specification.md for full DDL)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- binary_identity table
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.binary_identity (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
build_id TEXT,
|
||||||
|
build_id_type TEXT CHECK (build_id_type IN ('gnu-build-id', 'pe-cv', 'macho-uuid')),
|
||||||
|
file_sha256 TEXT NOT NULL,
|
||||||
|
text_sha256 TEXT,
|
||||||
|
blake3_hash TEXT,
|
||||||
|
format TEXT NOT NULL CHECK (format IN ('elf', 'pe', 'macho')),
|
||||||
|
architecture TEXT NOT NULL,
|
||||||
|
osabi TEXT,
|
||||||
|
binary_type TEXT CHECK (binary_type IN ('executable', 'shared_library', 'static_library', 'object')),
|
||||||
|
is_stripped BOOLEAN DEFAULT FALSE,
|
||||||
|
first_seen_snapshot_id UUID,
|
||||||
|
last_seen_snapshot_id UUID,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT binary_identity_key_unique UNIQUE (tenant_id, binary_key)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- corpus_snapshots table
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.corpus_snapshots (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
architecture TEXT NOT NULL,
|
||||||
|
snapshot_id TEXT NOT NULL,
|
||||||
|
packages_processed INT NOT NULL DEFAULT 0,
|
||||||
|
binaries_indexed INT NOT NULL DEFAULT 0,
|
||||||
|
repo_metadata_digest TEXT,
|
||||||
|
signing_key_id TEXT,
|
||||||
|
dsse_envelope_ref TEXT,
|
||||||
|
status TEXT NOT NULL DEFAULT 'pending' CHECK (status IN ('pending', 'processing', 'completed', 'failed')),
|
||||||
|
error TEXT,
|
||||||
|
started_at TIMESTAMPTZ,
|
||||||
|
completed_at TIMESTAMPTZ,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT corpus_snapshots_unique UNIQUE (tenant_id, distro, release, architecture, snapshot_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- binary_package_map table
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.binary_package_map (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
binary_identity_id UUID NOT NULL REFERENCES binaries.binary_identity(id) ON DELETE CASCADE,
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
source_pkg TEXT NOT NULL,
|
||||||
|
binary_pkg TEXT NOT NULL,
|
||||||
|
pkg_version TEXT NOT NULL,
|
||||||
|
pkg_purl TEXT,
|
||||||
|
architecture TEXT NOT NULL,
|
||||||
|
file_path_in_pkg TEXT NOT NULL,
|
||||||
|
snapshot_id UUID NOT NULL REFERENCES binaries.corpus_snapshots(id),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT binary_package_map_unique UNIQUE (binary_identity_id, snapshot_id, file_path_in_pkg)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- vulnerable_buildids table
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.vulnerable_buildids (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
buildid_type TEXT NOT NULL CHECK (buildid_type IN ('gnu-build-id', 'pe-cv', 'macho-uuid')),
|
||||||
|
buildid_value TEXT NOT NULL,
|
||||||
|
purl TEXT NOT NULL,
|
||||||
|
pkg_version TEXT NOT NULL,
|
||||||
|
distro TEXT,
|
||||||
|
release TEXT,
|
||||||
|
confidence TEXT NOT NULL DEFAULT 'exact' CHECK (confidence IN ('exact', 'inferred', 'heuristic')),
|
||||||
|
provenance JSONB DEFAULT '{}',
|
||||||
|
snapshot_id UUID REFERENCES binaries.corpus_snapshots(id),
|
||||||
|
indexed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT vulnerable_buildids_unique UNIQUE (tenant_id, buildid_value, buildid_type, purl, pkg_version)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- binary_vuln_assertion table
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.binary_vuln_assertion (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
binary_identity_id UUID REFERENCES binaries.binary_identity(id),
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
advisory_id UUID,
|
||||||
|
status TEXT NOT NULL CHECK (status IN ('affected', 'not_affected', 'fixed', 'unknown')),
|
||||||
|
method TEXT NOT NULL CHECK (method IN ('range_match', 'buildid_catalog', 'fingerprint_match', 'fix_index')),
|
||||||
|
confidence NUMERIC(3,2) CHECK (confidence >= 0 AND confidence <= 1),
|
||||||
|
evidence_ref TEXT,
|
||||||
|
evidence_digest TEXT,
|
||||||
|
evaluated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT binary_vuln_assertion_unique UNIQUE (tenant_id, binary_key, cve_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- INDEXES
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_identity_tenant ON binaries.binary_identity(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_identity_buildid ON binaries.binary_identity(build_id) WHERE build_id IS NOT NULL;
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_identity_sha256 ON binaries.binary_identity(file_sha256);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_identity_key ON binaries.binary_identity(binary_key);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_package_map_tenant ON binaries.binary_package_map(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_package_map_binary ON binaries.binary_package_map(binary_identity_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_package_map_distro ON binaries.binary_package_map(distro, release, source_pkg);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_package_map_snapshot ON binaries.binary_package_map(snapshot_id);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_corpus_snapshots_tenant ON binaries.corpus_snapshots(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_corpus_snapshots_distro ON binaries.corpus_snapshots(distro, release, architecture);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_corpus_snapshots_status ON binaries.corpus_snapshots(status) WHERE status IN ('pending', 'processing');
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_buildids_tenant ON binaries.vulnerable_buildids(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_buildids_value ON binaries.vulnerable_buildids(buildid_type, buildid_value);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_buildids_purl ON binaries.vulnerable_buildids(purl);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_vuln_assertion_tenant ON binaries.binary_vuln_assertion(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_vuln_assertion_binary ON binaries.binary_vuln_assertion(binary_key);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_binary_vuln_assertion_cve ON binaries.binary_vuln_assertion(cve_id);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- ROW-LEVEL SECURITY
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
ALTER TABLE binaries.binary_identity ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.binary_identity FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY binary_identity_tenant_isolation ON binaries.binary_identity
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.corpus_snapshots ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.corpus_snapshots FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY corpus_snapshots_tenant_isolation ON binaries.corpus_snapshots
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.binary_package_map ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.binary_package_map FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY binary_package_map_tenant_isolation ON binaries.binary_package_map
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.vulnerable_buildids ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.vulnerable_buildids FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY vulnerable_buildids_tenant_isolation ON binaries.vulnerable_buildids
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.binary_vuln_assertion ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.binary_vuln_assertion FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY binary_vuln_assertion_tenant_isolation ON binaries.binary_vuln_assertion
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
COMMIT;
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migration applies cleanly on fresh database
|
||||||
|
- [ ] Migration is idempotent (IF NOT EXISTS)
|
||||||
|
- [ ] RLS policies enforce tenant isolation
|
||||||
|
- [ ] All indexes created
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement Migration Runner
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the migration runner that applies embedded SQL migrations.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/BinaryIndexMigrationRunner.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Persistence;
|
||||||
|
|
||||||
|
public sealed class BinaryIndexMigrationRunner : IMigrationRunner
|
||||||
|
{
|
||||||
|
private readonly NpgsqlDataSource _dataSource;
|
||||||
|
private readonly ILogger<BinaryIndexMigrationRunner> _logger;
|
||||||
|
|
||||||
|
public BinaryIndexMigrationRunner(
|
||||||
|
NpgsqlDataSource dataSource,
|
||||||
|
ILogger<BinaryIndexMigrationRunner> logger)
|
||||||
|
{
|
||||||
|
_dataSource = dataSource;
|
||||||
|
_logger = logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task MigrateAsync(CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
const string lockKey = "binaries_schema_migration";
|
||||||
|
var lockHash = unchecked((int)lockKey.GetHashCode());
|
||||||
|
|
||||||
|
await using var connection = await _dataSource.OpenConnectionAsync(ct);
|
||||||
|
|
||||||
|
// Acquire advisory lock
|
||||||
|
await using var lockCmd = connection.CreateCommand();
|
||||||
|
lockCmd.CommandText = $"SELECT pg_try_advisory_lock({lockHash})";
|
||||||
|
var acquired = (bool)(await lockCmd.ExecuteScalarAsync(ct))!;
|
||||||
|
|
||||||
|
if (!acquired)
|
||||||
|
{
|
||||||
|
_logger.LogInformation("Migration already in progress, skipping");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var migrations = GetEmbeddedMigrations();
|
||||||
|
foreach (var (name, sql) in migrations.OrderBy(m => m.name))
|
||||||
|
{
|
||||||
|
_logger.LogInformation("Applying migration: {Name}", name);
|
||||||
|
await using var cmd = connection.CreateCommand();
|
||||||
|
cmd.CommandText = sql;
|
||||||
|
await cmd.ExecuteNonQueryAsync(ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
await using var unlockCmd = connection.CreateCommand();
|
||||||
|
unlockCmd.CommandText = $"SELECT pg_advisory_unlock({lockHash})";
|
||||||
|
await unlockCmd.ExecuteScalarAsync(ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IEnumerable<(string name, string sql)> GetEmbeddedMigrations()
|
||||||
|
{
|
||||||
|
var assembly = typeof(BinaryIndexMigrationRunner).Assembly;
|
||||||
|
var prefix = "StellaOps.BinaryIndex.Persistence.Migrations.";
|
||||||
|
|
||||||
|
foreach (var resourceName in assembly.GetManifestResourceNames()
|
||||||
|
.Where(n => n.StartsWith(prefix) && n.EndsWith(".sql")))
|
||||||
|
{
|
||||||
|
using var stream = assembly.GetManifestResourceStream(resourceName)!;
|
||||||
|
using var reader = new StreamReader(stream);
|
||||||
|
var sql = reader.ReadToEnd();
|
||||||
|
var name = resourceName[prefix.Length..];
|
||||||
|
yield return (name, sql);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Migrations applied on startup
|
||||||
|
- [ ] Advisory lock prevents concurrent migrations
|
||||||
|
- [ ] Embedded resources correctly loaded
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement DbContext and Repositories
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2, T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the database context and repository interfaces for core tables.
|
||||||
|
|
||||||
|
**Implementation Paths**:
|
||||||
|
- `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/BinaryIndexDbContext.cs`
|
||||||
|
- `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/Repositories/`
|
||||||
|
|
||||||
|
**DbContext**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Persistence;
|
||||||
|
|
||||||
|
public sealed class BinaryIndexDbContext : IBinaryIndexDbContext
|
||||||
|
{
|
||||||
|
private readonly NpgsqlDataSource _dataSource;
|
||||||
|
private readonly ITenantContext _tenantContext;
|
||||||
|
|
||||||
|
public BinaryIndexDbContext(
|
||||||
|
NpgsqlDataSource dataSource,
|
||||||
|
ITenantContext tenantContext)
|
||||||
|
{
|
||||||
|
_dataSource = dataSource;
|
||||||
|
_tenantContext = tenantContext;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<NpgsqlConnection> OpenConnectionAsync(CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var connection = await _dataSource.OpenConnectionAsync(ct);
|
||||||
|
|
||||||
|
// Set tenant context for RLS
|
||||||
|
await using var cmd = connection.CreateCommand();
|
||||||
|
cmd.CommandText = $"SET app.tenant_id = '{_tenantContext.TenantId}'";
|
||||||
|
await cmd.ExecuteNonQueryAsync(ct);
|
||||||
|
|
||||||
|
return connection;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Repository Interface**:
|
||||||
|
```csharp
|
||||||
|
public interface IBinaryIdentityRepository
|
||||||
|
{
|
||||||
|
Task<BinaryIdentity?> GetByBuildIdAsync(string buildId, string buildIdType, CancellationToken ct);
|
||||||
|
Task<BinaryIdentity?> GetByKeyAsync(string binaryKey, CancellationToken ct);
|
||||||
|
Task<BinaryIdentity> UpsertAsync(BinaryIdentity identity, CancellationToken ct);
|
||||||
|
Task<ImmutableArray<BinaryIdentity>> GetBatchAsync(IEnumerable<string> binaryKeys, CancellationToken ct);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] DbContext sets tenant context on connection
|
||||||
|
- [ ] Repositories implement CRUD operations
|
||||||
|
- [ ] Dapper used for data access
|
||||||
|
- [ ] Unit tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Integration Tests with Testcontainers
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create integration tests using Testcontainers for PostgreSQL.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Tests/StellaOps.BinaryIndex.Persistence.Tests/`
|
||||||
|
|
||||||
|
**Test Class**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Persistence.Tests;
|
||||||
|
|
||||||
|
public class BinaryIdentityRepositoryTests : IAsyncLifetime
|
||||||
|
{
|
||||||
|
private readonly PostgreSqlContainer _postgres = new PostgreSqlBuilder()
|
||||||
|
.WithImage("postgres:16-alpine")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
private NpgsqlDataSource _dataSource = null!;
|
||||||
|
private BinaryIdentityRepository _repository = null!;
|
||||||
|
|
||||||
|
public async Task InitializeAsync()
|
||||||
|
{
|
||||||
|
await _postgres.StartAsync();
|
||||||
|
_dataSource = NpgsqlDataSource.Create(_postgres.GetConnectionString());
|
||||||
|
|
||||||
|
var migrationRunner = new BinaryIndexMigrationRunner(
|
||||||
|
_dataSource,
|
||||||
|
NullLogger<BinaryIndexMigrationRunner>.Instance);
|
||||||
|
await migrationRunner.MigrateAsync();
|
||||||
|
|
||||||
|
var dbContext = new BinaryIndexDbContext(
|
||||||
|
_dataSource,
|
||||||
|
new TestTenantContext("test-tenant"));
|
||||||
|
_repository = new BinaryIdentityRepository(dbContext);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task DisposeAsync()
|
||||||
|
{
|
||||||
|
await _dataSource.DisposeAsync();
|
||||||
|
await _postgres.DisposeAsync();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task UpsertAsync_NewIdentity_CreatesRecord()
|
||||||
|
{
|
||||||
|
var identity = new BinaryIdentity
|
||||||
|
{
|
||||||
|
BinaryKey = "test-build-id-123",
|
||||||
|
BuildId = "abc123def456",
|
||||||
|
BuildIdType = "gnu-build-id",
|
||||||
|
FileSha256 = "sha256:...",
|
||||||
|
Format = "elf",
|
||||||
|
Architecture = "x86-64"
|
||||||
|
};
|
||||||
|
|
||||||
|
var result = await _repository.UpsertAsync(identity, CancellationToken.None);
|
||||||
|
|
||||||
|
result.Id.Should().NotBeEmpty();
|
||||||
|
result.BinaryKey.Should().Be(identity.BinaryKey);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task GetByBuildIdAsync_ExistingIdentity_ReturnsRecord()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var identity = new BinaryIdentity { /* ... */ };
|
||||||
|
await _repository.UpsertAsync(identity, CancellationToken.None);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = await _repository.GetByBuildIdAsync(
|
||||||
|
identity.BuildId!, identity.BuildIdType!, CancellationToken.None);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Should().NotBeNull();
|
||||||
|
result!.BuildId.Should().Be(identity.BuildId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Testcontainers PostgreSQL spins up
|
||||||
|
- [ ] Migrations apply in tests
|
||||||
|
- [ ] Repository CRUD operations tested
|
||||||
|
- [ ] RLS isolation verified
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | BinaryIndex Team | Create Project Structure |
|
||||||
|
| 2 | T2 | TODO | T1 | BinaryIndex Team | Create Initial Migration |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | BinaryIndex Team | Implement Migration Runner |
|
||||||
|
| 4 | T4 | TODO | T2, T3 | BinaryIndex Team | Implement DbContext and Repositories |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | BinaryIndex Team | Integration Tests with Testcontainers |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 2025-12-21 | Sprint created from BinaryIndex architecture. Schema foundational for all BinaryIndex functionality. | Agent |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
|
||||||
|
| Item | Type | Owner | Notes |
|
||||||
|
|------|------|-------|-------|
|
||||||
|
| RLS for tenant isolation | Decision | BinaryIndex Team | Consistent with other StellaOps schemas |
|
||||||
|
| Dapper over EF Core | Decision | BinaryIndex Team | Performance-critical lookups |
|
||||||
|
| Build-ID as primary identity | Decision | BinaryIndex Team | ELF Build-ID preferred, fallback to SHA-256 |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 5 tasks marked DONE
|
||||||
|
- [ ] `binaries` schema deployed and migrated
|
||||||
|
- [ ] RLS enforces tenant isolation
|
||||||
|
- [ ] Repository pattern implemented
|
||||||
|
- [ ] Integration tests pass with Testcontainers
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds with 100% pass rate
|
||||||
390
docs/implplan/SPRINT_6000_0001_0002_binary_identity_service.md
Normal file
390
docs/implplan/SPRINT_6000_0001_0002_binary_identity_service.md
Normal file
@@ -0,0 +1,390 @@
|
|||||||
|
# Sprint 6000.0001.0002 · Binary Identity Service
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement the core Binary Identity extraction and storage service.
|
||||||
|
- Create domain models for BinaryIdentity, BinaryFeatures, and related types.
|
||||||
|
- Integrate with existing Scanner.Analyzers.Native for ELF/PE/Mach-O parsing.
|
||||||
|
- **Working directory:** `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 6000.0001.0001 (Binaries Schema)
|
||||||
|
- **Downstream**: Sprints 6000.0001.0003, 6000.0001.0004
|
||||||
|
- **Safe to parallelize with**: None
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/modules/binaryindex/architecture.md`
|
||||||
|
- `src/Scanner/StellaOps.Scanner.Analyzers.Native/` (existing ELF parser)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create Core Domain Models
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: —
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Create domain models for binary identity and features.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/Models/`
|
||||||
|
|
||||||
|
**Models**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Core.Models;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Unique identity of a binary derived from Build-ID or hashes.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIdentity
|
||||||
|
{
|
||||||
|
public Guid Id { get; init; }
|
||||||
|
public required string BinaryKey { get; init; } // Primary key: build_id || file_sha256
|
||||||
|
public string? BuildId { get; init; } // ELF GNU Build-ID
|
||||||
|
public string? BuildIdType { get; init; } // gnu-build-id, pe-cv, macho-uuid
|
||||||
|
public required string FileSha256 { get; init; }
|
||||||
|
public string? TextSha256 { get; init; } // SHA-256 of .text section
|
||||||
|
public required BinaryFormat Format { get; init; }
|
||||||
|
public required string Architecture { get; init; }
|
||||||
|
public string? OsAbi { get; init; }
|
||||||
|
public BinaryType? Type { get; init; }
|
||||||
|
public bool IsStripped { get; init; }
|
||||||
|
public DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum BinaryFormat { Elf, Pe, Macho }
|
||||||
|
public enum BinaryType { Executable, SharedLibrary, StaticLibrary, Object }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Extended features extracted from a binary.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryFeatures
|
||||||
|
{
|
||||||
|
public required BinaryIdentity Identity { get; init; }
|
||||||
|
public ImmutableArray<string> DynamicDeps { get; init; } = []; // DT_NEEDED
|
||||||
|
public ImmutableArray<string> ExportedSymbols { get; init; } = [];
|
||||||
|
public ImmutableArray<string> ImportedSymbols { get; init; } = [];
|
||||||
|
public BinaryHardening? Hardening { get; init; }
|
||||||
|
public string? Interpreter { get; init; } // ELF interpreter path
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BinaryHardening(
|
||||||
|
bool HasStackCanary,
|
||||||
|
bool HasNx,
|
||||||
|
bool HasPie,
|
||||||
|
bool HasRelro,
|
||||||
|
bool HasBindNow);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] All domain models created with immutable records
|
||||||
|
- [ ] XML documentation on all types
|
||||||
|
- [ ] Models align with database schema
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Create IBinaryFeatureExtractor Interface
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Define the interface for binary feature extraction.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/Services/IBinaryFeatureExtractor.cs`
|
||||||
|
|
||||||
|
**Interface**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Core.Services;
|
||||||
|
|
||||||
|
public interface IBinaryFeatureExtractor
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Extract identity from a binary stream.
|
||||||
|
/// </summary>
|
||||||
|
Task<BinaryIdentity> ExtractIdentityAsync(
|
||||||
|
Stream binaryStream,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Extract full features from a binary stream.
|
||||||
|
/// </summary>
|
||||||
|
Task<BinaryFeatures> ExtractFeaturesAsync(
|
||||||
|
Stream binaryStream,
|
||||||
|
FeatureExtractorOptions? options = null,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FeatureExtractorOptions
|
||||||
|
{
|
||||||
|
public bool ExtractSymbols { get; init; } = true;
|
||||||
|
public bool ExtractHardening { get; init; } = true;
|
||||||
|
public int MaxSymbols { get; init; } = 10000;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Interface defined with async methods
|
||||||
|
- [ ] Options record for configuration
|
||||||
|
- [ ] Documentation complete
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement ElfFeatureExtractor
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement feature extraction for ELF binaries using existing Scanner.Analyzers.Native code.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/Services/ElfFeatureExtractor.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Core.Services;
|
||||||
|
|
||||||
|
public sealed class ElfFeatureExtractor : IBinaryFeatureExtractor
|
||||||
|
{
|
||||||
|
private readonly ILogger<ElfFeatureExtractor> _logger;
|
||||||
|
|
||||||
|
public async Task<BinaryIdentity> ExtractIdentityAsync(
|
||||||
|
Stream binaryStream,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
// Compute file hash
|
||||||
|
var fileHash = await ComputeSha256Async(binaryStream, ct);
|
||||||
|
binaryStream.Position = 0;
|
||||||
|
|
||||||
|
// Parse ELF header and notes
|
||||||
|
var elfReader = new ElfReader();
|
||||||
|
var elfInfo = await elfReader.ParseAsync(binaryStream, ct);
|
||||||
|
|
||||||
|
// Extract Build-ID from PT_NOTE sections
|
||||||
|
var buildId = elfInfo.Notes
|
||||||
|
.FirstOrDefault(n => n.Name == "GNU" && n.Type == 3)?
|
||||||
|
.DescriptorHex;
|
||||||
|
|
||||||
|
// Compute .text section hash if available
|
||||||
|
var textHash = elfInfo.Sections
|
||||||
|
.FirstOrDefault(s => s.Name == ".text")
|
||||||
|
?.ContentHash;
|
||||||
|
|
||||||
|
var binaryKey = buildId ?? fileHash;
|
||||||
|
|
||||||
|
return new BinaryIdentity
|
||||||
|
{
|
||||||
|
BinaryKey = binaryKey,
|
||||||
|
BuildId = buildId,
|
||||||
|
BuildIdType = buildId != null ? "gnu-build-id" : null,
|
||||||
|
FileSha256 = fileHash,
|
||||||
|
TextSha256 = textHash,
|
||||||
|
Format = BinaryFormat.Elf,
|
||||||
|
Architecture = MapArchitecture(elfInfo.Machine),
|
||||||
|
OsAbi = elfInfo.OsAbi,
|
||||||
|
Type = MapBinaryType(elfInfo.Type),
|
||||||
|
IsStripped = !elfInfo.HasDebugInfo
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<BinaryFeatures> ExtractFeaturesAsync(
|
||||||
|
Stream binaryStream,
|
||||||
|
FeatureExtractorOptions? options = null,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
options ??= new FeatureExtractorOptions();
|
||||||
|
|
||||||
|
var identity = await ExtractIdentityAsync(binaryStream, ct);
|
||||||
|
binaryStream.Position = 0;
|
||||||
|
|
||||||
|
var elfReader = new ElfReader();
|
||||||
|
var elfInfo = await elfReader.ParseAsync(binaryStream, ct);
|
||||||
|
|
||||||
|
var dynamicParser = new ElfDynamicSectionParser();
|
||||||
|
var dynamicInfo = dynamicParser.Parse(elfInfo);
|
||||||
|
|
||||||
|
ImmutableArray<string> exportedSymbols = [];
|
||||||
|
ImmutableArray<string> importedSymbols = [];
|
||||||
|
|
||||||
|
if (options.ExtractSymbols)
|
||||||
|
{
|
||||||
|
exportedSymbols = elfInfo.DynamicSymbols
|
||||||
|
.Where(s => s.Binding == SymbolBinding.Global && s.SectionIndex != 0)
|
||||||
|
.Take(options.MaxSymbols)
|
||||||
|
.Select(s => s.Name)
|
||||||
|
.ToImmutableArray();
|
||||||
|
|
||||||
|
importedSymbols = elfInfo.DynamicSymbols
|
||||||
|
.Where(s => s.SectionIndex == 0)
|
||||||
|
.Take(options.MaxSymbols)
|
||||||
|
.Select(s => s.Name)
|
||||||
|
.ToImmutableArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
BinaryHardening? hardening = null;
|
||||||
|
if (options.ExtractHardening)
|
||||||
|
{
|
||||||
|
hardening = new BinaryHardening(
|
||||||
|
HasStackCanary: dynamicInfo.HasStackCanary,
|
||||||
|
HasNx: elfInfo.HasNxBit,
|
||||||
|
HasPie: elfInfo.Type == ElfType.Dyn,
|
||||||
|
HasRelro: dynamicInfo.HasRelro,
|
||||||
|
HasBindNow: dynamicInfo.HasBindNow);
|
||||||
|
}
|
||||||
|
|
||||||
|
return new BinaryFeatures
|
||||||
|
{
|
||||||
|
Identity = identity,
|
||||||
|
DynamicDeps = dynamicInfo.Needed.ToImmutableArray(),
|
||||||
|
ExportedSymbols = exportedSymbols,
|
||||||
|
ImportedSymbols = importedSymbols,
|
||||||
|
Hardening = hardening,
|
||||||
|
Interpreter = dynamicInfo.Interpreter
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static async Task<string> ComputeSha256Async(Stream stream, CancellationToken ct)
|
||||||
|
{
|
||||||
|
using var sha256 = SHA256.Create();
|
||||||
|
var hash = await sha256.ComputeHashAsync(stream, ct);
|
||||||
|
return Convert.ToHexString(hash).ToLowerInvariant();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string MapArchitecture(ushort machine) => machine switch
|
||||||
|
{
|
||||||
|
0x3E => "x86-64",
|
||||||
|
0xB7 => "aarch64",
|
||||||
|
0x03 => "x86",
|
||||||
|
0x28 => "arm",
|
||||||
|
_ => $"unknown-{machine:X}"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static BinaryType MapBinaryType(ElfType type) => type switch
|
||||||
|
{
|
||||||
|
ElfType.Exec => BinaryType.Executable,
|
||||||
|
ElfType.Dyn => BinaryType.SharedLibrary,
|
||||||
|
ElfType.Rel => BinaryType.Object,
|
||||||
|
_ => BinaryType.Executable
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Build-ID extraction from ELF notes
|
||||||
|
- [ ] File and .text section hashing
|
||||||
|
- [ ] Symbol extraction with limits
|
||||||
|
- [ ] Hardening flag detection
|
||||||
|
- [ ] Reuses Scanner.Analyzers.Native code
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement IBinaryIdentityService
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T3
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Implement the service that coordinates extraction and storage.
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/Services/BinaryIdentityService.cs`
|
||||||
|
|
||||||
|
**Interface**:
|
||||||
|
```csharp
|
||||||
|
public interface IBinaryIdentityService
|
||||||
|
{
|
||||||
|
Task<BinaryIdentity> GetOrCreateAsync(Stream binaryStream, CancellationToken ct);
|
||||||
|
Task<BinaryIdentity?> GetByBuildIdAsync(string buildId, CancellationToken ct);
|
||||||
|
Task<BinaryIdentity?> GetByKeyAsync(string binaryKey, CancellationToken ct);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class BinaryIdentityService : IBinaryIdentityService
|
||||||
|
{
|
||||||
|
private readonly IBinaryFeatureExtractor _extractor;
|
||||||
|
private readonly IBinaryIdentityRepository _repository;
|
||||||
|
|
||||||
|
public async Task<BinaryIdentity> GetOrCreateAsync(
|
||||||
|
Stream binaryStream,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var identity = await _extractor.ExtractIdentityAsync(binaryStream, ct);
|
||||||
|
|
||||||
|
// Check if already exists
|
||||||
|
var existing = await _repository.GetByKeyAsync(identity.BinaryKey, ct);
|
||||||
|
if (existing != null)
|
||||||
|
return existing;
|
||||||
|
|
||||||
|
// Create new
|
||||||
|
return await _repository.UpsertAsync(identity, ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public Task<BinaryIdentity?> GetByBuildIdAsync(string buildId, CancellationToken ct) =>
|
||||||
|
_repository.GetByBuildIdAsync(buildId, "gnu-build-id", ct);
|
||||||
|
|
||||||
|
public Task<BinaryIdentity?> GetByKeyAsync(string binaryKey, CancellationToken ct) =>
|
||||||
|
_repository.GetByKeyAsync(binaryKey, ct);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Service coordinates extraction and storage
|
||||||
|
- [ ] Deduplication by binary key
|
||||||
|
- [ ] Integration with repository
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Unit Tests
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Description**:
|
||||||
|
Unit tests for domain models and feature extraction.
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
- ELF Build-ID extraction from real binaries
|
||||||
|
- SHA-256 computation determinism
|
||||||
|
- Symbol extraction limits
|
||||||
|
- Hardening flag detection
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] 90%+ code coverage on core models
|
||||||
|
- [ ] Real ELF binary test fixtures
|
||||||
|
- [ ] All tests pass
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | BinaryIndex Team | Create Core Domain Models |
|
||||||
|
| 2 | T2 | TODO | T1 | BinaryIndex Team | Create IBinaryFeatureExtractor Interface |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | BinaryIndex Team | Implement ElfFeatureExtractor |
|
||||||
|
| 4 | T4 | TODO | T1-T3 | BinaryIndex Team | Implement IBinaryIdentityService |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | BinaryIndex Team | Unit Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 5 tasks marked DONE
|
||||||
|
- [ ] ELF Build-ID extraction working
|
||||||
|
- [ ] Binary identity deduplication
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds with 90%+ coverage
|
||||||
355
docs/implplan/SPRINT_6000_0001_0003_debian_corpus_connector.md
Normal file
355
docs/implplan/SPRINT_6000_0001_0003_debian_corpus_connector.md
Normal file
@@ -0,0 +1,355 @@
|
|||||||
|
# Sprint 6000.0001.0003 · Debian Corpus Connector
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement the Debian/Ubuntu binary corpus connector.
|
||||||
|
- Fetch packages from Debian/Ubuntu repositories.
|
||||||
|
- Extract binaries and index them with their identities.
|
||||||
|
- Support snapshot-based ingestion for determinism.
|
||||||
|
- **Working directory:** `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Corpus.Debian/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 6000.0001.0001, 6000.0001.0002
|
||||||
|
- **Downstream**: Sprint 6000.0001.0004
|
||||||
|
- **Safe to parallelize with**: None
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/modules/binaryindex/architecture.md`
|
||||||
|
- Debian repository structure documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create Corpus Connector Framework
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Corpus/`
|
||||||
|
|
||||||
|
**Interfaces**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Corpus;
|
||||||
|
|
||||||
|
public interface IBinaryCorpusConnector
|
||||||
|
{
|
||||||
|
string ConnectorId { get; }
|
||||||
|
string[] SupportedDistros { get; }
|
||||||
|
|
||||||
|
Task<CorpusSnapshot> FetchSnapshotAsync(CorpusQuery query, CancellationToken ct);
|
||||||
|
IAsyncEnumerable<PackageInfo> ListPackagesAsync(CorpusSnapshot snapshot, CancellationToken ct);
|
||||||
|
IAsyncEnumerable<ExtractedBinary> ExtractBinariesAsync(PackageInfo pkg, CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record CorpusQuery(
|
||||||
|
string Distro,
|
||||||
|
string Release,
|
||||||
|
string Architecture,
|
||||||
|
string[]? ComponentFilter = null);
|
||||||
|
|
||||||
|
public sealed record CorpusSnapshot(
|
||||||
|
Guid Id,
|
||||||
|
string Distro,
|
||||||
|
string Release,
|
||||||
|
string Architecture,
|
||||||
|
string MetadataDigest,
|
||||||
|
DateTimeOffset CapturedAt);
|
||||||
|
|
||||||
|
public sealed record PackageInfo(
|
||||||
|
string Name,
|
||||||
|
string Version,
|
||||||
|
string SourcePackage,
|
||||||
|
string Architecture,
|
||||||
|
string Filename,
|
||||||
|
long Size,
|
||||||
|
string Sha256);
|
||||||
|
|
||||||
|
public sealed record ExtractedBinary(
|
||||||
|
BinaryIdentity Identity,
|
||||||
|
string PathInPackage,
|
||||||
|
PackageInfo Package);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Generic connector interface defined
|
||||||
|
- [ ] Snapshot-based ingestion model
|
||||||
|
- [ ] Async enumerable for streaming
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Implement Debian Repository Client
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Corpus.Debian/DebianRepoClient.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class DebianRepoClient : IDebianRepoClient
|
||||||
|
{
|
||||||
|
private readonly HttpClient _httpClient;
|
||||||
|
private readonly ILogger<DebianRepoClient> _logger;
|
||||||
|
|
||||||
|
public async Task<ReleaseFile> FetchReleaseAsync(
|
||||||
|
string mirror,
|
||||||
|
string release,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
// Fetch and parse Release file
|
||||||
|
var releaseUrl = $"{mirror}/dists/{release}/Release";
|
||||||
|
var content = await _httpClient.GetStringAsync(releaseUrl, ct);
|
||||||
|
|
||||||
|
// Parse Release file format
|
||||||
|
return ParseReleaseFile(content);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<PackagesFile> FetchPackagesAsync(
|
||||||
|
string mirror,
|
||||||
|
string release,
|
||||||
|
string component,
|
||||||
|
string architecture,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
// Fetch and decompress Packages.gz
|
||||||
|
var packagesUrl = $"{mirror}/dists/{release}/{component}/binary-{architecture}/Packages.gz";
|
||||||
|
using var response = await _httpClient.GetStreamAsync(packagesUrl, ct);
|
||||||
|
using var gzip = new GZipStream(response, CompressionMode.Decompress);
|
||||||
|
using var reader = new StreamReader(gzip);
|
||||||
|
|
||||||
|
var content = await reader.ReadToEndAsync(ct);
|
||||||
|
return ParsePackagesFile(content);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<Stream> DownloadPackageAsync(
|
||||||
|
string mirror,
|
||||||
|
string filename,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var url = $"{mirror}/{filename}";
|
||||||
|
return await _httpClient.GetStreamAsync(url, ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Release file parsing
|
||||||
|
- [ ] Packages file parsing
|
||||||
|
- [ ] Package download with verification
|
||||||
|
- [ ] GPG signature verification (optional)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement Package Extractor
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Corpus.Debian/DebPackageExtractor.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class DebPackageExtractor : IPackageExtractor
|
||||||
|
{
|
||||||
|
public async IAsyncEnumerable<ExtractedFile> ExtractAsync(
|
||||||
|
Stream debStream,
|
||||||
|
[EnumeratorCancellation] CancellationToken ct)
|
||||||
|
{
|
||||||
|
// .deb is an ar archive containing:
|
||||||
|
// - debian-binary
|
||||||
|
// - control.tar.gz
|
||||||
|
// - data.tar.xz (or .gz, .zst)
|
||||||
|
|
||||||
|
using var arReader = new ArReader(debStream);
|
||||||
|
|
||||||
|
// Find and extract data archive
|
||||||
|
var dataEntry = arReader.Entries
|
||||||
|
.FirstOrDefault(e => e.Name.StartsWith("data.tar"));
|
||||||
|
|
||||||
|
if (dataEntry == null)
|
||||||
|
yield break;
|
||||||
|
|
||||||
|
using var dataStream = await DecompressAsync(dataEntry, ct);
|
||||||
|
using var tarReader = new TarReader(dataStream);
|
||||||
|
|
||||||
|
await foreach (var entry in tarReader.ReadEntriesAsync(ct))
|
||||||
|
{
|
||||||
|
if (!IsElfFile(entry))
|
||||||
|
continue;
|
||||||
|
|
||||||
|
yield return new ExtractedFile(
|
||||||
|
Path: entry.Name,
|
||||||
|
Stream: entry.DataStream,
|
||||||
|
Mode: entry.Mode);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static bool IsElfFile(TarEntry entry)
|
||||||
|
{
|
||||||
|
// Check if file path suggests a binary
|
||||||
|
var path = entry.Name;
|
||||||
|
if (path.StartsWith("./usr/lib/") ||
|
||||||
|
path.StartsWith("./usr/bin/") ||
|
||||||
|
path.StartsWith("./lib/"))
|
||||||
|
{
|
||||||
|
// Check ELF magic
|
||||||
|
if (entry.DataStream.Length >= 4)
|
||||||
|
{
|
||||||
|
Span<byte> magic = stackalloc byte[4];
|
||||||
|
entry.DataStream.ReadExactly(magic);
|
||||||
|
entry.DataStream.Position = 0;
|
||||||
|
return magic.SequenceEqual("\x7FELF"u8);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] .deb archive extraction
|
||||||
|
- [ ] ELF file detection
|
||||||
|
- [ ] Memory-efficient streaming
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement DebianCorpusConnector
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T3
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Corpus.Debian/DebianCorpusConnector.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class DebianCorpusConnector : IBinaryCorpusConnector
|
||||||
|
{
|
||||||
|
public string ConnectorId => "debian";
|
||||||
|
public string[] SupportedDistros => ["debian", "ubuntu"];
|
||||||
|
|
||||||
|
private readonly IDebianRepoClient _repoClient;
|
||||||
|
private readonly IPackageExtractor _extractor;
|
||||||
|
private readonly IBinaryFeatureExtractor _featureExtractor;
|
||||||
|
private readonly ICorpusSnapshotRepository _snapshotRepo;
|
||||||
|
|
||||||
|
public async Task<CorpusSnapshot> FetchSnapshotAsync(
|
||||||
|
CorpusQuery query,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var mirror = GetMirrorUrl(query.Distro);
|
||||||
|
var release = await _repoClient.FetchReleaseAsync(mirror, query.Release, ct);
|
||||||
|
|
||||||
|
var snapshot = new CorpusSnapshot(
|
||||||
|
Id: Guid.NewGuid(),
|
||||||
|
Distro: query.Distro,
|
||||||
|
Release: query.Release,
|
||||||
|
Architecture: query.Architecture,
|
||||||
|
MetadataDigest: release.Sha256,
|
||||||
|
CapturedAt: DateTimeOffset.UtcNow);
|
||||||
|
|
||||||
|
await _snapshotRepo.CreateAsync(snapshot, ct);
|
||||||
|
return snapshot;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async IAsyncEnumerable<PackageInfo> ListPackagesAsync(
|
||||||
|
CorpusSnapshot snapshot,
|
||||||
|
[EnumeratorCancellation] CancellationToken ct)
|
||||||
|
{
|
||||||
|
var mirror = GetMirrorUrl(snapshot.Distro);
|
||||||
|
|
||||||
|
foreach (var component in new[] { "main", "contrib" })
|
||||||
|
{
|
||||||
|
var packages = await _repoClient.FetchPackagesAsync(
|
||||||
|
mirror, snapshot.Release, component, snapshot.Architecture, ct);
|
||||||
|
|
||||||
|
foreach (var pkg in packages.Packages)
|
||||||
|
{
|
||||||
|
yield return new PackageInfo(
|
||||||
|
Name: pkg.Package,
|
||||||
|
Version: pkg.Version,
|
||||||
|
SourcePackage: pkg.Source ?? pkg.Package,
|
||||||
|
Architecture: pkg.Architecture,
|
||||||
|
Filename: pkg.Filename,
|
||||||
|
Size: pkg.Size,
|
||||||
|
Sha256: pkg.Sha256);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public async IAsyncEnumerable<ExtractedBinary> ExtractBinariesAsync(
|
||||||
|
PackageInfo pkg,
|
||||||
|
[EnumeratorCancellation] CancellationToken ct)
|
||||||
|
{
|
||||||
|
var mirror = GetMirrorUrl("debian"); // Simplified
|
||||||
|
using var debStream = await _repoClient.DownloadPackageAsync(mirror, pkg.Filename, ct);
|
||||||
|
|
||||||
|
await foreach (var file in _extractor.ExtractAsync(debStream, ct))
|
||||||
|
{
|
||||||
|
var identity = await _featureExtractor.ExtractIdentityAsync(file.Stream, ct);
|
||||||
|
|
||||||
|
yield return new ExtractedBinary(
|
||||||
|
Identity: identity,
|
||||||
|
PathInPackage: file.Path,
|
||||||
|
Package: pkg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Snapshot capture from Release file
|
||||||
|
- [ ] Package listing from Packages file
|
||||||
|
- [ ] Binary extraction and identity creation
|
||||||
|
- [ ] Integration with identity service
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
- Fetch real Debian Release file
|
||||||
|
- Parse real Packages file
|
||||||
|
- Extract binaries from sample .deb
|
||||||
|
- End-to-end snapshot and extraction
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Real Debian repository integration test
|
||||||
|
- [ ] Sample .deb extraction test
|
||||||
|
- [ ] Build-ID extraction from real binaries
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | BinaryIndex Team | Create Corpus Connector Framework |
|
||||||
|
| 2 | T2 | TODO | T1 | BinaryIndex Team | Implement Debian Repository Client |
|
||||||
|
| 3 | T3 | TODO | T1 | BinaryIndex Team | Implement Package Extractor |
|
||||||
|
| 4 | T4 | TODO | T1-T3 | BinaryIndex Team | Implement DebianCorpusConnector |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | BinaryIndex Team | Integration Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 5 tasks marked DONE
|
||||||
|
- [ ] Debian package fetching operational
|
||||||
|
- [ ] Binary extraction and indexing working
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
372
docs/implplan/SPRINT_6000_0002_0001_fix_evidence_parser.md
Normal file
372
docs/implplan/SPRINT_6000_0002_0001_fix_evidence_parser.md
Normal file
@@ -0,0 +1,372 @@
|
|||||||
|
# Sprint 6000.0002.0001 · Fix Evidence Parser
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement parsers for distro-specific CVE fix evidence.
|
||||||
|
- Parse Debian/Ubuntu changelogs for CVE mentions.
|
||||||
|
- Parse patch headers (DEP-3) for CVE references.
|
||||||
|
- Parse Alpine APKBUILD secfixes for CVE mappings.
|
||||||
|
- **Working directory:** `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.FixIndex/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 6000.0001.x (MVP 1 complete)
|
||||||
|
- **Downstream**: Sprint 6000.0002.0002 (Fix Index Builder)
|
||||||
|
- **Safe to parallelize with**: Sprint 6000.0002.0003 (Version Comparators)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/modules/binaryindex/architecture.md`
|
||||||
|
- Advisory: MVP 2 section on patch-aware backport handling
|
||||||
|
- Debian Policy on changelog format
|
||||||
|
- DEP-3 patch header specification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create Fix Evidence Domain Models
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.FixIndex/Models/`
|
||||||
|
|
||||||
|
**Models**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.FixIndex.Models;
|
||||||
|
|
||||||
|
public sealed record FixEvidence
|
||||||
|
{
|
||||||
|
public required string Distro { get; init; }
|
||||||
|
public required string Release { get; init; }
|
||||||
|
public required string SourcePkg { get; init; }
|
||||||
|
public required string CveId { get; init; }
|
||||||
|
public required FixState State { get; init; }
|
||||||
|
public string? FixedVersion { get; init; }
|
||||||
|
public required FixMethod Method { get; init; }
|
||||||
|
public required decimal Confidence { get; init; }
|
||||||
|
public required FixEvidencePayload Evidence { get; init; }
|
||||||
|
public Guid? SnapshotId { get; init; }
|
||||||
|
public DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum FixState { Fixed, Vulnerable, NotAffected, Wontfix, Unknown }
|
||||||
|
public enum FixMethod { SecurityFeed, Changelog, PatchHeader, UpstreamPatchMatch }
|
||||||
|
|
||||||
|
public abstract record FixEvidencePayload;
|
||||||
|
|
||||||
|
public sealed record ChangelogEvidence : FixEvidencePayload
|
||||||
|
{
|
||||||
|
public required string File { get; init; }
|
||||||
|
public required string Version { get; init; }
|
||||||
|
public required string Excerpt { get; init; }
|
||||||
|
public int? LineNumber { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record PatchHeaderEvidence : FixEvidencePayload
|
||||||
|
{
|
||||||
|
public required string PatchPath { get; init; }
|
||||||
|
public required string PatchSha256 { get; init; }
|
||||||
|
public required string HeaderExcerpt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record SecurityFeedEvidence : FixEvidencePayload
|
||||||
|
{
|
||||||
|
public required string FeedId { get; init; }
|
||||||
|
public required string EntryId { get; init; }
|
||||||
|
public required DateTimeOffset PublishedAt { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] All evidence types modeled
|
||||||
|
- [ ] Confidence levels defined
|
||||||
|
- [ ] Evidence payloads for auditability
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Implement Debian Changelog Parser
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.FixIndex/Parsers/DebianChangelogParser.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.FixIndex.Parsers;
|
||||||
|
|
||||||
|
public sealed class DebianChangelogParser : IChangelogParser
|
||||||
|
{
|
||||||
|
private static readonly Regex CvePattern = new(@"\bCVE-\d{4}-\d{4,7}\b", RegexOptions.Compiled);
|
||||||
|
private static readonly Regex EntryHeaderPattern = new(@"^(\S+)\s+\(([^)]+)\)\s+", RegexOptions.Compiled);
|
||||||
|
private static readonly Regex TrailerPattern = new(@"^\s+--\s+", RegexOptions.Compiled);
|
||||||
|
|
||||||
|
public IEnumerable<FixEvidence> ParseTopEntry(
|
||||||
|
string changelog,
|
||||||
|
string distro,
|
||||||
|
string release,
|
||||||
|
string sourcePkg)
|
||||||
|
{
|
||||||
|
var lines = changelog.Split('\n');
|
||||||
|
if (lines.Length == 0)
|
||||||
|
yield break;
|
||||||
|
|
||||||
|
// Parse first entry header
|
||||||
|
var headerMatch = EntryHeaderPattern.Match(lines[0]);
|
||||||
|
if (!headerMatch.Success)
|
||||||
|
yield break;
|
||||||
|
|
||||||
|
var version = headerMatch.Groups[2].Value;
|
||||||
|
|
||||||
|
// Collect entry lines until trailer
|
||||||
|
var entryLines = new List<string> { lines[0] };
|
||||||
|
foreach (var line in lines.Skip(1))
|
||||||
|
{
|
||||||
|
entryLines.Add(line);
|
||||||
|
if (TrailerPattern.IsMatch(line))
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
var entryText = string.Join('\n', entryLines);
|
||||||
|
var cves = CvePattern.Matches(entryText)
|
||||||
|
.Select(m => m.Value)
|
||||||
|
.Distinct();
|
||||||
|
|
||||||
|
foreach (var cve in cves)
|
||||||
|
{
|
||||||
|
yield return new FixEvidence
|
||||||
|
{
|
||||||
|
Distro = distro,
|
||||||
|
Release = release,
|
||||||
|
SourcePkg = sourcePkg,
|
||||||
|
CveId = cve,
|
||||||
|
State = FixState.Fixed,
|
||||||
|
FixedVersion = version,
|
||||||
|
Method = FixMethod.Changelog,
|
||||||
|
Confidence = 0.80m,
|
||||||
|
Evidence = new ChangelogEvidence
|
||||||
|
{
|
||||||
|
File = "debian/changelog",
|
||||||
|
Version = version,
|
||||||
|
Excerpt = entryText.Length > 2000 ? entryText[..2000] : entryText
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Parse top changelog entry
|
||||||
|
- [ ] Extract CVE mentions
|
||||||
|
- [ ] Store evidence excerpt
|
||||||
|
- [ ] Handle malformed changelogs gracefully
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement Patch Header Parser
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.FixIndex/Parsers/PatchHeaderParser.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class PatchHeaderParser : IPatchParser
|
||||||
|
{
|
||||||
|
private static readonly Regex CvePattern = new(@"\bCVE-\d{4}-\d{4,7}\b", RegexOptions.Compiled);
|
||||||
|
|
||||||
|
public IEnumerable<FixEvidence> ParsePatches(
|
||||||
|
string patchesDir,
|
||||||
|
IEnumerable<(string path, string content, string sha256)> patches,
|
||||||
|
string distro,
|
||||||
|
string release,
|
||||||
|
string sourcePkg,
|
||||||
|
string version)
|
||||||
|
{
|
||||||
|
foreach (var (path, content, sha256) in patches)
|
||||||
|
{
|
||||||
|
// Read first 80 lines as header
|
||||||
|
var headerLines = content.Split('\n').Take(80);
|
||||||
|
var header = string.Join('\n', headerLines);
|
||||||
|
|
||||||
|
// Also check filename for CVE
|
||||||
|
var searchText = header + "\n" + Path.GetFileName(path);
|
||||||
|
var cves = CvePattern.Matches(searchText)
|
||||||
|
.Select(m => m.Value)
|
||||||
|
.Distinct();
|
||||||
|
|
||||||
|
foreach (var cve in cves)
|
||||||
|
{
|
||||||
|
yield return new FixEvidence
|
||||||
|
{
|
||||||
|
Distro = distro,
|
||||||
|
Release = release,
|
||||||
|
SourcePkg = sourcePkg,
|
||||||
|
CveId = cve,
|
||||||
|
State = FixState.Fixed,
|
||||||
|
FixedVersion = version,
|
||||||
|
Method = FixMethod.PatchHeader,
|
||||||
|
Confidence = 0.87m,
|
||||||
|
Evidence = new PatchHeaderEvidence
|
||||||
|
{
|
||||||
|
PatchPath = path,
|
||||||
|
PatchSha256 = sha256,
|
||||||
|
HeaderExcerpt = header.Length > 1200 ? header[..1200] : header
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Parse patch headers for CVE mentions
|
||||||
|
- [ ] Check patch filenames
|
||||||
|
- [ ] Store patch digests for verification
|
||||||
|
- [ ] Support DEP-3 format
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement Alpine Secfixes Parser
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.FixIndex/Parsers/AlpineSecfixesParser.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class AlpineSecfixesParser : ISecfixesParser
|
||||||
|
{
|
||||||
|
// APKBUILD secfixes format:
|
||||||
|
// # secfixes:
|
||||||
|
// # 1.2.3-r0:
|
||||||
|
// # - CVE-2024-1234
|
||||||
|
// # - CVE-2024-1235
|
||||||
|
private static readonly Regex SecfixesPattern = new(
|
||||||
|
@"^#\s*secfixes:\s*$", RegexOptions.Compiled | RegexOptions.Multiline);
|
||||||
|
private static readonly Regex VersionPattern = new(
|
||||||
|
@"^#\s+(\d+\.\d+[^:]*):$", RegexOptions.Compiled);
|
||||||
|
private static readonly Regex CvePattern = new(
|
||||||
|
@"^#\s+-\s+(CVE-\d{4}-\d{4,7})$", RegexOptions.Compiled);
|
||||||
|
|
||||||
|
public IEnumerable<FixEvidence> Parse(
|
||||||
|
string apkbuild,
|
||||||
|
string distro,
|
||||||
|
string release,
|
||||||
|
string sourcePkg)
|
||||||
|
{
|
||||||
|
var lines = apkbuild.Split('\n');
|
||||||
|
var inSecfixes = false;
|
||||||
|
string? currentVersion = null;
|
||||||
|
|
||||||
|
foreach (var line in lines)
|
||||||
|
{
|
||||||
|
if (SecfixesPattern.IsMatch(line))
|
||||||
|
{
|
||||||
|
inSecfixes = true;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!inSecfixes)
|
||||||
|
continue;
|
||||||
|
|
||||||
|
// Exit secfixes block on non-comment line
|
||||||
|
if (!line.TrimStart().StartsWith('#'))
|
||||||
|
{
|
||||||
|
inSecfixes = false;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
var versionMatch = VersionPattern.Match(line);
|
||||||
|
if (versionMatch.Success)
|
||||||
|
{
|
||||||
|
currentVersion = versionMatch.Groups[1].Value;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
var cveMatch = CvePattern.Match(line);
|
||||||
|
if (cveMatch.Success && currentVersion != null)
|
||||||
|
{
|
||||||
|
yield return new FixEvidence
|
||||||
|
{
|
||||||
|
Distro = distro,
|
||||||
|
Release = release,
|
||||||
|
SourcePkg = sourcePkg,
|
||||||
|
CveId = cveMatch.Groups[1].Value,
|
||||||
|
State = FixState.Fixed,
|
||||||
|
FixedVersion = currentVersion,
|
||||||
|
Method = FixMethod.SecurityFeed, // APKBUILD is authoritative
|
||||||
|
Confidence = 0.95m,
|
||||||
|
Evidence = new SecurityFeedEvidence
|
||||||
|
{
|
||||||
|
FeedId = "alpine-secfixes",
|
||||||
|
EntryId = $"{sourcePkg}/{currentVersion}",
|
||||||
|
PublishedAt = DateTimeOffset.UtcNow
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Parse APKBUILD secfixes section
|
||||||
|
- [ ] Extract version-to-CVE mappings
|
||||||
|
- [ ] High confidence for authoritative source
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Unit Tests with Real Changelogs
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Test Fixtures**:
|
||||||
|
- Real Debian openssl changelog
|
||||||
|
- Real Ubuntu libssl changelog
|
||||||
|
- Sample patches with CVE headers
|
||||||
|
- Real Alpine openssl APKBUILD
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Test fixtures from real packages
|
||||||
|
- [ ] CVE extraction accuracy tests
|
||||||
|
- [ ] Confidence scoring validation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | BinaryIndex Team | Create Fix Evidence Domain Models |
|
||||||
|
| 2 | T2 | TODO | T1 | BinaryIndex Team | Implement Debian Changelog Parser |
|
||||||
|
| 3 | T3 | TODO | T1 | BinaryIndex Team | Implement Patch Header Parser |
|
||||||
|
| 4 | T4 | TODO | T1 | BinaryIndex Team | Implement Alpine Secfixes Parser |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | BinaryIndex Team | Unit Tests with Real Changelogs |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 5 tasks marked DONE
|
||||||
|
- [ ] Changelog CVE extraction working
|
||||||
|
- [ ] Patch header parsing working
|
||||||
|
- [ ] 95%+ accuracy on test fixtures
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
395
docs/implplan/SPRINT_6000_0003_0001_fingerprint_storage.md
Normal file
395
docs/implplan/SPRINT_6000_0003_0001_fingerprint_storage.md
Normal file
@@ -0,0 +1,395 @@
|
|||||||
|
# Sprint 6000.0003.0001 · Fingerprint Storage
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Implement database and blob storage for vulnerable function fingerprints.
|
||||||
|
- Create tables for fingerprint storage, corpus metadata, and validation results.
|
||||||
|
- Implement RustFS storage for fingerprint blobs and reference builds.
|
||||||
|
- **Working directory:** `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Fingerprints/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprint 6000.0001.x (MVP 1), 6000.0002.x (MVP 2)
|
||||||
|
- **Downstream**: Sprint 6000.0003.0002-0005
|
||||||
|
- **Safe to parallelize with**: Sprint 6000.0003.0002 (Reference Build Pipeline)
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/modules/binaryindex/architecture.md`
|
||||||
|
- `docs/db/schemas/binaries_schema_specification.md` (fingerprint tables)
|
||||||
|
- Existing fingerprinting: `src/Scanner/__Libraries/StellaOps.Scanner.EntryTrace/Binary/`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create Fingerprint Schema Migration
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/Migrations/002_create_fingerprint_tables.sql`
|
||||||
|
|
||||||
|
**Migration**:
|
||||||
|
```sql
|
||||||
|
-- 002_create_fingerprint_tables.sql
|
||||||
|
-- Adds fingerprint-related tables for MVP 3
|
||||||
|
|
||||||
|
BEGIN;
|
||||||
|
|
||||||
|
-- Fix index tables (from MVP 2, if not already created)
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.cve_fix_evidence (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
source_pkg TEXT NOT NULL,
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
state TEXT NOT NULL CHECK (state IN ('fixed', 'vulnerable', 'not_affected', 'wontfix', 'unknown')),
|
||||||
|
fixed_version TEXT,
|
||||||
|
method TEXT NOT NULL CHECK (method IN ('security_feed', 'changelog', 'patch_header', 'upstream_patch_match')),
|
||||||
|
confidence NUMERIC(3,2) NOT NULL CHECK (confidence >= 0 AND confidence <= 1),
|
||||||
|
evidence JSONB NOT NULL,
|
||||||
|
snapshot_id UUID REFERENCES binaries.corpus_snapshots(id),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.cve_fix_index (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
distro TEXT NOT NULL,
|
||||||
|
release TEXT NOT NULL,
|
||||||
|
source_pkg TEXT NOT NULL,
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
architecture TEXT,
|
||||||
|
state TEXT NOT NULL CHECK (state IN ('fixed', 'vulnerable', 'not_affected', 'wontfix', 'unknown')),
|
||||||
|
fixed_version TEXT,
|
||||||
|
primary_method TEXT NOT NULL,
|
||||||
|
confidence NUMERIC(3,2) NOT NULL,
|
||||||
|
evidence_ids UUID[],
|
||||||
|
computed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT cve_fix_index_unique UNIQUE (tenant_id, distro, release, source_pkg, cve_id, architecture)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Fingerprint tables
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.vulnerable_fingerprints (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
cve_id TEXT NOT NULL,
|
||||||
|
component TEXT NOT NULL,
|
||||||
|
purl TEXT,
|
||||||
|
algorithm TEXT NOT NULL CHECK (algorithm IN ('basic_block', 'control_flow_graph', 'string_refs', 'combined')),
|
||||||
|
fingerprint_id TEXT NOT NULL,
|
||||||
|
fingerprint_hash BYTEA NOT NULL,
|
||||||
|
architecture TEXT NOT NULL,
|
||||||
|
function_name TEXT,
|
||||||
|
source_file TEXT,
|
||||||
|
source_line INT,
|
||||||
|
similarity_threshold NUMERIC(3,2) DEFAULT 0.95,
|
||||||
|
confidence NUMERIC(3,2) CHECK (confidence >= 0 AND confidence <= 1),
|
||||||
|
validated BOOLEAN DEFAULT FALSE,
|
||||||
|
validation_stats JSONB DEFAULT '{}',
|
||||||
|
vuln_build_ref TEXT,
|
||||||
|
fixed_build_ref TEXT,
|
||||||
|
notes TEXT,
|
||||||
|
evidence_ref TEXT,
|
||||||
|
indexed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT vulnerable_fingerprints_unique UNIQUE (tenant_id, cve_id, algorithm, fingerprint_id, architecture)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.fingerprint_corpus_metadata (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
purl TEXT NOT NULL,
|
||||||
|
version TEXT NOT NULL,
|
||||||
|
algorithm TEXT NOT NULL,
|
||||||
|
binary_digest TEXT,
|
||||||
|
function_count INT NOT NULL DEFAULT 0,
|
||||||
|
fingerprints_indexed INT NOT NULL DEFAULT 0,
|
||||||
|
indexed_by TEXT,
|
||||||
|
indexed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
CONSTRAINT fingerprint_corpus_metadata_unique UNIQUE (tenant_id, purl, version, algorithm)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS binaries.fingerprint_matches (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
tenant_id UUID NOT NULL,
|
||||||
|
scan_id UUID NOT NULL,
|
||||||
|
match_type TEXT NOT NULL CHECK (match_type IN ('fingerprint', 'buildid', 'hash_exact')),
|
||||||
|
binary_key TEXT NOT NULL,
|
||||||
|
binary_identity_id UUID REFERENCES binaries.binary_identity(id),
|
||||||
|
vulnerable_purl TEXT NOT NULL,
|
||||||
|
vulnerable_version TEXT NOT NULL,
|
||||||
|
matched_fingerprint_id UUID REFERENCES binaries.vulnerable_fingerprints(id),
|
||||||
|
matched_function TEXT,
|
||||||
|
similarity NUMERIC(3,2),
|
||||||
|
advisory_ids TEXT[],
|
||||||
|
reachability_status TEXT CHECK (reachability_status IN ('reachable', 'unreachable', 'unknown', 'partial')),
|
||||||
|
evidence JSONB DEFAULT '{}',
|
||||||
|
matched_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Indexes
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_cve_fix_evidence_tenant ON binaries.cve_fix_evidence(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_cve_fix_evidence_key ON binaries.cve_fix_evidence(distro, release, source_pkg, cve_id);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_cve_fix_index_tenant ON binaries.cve_fix_index(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_cve_fix_index_lookup ON binaries.cve_fix_index(distro, release, source_pkg, cve_id);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_fingerprints_tenant ON binaries.vulnerable_fingerprints(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_fingerprints_cve ON binaries.vulnerable_fingerprints(cve_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_fingerprints_component ON binaries.vulnerable_fingerprints(component, architecture);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vulnerable_fingerprints_hash ON binaries.vulnerable_fingerprints USING hash (fingerprint_hash);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fingerprint_corpus_tenant ON binaries.fingerprint_corpus_metadata(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fingerprint_corpus_purl ON binaries.fingerprint_corpus_metadata(purl, version);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fingerprint_matches_tenant ON binaries.fingerprint_matches(tenant_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fingerprint_matches_scan ON binaries.fingerprint_matches(scan_id);
|
||||||
|
|
||||||
|
-- RLS
|
||||||
|
ALTER TABLE binaries.cve_fix_evidence ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.cve_fix_evidence FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY cve_fix_evidence_tenant_isolation ON binaries.cve_fix_evidence
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.cve_fix_index ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.cve_fix_index FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY cve_fix_index_tenant_isolation ON binaries.cve_fix_index
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.vulnerable_fingerprints ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.vulnerable_fingerprints FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY vulnerable_fingerprints_tenant_isolation ON binaries.vulnerable_fingerprints
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.fingerprint_corpus_metadata ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.fingerprint_corpus_metadata FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY fingerprint_corpus_metadata_tenant_isolation ON binaries.fingerprint_corpus_metadata
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
ALTER TABLE binaries.fingerprint_matches ENABLE ROW LEVEL SECURITY;
|
||||||
|
ALTER TABLE binaries.fingerprint_matches FORCE ROW LEVEL SECURITY;
|
||||||
|
CREATE POLICY fingerprint_matches_tenant_isolation ON binaries.fingerprint_matches
|
||||||
|
FOR ALL USING (tenant_id::text = binaries_app.require_current_tenant())
|
||||||
|
WITH CHECK (tenant_id::text = binaries_app.require_current_tenant());
|
||||||
|
|
||||||
|
COMMIT;
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] All fingerprint tables created
|
||||||
|
- [ ] Hash index on fingerprint_hash
|
||||||
|
- [ ] RLS policies enforced
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Create Fingerprint Domain Models
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Fingerprints/Models/`
|
||||||
|
|
||||||
|
**Models**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Fingerprints.Models;
|
||||||
|
|
||||||
|
public sealed record VulnFingerprint
|
||||||
|
{
|
||||||
|
public Guid Id { get; init; }
|
||||||
|
public required string CveId { get; init; }
|
||||||
|
public required string Component { get; init; }
|
||||||
|
public string? Purl { get; init; }
|
||||||
|
public required FingerprintAlgorithm Algorithm { get; init; }
|
||||||
|
public required string FingerprintId { get; init; }
|
||||||
|
public required byte[] FingerprintHash { get; init; }
|
||||||
|
public required string Architecture { get; init; }
|
||||||
|
public string? FunctionName { get; init; }
|
||||||
|
public string? SourceFile { get; init; }
|
||||||
|
public int? SourceLine { get; init; }
|
||||||
|
public decimal SimilarityThreshold { get; init; } = 0.95m;
|
||||||
|
public decimal? Confidence { get; init; }
|
||||||
|
public bool Validated { get; init; }
|
||||||
|
public FingerprintValidationStats? ValidationStats { get; init; }
|
||||||
|
public string? VulnBuildRef { get; init; }
|
||||||
|
public string? FixedBuildRef { get; init; }
|
||||||
|
public DateTimeOffset IndexedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum FingerprintAlgorithm
|
||||||
|
{
|
||||||
|
BasicBlock,
|
||||||
|
ControlFlowGraph,
|
||||||
|
StringRefs,
|
||||||
|
Combined
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FingerprintValidationStats
|
||||||
|
{
|
||||||
|
public int TruePositives { get; init; }
|
||||||
|
public int FalsePositives { get; init; }
|
||||||
|
public int TrueNegatives { get; init; }
|
||||||
|
public int FalseNegatives { get; init; }
|
||||||
|
public decimal Precision => TruePositives + FalsePositives == 0 ? 0 :
|
||||||
|
(decimal)TruePositives / (TruePositives + FalsePositives);
|
||||||
|
public decimal Recall => TruePositives + FalseNegatives == 0 ? 0 :
|
||||||
|
(decimal)TruePositives / (TruePositives + FalseNegatives);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FingerprintMatch
|
||||||
|
{
|
||||||
|
public Guid Id { get; init; }
|
||||||
|
public Guid ScanId { get; init; }
|
||||||
|
public required MatchType Type { get; init; }
|
||||||
|
public required string BinaryKey { get; init; }
|
||||||
|
public required string VulnerablePurl { get; init; }
|
||||||
|
public required string VulnerableVersion { get; init; }
|
||||||
|
public Guid? MatchedFingerprintId { get; init; }
|
||||||
|
public string? MatchedFunction { get; init; }
|
||||||
|
public decimal? Similarity { get; init; }
|
||||||
|
public string[]? AdvisoryIds { get; init; }
|
||||||
|
public ReachabilityStatus? ReachabilityStatus { get; init; }
|
||||||
|
public DateTimeOffset MatchedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum MatchType { Fingerprint, BuildId, HashExact }
|
||||||
|
public enum ReachabilityStatus { Reachable, Unreachable, Unknown, Partial }
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] All fingerprint models defined
|
||||||
|
- [ ] Validation stats with precision/recall
|
||||||
|
- [ ] Match types enumerated
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Implement Fingerprint Repository
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/Repositories/FingerprintRepository.cs`
|
||||||
|
|
||||||
|
**Interface**:
|
||||||
|
```csharp
|
||||||
|
public interface IFingerprintRepository
|
||||||
|
{
|
||||||
|
Task<VulnFingerprint> CreateAsync(VulnFingerprint fingerprint, CancellationToken ct);
|
||||||
|
Task<VulnFingerprint?> GetByIdAsync(Guid id, CancellationToken ct);
|
||||||
|
Task<ImmutableArray<VulnFingerprint>> GetByCveAsync(string cveId, CancellationToken ct);
|
||||||
|
Task<ImmutableArray<VulnFingerprint>> SearchByHashAsync(
|
||||||
|
byte[] hash, FingerprintAlgorithm algorithm, string architecture, CancellationToken ct);
|
||||||
|
Task UpdateValidationStatsAsync(Guid id, FingerprintValidationStats stats, CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public interface IFingerprintMatchRepository
|
||||||
|
{
|
||||||
|
Task<FingerprintMatch> CreateAsync(FingerprintMatch match, CancellationToken ct);
|
||||||
|
Task<ImmutableArray<FingerprintMatch>> GetByScanAsync(Guid scanId, CancellationToken ct);
|
||||||
|
Task UpdateReachabilityAsync(Guid id, ReachabilityStatus status, CancellationToken ct);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] CRUD operations for fingerprints
|
||||||
|
- [ ] Hash-based search
|
||||||
|
- [ ] Match recording
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Implement RustFS Fingerprint Storage
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T2
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Fingerprints/Storage/FingerprintBlobStorage.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class FingerprintBlobStorage : IFingerprintBlobStorage
|
||||||
|
{
|
||||||
|
private readonly IRustFsClient _rustFs;
|
||||||
|
private const string BasePath = "binaryindex/fingerprints";
|
||||||
|
|
||||||
|
public async Task<string> StoreFingerprintAsync(
|
||||||
|
VulnFingerprint fingerprint,
|
||||||
|
byte[] fullData,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var prefix = fingerprint.FingerprintId[..2];
|
||||||
|
var path = $"{BasePath}/{fingerprint.Algorithm}/{prefix}/{fingerprint.FingerprintId}.bin";
|
||||||
|
|
||||||
|
await _rustFs.PutAsync(path, fullData, ct);
|
||||||
|
return path;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<string> StoreReferenceBuildAsync(
|
||||||
|
string cveId,
|
||||||
|
string buildType, // "vulnerable" or "fixed"
|
||||||
|
byte[] buildArtifact,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var path = $"{BasePath}/refbuilds/{cveId}/{buildType}.tar.zst";
|
||||||
|
await _rustFs.PutAsync(path, buildArtifact, ct);
|
||||||
|
return path;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Fingerprint blob storage
|
||||||
|
- [ ] Reference build storage
|
||||||
|
- [ ] Shard-by-prefix layout
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Fingerprint CRUD tests
|
||||||
|
- [ ] Hash search tests
|
||||||
|
- [ ] Blob storage integration tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | BinaryIndex Team | Create Fingerprint Schema Migration |
|
||||||
|
| 2 | T2 | TODO | T1 | BinaryIndex Team | Create Fingerprint Domain Models |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | BinaryIndex Team | Implement Fingerprint Repository |
|
||||||
|
| 4 | T4 | TODO | T2 | BinaryIndex Team | Implement RustFS Fingerprint Storage |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | BinaryIndex Team | Integration Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 5 tasks marked DONE
|
||||||
|
- [ ] Fingerprint tables deployed
|
||||||
|
- [ ] RustFS storage operational
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
530
docs/implplan/SPRINT_6000_0004_0001_scanner_integration.md
Normal file
530
docs/implplan/SPRINT_6000_0004_0001_scanner_integration.md
Normal file
@@ -0,0 +1,530 @@
|
|||||||
|
# Sprint 6000.0004.0001 · Scanner Worker Integration
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
|
||||||
|
- Integrate BinaryIndex into Scanner.Worker for binary vulnerability lookup during scans.
|
||||||
|
- Query binaries during layer extraction for Build-ID and fingerprint matches.
|
||||||
|
- Wire results into the existing findings pipeline.
|
||||||
|
- **Working directory:** `src/Scanner/StellaOps.Scanner.Worker/`
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
|
||||||
|
- **Upstream**: Sprints 6000.0001.x, 6000.0002.x, 6000.0003.x (MVPs 1-3)
|
||||||
|
- **Downstream**: Sprint 6000.0004.0002-0004
|
||||||
|
- **Safe to parallelize with**: None
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
|
||||||
|
- `docs/modules/binaryindex/architecture.md`
|
||||||
|
- `docs/modules/scanner/architecture.md`
|
||||||
|
- Existing Scanner.Worker pipeline
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
### T1: Create IBinaryVulnerabilityService Interface
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/Services/IBinaryVulnerabilityService.cs`
|
||||||
|
|
||||||
|
**Interface**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.BinaryIndex.Core.Services;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Main query interface for binary vulnerability lookup.
|
||||||
|
/// Consumed by Scanner.Worker during container scanning.
|
||||||
|
/// </summary>
|
||||||
|
public interface IBinaryVulnerabilityService
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Look up vulnerabilities by binary identity (Build-ID, hashes).
|
||||||
|
/// </summary>
|
||||||
|
Task<ImmutableArray<BinaryVulnMatch>> LookupByIdentityAsync(
|
||||||
|
BinaryIdentity identity,
|
||||||
|
LookupOptions? options = null,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Look up vulnerabilities by function fingerprint.
|
||||||
|
/// </summary>
|
||||||
|
Task<ImmutableArray<BinaryVulnMatch>> LookupByFingerprintAsync(
|
||||||
|
CodeFingerprint fingerprint,
|
||||||
|
decimal minSimilarity = 0.95m,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Batch lookup for scan performance.
|
||||||
|
/// </summary>
|
||||||
|
Task<ImmutableDictionary<string, ImmutableArray<BinaryVulnMatch>>> LookupBatchAsync(
|
||||||
|
IEnumerable<BinaryIdentity> identities,
|
||||||
|
LookupOptions? options = null,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Get distro-specific fix status (patch-aware).
|
||||||
|
/// </summary>
|
||||||
|
Task<FixRecord?> GetFixStatusAsync(
|
||||||
|
string distro,
|
||||||
|
string release,
|
||||||
|
string sourcePkg,
|
||||||
|
string cveId,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record LookupOptions
|
||||||
|
{
|
||||||
|
public bool IncludeFingerprints { get; init; } = true;
|
||||||
|
public bool CheckFixIndex { get; init; } = true;
|
||||||
|
public string? DistroHint { get; init; }
|
||||||
|
public string? ReleaseHint { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BinaryVulnMatch
|
||||||
|
{
|
||||||
|
public required string CveId { get; init; }
|
||||||
|
public required string VulnerablePurl { get; init; }
|
||||||
|
public required MatchMethod Method { get; init; }
|
||||||
|
public required decimal Confidence { get; init; }
|
||||||
|
public MatchEvidence? Evidence { get; init; }
|
||||||
|
public FixRecord? FixStatus { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum MatchMethod { BuildIdCatalog, FingerprintMatch, RangeMatch }
|
||||||
|
|
||||||
|
public sealed record MatchEvidence
|
||||||
|
{
|
||||||
|
public string? BuildId { get; init; }
|
||||||
|
public string? FingerprintId { get; init; }
|
||||||
|
public decimal? Similarity { get; init; }
|
||||||
|
public string? MatchedFunction { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FixRecord
|
||||||
|
{
|
||||||
|
public required string Distro { get; init; }
|
||||||
|
public required string Release { get; init; }
|
||||||
|
public required string SourcePkg { get; init; }
|
||||||
|
public required string CveId { get; init; }
|
||||||
|
public required FixState State { get; init; }
|
||||||
|
public string? FixedVersion { get; init; }
|
||||||
|
public required FixMethod Method { get; init; }
|
||||||
|
public required decimal Confidence { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Interface defined with all lookup methods
|
||||||
|
- [ ] Options for controlling lookup scope
|
||||||
|
- [ ] Match evidence structure
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T2: Implement BinaryVulnerabilityService
|
||||||
|
|
||||||
|
**Assignee**: BinaryIndex Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1
|
||||||
|
|
||||||
|
**Implementation Path**: `src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Core/Services/BinaryVulnerabilityService.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
public sealed class BinaryVulnerabilityService : IBinaryVulnerabilityService
|
||||||
|
{
|
||||||
|
private readonly IBinaryVulnAssertionRepository _assertionRepo;
|
||||||
|
private readonly IVulnerableBuildIdRepository _buildIdRepo;
|
||||||
|
private readonly IFingerprintRepository _fingerprintRepo;
|
||||||
|
private readonly ICveFixIndexRepository _fixIndexRepo;
|
||||||
|
private readonly ILogger<BinaryVulnerabilityService> _logger;
|
||||||
|
|
||||||
|
public async Task<ImmutableArray<BinaryVulnMatch>> LookupByIdentityAsync(
|
||||||
|
BinaryIdentity identity,
|
||||||
|
LookupOptions? options = null,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
options ??= new LookupOptions();
|
||||||
|
var matches = new List<BinaryVulnMatch>();
|
||||||
|
|
||||||
|
// Tier 1: Check explicit assertions
|
||||||
|
var assertions = await _assertionRepo.GetByBinaryKeyAsync(identity.BinaryKey, ct);
|
||||||
|
foreach (var assertion in assertions.Where(a => a.Status == "affected"))
|
||||||
|
{
|
||||||
|
matches.Add(new BinaryVulnMatch
|
||||||
|
{
|
||||||
|
CveId = assertion.CveId,
|
||||||
|
VulnerablePurl = "unknown", // Resolve from advisory
|
||||||
|
Method = MapMethod(assertion.Method),
|
||||||
|
Confidence = assertion.Confidence ?? 0.9m,
|
||||||
|
Evidence = new MatchEvidence { BuildId = identity.BuildId }
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tier 2: Check Build-ID catalog
|
||||||
|
if (identity.BuildId != null)
|
||||||
|
{
|
||||||
|
var buildIdMatches = await _buildIdRepo.GetByBuildIdAsync(
|
||||||
|
identity.BuildId, identity.BuildIdType ?? "gnu-build-id", ct);
|
||||||
|
|
||||||
|
foreach (var bid in buildIdMatches)
|
||||||
|
{
|
||||||
|
// Check if we already have this CVE from assertions
|
||||||
|
// Look up advisories for this PURL
|
||||||
|
// Add matches...
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tier 3: Apply fix index adjustments
|
||||||
|
if (options.CheckFixIndex && options.DistroHint != null)
|
||||||
|
{
|
||||||
|
foreach (var match in matches.ToList())
|
||||||
|
{
|
||||||
|
var fixRecord = await GetFixStatusFromMatchAsync(match, options, ct);
|
||||||
|
if (fixRecord?.State == FixState.Fixed)
|
||||||
|
{
|
||||||
|
// Mark as fixed, don't remove from matches
|
||||||
|
// Let caller decide based on fix status
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return matches.ToImmutableArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<ImmutableDictionary<string, ImmutableArray<BinaryVulnMatch>>> LookupBatchAsync(
|
||||||
|
IEnumerable<BinaryIdentity> identities,
|
||||||
|
LookupOptions? options = null,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var results = new Dictionary<string, ImmutableArray<BinaryVulnMatch>>();
|
||||||
|
|
||||||
|
// Batch fetch for performance
|
||||||
|
var keys = identities.Select(i => i.BinaryKey).ToArray();
|
||||||
|
var allAssertions = await _assertionRepo.GetBatchByKeysAsync(keys, ct);
|
||||||
|
|
||||||
|
foreach (var identity in identities)
|
||||||
|
{
|
||||||
|
var matches = await LookupByIdentityAsync(identity, options, ct);
|
||||||
|
results[identity.BinaryKey] = matches;
|
||||||
|
}
|
||||||
|
|
||||||
|
return results.ToImmutableDictionary();
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<FixRecord?> GetFixStatusAsync(
|
||||||
|
string distro,
|
||||||
|
string release,
|
||||||
|
string sourcePkg,
|
||||||
|
string cveId,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
var fixIndex = await _fixIndexRepo.GetAsync(distro, release, sourcePkg, cveId, ct);
|
||||||
|
if (fixIndex == null)
|
||||||
|
return null;
|
||||||
|
|
||||||
|
return new FixRecord
|
||||||
|
{
|
||||||
|
Distro = fixIndex.Distro,
|
||||||
|
Release = fixIndex.Release,
|
||||||
|
SourcePkg = fixIndex.SourcePkg,
|
||||||
|
CveId = fixIndex.CveId,
|
||||||
|
State = Enum.Parse<FixState>(fixIndex.State, true),
|
||||||
|
FixedVersion = fixIndex.FixedVersion,
|
||||||
|
Method = Enum.Parse<FixMethod>(fixIndex.PrimaryMethod, true),
|
||||||
|
Confidence = fixIndex.Confidence
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ... additional helper methods
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Build-ID lookup working
|
||||||
|
- [ ] Fix index integration
|
||||||
|
- [ ] Batch lookup for performance
|
||||||
|
- [ ] Proper tiering (assertions → Build-ID → fingerprints)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T3: Create Scanner.Worker Integration Point
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1, T2
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/StellaOps.Scanner.Worker/Analyzers/BinaryVulnerabilityAnalyzer.cs`
|
||||||
|
|
||||||
|
**Implementation**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Scanner.Worker.Analyzers;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Analyzer that queries BinaryIndex for vulnerable binaries during scan.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class BinaryVulnerabilityAnalyzer : ILayerAnalyzer
|
||||||
|
{
|
||||||
|
private readonly IBinaryVulnerabilityService _binaryVulnService;
|
||||||
|
private readonly IBinaryFeatureExtractor _featureExtractor;
|
||||||
|
private readonly ILogger<BinaryVulnerabilityAnalyzer> _logger;
|
||||||
|
|
||||||
|
public string AnalyzerId => "binary-vulnerability";
|
||||||
|
public int Priority => 100; // Run after package analyzers
|
||||||
|
|
||||||
|
public async Task<LayerAnalysisResult> AnalyzeAsync(
|
||||||
|
LayerContext context,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var findings = new List<BinaryVulnerabilityFinding>();
|
||||||
|
var identities = new List<BinaryIdentity>();
|
||||||
|
|
||||||
|
// Extract identities from all binaries in layer
|
||||||
|
await foreach (var file in context.EnumerateFilesAsync(ct))
|
||||||
|
{
|
||||||
|
if (!IsBinaryFile(file))
|
||||||
|
continue;
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
using var stream = await file.OpenReadAsync(ct);
|
||||||
|
var identity = await _featureExtractor.ExtractIdentityAsync(stream, ct);
|
||||||
|
identities.Add(identity);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogDebug(ex, "Failed to extract identity from {Path}", file.Path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (identities.Count == 0)
|
||||||
|
return LayerAnalysisResult.Empty;
|
||||||
|
|
||||||
|
// Batch lookup
|
||||||
|
var options = new LookupOptions
|
||||||
|
{
|
||||||
|
DistroHint = context.DetectedDistro,
|
||||||
|
ReleaseHint = context.DetectedRelease,
|
||||||
|
CheckFixIndex = true
|
||||||
|
};
|
||||||
|
|
||||||
|
var matches = await _binaryVulnService.LookupBatchAsync(identities, options, ct);
|
||||||
|
|
||||||
|
foreach (var (binaryKey, vulnMatches) in matches)
|
||||||
|
{
|
||||||
|
foreach (var match in vulnMatches)
|
||||||
|
{
|
||||||
|
findings.Add(new BinaryVulnerabilityFinding
|
||||||
|
{
|
||||||
|
ScanId = context.ScanId,
|
||||||
|
LayerDigest = context.LayerDigest,
|
||||||
|
BinaryKey = binaryKey,
|
||||||
|
CveId = match.CveId,
|
||||||
|
VulnerablePurl = match.VulnerablePurl,
|
||||||
|
MatchMethod = match.Method.ToString(),
|
||||||
|
Confidence = match.Confidence,
|
||||||
|
FixStatus = match.FixStatus,
|
||||||
|
Evidence = match.Evidence
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new LayerAnalysisResult
|
||||||
|
{
|
||||||
|
AnalyzerId = AnalyzerId,
|
||||||
|
BinaryFindings = findings.ToImmutableArray()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static bool IsBinaryFile(LayerFile file)
|
||||||
|
{
|
||||||
|
// Check path patterns
|
||||||
|
var path = file.Path;
|
||||||
|
return path.StartsWith("/usr/lib/") ||
|
||||||
|
path.StartsWith("/lib/") ||
|
||||||
|
path.StartsWith("/usr/bin/") ||
|
||||||
|
path.StartsWith("/bin/") ||
|
||||||
|
path.EndsWith(".so") ||
|
||||||
|
path.Contains(".so.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Analyzer integrates with layer analysis pipeline
|
||||||
|
- [ ] Binary detection heuristics
|
||||||
|
- [ ] Batch lookup for performance
|
||||||
|
- [ ] Distro detection passed to lookup
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T4: Wire Findings to Existing Pipeline
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 3
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T3
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/StellaOps.Scanner.Worker/Findings/BinaryVulnerabilityFinding.cs`
|
||||||
|
|
||||||
|
**Finding Model**:
|
||||||
|
```csharp
|
||||||
|
namespace StellaOps.Scanner.Worker.Findings;
|
||||||
|
|
||||||
|
public sealed record BinaryVulnerabilityFinding : IFinding
|
||||||
|
{
|
||||||
|
public Guid ScanId { get; init; }
|
||||||
|
public required string LayerDigest { get; init; }
|
||||||
|
public required string BinaryKey { get; init; }
|
||||||
|
public required string CveId { get; init; }
|
||||||
|
public required string VulnerablePurl { get; init; }
|
||||||
|
public required string MatchMethod { get; init; }
|
||||||
|
public required decimal Confidence { get; init; }
|
||||||
|
public FixRecord? FixStatus { get; init; }
|
||||||
|
public MatchEvidence? Evidence { get; init; }
|
||||||
|
|
||||||
|
public string FindingType => "binary-vulnerability";
|
||||||
|
|
||||||
|
public string GetSummary() =>
|
||||||
|
$"{CveId} in {VulnerablePurl} (via {MatchMethod}, confidence {Confidence:P0})";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Integration with Findings Ledger**:
|
||||||
|
```csharp
|
||||||
|
// In ScanResultAggregator
|
||||||
|
public async Task AggregateFindingsAsync(ScanContext context, CancellationToken ct)
|
||||||
|
{
|
||||||
|
foreach (var layer in context.Layers)
|
||||||
|
{
|
||||||
|
var result = layer.AnalysisResult;
|
||||||
|
|
||||||
|
// Process binary findings
|
||||||
|
foreach (var binaryFinding in result.BinaryFindings)
|
||||||
|
{
|
||||||
|
await _findingsLedger.RecordAsync(new FindingEntry
|
||||||
|
{
|
||||||
|
ScanId = context.ScanId,
|
||||||
|
FindingType = binaryFinding.FindingType,
|
||||||
|
CveId = binaryFinding.CveId,
|
||||||
|
Purl = binaryFinding.VulnerablePurl,
|
||||||
|
Severity = await _advisoryService.GetSeverityAsync(binaryFinding.CveId, ct),
|
||||||
|
Evidence = new FindingEvidence
|
||||||
|
{
|
||||||
|
Type = "binary_match",
|
||||||
|
Method = binaryFinding.MatchMethod,
|
||||||
|
Confidence = binaryFinding.Confidence,
|
||||||
|
BinaryKey = binaryFinding.BinaryKey
|
||||||
|
}
|
||||||
|
}, ct);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Binary findings recorded in ledger
|
||||||
|
- [ ] Evidence properly structured
|
||||||
|
- [ ] Integration with existing severity lookup
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T5: Add Configuration and DI Registration
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 2
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T4
|
||||||
|
|
||||||
|
**Implementation Path**: `src/Scanner/StellaOps.Scanner.Worker/Extensions/BinaryIndexServiceExtensions.cs`
|
||||||
|
|
||||||
|
**DI Registration**:
|
||||||
|
```csharp
|
||||||
|
public static class BinaryIndexServiceExtensions
|
||||||
|
{
|
||||||
|
public static IServiceCollection AddBinaryIndexIntegration(
|
||||||
|
this IServiceCollection services,
|
||||||
|
IConfiguration configuration)
|
||||||
|
{
|
||||||
|
var options = configuration
|
||||||
|
.GetSection("BinaryIndex")
|
||||||
|
.Get<BinaryIndexOptions>() ?? new BinaryIndexOptions();
|
||||||
|
|
||||||
|
if (!options.Enabled)
|
||||||
|
return services;
|
||||||
|
|
||||||
|
services.AddSingleton(options);
|
||||||
|
services.AddScoped<IBinaryVulnerabilityService, BinaryVulnerabilityService>();
|
||||||
|
services.AddScoped<IBinaryFeatureExtractor, ElfFeatureExtractor>();
|
||||||
|
services.AddScoped<BinaryVulnerabilityAnalyzer>();
|
||||||
|
|
||||||
|
// Register analyzer in pipeline
|
||||||
|
services.AddSingleton<ILayerAnalyzer>(sp =>
|
||||||
|
sp.GetRequiredService<BinaryVulnerabilityAnalyzer>());
|
||||||
|
|
||||||
|
return services;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed class BinaryIndexOptions
|
||||||
|
{
|
||||||
|
public bool Enabled { get; init; } = true;
|
||||||
|
public int BatchSize { get; init; } = 100;
|
||||||
|
public int TimeoutMs { get; init; } = 5000;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Configuration-driven enablement
|
||||||
|
- [ ] Proper DI registration
|
||||||
|
- [ ] Timeout configuration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### T6: Integration Tests
|
||||||
|
|
||||||
|
**Assignee**: Scanner Team
|
||||||
|
**Story Points**: 5
|
||||||
|
**Status**: TODO
|
||||||
|
**Dependencies**: T1-T5
|
||||||
|
|
||||||
|
**Test Cases**:
|
||||||
|
- End-to-end scan with binary lookup
|
||||||
|
- Layer with known vulnerable Build-ID
|
||||||
|
- Fix index correctly overrides upstream range
|
||||||
|
- Batch performance test
|
||||||
|
|
||||||
|
**Acceptance Criteria**:
|
||||||
|
- [ ] Integration test with real container image
|
||||||
|
- [ ] Binary match correctly recorded
|
||||||
|
- [ ] Fix status applied
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
|
||||||
|
| # | Task ID | Status | Dependency | Owners | Task Definition |
|
||||||
|
|---|---------|--------|------------|--------|-----------------|
|
||||||
|
| 1 | T1 | TODO | — | Scanner Team | Create IBinaryVulnerabilityService Interface |
|
||||||
|
| 2 | T2 | TODO | T1 | BinaryIndex Team | Implement BinaryVulnerabilityService |
|
||||||
|
| 3 | T3 | TODO | T1, T2 | Scanner Team | Create Scanner.Worker Integration Point |
|
||||||
|
| 4 | T4 | TODO | T3 | Scanner Team | Wire Findings to Existing Pipeline |
|
||||||
|
| 5 | T5 | TODO | T1-T4 | Scanner Team | Add Configuration and DI Registration |
|
||||||
|
| 6 | T6 | TODO | T1-T5 | Scanner Team | Integration Tests |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
- [ ] All 6 tasks marked DONE
|
||||||
|
- [ ] Binary vulnerability analyzer integrated
|
||||||
|
- [ ] Findings recorded in ledger
|
||||||
|
- [ ] Configuration-driven enablement
|
||||||
|
- [ ] < 100ms p95 lookup latency
|
||||||
|
- [ ] `dotnet build` succeeds
|
||||||
|
- [ ] `dotnet test` succeeds
|
||||||
290
docs/implplan/SPRINT_6000_SUMMARY.md
Normal file
290
docs/implplan/SPRINT_6000_SUMMARY.md
Normal file
@@ -0,0 +1,290 @@
|
|||||||
|
# Sprint 6000 Series Summary: BinaryIndex Module
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The 6000 series implements the **BinaryIndex** module - a vulnerable binaries database that enables detection of vulnerable code at the binary level, independent of package metadata.
|
||||||
|
|
||||||
|
**Advisory Source:** `docs/product-advisories/21-Dec-2025 - Mapping Evidence Within Compiled Binaries.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## MVP Roadmap
|
||||||
|
|
||||||
|
### MVP 1: Known-Build Binary Catalog (Sprint 6000.0001)
|
||||||
|
|
||||||
|
**Goal:** Query "is this Build-ID vulnerable?" with distro-level precision.
|
||||||
|
|
||||||
|
| Sprint | Topic | Description |
|
||||||
|
|--------|-------|-------------|
|
||||||
|
| 6000.0001.0001 | Binaries Schema | PostgreSQL schema creation |
|
||||||
|
| 6000.0001.0002 | Binary Identity Service | Core identity extraction and storage |
|
||||||
|
| 6000.0001.0003 | Debian Corpus Connector | Debian/Ubuntu package ingestion |
|
||||||
|
| 6000.0001.0004 | Build-ID Lookup Service | Query API for Build-ID matching |
|
||||||
|
|
||||||
|
**Acceptance:** Given a Build-ID, return associated CVEs from known distro builds.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### MVP 2: Patch-Aware Backport Handling (Sprint 6000.0002)
|
||||||
|
|
||||||
|
**Goal:** Handle "version says vulnerable but distro backported the fix."
|
||||||
|
|
||||||
|
| Sprint | Topic | Description |
|
||||||
|
|--------|-------|-------------|
|
||||||
|
| 6000.0002.0001 | Fix Evidence Parser | Changelog and patch header parsing |
|
||||||
|
| 6000.0002.0002 | Fix Index Builder | Merge evidence into fix index |
|
||||||
|
| 6000.0002.0003 | Version Comparators | Distro-specific version comparison |
|
||||||
|
| 6000.0002.0004 | RPM Corpus Connector | RHEL/Fedora package ingestion |
|
||||||
|
|
||||||
|
**Acceptance:** For a CVE that upstream marks vulnerable, correctly identify distro backport as fixed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### MVP 3: Binary Fingerprint Factory (Sprint 6000.0003)
|
||||||
|
|
||||||
|
**Goal:** Detect vulnerable code independent of package metadata.
|
||||||
|
|
||||||
|
| Sprint | Topic | Description |
|
||||||
|
|--------|-------|-------------|
|
||||||
|
| 6000.0003.0001 | Fingerprint Storage | Database and blob storage for fingerprints |
|
||||||
|
| 6000.0003.0002 | Reference Build Pipeline | Generate vulnerable/fixed reference builds |
|
||||||
|
| 6000.0003.0003 | Fingerprint Generator | Extract function fingerprints from binaries |
|
||||||
|
| 6000.0003.0004 | Fingerprint Matching Engine | Similarity search and matching |
|
||||||
|
| 6000.0003.0005 | Validation Corpus | Golden corpus for fingerprint validation |
|
||||||
|
|
||||||
|
**Acceptance:** Detect CVE in stripped binary with no package metadata, confidence > 0.95.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### MVP 4: Scanner Integration (Sprint 6000.0004)
|
||||||
|
|
||||||
|
**Goal:** Binary evidence in production scans.
|
||||||
|
|
||||||
|
| Sprint | Topic | Description |
|
||||||
|
|--------|-------|-------------|
|
||||||
|
| 6000.0004.0001 | Scanner Worker Integration | Wire BinaryIndex into scan pipeline |
|
||||||
|
| 6000.0004.0002 | Findings Ledger Integration | Record binary matches as findings |
|
||||||
|
| 6000.0004.0003 | Proof Segment Attestation | DSSE attestations for binary evidence |
|
||||||
|
| 6000.0004.0004 | CLI Binary Match Inspection | CLI commands for match inspection |
|
||||||
|
|
||||||
|
**Acceptance:** Container scan produces binary match findings with evidence chain.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
subgraph MVP1["MVP 1: Known-Build Catalog"]
|
||||||
|
S6001[6000.0001.0001<br/>Schema]
|
||||||
|
S6002[6000.0001.0002<br/>Identity Service]
|
||||||
|
S6003[6000.0001.0003<br/>Debian Connector]
|
||||||
|
S6004[6000.0001.0004<br/>Build-ID Lookup]
|
||||||
|
|
||||||
|
S6001 --> S6002
|
||||||
|
S6002 --> S6003
|
||||||
|
S6002 --> S6004
|
||||||
|
S6003 --> S6004
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph MVP2["MVP 2: Patch-Aware"]
|
||||||
|
S6011[6000.0002.0001<br/>Fix Parser]
|
||||||
|
S6012[6000.0002.0002<br/>Fix Index Builder]
|
||||||
|
S6013[6000.0002.0003<br/>Version Comparators]
|
||||||
|
S6014[6000.0002.0004<br/>RPM Connector]
|
||||||
|
|
||||||
|
S6011 --> S6012
|
||||||
|
S6013 --> S6012
|
||||||
|
S6012 --> S6014
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph MVP3["MVP 3: Fingerprints"]
|
||||||
|
S6021[6000.0003.0001<br/>FP Storage]
|
||||||
|
S6022[6000.0003.0002<br/>Ref Build Pipeline]
|
||||||
|
S6023[6000.0003.0003<br/>FP Generator]
|
||||||
|
S6024[6000.0003.0004<br/>Matching Engine]
|
||||||
|
S6025[6000.0003.0005<br/>Validation Corpus]
|
||||||
|
|
||||||
|
S6021 --> S6023
|
||||||
|
S6022 --> S6023
|
||||||
|
S6023 --> S6024
|
||||||
|
S6024 --> S6025
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph MVP4["MVP 4: Integration"]
|
||||||
|
S6031[6000.0004.0001<br/>Scanner Integration]
|
||||||
|
S6032[6000.0004.0002<br/>Findings Ledger]
|
||||||
|
S6033[6000.0004.0003<br/>Attestations]
|
||||||
|
S6034[6000.0004.0004<br/>CLI]
|
||||||
|
|
||||||
|
S6031 --> S6032
|
||||||
|
S6032 --> S6033
|
||||||
|
S6031 --> S6034
|
||||||
|
end
|
||||||
|
|
||||||
|
MVP1 --> MVP2
|
||||||
|
MVP1 --> MVP3
|
||||||
|
MVP2 --> MVP4
|
||||||
|
MVP3 --> MVP4
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Module Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
src/BinaryIndex/
|
||||||
|
├── StellaOps.BinaryIndex.WebService/ # API service
|
||||||
|
├── StellaOps.BinaryIndex.Worker/ # Corpus ingestion worker
|
||||||
|
├── __Libraries/
|
||||||
|
│ ├── StellaOps.BinaryIndex.Core/ # Domain models, interfaces
|
||||||
|
│ ├── StellaOps.BinaryIndex.Persistence/ # PostgreSQL + RustFS
|
||||||
|
│ ├── StellaOps.BinaryIndex.Corpus/ # Corpus connector framework
|
||||||
|
│ ├── StellaOps.BinaryIndex.Corpus.Debian/ # Debian connector
|
||||||
|
│ ├── StellaOps.BinaryIndex.Corpus.Rpm/ # RPM connector
|
||||||
|
│ ├── StellaOps.BinaryIndex.FixIndex/ # Patch-aware fix index
|
||||||
|
│ └── StellaOps.BinaryIndex.Fingerprints/ # Fingerprint generation
|
||||||
|
└── __Tests/
|
||||||
|
├── StellaOps.BinaryIndex.Core.Tests/
|
||||||
|
├── StellaOps.BinaryIndex.Persistence.Tests/
|
||||||
|
├── StellaOps.BinaryIndex.Corpus.Tests/
|
||||||
|
└── StellaOps.BinaryIndex.Integration.Tests/
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key Interfaces
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
// Query interface (consumed by Scanner.Worker)
|
||||||
|
public interface IBinaryVulnerabilityService
|
||||||
|
{
|
||||||
|
Task<ImmutableArray<BinaryVulnMatch>> LookupByIdentityAsync(BinaryIdentity identity, CancellationToken ct);
|
||||||
|
Task<ImmutableArray<BinaryVulnMatch>> LookupByFingerprintAsync(CodeFingerprint fp, CancellationToken ct);
|
||||||
|
Task<FixRecord?> GetFixStatusAsync(string distro, string release, string sourcePkg, string cveId, CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Corpus connector interface
|
||||||
|
public interface IBinaryCorpusConnector
|
||||||
|
{
|
||||||
|
string ConnectorId { get; }
|
||||||
|
Task<CorpusSnapshot> FetchSnapshotAsync(CorpusQuery query, CancellationToken ct);
|
||||||
|
IAsyncEnumerable<ExtractedBinary> ExtractBinariesAsync(PackageReference pkg, CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fix index interface
|
||||||
|
public interface IFixIndexBuilder
|
||||||
|
{
|
||||||
|
Task BuildIndexAsync(DistroRelease distro, CancellationToken ct);
|
||||||
|
Task<FixRecord?> GetFixRecordAsync(string distro, string release, string sourcePkg, string cveId, CancellationToken ct);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Database Schema
|
||||||
|
|
||||||
|
Schema: `binaries`
|
||||||
|
Owner: BinaryIndex module
|
||||||
|
|
||||||
|
**Key Tables:**
|
||||||
|
|
||||||
|
| Table | Purpose |
|
||||||
|
|-------|---------|
|
||||||
|
| `binary_identity` | Known binary identities (Build-ID, hashes) |
|
||||||
|
| `binary_package_map` | Binary → package mapping per snapshot |
|
||||||
|
| `vulnerable_buildids` | Build-IDs known to be vulnerable |
|
||||||
|
| `cve_fix_index` | Patch-aware fix status per distro |
|
||||||
|
| `vulnerable_fingerprints` | Function fingerprints for CVEs |
|
||||||
|
| `fingerprint_matches` | Match results (findings evidence) |
|
||||||
|
|
||||||
|
See: `docs/db/schemas/binaries_schema_specification.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Integration Points
|
||||||
|
|
||||||
|
### Scanner.Worker
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
// During binary extraction
|
||||||
|
var identity = await _featureExtractor.ExtractIdentityAsync(binaryStream, ct);
|
||||||
|
var matches = await _binaryVulnService.LookupByIdentityAsync(identity, ct);
|
||||||
|
|
||||||
|
// If distro known, check fix status
|
||||||
|
var fixStatus = await _binaryVulnService.GetFixStatusAsync(
|
||||||
|
distro, release, sourcePkg, cveId, ct);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Findings Ledger
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public record BinaryVulnerabilityFinding : IFinding
|
||||||
|
{
|
||||||
|
public string MatchType { get; init; } // "fingerprint", "buildid"
|
||||||
|
public string VulnerablePurl { get; init; }
|
||||||
|
public string MatchedSymbol { get; init; }
|
||||||
|
public float Similarity { get; init; }
|
||||||
|
public string[] LinkedCves { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Policy Engine
|
||||||
|
|
||||||
|
New proof segment type: `binary_fingerprint_evidence`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
binaryindex:
|
||||||
|
enabled: true
|
||||||
|
corpus:
|
||||||
|
connectors:
|
||||||
|
- type: debian
|
||||||
|
enabled: true
|
||||||
|
releases: [bookworm, bullseye, jammy, noble]
|
||||||
|
fingerprinting:
|
||||||
|
enabled: true
|
||||||
|
target_components: [openssl, glibc, zlib, curl]
|
||||||
|
lookup:
|
||||||
|
cache_ttl: 3600
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
### MVP 1
|
||||||
|
- [ ] `binaries` schema deployed and migrated
|
||||||
|
- [ ] Debian/Ubuntu corpus ingestion operational
|
||||||
|
- [ ] Build-ID lookup returns CVEs with < 100ms p95 latency
|
||||||
|
|
||||||
|
### MVP 2
|
||||||
|
- [ ] Fix index correctly handles Debian/RHEL backports
|
||||||
|
- [ ] 95%+ accuracy on backport test corpus
|
||||||
|
|
||||||
|
### MVP 3
|
||||||
|
- [ ] Fingerprints generated for OpenSSL, glibc, zlib, curl
|
||||||
|
- [ ] < 5% false positive rate on validation corpus
|
||||||
|
|
||||||
|
### MVP 4
|
||||||
|
- [ ] Scanner produces binary match findings
|
||||||
|
- [ ] DSSE attestations include binary evidence
|
||||||
|
- [ ] CLI `stella binary-matches` command operational
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
- Architecture: `docs/modules/binaryindex/architecture.md`
|
||||||
|
- Schema: `docs/db/schemas/binaries_schema_specification.md`
|
||||||
|
- Advisory: `docs/product-advisories/21-Dec-2025 - Mapping Evidence Within Compiled Binaries.md`
|
||||||
|
- Existing fingerprinting: `src/Scanner/__Libraries/StellaOps.Scanner.EntryTrace/Binary/`
|
||||||
|
- Build-ID indexing: `src/Scanner/StellaOps.Scanner.Analyzers.Native/Index/`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Document Version: 1.0.0*
|
||||||
|
*Created: 2025-12-21*
|
||||||
558
docs/modules/binaryindex/architecture.md
Normal file
558
docs/modules/binaryindex/architecture.md
Normal file
@@ -0,0 +1,558 @@
|
|||||||
|
# BinaryIndex Module Architecture
|
||||||
|
|
||||||
|
> **Ownership:** Scanner Guild + Concelier Guild
|
||||||
|
> **Status:** DRAFT
|
||||||
|
> **Version:** 1.0.0
|
||||||
|
> **Related:** [High-Level Architecture](../../07_HIGH_LEVEL_ARCHITECTURE.md), [Scanner Architecture](../scanner/architecture.md), [Concelier Architecture](../concelier/architecture.md)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Overview
|
||||||
|
|
||||||
|
The **BinaryIndex** module provides a vulnerable binaries database that enables detection of vulnerable code at the binary level, independent of package metadata. This addresses a critical gap in vulnerability scanning: package version strings can lie (backports, custom builds, stripped metadata), but **binary identity doesn't lie**.
|
||||||
|
|
||||||
|
### 1.1 Problem Statement
|
||||||
|
|
||||||
|
Traditional vulnerability scanners rely on package version matching, which fails in several scenarios:
|
||||||
|
|
||||||
|
1. **Backported patches** - Distros backport security fixes without changing upstream version
|
||||||
|
2. **Custom/vendored builds** - Binaries compiled from source without package metadata
|
||||||
|
3. **Stripped binaries** - Debug info and version strings removed
|
||||||
|
4. **Static linking** - Vulnerable library code embedded in final binary
|
||||||
|
5. **Container base images** - Distroless or scratch images with no package DB
|
||||||
|
|
||||||
|
### 1.2 Solution: Binary-First Vulnerability Detection
|
||||||
|
|
||||||
|
BinaryIndex provides three tiers of binary identification:
|
||||||
|
|
||||||
|
| Tier | Method | Precision | Coverage |
|
||||||
|
|------|--------|-----------|----------|
|
||||||
|
| A | Package/version range matching | Medium | High |
|
||||||
|
| B | Build-ID/hash catalog (exact binary identity) | High | Medium |
|
||||||
|
| C | Function fingerprints (CFG/basic-block hashes) | Very High | Targeted |
|
||||||
|
|
||||||
|
### 1.3 Module Scope
|
||||||
|
|
||||||
|
**In Scope:**
|
||||||
|
- Binary identity extraction (Build-ID, PE CodeView GUID, Mach-O UUID)
|
||||||
|
- Binary-to-advisory mapping database
|
||||||
|
- Fingerprint storage and matching engine
|
||||||
|
- Fix index for patch-aware backport handling
|
||||||
|
- Integration with Scanner.Worker for binary lookup
|
||||||
|
|
||||||
|
**Out of Scope:**
|
||||||
|
- Binary disassembly/analysis (provided by Scanner.Analyzers.Native)
|
||||||
|
- Runtime binary tracing (provided by Zastava)
|
||||||
|
- SBOM generation (provided by Scanner)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Architecture
|
||||||
|
|
||||||
|
### 2.1 System Context
|
||||||
|
|
||||||
|
```
|
||||||
|
┌──────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ External Systems │
|
||||||
|
│ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ │
|
||||||
|
│ │ Distro Repos │ │ Debug Symbol │ │ Upstream Source │ │
|
||||||
|
│ │ (Debian, RPM, │ │ Servers │ │ (GitHub, etc.) │ │
|
||||||
|
│ │ Alpine) │ │ (debuginfod) │ │ │ │
|
||||||
|
│ └────────┬────────┘ └────────┬────────┘ └────────┬────────┘ │
|
||||||
|
└───────────│─────────────────────│─────────────────────│──────────────────┘
|
||||||
|
│ │ │
|
||||||
|
v v v
|
||||||
|
┌──────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ BinaryIndex Module │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ Corpus Ingestion Layer │ │
|
||||||
|
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │
|
||||||
|
│ │ │ DebianCorpus │ │ RpmCorpus │ │ AlpineCorpus │ │ │
|
||||||
|
│ │ │ Connector │ │ Connector │ │ Connector │ │ │
|
||||||
|
│ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ v │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ Processing Layer │ │
|
||||||
|
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │
|
||||||
|
│ │ │ BinaryFeature│ │ FixIndex │ │ Fingerprint │ │ │
|
||||||
|
│ │ │ Extractor │ │ Builder │ │ Generator │ │ │
|
||||||
|
│ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ v │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ Storage Layer │ │
|
||||||
|
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │
|
||||||
|
│ │ │ PostgreSQL │ │ RustFS │ │ Valkey │ │ │
|
||||||
|
│ │ │ (binaries │ │ (fingerprint │ │ (lookup │ │ │
|
||||||
|
│ │ │ schema) │ │ blobs) │ │ cache) │ │ │
|
||||||
|
│ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ v │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ Query Layer │ │
|
||||||
|
│ │ ┌──────────────────────────────────────────────────────────────┐ │ │
|
||||||
|
│ │ │ IBinaryVulnerabilityService │ │ │
|
||||||
|
│ │ │ - LookupByBuildIdAsync(buildId) │ │ │
|
||||||
|
│ │ │ - LookupByFingerprintAsync(fingerprint) │ │ │
|
||||||
|
│ │ │ - LookupBatchAsync(identities) │ │ │
|
||||||
|
│ │ │ - GetFixStatusAsync(distro, release, sourcePkg, cve) │ │ │
|
||||||
|
│ │ └──────────────────────────────────────────────────────────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
└──────────────────────────────────────────────────────────────────────────┘
|
||||||
|
│
|
||||||
|
v
|
||||||
|
┌──────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ Consuming Modules │
|
||||||
|
│ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ │
|
||||||
|
│ │ Scanner.Worker │ │ Policy Engine │ │ Findings Ledger │ │
|
||||||
|
│ │ (binary lookup │ │ (evidence in │ │ (match records) │ │
|
||||||
|
│ │ during scan) │ │ proof chain) │ │ │ │
|
||||||
|
│ └─────────────────┘ └─────────────────┘ └─────────────────┘ │
|
||||||
|
└──────────────────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2.2 Component Breakdown
|
||||||
|
|
||||||
|
#### 2.2.1 Corpus Connectors
|
||||||
|
|
||||||
|
Plugin-based connectors that ingest binaries from distribution repositories.
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public interface IBinaryCorpusConnector
|
||||||
|
{
|
||||||
|
string ConnectorId { get; }
|
||||||
|
string[] SupportedDistros { get; }
|
||||||
|
|
||||||
|
Task<CorpusSnapshot> FetchSnapshotAsync(CorpusQuery query, CancellationToken ct);
|
||||||
|
Task<IAsyncEnumerable<ExtractedBinary>> ExtractBinariesAsync(PackageReference pkg, CancellationToken ct);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Implementations:**
|
||||||
|
- `DebianBinaryCorpusConnector` - Debian/Ubuntu packages + debuginfo
|
||||||
|
- `RpmBinaryCorpusConnector` - RHEL/Fedora/CentOS + SRPM
|
||||||
|
- `AlpineBinaryCorpusConnector` - Alpine APK + APKBUILD
|
||||||
|
|
||||||
|
#### 2.2.2 Binary Feature Extractor
|
||||||
|
|
||||||
|
Extracts identity and features from binaries. Reuses existing Scanner.Analyzers.Native capabilities.
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public interface IBinaryFeatureExtractor
|
||||||
|
{
|
||||||
|
Task<BinaryIdentity> ExtractIdentityAsync(Stream binaryStream, CancellationToken ct);
|
||||||
|
Task<BinaryFeatures> ExtractFeaturesAsync(Stream binaryStream, ExtractorOptions opts, CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BinaryIdentity(
|
||||||
|
string Format, // elf, pe, macho
|
||||||
|
string? BuildId, // ELF GNU Build-ID
|
||||||
|
string? PeCodeViewGuid, // PE CodeView GUID + Age
|
||||||
|
string? MachoUuid, // Mach-O LC_UUID
|
||||||
|
string FileSha256,
|
||||||
|
string TextSectionSha256);
|
||||||
|
|
||||||
|
public sealed record BinaryFeatures(
|
||||||
|
BinaryIdentity Identity,
|
||||||
|
string[] DynamicDeps, // DT_NEEDED
|
||||||
|
string[] ExportedSymbols,
|
||||||
|
string[] ImportedSymbols,
|
||||||
|
BinaryHardening Hardening);
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2.2.3 Fix Index Builder
|
||||||
|
|
||||||
|
Builds the patch-aware CVE fix index from distro sources.
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public interface IFixIndexBuilder
|
||||||
|
{
|
||||||
|
Task BuildIndexAsync(DistroRelease distro, CancellationToken ct);
|
||||||
|
Task<FixRecord?> GetFixRecordAsync(string distro, string release, string sourcePkg, string cveId, CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record FixRecord(
|
||||||
|
string Distro,
|
||||||
|
string Release,
|
||||||
|
string SourcePkg,
|
||||||
|
string CveId,
|
||||||
|
FixState State, // fixed, vulnerable, not_affected, wontfix, unknown
|
||||||
|
string? FixedVersion, // Distro version string
|
||||||
|
FixMethod Method, // security_feed, changelog, patch_header
|
||||||
|
decimal Confidence, // 0.00-1.00
|
||||||
|
FixEvidence Evidence);
|
||||||
|
|
||||||
|
public enum FixState { Fixed, Vulnerable, NotAffected, Wontfix, Unknown }
|
||||||
|
public enum FixMethod { SecurityFeed, Changelog, PatchHeader, UpstreamPatchMatch }
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2.2.4 Fingerprint Generator
|
||||||
|
|
||||||
|
Generates function-level fingerprints for vulnerable code detection.
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public interface IVulnFingerprintGenerator
|
||||||
|
{
|
||||||
|
Task<ImmutableArray<VulnFingerprint>> GenerateAsync(
|
||||||
|
string cveId,
|
||||||
|
BinaryPair vulnAndFixed, // Reference builds
|
||||||
|
FingerprintOptions opts,
|
||||||
|
CancellationToken ct);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record VulnFingerprint(
|
||||||
|
string CveId,
|
||||||
|
string Component, // e.g., openssl
|
||||||
|
string Architecture, // x86-64, aarch64
|
||||||
|
FingerprintType Type, // basic_block, cfg, combined
|
||||||
|
string FingerprintId, // e.g., "bb-abc123..."
|
||||||
|
byte[] FingerprintHash, // 16-32 bytes
|
||||||
|
string? FunctionHint, // Function name if known
|
||||||
|
decimal Confidence,
|
||||||
|
FingerprintEvidence Evidence);
|
||||||
|
|
||||||
|
public enum FingerprintType { BasicBlock, ControlFlowGraph, StringReferences, Combined }
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2.2.5 Binary Vulnerability Service
|
||||||
|
|
||||||
|
Main query interface for consumers.
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public interface IBinaryVulnerabilityService
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Look up vulnerabilities by Build-ID or equivalent binary identity.
|
||||||
|
/// </summary>
|
||||||
|
Task<ImmutableArray<BinaryVulnMatch>> LookupByIdentityAsync(
|
||||||
|
BinaryIdentity identity,
|
||||||
|
LookupOptions? opts = null,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Look up vulnerabilities by function fingerprint.
|
||||||
|
/// </summary>
|
||||||
|
Task<ImmutableArray<BinaryVulnMatch>> LookupByFingerprintAsync(
|
||||||
|
CodeFingerprint fingerprint,
|
||||||
|
decimal minSimilarity = 0.95m,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Batch lookup for scan performance.
|
||||||
|
/// </summary>
|
||||||
|
Task<ImmutableDictionary<string, ImmutableArray<BinaryVulnMatch>>> LookupBatchAsync(
|
||||||
|
IEnumerable<BinaryIdentity> identities,
|
||||||
|
LookupOptions? opts = null,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Get distro-specific fix status (patch-aware).
|
||||||
|
/// </summary>
|
||||||
|
Task<FixRecord?> GetFixStatusAsync(
|
||||||
|
string distro,
|
||||||
|
string release,
|
||||||
|
string sourcePkg,
|
||||||
|
string cveId,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed record BinaryVulnMatch(
|
||||||
|
string CveId,
|
||||||
|
string VulnerablePurl,
|
||||||
|
MatchMethod Method, // buildid_catalog, fingerprint_match, range_match
|
||||||
|
decimal Confidence,
|
||||||
|
MatchEvidence Evidence);
|
||||||
|
|
||||||
|
public enum MatchMethod { BuildIdCatalog, FingerprintMatch, RangeMatch }
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Data Model
|
||||||
|
|
||||||
|
### 3.1 PostgreSQL Schema (`binaries`)
|
||||||
|
|
||||||
|
The `binaries` schema stores binary identity, fingerprint, and match data.
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE SCHEMA IF NOT EXISTS binaries;
|
||||||
|
CREATE SCHEMA IF NOT EXISTS binaries_app;
|
||||||
|
|
||||||
|
-- RLS helper
|
||||||
|
CREATE OR REPLACE FUNCTION binaries_app.require_current_tenant()
|
||||||
|
RETURNS TEXT LANGUAGE plpgsql STABLE SECURITY DEFINER AS $$
|
||||||
|
DECLARE v_tenant TEXT;
|
||||||
|
BEGIN
|
||||||
|
v_tenant := current_setting('app.tenant_id', true);
|
||||||
|
IF v_tenant IS NULL OR v_tenant = '' THEN
|
||||||
|
RAISE EXCEPTION 'app.tenant_id session variable not set';
|
||||||
|
END IF;
|
||||||
|
RETURN v_tenant;
|
||||||
|
END;
|
||||||
|
$$;
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3.1.1 Core Tables
|
||||||
|
|
||||||
|
See `docs/db/schemas/binaries_schema_specification.md` for complete DDL.
|
||||||
|
|
||||||
|
**Key Tables:**
|
||||||
|
|
||||||
|
| Table | Purpose |
|
||||||
|
|-------|---------|
|
||||||
|
| `binaries.binary_identity` | Known binary identities (Build-ID, hashes) |
|
||||||
|
| `binaries.binary_package_map` | Binary → package mapping per snapshot |
|
||||||
|
| `binaries.vulnerable_buildids` | Build-IDs known to be vulnerable |
|
||||||
|
| `binaries.vulnerable_fingerprints` | Function fingerprints for CVEs |
|
||||||
|
| `binaries.cve_fix_index` | Patch-aware fix status per distro |
|
||||||
|
| `binaries.fingerprint_matches` | Match results (findings evidence) |
|
||||||
|
| `binaries.corpus_snapshots` | Corpus ingestion tracking |
|
||||||
|
|
||||||
|
### 3.2 RustFS Layout
|
||||||
|
|
||||||
|
```
|
||||||
|
rustfs://stellaops/binaryindex/
|
||||||
|
fingerprints/<algorithm>/<prefix>/<fingerprint_id>.bin
|
||||||
|
corpus/<distro>/<release>/<snapshot_id>/manifest.json
|
||||||
|
corpus/<distro>/<release>/<snapshot_id>/packages/<pkg>.metadata.json
|
||||||
|
evidence/<match_id>.dsse.json
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Integration Points
|
||||||
|
|
||||||
|
### 4.1 Scanner.Worker Integration
|
||||||
|
|
||||||
|
During container scanning, Scanner.Worker queries BinaryIndex for each extracted binary:
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant SW as Scanner.Worker
|
||||||
|
participant BI as BinaryIndex
|
||||||
|
participant PG as PostgreSQL
|
||||||
|
participant FL as Findings Ledger
|
||||||
|
|
||||||
|
SW->>SW: Extract binary from layer
|
||||||
|
SW->>SW: Compute BinaryIdentity
|
||||||
|
SW->>BI: LookupByIdentityAsync(identity)
|
||||||
|
BI->>PG: Query binaries.vulnerable_buildids
|
||||||
|
PG-->>BI: Matches
|
||||||
|
BI->>PG: Query binaries.cve_fix_index (if distro known)
|
||||||
|
PG-->>BI: Fix status
|
||||||
|
BI-->>SW: BinaryVulnMatch[]
|
||||||
|
SW->>FL: RecordFinding(match, evidence)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.2 Concelier Integration
|
||||||
|
|
||||||
|
BinaryIndex subscribes to Concelier's advisory updates:
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant CO as Concelier
|
||||||
|
participant BI as BinaryIndex
|
||||||
|
participant PG as PostgreSQL
|
||||||
|
|
||||||
|
CO->>CO: Ingest new advisory
|
||||||
|
CO->>BI: advisory.created event
|
||||||
|
BI->>BI: Check if affected packages in corpus
|
||||||
|
BI->>PG: Update binaries.binary_vuln_assertion
|
||||||
|
BI->>BI: Queue fingerprint generation (if high-impact)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.3 Policy Integration
|
||||||
|
|
||||||
|
Binary matches are recorded as proof segments:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"segment_type": "binary_fingerprint_evidence",
|
||||||
|
"payload": {
|
||||||
|
"binary_identity": {
|
||||||
|
"format": "elf",
|
||||||
|
"build_id": "abc123...",
|
||||||
|
"file_sha256": "def456..."
|
||||||
|
},
|
||||||
|
"matches": [
|
||||||
|
{
|
||||||
|
"cve_id": "CVE-2024-1234",
|
||||||
|
"method": "buildid_catalog",
|
||||||
|
"confidence": 0.98,
|
||||||
|
"vulnerable_purl": "pkg:deb/debian/libssl3@1.1.1n-0+deb11u3"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. MVP Roadmap
|
||||||
|
|
||||||
|
### MVP 1: Known-Build Binary Catalog (Sprint 6000.0001)
|
||||||
|
|
||||||
|
**Goal:** Query "is this Build-ID vulnerable?" with distro-level precision.
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- `binaries` PostgreSQL schema
|
||||||
|
- Build-ID to package mapping tables
|
||||||
|
- Basic CVE lookup by binary identity
|
||||||
|
- Debian/Ubuntu corpus connector
|
||||||
|
|
||||||
|
### MVP 2: Patch-Aware Backport Handling (Sprint 6000.0002)
|
||||||
|
|
||||||
|
**Goal:** Handle "version says vulnerable but distro backported the fix."
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- Fix index builder (changelog + patch header parsing)
|
||||||
|
- Distro-specific version comparison
|
||||||
|
- RPM corpus connector
|
||||||
|
- Scanner.Worker integration
|
||||||
|
|
||||||
|
### MVP 3: Binary Fingerprint Factory (Sprint 6000.0003)
|
||||||
|
|
||||||
|
**Goal:** Detect vulnerable code independent of package metadata.
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- Fingerprint storage and matching
|
||||||
|
- Reference build generation pipeline
|
||||||
|
- Fingerprint validation corpus
|
||||||
|
- High-impact CVE coverage (OpenSSL, glibc, zlib, curl)
|
||||||
|
|
||||||
|
### MVP 4: Full Scanner Integration (Sprint 6000.0004)
|
||||||
|
|
||||||
|
**Goal:** Binary evidence in production scans.
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- Scanner.Worker binary lookup integration
|
||||||
|
- Findings Ledger binary match records
|
||||||
|
- Proof segment attestations
|
||||||
|
- CLI binary match inspection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Security Considerations
|
||||||
|
|
||||||
|
### 6.1 Trust Boundaries
|
||||||
|
|
||||||
|
1. **Corpus Ingestion** - Packages are untrusted; extraction runs in sandboxed workers
|
||||||
|
2. **Fingerprint Generation** - Reference builds compiled in isolated environments
|
||||||
|
3. **Query API** - Tenant-isolated via RLS; no cross-tenant data leakage
|
||||||
|
|
||||||
|
### 6.2 Signing & Provenance
|
||||||
|
|
||||||
|
- All corpus snapshots are signed (DSSE)
|
||||||
|
- Fingerprint sets are versioned and signed
|
||||||
|
- Every match result references evidence digests
|
||||||
|
|
||||||
|
### 6.3 Sandbox Requirements
|
||||||
|
|
||||||
|
Binary extraction and fingerprint generation MUST run with:
|
||||||
|
- Seccomp profile restricting syscalls
|
||||||
|
- Read-only root filesystem
|
||||||
|
- No network access during analysis
|
||||||
|
- Memory/CPU limits
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Observability
|
||||||
|
|
||||||
|
### 7.1 Metrics
|
||||||
|
|
||||||
|
| Metric | Type | Labels |
|
||||||
|
|--------|------|--------|
|
||||||
|
| `binaryindex_lookup_total` | Counter | method, result |
|
||||||
|
| `binaryindex_lookup_latency_ms` | Histogram | method |
|
||||||
|
| `binaryindex_corpus_packages_total` | Gauge | distro, release |
|
||||||
|
| `binaryindex_fingerprints_indexed` | Gauge | algorithm, component |
|
||||||
|
| `binaryindex_match_confidence` | Histogram | method |
|
||||||
|
|
||||||
|
### 7.2 Traces
|
||||||
|
|
||||||
|
- `binaryindex.lookup` - Full lookup span
|
||||||
|
- `binaryindex.corpus.ingest` - Corpus ingestion
|
||||||
|
- `binaryindex.fingerprint.generate` - Fingerprint generation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Configuration
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# binaryindex.yaml
|
||||||
|
binaryindex:
|
||||||
|
enabled: true
|
||||||
|
|
||||||
|
corpus:
|
||||||
|
connectors:
|
||||||
|
- type: debian
|
||||||
|
enabled: true
|
||||||
|
mirror: http://deb.debian.org/debian
|
||||||
|
releases: [bookworm, bullseye]
|
||||||
|
architectures: [amd64, arm64]
|
||||||
|
- type: ubuntu
|
||||||
|
enabled: true
|
||||||
|
mirror: http://archive.ubuntu.com/ubuntu
|
||||||
|
releases: [jammy, noble]
|
||||||
|
|
||||||
|
fingerprinting:
|
||||||
|
enabled: true
|
||||||
|
algorithms: [basic_block, cfg]
|
||||||
|
target_components:
|
||||||
|
- openssl
|
||||||
|
- glibc
|
||||||
|
- zlib
|
||||||
|
- curl
|
||||||
|
- sqlite
|
||||||
|
min_function_size: 16 # bytes
|
||||||
|
max_functions_per_binary: 10000
|
||||||
|
|
||||||
|
lookup:
|
||||||
|
cache_ttl: 3600
|
||||||
|
batch_size: 100
|
||||||
|
timeout_ms: 5000
|
||||||
|
|
||||||
|
storage:
|
||||||
|
postgres_schema: binaries
|
||||||
|
rustfs_bucket: stellaops/binaryindex
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9. Testing Strategy
|
||||||
|
|
||||||
|
### 9.1 Unit Tests
|
||||||
|
|
||||||
|
- Identity extraction (Build-ID, hashes)
|
||||||
|
- Fingerprint generation determinism
|
||||||
|
- Fix index parsing (changelog, patch headers)
|
||||||
|
|
||||||
|
### 9.2 Integration Tests
|
||||||
|
|
||||||
|
- PostgreSQL schema validation
|
||||||
|
- Full corpus ingestion flow
|
||||||
|
- Scanner.Worker lookup integration
|
||||||
|
|
||||||
|
### 9.3 Regression Tests
|
||||||
|
|
||||||
|
- Known CVE detection (golden corpus)
|
||||||
|
- Backport handling (Debian libssl example)
|
||||||
|
- False positive rate validation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 10. References
|
||||||
|
|
||||||
|
- Advisory: `docs/product-advisories/21-Dec-2025 - Mapping Evidence Within Compiled Binaries.md`
|
||||||
|
- Scanner Native Analysis: `src/Scanner/StellaOps.Scanner.Analyzers.Native/`
|
||||||
|
- Existing Fingerprinting: `src/Scanner/__Libraries/StellaOps.Scanner.EntryTrace/Binary/`
|
||||||
|
- Build-ID Index: `src/Scanner/StellaOps.Scanner.Analyzers.Native/Index/`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Document Version: 1.0.0*
|
||||||
|
*Last Updated: 2025-12-21*
|
||||||
461
docs/modules/gateway/architecture.md
Normal file
461
docs/modules/gateway/architecture.md
Normal file
@@ -0,0 +1,461 @@
|
|||||||
|
# component_architecture_gateway.md — **Stella Ops Gateway** (Sprint 3600)
|
||||||
|
|
||||||
|
> Derived from Reference Architecture Advisory and Router Architecture Specification
|
||||||
|
|
||||||
|
> **Scope.** The Gateway WebService is the single HTTP ingress point for all external traffic. It authenticates requests via Authority (DPoP/mTLS), routes to microservices via the Router binary protocol, aggregates OpenAPI specifications, and enforces tenant isolation.
|
||||||
|
> **Ownership:** Platform Guild
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 0) Mission & Boundaries
|
||||||
|
|
||||||
|
### What Gateway Does
|
||||||
|
|
||||||
|
- **HTTP Ingress**: Single entry point for all external HTTP/HTTPS traffic
|
||||||
|
- **Authentication**: DPoP and mTLS token validation via Authority integration
|
||||||
|
- **Routing**: Routes HTTP requests to microservices via binary protocol (TCP/TLS)
|
||||||
|
- **OpenAPI Aggregation**: Combines endpoint specs from all registered microservices
|
||||||
|
- **Health Aggregation**: Provides unified health status from downstream services
|
||||||
|
- **Rate Limiting**: Per-tenant and per-identity request throttling
|
||||||
|
- **Tenant Propagation**: Extracts tenant context and propagates to microservices
|
||||||
|
|
||||||
|
### What Gateway Does NOT Do
|
||||||
|
|
||||||
|
- **Business Logic**: No domain logic; pure routing and auth
|
||||||
|
- **Data Storage**: Stateless; no persistent state beyond connection cache
|
||||||
|
- **Direct Database Access**: Never connects to PostgreSQL directly
|
||||||
|
- **SBOM/VEX Processing**: Delegates to Scanner, Excititor, etc.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1) Solution & Project Layout
|
||||||
|
|
||||||
|
```
|
||||||
|
src/Gateway/
|
||||||
|
├── StellaOps.Gateway.WebService/
|
||||||
|
│ ├── StellaOps.Gateway.WebService.csproj
|
||||||
|
│ ├── Program.cs # DI bootstrap, transport init
|
||||||
|
│ ├── Dockerfile
|
||||||
|
│ ├── appsettings.json
|
||||||
|
│ ├── appsettings.Development.json
|
||||||
|
│ ├── Configuration/
|
||||||
|
│ │ ├── GatewayOptions.cs # All configuration options
|
||||||
|
│ │ └── TransportOptions.cs # TCP/TLS transport config
|
||||||
|
│ ├── Middleware/
|
||||||
|
│ │ ├── TenantMiddleware.cs # Tenant context extraction
|
||||||
|
│ │ ├── RequestRoutingMiddleware.cs # HTTP → binary routing
|
||||||
|
│ │ ├── AuthenticationMiddleware.cs # DPoP/mTLS validation
|
||||||
|
│ │ └── RateLimitingMiddleware.cs # Per-tenant throttling
|
||||||
|
│ ├── Services/
|
||||||
|
│ │ ├── GatewayHostedService.cs # Transport lifecycle
|
||||||
|
│ │ ├── OpenApiAggregationService.cs # Spec aggregation
|
||||||
|
│ │ └── HealthAggregationService.cs # Downstream health
|
||||||
|
│ └── Endpoints/
|
||||||
|
│ ├── HealthEndpoints.cs # /health/*, /metrics
|
||||||
|
│ └── OpenApiEndpoints.cs # /openapi.json, /openapi.yaml
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dependencies
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<ItemGroup>
|
||||||
|
<ProjectReference Include="..\..\__Libraries\StellaOps.Router.Gateway\..." />
|
||||||
|
<ProjectReference Include="..\..\__Libraries\StellaOps.Router.Transport.Tcp\..." />
|
||||||
|
<ProjectReference Include="..\..\__Libraries\StellaOps.Router.Transport.Tls\..." />
|
||||||
|
<ProjectReference Include="..\..\Auth\StellaOps.Auth.ServerIntegration\..." />
|
||||||
|
</ItemGroup>
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2) External Dependencies
|
||||||
|
|
||||||
|
| Dependency | Purpose | Required |
|
||||||
|
|------------|---------|----------|
|
||||||
|
| **Authority** | OpTok validation, DPoP/mTLS | Yes |
|
||||||
|
| **Router.Gateway** | Routing state, endpoint discovery | Yes |
|
||||||
|
| **Router.Transport.Tcp** | Binary transport (dev) | Yes |
|
||||||
|
| **Router.Transport.Tls** | Binary transport (prod) | Yes |
|
||||||
|
| **Valkey/Redis** | Rate limiting state | Optional |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3) Contracts & Data Model
|
||||||
|
|
||||||
|
### Request Flow
|
||||||
|
|
||||||
|
```
|
||||||
|
┌──────────────┐ HTTPS ┌─────────────────┐ Binary ┌─────────────────┐
|
||||||
|
│ Client │ ─────────────► │ Gateway │ ────────────► │ Microservice │
|
||||||
|
│ (CLI/UI) │ │ WebService │ Frame │ (Scanner, │
|
||||||
|
│ │ ◄───────────── │ │ ◄──────────── │ Policy, etc) │
|
||||||
|
└──────────────┘ HTTPS └─────────────────┘ Binary └─────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Binary Frame Protocol
|
||||||
|
|
||||||
|
Gateway uses the Router binary protocol for internal communication:
|
||||||
|
|
||||||
|
| Frame Type | Purpose |
|
||||||
|
|------------|---------|
|
||||||
|
| HELLO | Microservice registration with endpoints |
|
||||||
|
| HEARTBEAT | Health check and latency measurement |
|
||||||
|
| REQUEST | HTTP request serialized to binary |
|
||||||
|
| RESPONSE | HTTP response serialized from binary |
|
||||||
|
| STREAM_DATA | Streaming response chunks |
|
||||||
|
| CANCEL | Request cancellation propagation |
|
||||||
|
|
||||||
|
### Endpoint Descriptor
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public sealed class EndpointDescriptor
|
||||||
|
{
|
||||||
|
public required string Method { get; init; } // GET, POST, etc.
|
||||||
|
public required string Path { get; init; } // /api/v1/scans/{id}
|
||||||
|
public required string ServiceName { get; init; } // scanner
|
||||||
|
public required string Version { get; init; } // 1.0.0
|
||||||
|
public TimeSpan DefaultTimeout { get; init; } // 30s
|
||||||
|
public bool SupportsStreaming { get; init; } // true for large responses
|
||||||
|
public IReadOnlyList<ClaimRequirement> RequiringClaims { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Routing State
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public interface IRoutingStateManager
|
||||||
|
{
|
||||||
|
ValueTask RegisterEndpointsAsync(ConnectionState conn, HelloPayload hello);
|
||||||
|
ValueTask<InstanceSelection?> SelectInstanceAsync(string method, string path);
|
||||||
|
ValueTask UpdateHealthAsync(ConnectionState conn, HeartbeatPayload heartbeat);
|
||||||
|
ValueTask DrainConnectionAsync(string connectionId);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4) REST API
|
||||||
|
|
||||||
|
Gateway exposes minimal management endpoints; all business APIs are routed to microservices.
|
||||||
|
|
||||||
|
### Health Endpoints
|
||||||
|
|
||||||
|
| Endpoint | Auth | Description |
|
||||||
|
|----------|------|-------------|
|
||||||
|
| `GET /health/live` | None | Liveness probe |
|
||||||
|
| `GET /health/ready` | None | Readiness probe |
|
||||||
|
| `GET /health/startup` | None | Startup probe |
|
||||||
|
| `GET /metrics` | None | Prometheus metrics |
|
||||||
|
|
||||||
|
### OpenAPI Endpoints
|
||||||
|
|
||||||
|
| Endpoint | Auth | Description |
|
||||||
|
|----------|------|-------------|
|
||||||
|
| `GET /openapi.json` | None | Aggregated OpenAPI 3.1.0 spec |
|
||||||
|
| `GET /openapi.yaml` | None | YAML format spec |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5) Execution Flow
|
||||||
|
|
||||||
|
### Request Routing
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant C as Client
|
||||||
|
participant G as Gateway
|
||||||
|
participant A as Authority
|
||||||
|
participant M as Microservice
|
||||||
|
|
||||||
|
C->>G: HTTPS Request + DPoP Token
|
||||||
|
G->>A: Validate Token
|
||||||
|
A-->>G: Claims (sub, tid, scope)
|
||||||
|
G->>G: Select Instance (Method, Path)
|
||||||
|
G->>M: Binary REQUEST Frame
|
||||||
|
M-->>G: Binary RESPONSE Frame
|
||||||
|
G-->>C: HTTPS Response
|
||||||
|
```
|
||||||
|
|
||||||
|
### Microservice Registration
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant M as Microservice
|
||||||
|
participant G as Gateway
|
||||||
|
|
||||||
|
M->>G: TCP/TLS Connect
|
||||||
|
M->>G: HELLO (ServiceName, Version, Endpoints)
|
||||||
|
G->>G: Register Endpoints
|
||||||
|
G-->>M: HELLO ACK
|
||||||
|
|
||||||
|
loop Every 10s
|
||||||
|
G->>M: HEARTBEAT
|
||||||
|
M-->>G: HEARTBEAT (latency, health)
|
||||||
|
G->>G: Update Health State
|
||||||
|
end
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6) Instance Selection Algorithm
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public ValueTask<InstanceSelection?> SelectInstanceAsync(string method, string path)
|
||||||
|
{
|
||||||
|
// 1. Find all endpoints matching (method, path)
|
||||||
|
var candidates = _endpoints
|
||||||
|
.Where(e => e.Method == method && MatchPath(e.Path, path))
|
||||||
|
.ToList();
|
||||||
|
|
||||||
|
// 2. Filter by health
|
||||||
|
candidates = candidates
|
||||||
|
.Where(c => c.Health is InstanceHealthStatus.Healthy or InstanceHealthStatus.Degraded)
|
||||||
|
.ToList();
|
||||||
|
|
||||||
|
// 3. Region preference
|
||||||
|
var localRegion = candidates.Where(c => c.Region == _config.Region).ToList();
|
||||||
|
var neighborRegions = candidates.Where(c => _config.NeighborRegions.Contains(c.Region)).ToList();
|
||||||
|
var otherRegions = candidates.Except(localRegion).Except(neighborRegions).ToList();
|
||||||
|
|
||||||
|
var preferred = localRegion.Any() ? localRegion
|
||||||
|
: neighborRegions.Any() ? neighborRegions
|
||||||
|
: otherRegions;
|
||||||
|
|
||||||
|
// 4. Within tier: prefer lower latency, then most recent heartbeat
|
||||||
|
return preferred
|
||||||
|
.OrderBy(c => c.AveragePingMs)
|
||||||
|
.ThenByDescending(c => c.LastHeartbeatUtc)
|
||||||
|
.FirstOrDefault();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7) Configuration
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
gateway:
|
||||||
|
node:
|
||||||
|
region: "eu1"
|
||||||
|
nodeId: "gw-eu1-01"
|
||||||
|
environment: "prod"
|
||||||
|
|
||||||
|
transports:
|
||||||
|
tcp:
|
||||||
|
enabled: true
|
||||||
|
port: 9100
|
||||||
|
maxConnections: 1000
|
||||||
|
receiveBufferSize: 65536
|
||||||
|
sendBufferSize: 65536
|
||||||
|
tls:
|
||||||
|
enabled: true
|
||||||
|
port: 9443
|
||||||
|
certificatePath: "/certs/gateway.pfx"
|
||||||
|
certificatePassword: "${GATEWAY_CERT_PASSWORD}"
|
||||||
|
clientCertificateMode: "RequireCertificate"
|
||||||
|
allowedClientCertificateThumbprints: []
|
||||||
|
|
||||||
|
routing:
|
||||||
|
defaultTimeout: "30s"
|
||||||
|
maxRequestBodySize: "100MB"
|
||||||
|
streamingEnabled: true
|
||||||
|
streamingBufferSize: 16384
|
||||||
|
neighborRegions: ["eu2", "us1"]
|
||||||
|
|
||||||
|
auth:
|
||||||
|
dpopEnabled: true
|
||||||
|
dpopMaxClockSkew: "60s"
|
||||||
|
mtlsEnabled: true
|
||||||
|
rateLimiting:
|
||||||
|
enabled: true
|
||||||
|
requestsPerMinute: 1000
|
||||||
|
burstSize: 100
|
||||||
|
redisConnectionString: "${REDIS_URL}"
|
||||||
|
|
||||||
|
openapi:
|
||||||
|
enabled: true
|
||||||
|
cacheTtlSeconds: 300
|
||||||
|
title: "Stella Ops API"
|
||||||
|
version: "1.0.0"
|
||||||
|
|
||||||
|
health:
|
||||||
|
heartbeatIntervalSeconds: 10
|
||||||
|
heartbeatTimeoutSeconds: 30
|
||||||
|
unhealthyThreshold: 3
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8) Scale & Performance
|
||||||
|
|
||||||
|
| Metric | Target | Notes |
|
||||||
|
|--------|--------|-------|
|
||||||
|
| Routing latency (P50) | <2ms | Overhead only; excludes downstream |
|
||||||
|
| Routing latency (P99) | <5ms | Under normal load |
|
||||||
|
| Concurrent connections | 10,000 | Per gateway instance |
|
||||||
|
| Requests/second | 50,000 | Per gateway instance |
|
||||||
|
| Memory footprint | <512MB | Base; scales with connections |
|
||||||
|
|
||||||
|
### Scaling Strategy
|
||||||
|
|
||||||
|
- Horizontal scaling behind load balancer
|
||||||
|
- Sticky sessions NOT required (stateless)
|
||||||
|
- Regional deployment for latency optimization
|
||||||
|
- Rate limiting via distributed Valkey/Redis
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9) Security Posture
|
||||||
|
|
||||||
|
### Authentication
|
||||||
|
|
||||||
|
| Method | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| DPoP | Proof-of-possession tokens from Authority |
|
||||||
|
| mTLS | Certificate-bound tokens for machine clients |
|
||||||
|
|
||||||
|
### Authorization
|
||||||
|
|
||||||
|
- Claims-based authorization per endpoint
|
||||||
|
- Required claims defined in endpoint descriptors
|
||||||
|
- Tenant isolation via `tid` claim
|
||||||
|
|
||||||
|
### Transport Security
|
||||||
|
|
||||||
|
| Component | Encryption |
|
||||||
|
|-----------|------------|
|
||||||
|
| Client → Gateway | TLS 1.3 (HTTPS) |
|
||||||
|
| Gateway → Microservices | TLS (prod), TCP (dev only) |
|
||||||
|
|
||||||
|
### Rate Limiting
|
||||||
|
|
||||||
|
- Per-tenant: Configurable requests/minute
|
||||||
|
- Per-identity: Burst protection
|
||||||
|
- Global: Circuit breaker for overload
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 10) Observability & Audit
|
||||||
|
|
||||||
|
### Metrics (Prometheus)
|
||||||
|
|
||||||
|
```
|
||||||
|
gateway_requests_total{service,method,path,status}
|
||||||
|
gateway_request_duration_seconds{service,method,path,quantile}
|
||||||
|
gateway_active_connections{service}
|
||||||
|
gateway_transport_frames_total{type}
|
||||||
|
gateway_auth_failures_total{reason}
|
||||||
|
gateway_rate_limit_exceeded_total{tenant}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Traces (OpenTelemetry)
|
||||||
|
|
||||||
|
- Span per request: `gateway.route`
|
||||||
|
- Child span: `gateway.auth.validate`
|
||||||
|
- Child span: `gateway.transport.send`
|
||||||
|
|
||||||
|
### Logs (Structured)
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"timestamp": "2025-12-21T10:00:00Z",
|
||||||
|
"level": "info",
|
||||||
|
"message": "Request routed",
|
||||||
|
"correlationId": "abc123",
|
||||||
|
"tenantId": "tenant-1",
|
||||||
|
"method": "GET",
|
||||||
|
"path": "/api/v1/scans/xyz",
|
||||||
|
"service": "scanner",
|
||||||
|
"durationMs": 45,
|
||||||
|
"status": 200
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 11) Testing Matrix
|
||||||
|
|
||||||
|
| Test Type | Scope | Coverage Target |
|
||||||
|
|-----------|-------|-----------------|
|
||||||
|
| Unit | Routing algorithm, auth validation | 90% |
|
||||||
|
| Integration | Transport + routing flow | 80% |
|
||||||
|
| E2E | Full request path with mock services | Key flows |
|
||||||
|
| Performance | Latency, throughput, connection limits | SLO targets |
|
||||||
|
| Chaos | Connection failures, microservice crashes | Resilience |
|
||||||
|
|
||||||
|
### Test Fixtures
|
||||||
|
|
||||||
|
- `StellaOps.Router.Transport.InMemory` for transport mocking
|
||||||
|
- Mock Authority for auth testing
|
||||||
|
- `WebApplicationFactory` for integration tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 12) DevOps & Operations
|
||||||
|
|
||||||
|
### Deployment
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# Kubernetes deployment excerpt
|
||||||
|
apiVersion: apps/v1
|
||||||
|
kind: Deployment
|
||||||
|
metadata:
|
||||||
|
name: gateway
|
||||||
|
spec:
|
||||||
|
replicas: 3
|
||||||
|
template:
|
||||||
|
spec:
|
||||||
|
containers:
|
||||||
|
- name: gateway
|
||||||
|
image: stellaops/gateway:1.0.0
|
||||||
|
ports:
|
||||||
|
- containerPort: 8080 # HTTPS
|
||||||
|
- containerPort: 9443 # TLS (microservices)
|
||||||
|
resources:
|
||||||
|
requests:
|
||||||
|
memory: "256Mi"
|
||||||
|
cpu: "250m"
|
||||||
|
limits:
|
||||||
|
memory: "512Mi"
|
||||||
|
cpu: "1000m"
|
||||||
|
livenessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /health/live
|
||||||
|
port: 8080
|
||||||
|
readinessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /health/ready
|
||||||
|
port: 8080
|
||||||
|
```
|
||||||
|
|
||||||
|
### SLOs
|
||||||
|
|
||||||
|
| SLO | Target | Measurement |
|
||||||
|
|-----|--------|-------------|
|
||||||
|
| Availability | 99.9% | Uptime over 30 days |
|
||||||
|
| Latency P99 | <50ms | Includes downstream |
|
||||||
|
| Error rate | <0.1% | 5xx responses |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 13) Roadmap
|
||||||
|
|
||||||
|
| Feature | Sprint | Status |
|
||||||
|
|---------|--------|--------|
|
||||||
|
| Core implementation | 3600.0001.0001 | TODO |
|
||||||
|
| WebSocket support | Future | Planned |
|
||||||
|
| gRPC passthrough | Future | Planned |
|
||||||
|
| GraphQL aggregation | Future | Exploration |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 14) References
|
||||||
|
|
||||||
|
- Router Architecture: `docs/modules/router/architecture.md`
|
||||||
|
- OpenAPI Aggregation: `docs/modules/gateway/openapi.md`
|
||||||
|
- Authority Integration: `docs/modules/authority/architecture.md`
|
||||||
|
- Reference Architecture: `docs/product-advisories/archived/2025-12-21-reference-architecture/`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Last Updated**: 2025-12-21 (Sprint 3600)
|
||||||
223
docs/modules/platform/reference-architecture-card.md
Normal file
223
docs/modules/platform/reference-architecture-card.md
Normal file
@@ -0,0 +1,223 @@
|
|||||||
|
# Stella Ops Reference Architecture Card (Dec 2025)
|
||||||
|
|
||||||
|
> **One-Pager** for product managers, architects, and auditors.
|
||||||
|
> Full specification: `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Topology & Trust Boundaries
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ TRUST BOUNDARY 1 │
|
||||||
|
│ ┌─────────────────┐ │
|
||||||
|
│ │ EDGE LAYER │ StellaRouter (Gateway) / UI │
|
||||||
|
│ │ │ OAuth2/OIDC Authentication │
|
||||||
|
│ └────────┬────────┘ │
|
||||||
|
│ │ Signed credentials/attestations required │
|
||||||
|
├───────────┼─────────────────────────────────────────────────────────────────┤
|
||||||
|
│ ▼ TRUST BOUNDARY 2 │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ CONTROL PLANE │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ │
|
||||||
|
│ │ │Scheduler │ │ Policy │ │Authority │ │ Attestor │ │ │
|
||||||
|
│ │ │ │ │ Engine │ │ │ │ │ │ │
|
||||||
|
│ │ │ Routes │ │ Signed │ │ Keys & │ │ DSSE + │ │ │
|
||||||
|
│ │ │ work │ │ verdicts │ │ identity │ │ Rekor │ │ │
|
||||||
|
│ │ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ ┌──────────────────────────────────────┐ │ │
|
||||||
|
│ │ │ Timeline / Notify │ │ │
|
||||||
|
│ │ │ Immutable audit + notifications │ │ │
|
||||||
|
│ │ └──────────────────────────────────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
│ │ Only blessed evidence/identities influence decisions │
|
||||||
|
├───────────┼─────────────────────────────────────────────────────────────────┤
|
||||||
|
│ ▼ TRUST BOUNDARY 3 │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ EVIDENCE PLANE │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ │
|
||||||
|
│ │ │ Sbomer │ │Excititor │ │Concelier │ │Reachabil-│ │ │
|
||||||
|
│ │ │ │ │ │ │ │ │ity/Sigs │ │ │
|
||||||
|
│ │ │CDX 1.7 / │ │ VEX │ │Advisory │ │ Is vuln │ │ │
|
||||||
|
│ │ │SPDX 3.0.1│ │ claims │ │ feeds │ │reachable?│ │ │
|
||||||
|
│ │ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
│ │ Tamper-evident, separately signed; opinions in Policy only │
|
||||||
|
├───────────┼─────────────────────────────────────────────────────────────────┤
|
||||||
|
│ ▼ TRUST BOUNDARY 4 │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
||||||
|
│ │ DATA PLANE │ │
|
||||||
|
│ │ │ │
|
||||||
|
│ │ ┌──────────────────────────────────────────────────────────────┐ │ │
|
||||||
|
│ │ │ Workers / Scanners │ │ │
|
||||||
|
│ │ │ Pull tasks → compute → emit artifacts + attestations │ │ │
|
||||||
|
│ │ │ Isolated per tenant; outputs tied to inputs cryptographically│ │ │
|
||||||
|
│ │ └──────────────────────────────────────────────────────────────┘ │ │
|
||||||
|
│ └─────────────────────────────────────────────────────────────────────┘ │
|
||||||
|
└─────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Artifact Association (OCI Referrers)
|
||||||
|
|
||||||
|
```
|
||||||
|
Image Digest (Subject)
|
||||||
|
│
|
||||||
|
├──► SBOM (CycloneDX 1.7 / SPDX 3.0.1)
|
||||||
|
│ └──► DSSE Attestation
|
||||||
|
│ └──► Rekor Log Entry
|
||||||
|
│
|
||||||
|
├──► VEX Claims
|
||||||
|
│ └──► DSSE Attestation
|
||||||
|
│
|
||||||
|
├──► Reachability Subgraph
|
||||||
|
│ └──► DSSE Attestation
|
||||||
|
│
|
||||||
|
└──► Policy Verdict
|
||||||
|
└──► DSSE Attestation
|
||||||
|
└──► Rekor Log Entry
|
||||||
|
```
|
||||||
|
|
||||||
|
- Every artifact is a **subject** in the registry
|
||||||
|
- SBOMs, VEX, verdicts attached as **OCI referrers**
|
||||||
|
- Multiple versioned, signed facts per image without altering the image
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Data Flows
|
||||||
|
|
||||||
|
### Evidence Flow
|
||||||
|
|
||||||
|
```
|
||||||
|
Workers ──► SBOM (CDX 1.7) ──► DSSE Sign ──► OCI Referrer ──► Registry
|
||||||
|
├─► VEX Claims ──► DSSE Sign ──► OCI Referrer ──►
|
||||||
|
├─► Reachability ──► DSSE Sign ──► OCI Referrer ──►
|
||||||
|
└─► All wrapped as in-toto attestations
|
||||||
|
```
|
||||||
|
|
||||||
|
### Verdict Flow
|
||||||
|
|
||||||
|
```
|
||||||
|
Policy Engine ──► Ingests SBOM/VEX/Reachability/Signals
|
||||||
|
──► Applies rules (deterministic IR)
|
||||||
|
──► Emits signed verdict
|
||||||
|
──► Verdict attached via OCI referrer
|
||||||
|
──► Replayable: same inputs → same output
|
||||||
|
```
|
||||||
|
|
||||||
|
### Audit Flow
|
||||||
|
|
||||||
|
```
|
||||||
|
Timeline ──► Captures all events (immutable)
|
||||||
|
──► Links to attestation digests
|
||||||
|
──► Enables replay and forensics
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tenant Isolation
|
||||||
|
|
||||||
|
| Layer | Mechanism |
|
||||||
|
|-------|-----------|
|
||||||
|
| Database | PostgreSQL RLS (Row-Level Security) |
|
||||||
|
| Application | AsyncLocal tenant context |
|
||||||
|
| Storage | Tenant-scoped paths |
|
||||||
|
| Crypto | Per-tenant keys & trust roots |
|
||||||
|
| Network | Tenant header propagation |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Minimal Day-1 Policy
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
rules:
|
||||||
|
# Block reachable HIGH/CRITICAL unless VEX says not_affected
|
||||||
|
- match: { severity: [CRITICAL, HIGH], reachability: reachable }
|
||||||
|
unless: { vexStatus: not_affected }
|
||||||
|
action: block
|
||||||
|
|
||||||
|
# Fail on >5% unknowns
|
||||||
|
- match: { unknownsRatio: { gt: 0.05 } }
|
||||||
|
action: block
|
||||||
|
|
||||||
|
# Require signed SBOM + verdict for production
|
||||||
|
- match: { environment: production }
|
||||||
|
require: { signedSbom: true, signedVerdict: true }
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SBOM Format Support
|
||||||
|
|
||||||
|
| Format | Generation | Parsing | Notes |
|
||||||
|
|--------|------------|---------|-------|
|
||||||
|
| CycloneDX 1.7 | Yes | Yes | Primary format |
|
||||||
|
| CycloneDX 1.6 | - | Yes | Backward compat |
|
||||||
|
| SPDX 3.0.1 | Yes | Yes | Alternative format |
|
||||||
|
| SPDX 2.x | - | Yes | Import only |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key Capabilities
|
||||||
|
|
||||||
|
| Capability | Status | Notes |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| Deterministic SBOMs | Complete | Same input → same output |
|
||||||
|
| Signed Verdicts | Complete | DSSE + in-toto |
|
||||||
|
| Replayable Verdicts | Complete | Content-addressed proofs |
|
||||||
|
| OCI Referrers | Complete | Subject digest model |
|
||||||
|
| Rekor Transparency | Complete | v2 tile-backed |
|
||||||
|
| Tenant Isolation | Complete | RLS + crypto separation |
|
||||||
|
| Air-Gap Operation | Complete | Offline bundles |
|
||||||
|
| CycloneDX 1.7 | Planned | Sprint 3600.0002 |
|
||||||
|
| SPDX 3.0.1 Generation | Planned | Sprint 3600.0003 |
|
||||||
|
| Gateway WebService | Planned | Sprint 3600.0001 |
|
||||||
|
| Proof Chain UI | Planned | Sprint 4200.0001 |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick Glossary
|
||||||
|
|
||||||
|
| Term | Definition |
|
||||||
|
|------|------------|
|
||||||
|
| **SBOM** | Software Bill of Materials (what's inside) |
|
||||||
|
| **VEX** | Vulnerability Exploitability eXchange (is CVE relevant?) |
|
||||||
|
| **Reachability** | Graph proof that vulnerable code is (not) callable |
|
||||||
|
| **DSSE** | Dead Simple Signing Envelope |
|
||||||
|
| **in-toto** | Supply chain attestation framework |
|
||||||
|
| **OCI Referrers** | Registry mechanism to link artifacts to image digest |
|
||||||
|
| **OpTok** | Short-lived operation token from Authority |
|
||||||
|
| **DPoP** | Demonstrating Proof of Possession (RFC 9449) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Sprints
|
||||||
|
|
||||||
|
| Sprint | Title | Priority |
|
||||||
|
|--------|-------|----------|
|
||||||
|
| 3600.0001.0001 | Gateway WebService | HIGH |
|
||||||
|
| 3600.0002.0001 | CycloneDX 1.7 Upgrade | HIGH |
|
||||||
|
| 3600.0003.0001 | SPDX 3.0.1 Generation | MEDIUM |
|
||||||
|
| 4200.0001.0001 | Proof Chain Verification UI | HIGH |
|
||||||
|
| 5200.0001.0001 | Starter Policy Template | HIGH |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Audit Checklist
|
||||||
|
|
||||||
|
- [ ] All SBOMs have DSSE signatures
|
||||||
|
- [ ] All verdicts have DSSE signatures
|
||||||
|
- [ ] Rekor log entries exist for production artifacts
|
||||||
|
- [ ] Tenant isolation verified (RLS + crypto)
|
||||||
|
- [ ] Replay tokens verify (same inputs → same verdict)
|
||||||
|
- [ ] Air-gap bundles include all evidence
|
||||||
|
- [ ] OCI referrers discoverable for all images
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Source**: Reference Architecture Advisory (Dec 2025)
|
||||||
|
**Last Updated**: 2025-12-21
|
||||||
Binary file not shown.
@@ -0,0 +1,81 @@
|
|||||||
|
# Archived Advisory: Mapping Evidence Within Compiled Binaries
|
||||||
|
|
||||||
|
**Original Advisory:** `21-Dec-2025 - Mapping Evidence Within Compiled Binaries.md`
|
||||||
|
**Archived:** 2025-12-21
|
||||||
|
**Status:** Converted to Implementation Plan
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
This advisory proposed building a **Vulnerable Binaries Database** that enables detection of vulnerable code at the binary level, independent of package metadata.
|
||||||
|
|
||||||
|
## Implementation Artifacts Created
|
||||||
|
|
||||||
|
### Architecture Documentation
|
||||||
|
|
||||||
|
- `docs/modules/binaryindex/architecture.md` - Full module architecture
|
||||||
|
- `docs/db/schemas/binaries_schema_specification.md` - Database schema
|
||||||
|
|
||||||
|
### Sprint Files
|
||||||
|
|
||||||
|
**Summary:**
|
||||||
|
- `docs/implplan/SPRINT_6000_SUMMARY.md` - MVP roadmap overview
|
||||||
|
|
||||||
|
**MVP 1: Known-Build Binary Catalog (Sprint 6000.0001)**
|
||||||
|
- `SPRINT_6000_0001_0001_binaries_schema.md` - PostgreSQL schema
|
||||||
|
- `SPRINT_6000_0001_0002_binary_identity_service.md` - Identity extraction
|
||||||
|
- `SPRINT_6000_0001_0003_debian_corpus_connector.md` - Debian/Ubuntu ingestion
|
||||||
|
|
||||||
|
**MVP 2: Patch-Aware Backport Handling (Sprint 6000.0002)**
|
||||||
|
- `SPRINT_6000_0002_0001_fix_evidence_parser.md` - Changelog/patch parsing
|
||||||
|
|
||||||
|
**MVP 3: Binary Fingerprint Factory (Sprint 6000.0003)**
|
||||||
|
- `SPRINT_6000_0003_0001_fingerprint_storage.md` - Fingerprint storage
|
||||||
|
|
||||||
|
**MVP 4: Scanner Integration (Sprint 6000.0004)**
|
||||||
|
- `SPRINT_6000_0004_0001_scanner_integration.md` - Scanner.Worker integration
|
||||||
|
|
||||||
|
## Key Decisions
|
||||||
|
|
||||||
|
| Decision | Rationale |
|
||||||
|
|----------|-----------|
|
||||||
|
| New `BinaryIndex` module | Binary vulnerability DB is distinct concern from Scanner |
|
||||||
|
| Build-ID as primary key | Most deterministic identifier for ELF binaries |
|
||||||
|
| `binaries` PostgreSQL schema | Aligns with existing per-module schema pattern |
|
||||||
|
| Three-tier lookup | Assertions → Build-ID → Fingerprints for precision |
|
||||||
|
| Patch-aware fix index | Handles distro backports correctly |
|
||||||
|
|
||||||
|
## Module Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
src/BinaryIndex/
|
||||||
|
├── StellaOps.BinaryIndex.WebService/
|
||||||
|
├── StellaOps.BinaryIndex.Worker/
|
||||||
|
├── __Libraries/
|
||||||
|
│ ├── StellaOps.BinaryIndex.Core/
|
||||||
|
│ ├── StellaOps.BinaryIndex.Persistence/
|
||||||
|
│ ├── StellaOps.BinaryIndex.Corpus/
|
||||||
|
│ ├── StellaOps.BinaryIndex.Corpus.Debian/
|
||||||
|
│ ├── StellaOps.BinaryIndex.FixIndex/
|
||||||
|
│ └── StellaOps.BinaryIndex.Fingerprints/
|
||||||
|
└── __Tests/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Tables
|
||||||
|
|
||||||
|
| Table | Purpose |
|
||||||
|
|-------|---------|
|
||||||
|
| `binaries.binary_identity` | Known binary identities |
|
||||||
|
| `binaries.binary_package_map` | Binary → package mapping |
|
||||||
|
| `binaries.vulnerable_buildids` | Vulnerable Build-IDs |
|
||||||
|
| `binaries.cve_fix_index` | Patch-aware fix status |
|
||||||
|
| `binaries.vulnerable_fingerprints` | Function fingerprints |
|
||||||
|
| `binaries.fingerprint_matches` | Scan match results |
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
- Original advisory: This folder
|
||||||
|
- Architecture: `docs/modules/binaryindex/architecture.md`
|
||||||
|
- Schema: `docs/db/schemas/binaries_schema_specification.md`
|
||||||
|
- Sprints: `docs/implplan/SPRINT_6000_*.md`
|
||||||
@@ -0,0 +1,97 @@
|
|||||||
|
# MOAT Gap Closure Archive Manifest
|
||||||
|
|
||||||
|
**Archive Date**: 2025-12-21
|
||||||
|
**Archive Reason**: Product advisories processed and implementation gaps identified
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
This archive contains 12 MOAT (Market-Oriented Architecture Transformation) product advisories that were analyzed against the StellaOps codebase. After thorough source code exploration, the implementation coverage was assessed at **~92%**.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Coverage
|
||||||
|
|
||||||
|
| Advisory Topic | Coverage | Notes |
|
||||||
|
|---------------|----------|-------|
|
||||||
|
| CVSS and Competitive Analysis | 100% | Full CVSS v4 engine, all attack complexity metrics |
|
||||||
|
| Determinism and Reproducibility | 100% | Stable ordering, hash chains, replayTokens, NDJSON |
|
||||||
|
| Developer Onboarding | 100% | AGENTS.md files, CLAUDE.md, module dossiers |
|
||||||
|
| Offline and Air-Gap | 100% | Bundle system, egress allowlists, offline sources |
|
||||||
|
| PostgreSQL Patterns | 100% | RLS, tenant isolation, schema per module |
|
||||||
|
| Proof and Evidence Chain | 100% | ProofSpine, DSSE envelopes, hash chaining |
|
||||||
|
| Reachability Analysis | 100% | CallGraphAnalyzer, AttackPathScorer, CodePathResult |
|
||||||
|
| Rekor Integration | 100% | RekorClient, transparency log publishing |
|
||||||
|
| Smart-Diff | 100% | MaterialRiskChangeDetector, hash-based diffing |
|
||||||
|
| Testing and Quality Guardrails | 100% | Testcontainers, benchmarks, truth schemas |
|
||||||
|
| UX and Time-to-Evidence | 100% | EvidencePanel, keyboard shortcuts, motion tokens |
|
||||||
|
| Triage and Unknowns | 75% | UnknownRanker exists, missing decay/containment |
|
||||||
|
|
||||||
|
**Overall**: ~92% implementation coverage
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Identified Gaps & Sprint References
|
||||||
|
|
||||||
|
Three implementation gaps were identified and documented in sprints:
|
||||||
|
|
||||||
|
### Gap 1: Decay Algorithm (Sprint 4000.0001.0001)
|
||||||
|
- **File**: `docs/implplan/SPRINT_4000_0001_0001_unknowns_decay_algorithm.md`
|
||||||
|
- **Scope**: Add time-based decay factor to UnknownRanker
|
||||||
|
- **Story Points**: 15
|
||||||
|
- **Working Directory**: `src/Policy/__Libraries/StellaOps.Policy.Unknowns/`
|
||||||
|
|
||||||
|
### Gap 2: BlastRadius & Containment (Sprint 4000.0001.0002)
|
||||||
|
- **File**: `docs/implplan/SPRINT_4000_0001_0002_unknowns_blast_radius_containment.md`
|
||||||
|
- **Scope**: Add BlastRadius and ContainmentSignals to ranking
|
||||||
|
- **Story Points**: 19
|
||||||
|
- **Working Directory**: `src/Policy/__Libraries/StellaOps.Policy.Unknowns/`
|
||||||
|
|
||||||
|
### Gap 3: EPSS Feed Connector (Sprint 4000.0002.0001)
|
||||||
|
- **File**: `docs/implplan/SPRINT_4000_0002_0001_epss_feed_connector.md`
|
||||||
|
- **Scope**: Create Concelier connector for orchestrated EPSS ingestion
|
||||||
|
- **Story Points**: 22
|
||||||
|
- **Working Directory**: `src/Concelier/__Libraries/StellaOps.Concelier.Connector.Epss/`
|
||||||
|
|
||||||
|
**Total Gap Closure Effort**: 56 story points
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Archived Files (12)
|
||||||
|
|
||||||
|
1. `14-Dec-2025 - CVSS and Competitive Analysis Technical Reference.md`
|
||||||
|
2. `14-Dec-2025 - Determinism and Reproducibility Technical Reference.md`
|
||||||
|
3. `14-Dec-2025 - Developer Onboarding Technical Reference.md`
|
||||||
|
4. `14-Dec-2025 - Offline and Air-Gap Technical Reference.md`
|
||||||
|
5. `14-Dec-2025 - PostgreSQL Patterns Technical Reference.md`
|
||||||
|
6. `14-Dec-2025 - Proof and Evidence Chain Technical Reference.md`
|
||||||
|
7. `14-Dec-2025 - Reachability Analysis Technical Reference.md`
|
||||||
|
8. `14-Dec-2025 - Rekor Integration Technical Reference.md`
|
||||||
|
9. `14-Dec-2025 - Smart-Diff Technical Reference.md`
|
||||||
|
10. `14-Dec-2025 - Testing and Quality Guardrails Technical Reference.md`
|
||||||
|
11. `14-Dec-2025 - Triage and Unknowns Technical Reference.md`
|
||||||
|
12. `14-Dec-2025 - UX and Time-to-Evidence Technical Reference.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key Discoveries
|
||||||
|
|
||||||
|
Features that were discovered to exist with different naming than expected:
|
||||||
|
|
||||||
|
| Expected | Actual Implementation |
|
||||||
|
|----------|----------------------|
|
||||||
|
| FipsProfile, GostProfile, SmProfile | ComplianceProfiles (unified) |
|
||||||
|
| FindingsLedger.HashChain | Exists in FindingsSnapshot with replayTokens |
|
||||||
|
| Benchmark suite | Exists in `__Benchmarks/` directories |
|
||||||
|
| EvidencePanel | Exists in Web UI with motion tokens |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Post-Closure Target
|
||||||
|
|
||||||
|
After completing the three gap-closure sprints:
|
||||||
|
- Implementation coverage: **95%+**
|
||||||
|
- All advisory requirements addressed
|
||||||
|
- Triage/Unknowns module fully featured
|
||||||
|
|
||||||
@@ -0,0 +1,146 @@
|
|||||||
|
# MOAT Phase 2 Archive Manifest
|
||||||
|
|
||||||
|
**Archive Date**: 2025-12-21
|
||||||
|
**Archive Reason**: Product advisories processed and implementation gaps identified
|
||||||
|
**Epoch**: 4100 (MOAT Phase 2 - Governance & Replay)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
This archive contains 11 MOAT (Market-Oriented Architecture Transformation) product advisories from 19-Dec and 20-Dec 2025 that were analyzed against the StellaOps codebase. After thorough source code exploration, the implementation coverage was assessed at **~65% baseline** with sprints planned to reach **~90% target**.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Gap Analysis (from 65% baseline)
|
||||||
|
|
||||||
|
| Area | Current | Target | Gap |
|
||||||
|
|------|---------|--------|-----|
|
||||||
|
| Security Snapshots & Deltas | 55% | 90% | Unified snapshot, DeltaVerdict |
|
||||||
|
| Risk Verdict Attestations | 50% | 90% | RVA contract, OCI push |
|
||||||
|
| VEX Claims Resolution | 80% | 95% | JSON parsing, evidence providers |
|
||||||
|
| Unknowns First-Class | 60% | 95% | Reason codes, budgets, attestations |
|
||||||
|
| Knowledge Snapshots | 60% | 90% | Manifest, ReplayEngine |
|
||||||
|
| Risk Budgets & Gates | 20% | 80% | RP scoring, gate levels |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Sprint Structure (10 Sprints, 169 Story Points)
|
||||||
|
|
||||||
|
### Batch 4100.0001: Unknowns Enhancement (40 pts)
|
||||||
|
|
||||||
|
| Sprint | Topic | Points | Status |
|
||||||
|
|--------|-------|--------|--------|
|
||||||
|
| 4100.0001.0001 | Reason-Coded Unknowns | 15 | Planned |
|
||||||
|
| 4100.0001.0002 | Unknown Budgets & Env Thresholds | 13 | Planned |
|
||||||
|
| 4100.0001.0003 | Unknowns in Attestations | 12 | Planned |
|
||||||
|
|
||||||
|
### Batch 4100.0002: Knowledge Snapshots & Replay (55 pts)
|
||||||
|
|
||||||
|
| Sprint | Topic | Points | Status |
|
||||||
|
|--------|-------|--------|--------|
|
||||||
|
| 4100.0002.0001 | Knowledge Snapshot Manifest | 18 | Planned |
|
||||||
|
| 4100.0002.0002 | Replay Engine | 22 | Planned |
|
||||||
|
| 4100.0002.0003 | Snapshot Export/Import | 15 | Planned |
|
||||||
|
|
||||||
|
### Batch 4100.0003: Risk Verdict & OCI (34 pts)
|
||||||
|
|
||||||
|
| Sprint | Topic | Points | Status |
|
||||||
|
|--------|-------|--------|--------|
|
||||||
|
| 4100.0003.0001 | Risk Verdict Attestation Contract | 16 | Planned |
|
||||||
|
| 4100.0003.0002 | OCI Referrer Push & Discovery | 18 | Planned |
|
||||||
|
|
||||||
|
### Batch 4100.0004: Deltas & Gates (38 pts)
|
||||||
|
|
||||||
|
| Sprint | Topic | Points | Status |
|
||||||
|
|--------|-------|--------|--------|
|
||||||
|
| 4100.0004.0001 | Security State Delta & Verdict | 20 | Planned |
|
||||||
|
| 4100.0004.0002 | Risk Budgets & Gate Levels | 18 | Planned |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Sprint File References
|
||||||
|
|
||||||
|
| Sprint | File |
|
||||||
|
|--------|------|
|
||||||
|
| 4100.0001.0001 | `docs/implplan/SPRINT_4100_0001_0001_reason_coded_unknowns.md` |
|
||||||
|
| 4100.0001.0002 | `docs/implplan/SPRINT_4100_0001_0002_unknown_budgets.md` |
|
||||||
|
| 4100.0001.0003 | `docs/implplan/SPRINT_4100_0001_0003_unknowns_attestations.md` |
|
||||||
|
| 4100.0002.0001 | `docs/implplan/SPRINT_4100_0002_0001_knowledge_snapshot_manifest.md` |
|
||||||
|
| 4100.0002.0002 | `docs/implplan/SPRINT_4100_0002_0002_replay_engine.md` |
|
||||||
|
| 4100.0002.0003 | `docs/implplan/SPRINT_4100_0002_0003_snapshot_export_import.md` |
|
||||||
|
| 4100.0003.0001 | `docs/implplan/SPRINT_4100_0003_0001_risk_verdict_attestation.md` |
|
||||||
|
| 4100.0003.0002 | `docs/implplan/SPRINT_4100_0003_0002_oci_referrer_push.md` |
|
||||||
|
| 4100.0004.0001 | `docs/implplan/SPRINT_4100_0004_0001_security_state_delta.md` |
|
||||||
|
| 4100.0004.0002 | `docs/implplan/SPRINT_4100_0004_0002_risk_budgets_gates.md` |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Archived Files (11)
|
||||||
|
|
||||||
|
### 19-Dec-2025 Moat Advisories (7)
|
||||||
|
|
||||||
|
1. `19-Dec-2025 - Moat #1.md`
|
||||||
|
2. `19-Dec-2025 - Moat #2.md`
|
||||||
|
3. `19-Dec-2025 - Moat #3.md`
|
||||||
|
4. `19-Dec-2025 - Moat #4.md`
|
||||||
|
5. `19-Dec-2025 - Moat #5.md`
|
||||||
|
6. `19-Dec-2025 - Moat #6.md`
|
||||||
|
7. `19-Dec-2025 - Moat #7.md`
|
||||||
|
|
||||||
|
### 20-Dec-2025 Moat Explanation Advisories (4)
|
||||||
|
|
||||||
|
8. `20-Dec-2025 - Moat Explanation - Exception management as auditable objects.md`
|
||||||
|
9. `20-Dec-2025 - Moat Explanation - Guidelines for Product and Development Managers - Signed, Replayable Risk Verdicts.md`
|
||||||
|
10. `20-Dec-2025 - Moat Explanation - Knowledge Snapshots and Time-Travel Replay.md`
|
||||||
|
11. `20-Dec-2025 - Moat Explanation - Risk Budgets and Diff-Aware Release Gates.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key New Concepts
|
||||||
|
|
||||||
|
| Concept | Description | Sprint |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| UnknownReasonCode | 7 reason codes: U-RCH, U-ID, U-PROV, U-VEX, U-FEED, U-CONFIG, U-ANALYZER | 4100.0001.0001 |
|
||||||
|
| UnknownBudget | Environment-aware thresholds (prod: block, stage: warn, dev: warn_only) | 4100.0001.0002 |
|
||||||
|
| KnowledgeSnapshotManifest | Content-addressed bundle (ksm:sha256:{hash}) | 4100.0002.0001 |
|
||||||
|
| ReplayEngine | Time-travel replay with frozen inputs for determinism verification | 4100.0002.0002 |
|
||||||
|
| RiskVerdictAttestation | PASS/FAIL/PASS_WITH_EXCEPTIONS/INDETERMINATE verdicts | 4100.0003.0001 |
|
||||||
|
| OCI Referrer Push | OCI 1.1 referrers API with fallback to tagged indexes | 4100.0003.0002 |
|
||||||
|
| SecurityStateDelta | Baseline vs target comparison with DeltaVerdict | 4100.0004.0001 |
|
||||||
|
| GateLevel | G0-G4 diff-aware release gates with RP scoring | 4100.0004.0002 |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Recommended Parallel Execution
|
||||||
|
|
||||||
|
```
|
||||||
|
Phase 1: 4100.0001.0001 + 4100.0002.0001 + 4100.0003.0001 + 4100.0004.0002
|
||||||
|
Phase 2: 4100.0001.0002 + 4100.0002.0002 + 4100.0003.0002
|
||||||
|
Phase 3: 4100.0001.0003 + 4100.0002.0003 + 4100.0004.0001
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
| Metric | Target |
|
||||||
|
|--------|--------|
|
||||||
|
| Reason-coded unknowns | 7 codes implemented |
|
||||||
|
| Unknown budget tests | 5+ passing |
|
||||||
|
| Knowledge snapshot tests | 8+ passing |
|
||||||
|
| Replay engine golden tests | 10+ passing |
|
||||||
|
| RVA verification tests | 6+ passing |
|
||||||
|
| OCI push integration tests | 4+ passing |
|
||||||
|
| Delta computation tests | 6+ passing |
|
||||||
|
| Overall MOAT coverage | 85%+ |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Post-Closure Target
|
||||||
|
|
||||||
|
After completing all 10 sprints:
|
||||||
|
- Implementation coverage: **90%+**
|
||||||
|
- All Phase 2 advisory requirements addressed
|
||||||
|
- Full governance and replay capabilities
|
||||||
|
- Risk budgets and gate levels operational
|
||||||
@@ -1,12 +1,12 @@
|
|||||||
Here’s a compact, practical plan to harden Stella Ops around **offline‑ready security evidence and deterministic verdicts**, with just enough background so it all clicks.
|
Here's a compact, practical plan to harden Stella Ops around **offline‑ready security evidence and deterministic verdicts**, with just enough background so it all clicks.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
# Why this matters (quick primer)
|
# Why this matters (quick primer)
|
||||||
|
|
||||||
* **Air‑gapped/offline**: Many customers can’t reach public feeds or registries. Your scanners, SBOM tooling, and attestations must work with **pre‑synced bundles** and prove what data they used.
|
* **Air‑gapped/offline**: Many customers can't reach public feeds or registries. Your scanners, SBOM tooling, and attestations must work with **pre‑synced bundles** and prove what data they used.
|
||||||
* **Interoperability**: Teams mix tools (Syft/Grype/Trivy, cosign, CycloneDX/SPDX). Your CI should **round‑trip** SBOMs and attestations end‑to‑end and prove that downstream consumers (e.g., Grype) can load them.
|
* **Interoperability**: Teams mix tools (Syft/Grype/Trivy, cosign, CycloneDX/SPDX). Your CI should **round‑trip** SBOMs and attestations end‑to‑end and prove that downstream consumers (e.g., Grype) can load them.
|
||||||
* **Determinism**: Auditors expect **“same inputs → same verdict.”** Capture inputs, policies, and feed hashes so a verdict is exactly reproducible later.
|
* **Determinism**: Auditors expect **"same inputs → same verdict."** Capture inputs, policies, and feed hashes so a verdict is exactly reproducible later.
|
||||||
* **Operational guardrails**: Shipping gates should fail early on **unknowns** and apply **backpressure** gracefully when load spikes.
|
* **Operational guardrails**: Shipping gates should fail early on **unknowns** and apply **backpressure** gracefully when load spikes.
|
||||||
|
|
||||||
---
|
---
|
||||||
@@ -15,14 +15,14 @@ Here’s a compact, practical plan to harden Stella Ops around **offline‑rea
|
|||||||
|
|
||||||
1. **Air‑gapped operation e2e**
|
1. **Air‑gapped operation e2e**
|
||||||
|
|
||||||
* Package “offline bundle” (vuln feeds, package catalogs, policy/lattice rules, certs, keys).
|
* Package "offline bundle" (vuln feeds, package catalogs, policy/lattice rules, certs, keys).
|
||||||
* Run scans (containers, OS, language deps, binaries) **without network**.
|
* Run scans (containers, OS, language deps, binaries) **without network**.
|
||||||
* Assert: SBOMs generated, attestations signed/verified, verdicts emitted.
|
* Assert: SBOMs generated, attestations signed/verified, verdicts emitted.
|
||||||
* Evidence: manifest of bundle contents + hashes in the run log.
|
* Evidence: manifest of bundle contents + hashes in the run log.
|
||||||
|
|
||||||
2. **Interop round‑trips (SBOM ⇄ attestation ⇄ scanner)**
|
2. **Interop round‑trips (SBOM ⇄ attestation ⇄ scanner)**
|
||||||
|
|
||||||
* Produce SBOM (CycloneDX 1.6 and SPDX 3.0.1) with Syft.
|
* Produce SBOM (CycloneDX 1.6 and SPDX 3.0.1) with Syft.
|
||||||
* Create **DSSE/cosign** attestation for that SBOM.
|
* Create **DSSE/cosign** attestation for that SBOM.
|
||||||
* Verify consumer tools:
|
* Verify consumer tools:
|
||||||
|
|
||||||
@@ -33,11 +33,11 @@ Here’s a compact, practical plan to harden Stella Ops around **offline‑rea
|
|||||||
3. **Replayability (delta‑verdicts + strict replay)**
|
3. **Replayability (delta‑verdicts + strict replay)**
|
||||||
|
|
||||||
* Store input set: artifact digest(s), SBOM digests, policy version, feed digests, lattice rules, tool versions.
|
* Store input set: artifact digest(s), SBOM digests, policy version, feed digests, lattice rules, tool versions.
|
||||||
* Re‑run later; assert **byte‑identical verdict** and same “delta‑verdict” when inputs unchanged.
|
* Re‑run later; assert **byte‑identical verdict** and same "delta‑verdict" when inputs unchanged.
|
||||||
|
|
||||||
4. **Unknowns‑budget policy gates**
|
4. **Unknowns‑budget policy gates**
|
||||||
|
|
||||||
* Inject controlled “unknown” conditions (missing CPE mapping, unresolved package source, unparsed distro).
|
* Inject controlled "unknown" conditions (missing CPE mapping, unresolved package source, unparsed distro).
|
||||||
* Gate: **fail build if unknowns > budget** (e.g., prod=0, staging≤N).
|
* Gate: **fail build if unknowns > budget** (e.g., prod=0, staging≤N).
|
||||||
* Assert: UI, CLI, and attestation all record unknown counts and gate decision.
|
* Assert: UI, CLI, and attestation all record unknown counts and gate decision.
|
||||||
|
|
||||||
@@ -45,7 +45,7 @@ Here’s a compact, practical plan to harden Stella Ops around **offline‑rea
|
|||||||
|
|
||||||
* Produce: build‑provenance (in‑toto/DSSE), SBOM attest, VEX attest, final **verdict attest**.
|
* Produce: build‑provenance (in‑toto/DSSE), SBOM attest, VEX attest, final **verdict attest**.
|
||||||
* Verify: signature (cosign), certificate chain, time‑stamping, Rekor‑style (or mirror) inclusion when online; cached proofs when offline.
|
* Verify: signature (cosign), certificate chain, time‑stamping, Rekor‑style (or mirror) inclusion when online; cached proofs when offline.
|
||||||
* Assert: each attestation is linked in the verdict’s evidence index.
|
* Assert: each attestation is linked in the verdict's evidence index.
|
||||||
|
|
||||||
6. **Router backpressure chaos (HTTP 429/503 + Retry‑After)**
|
6. **Router backpressure chaos (HTTP 429/503 + Retry‑After)**
|
||||||
|
|
||||||
@@ -55,7 +55,7 @@ Here’s a compact, practical plan to harden Stella Ops around **offline‑rea
|
|||||||
7. **UI reducer tests for reachability & VEX chips**
|
7. **UI reducer tests for reachability & VEX chips**
|
||||||
|
|
||||||
* Component tests: large SBOM graphs, focused **reachability subgraphs**, and VEX status chips (affected/not‑affected/under‑investigation).
|
* Component tests: large SBOM graphs, focused **reachability subgraphs**, and VEX status chips (affected/not‑affected/under‑investigation).
|
||||||
* Assert: stable rendering under 50k+ nodes; interactions remain <200 ms.
|
* Assert: stable rendering under 50k+ nodes; interactions remain <200 ms.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -95,7 +95,7 @@ Here’s a compact, practical plan to harden Stella Ops around **offline‑rea
|
|||||||
* Router under burst emits **correct Retry‑After** and recovers cleanly.
|
* Router under burst emits **correct Retry‑After** and recovers cleanly.
|
||||||
* UI handles huge graphs; VEX chips never desync from evidence.
|
* UI handles huge graphs; VEX chips never desync from evidence.
|
||||||
|
|
||||||
If you want, I’ll turn this into GitLab/Gitea pipeline YAML + a tiny sample repo (image, SBOM, policies, and goldens) so your team can plug‑and‑play.
|
If you want, I'll turn this into GitLab/Gitea pipeline YAML + a tiny sample repo (image, SBOM, policies, and goldens) so your team can plug‑and‑play.
|
||||||
Below is a complete, end-to-end testing strategy for Stella Ops that turns your moats (offline readiness, deterministic replayable verdicts, lattice/policy decisioning, attestation provenance, unknowns budgets, router backpressure, UI reachability evidence) into continuously verified guarantees.
|
Below is a complete, end-to-end testing strategy for Stella Ops that turns your moats (offline readiness, deterministic replayable verdicts, lattice/policy decisioning, attestation provenance, unknowns budgets, router backpressure, UI reachability evidence) into continuously verified guarantees.
|
||||||
|
|
||||||
---
|
---
|
||||||
@@ -124,21 +124,21 @@ A scan/verdict is *deterministic* iff **same inputs → byte-identical outputs**
|
|||||||
|
|
||||||
### 1.2 Offline by default
|
### 1.2 Offline by default
|
||||||
|
|
||||||
Every CI job (except explicitly tagged “online”) runs with **no egress**.
|
Every CI job (except explicitly tagged "online") runs with **no egress**.
|
||||||
|
|
||||||
* Offline bundle is mandatory input for scanning.
|
* Offline bundle is mandatory input for scanning.
|
||||||
* Any attempted network call fails the test (proves air-gap compliance).
|
* Any attempted network call fails the test (proves air-gap compliance).
|
||||||
|
|
||||||
### 1.3 Evidence-first validation
|
### 1.3 Evidence-first validation
|
||||||
|
|
||||||
No assertion is “verdict == pass” without verifying the chain of evidence:
|
No assertion is "verdict == pass" without verifying the chain of evidence:
|
||||||
|
|
||||||
* verdict references SBOM digest(s)
|
* verdict references SBOM digest(s)
|
||||||
* SBOM references artifact digest(s)
|
* SBOM references artifact digest(s)
|
||||||
* VEX claims reference vulnerabilities + components + reachability evidence
|
* VEX claims reference vulnerabilities + components + reachability evidence
|
||||||
* attestations verify cryptographically and chain to configured roots.
|
* attestations verify cryptographically and chain to configured roots.
|
||||||
|
|
||||||
### 1.4 Interop is required, not “nice to have”
|
### 1.4 Interop is required, not "nice to have"
|
||||||
|
|
||||||
Stella Ops must round-trip with:
|
Stella Ops must round-trip with:
|
||||||
|
|
||||||
@@ -146,19 +146,19 @@ Stella Ops must round-trip with:
|
|||||||
* Attestation: DSSE / in-toto style envelopes, cosign-compatible flows
|
* Attestation: DSSE / in-toto style envelopes, cosign-compatible flows
|
||||||
* Consumer scanners: at least Grype from SBOM; ideally Trivy as cross-check
|
* Consumer scanners: at least Grype from SBOM; ideally Trivy as cross-check
|
||||||
|
|
||||||
Interop tests are treated as “compatibility contracts” and block releases.
|
Interop tests are treated as "compatibility contracts" and block releases.
|
||||||
|
|
||||||
### 1.5 Architectural boundary enforcement (your standing rule)
|
### 1.5 Architectural boundary enforcement (your standing rule)
|
||||||
|
|
||||||
* Lattice/policy merge algorithms run **in `scanner.webservice`**.
|
* Lattice/policy merge algorithms run **in `scanner.webservice`**.
|
||||||
* `Concelier` and `Excitors` must “preserve prune source”.
|
* `Concelier` and `Excitors` must "preserve prune source".
|
||||||
This is enforced with tests that detect forbidden behavior (see §6.2).
|
This is enforced with tests that detect forbidden behavior (see §6.2).
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 2) The test portfolio (what kinds of tests exist)
|
## 2) The test portfolio (what kinds of tests exist)
|
||||||
|
|
||||||
Think “coverage by risk”, not “coverage by lines”.
|
Think "coverage by risk", not "coverage by lines".
|
||||||
|
|
||||||
### 2.1 Test layers and what they prove
|
### 2.1 Test layers and what they prove
|
||||||
|
|
||||||
@@ -172,9 +172,9 @@ Think “coverage by risk”, not “coverage by lines”.
|
|||||||
|
|
||||||
2. **Property-based tests** (FsCheck)
|
2. **Property-based tests** (FsCheck)
|
||||||
|
|
||||||
* “Reordering inputs does not change verdict hash”
|
* "Reordering inputs does not change verdict hash"
|
||||||
* “Graph merge is associative/commutative where policy declares it”
|
* "Graph merge is associative/commutative where policy declares it"
|
||||||
* “Unknowns budgets always monotonic with missing evidence”
|
* "Unknowns budgets always monotonic with missing evidence"
|
||||||
* Parser robustness: arbitrary JSON for SBOM/VEX envelopes never crashes
|
* Parser robustness: arbitrary JSON for SBOM/VEX envelopes never crashes
|
||||||
|
|
||||||
3. **Component tests** (service + Postgres; optional Valkey)
|
3. **Component tests** (service + Postgres; optional Valkey)
|
||||||
@@ -194,7 +194,7 @@ Think “coverage by risk”, not “coverage by lines”.
|
|||||||
|
|
||||||
* Router → scanner.webservice → attestor → storage
|
* Router → scanner.webservice → attestor → storage
|
||||||
* Offline bundle import/export
|
* Offline bundle import/export
|
||||||
* Knowledge snapshot “time travel” replay pipeline
|
* Knowledge snapshot "time travel" replay pipeline
|
||||||
|
|
||||||
6. **End-to-end tests** (realistic flows)
|
6. **End-to-end tests** (realistic flows)
|
||||||
|
|
||||||
@@ -224,10 +224,10 @@ Both must pass.
|
|||||||
|
|
||||||
### 3.2 Environment isolation
|
### 3.2 Environment isolation
|
||||||
|
|
||||||
* Containers started with **no network** unless a test explicitly declares “online”.
|
* Containers started with **no network** unless a test explicitly declares "online".
|
||||||
* For Kubernetes e2e: apply a default-deny egress NetworkPolicy.
|
* For Kubernetes e2e: apply a default-deny egress NetworkPolicy.
|
||||||
|
|
||||||
### 3.3 Golden corpora repository (your “truth set”)
|
### 3.3 Golden corpora repository (your "truth set")
|
||||||
|
|
||||||
Create a versioned `stellaops-test-corpus/` containing:
|
Create a versioned `stellaops-test-corpus/` containing:
|
||||||
|
|
||||||
@@ -285,7 +285,7 @@ Bundle includes:
|
|||||||
* crypto provider modules (for sovereign readiness)
|
* crypto provider modules (for sovereign readiness)
|
||||||
* optional: Rekor mirror snapshot / inclusion proofs cache
|
* optional: Rekor mirror snapshot / inclusion proofs cache
|
||||||
|
|
||||||
**Test invariant:** offline scan is blocked if bundle is missing required parts; error is explicit and counts as “unknown” only where policy says so.
|
**Test invariant:** offline scan is blocked if bundle is missing required parts; error is explicit and counts as "unknown" only where policy says so.
|
||||||
|
|
||||||
### 4.3 Evidence Index
|
### 4.3 Evidence Index
|
||||||
|
|
||||||
@@ -295,7 +295,7 @@ The verdict is not the product; the product is verdict + evidence graph:
|
|||||||
* their digests and verification status
|
* their digests and verification status
|
||||||
* unknowns list with codes + remediation hints
|
* unknowns list with codes + remediation hints
|
||||||
|
|
||||||
**Test invariant:** every “not affected” claim has required evidence hooks per policy (“because feature flag off” etc.), otherwise becomes unknown/fail.
|
**Test invariant:** every "not affected" claim has required evidence hooks per policy ("because feature flag off" etc.), otherwise becomes unknown/fail.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -333,8 +333,8 @@ These are your release blockers.
|
|||||||
* Assertions:
|
* Assertions:
|
||||||
|
|
||||||
* verdict bytes identical
|
* verdict bytes identical
|
||||||
* evidence index identical (except allowed “execution metadata” section)
|
* evidence index identical (except allowed "execution metadata" section)
|
||||||
* delta verdict is “empty delta”
|
* delta verdict is "empty delta"
|
||||||
|
|
||||||
### Flow D: Diff-aware delta verdict (smart-diff)
|
### Flow D: Diff-aware delta verdict (smart-diff)
|
||||||
|
|
||||||
@@ -366,7 +366,7 @@ These are your release blockers.
|
|||||||
* clients backoff; no request loss
|
* clients backoff; no request loss
|
||||||
* metrics expose throttling reasons
|
* metrics expose throttling reasons
|
||||||
|
|
||||||
### Flow G: Evidence export (“audit pack”)
|
### Flow G: Evidence export ("audit pack")
|
||||||
|
|
||||||
* Run scan
|
* Run scan
|
||||||
* Export a sealed audit pack (bundle + run manifest + evidence + verdict)
|
* Export a sealed audit pack (bundle + run manifest + evidence + verdict)
|
||||||
@@ -390,16 +390,16 @@ Must have:
|
|||||||
|
|
||||||
**Critical invariant tests:**
|
**Critical invariant tests:**
|
||||||
|
|
||||||
* “Vendor > distro > internal” must be demonstrably *configurable*, and wrong merges must fail deterministically.
|
* "Vendor > distro > internal" must be demonstrably *configurable*, and wrong merges must fail deterministically.
|
||||||
|
|
||||||
### 6.2 Boundary enforcement: Concelier & Excitors preserve prune source
|
### 6.2 Boundary enforcement: Concelier & Excitors preserve prune source
|
||||||
|
|
||||||
Add a “behavioral boundary suite”:
|
Add a "behavioral boundary suite":
|
||||||
|
|
||||||
* instrument events/telemetry that records where merges happened
|
* instrument events/telemetry that records where merges happened
|
||||||
* feed in conflicting VEX claims and assert:
|
* feed in conflicting VEX claims and assert:
|
||||||
|
|
||||||
* Concelier/Excitors do not resolve conflicts; they retain provenance and “prune source”
|
* Concelier/Excitors do not resolve conflicts; they retain provenance and "prune source"
|
||||||
* only `scanner.webservice` produces the final merged semantics
|
* only `scanner.webservice` produces the final merged semantics
|
||||||
|
|
||||||
If Concelier/Excitors output a resolved claim, the test fails.
|
If Concelier/Excitors output a resolved claim, the test fails.
|
||||||
@@ -439,7 +439,7 @@ Define standard workloads:
|
|||||||
* small image (200 packages)
|
* small image (200 packages)
|
||||||
* medium (2k packages)
|
* medium (2k packages)
|
||||||
* large (20k+ packages)
|
* large (20k+ packages)
|
||||||
* “monorepo container” worst case (50k+ nodes graph)
|
* "monorepo container" worst case (50k+ nodes graph)
|
||||||
|
|
||||||
Metrics collected:
|
Metrics collected:
|
||||||
|
|
||||||
@@ -529,7 +529,7 @@ Release candidate is blocked if any of these fail:
|
|||||||
|
|
||||||
### Phase 2: Offline e2e + interop
|
### Phase 2: Offline e2e + interop
|
||||||
|
|
||||||
* offline bundle builder + strict “no egress” enforcement
|
* offline bundle builder + strict "no egress" enforcement
|
||||||
* SBOM attestation round-trip + consumer parsing suite
|
* SBOM attestation round-trip + consumer parsing suite
|
||||||
|
|
||||||
### Phase 3: Unknowns budgets + delta verdict
|
### Phase 3: Unknowns budgets + delta verdict
|
||||||
@@ -556,7 +556,7 @@ If you do only three things, do these:
|
|||||||
|
|
||||||
1. **Run Manifest** as first-class test artifact
|
1. **Run Manifest** as first-class test artifact
|
||||||
2. **Golden corpus** that pins all digests (feeds, policies, images, expected outputs)
|
2. **Golden corpus** that pins all digests (feeds, policies, images, expected outputs)
|
||||||
3. **“No egress” default** in CI with explicit opt-in for online tests
|
3. **"No egress" default** in CI with explicit opt-in for online tests
|
||||||
|
|
||||||
Everything else becomes far easier once these are in place.
|
Everything else becomes far easier once these are in place.
|
||||||
|
|
||||||
@@ -0,0 +1,56 @@
|
|||||||
|
# Archived Advisory: Testing Strategy
|
||||||
|
|
||||||
|
**Archived**: 2025-12-21
|
||||||
|
**Original**: `docs/product-advisories/20-Dec-2025 - Testing strategy.md`
|
||||||
|
|
||||||
|
## Processing Summary
|
||||||
|
|
||||||
|
This advisory was processed into Sprint Epic 5100 - Comprehensive Testing Strategy.
|
||||||
|
|
||||||
|
### Artifacts Created
|
||||||
|
|
||||||
|
**Sprint Files** (12 sprints, ~75 tasks):
|
||||||
|
|
||||||
|
| Sprint | Name | Phase |
|
||||||
|
|--------|------|-------|
|
||||||
|
| 5100.0001.0001 | Run Manifest Schema | Phase 0 |
|
||||||
|
| 5100.0001.0002 | Evidence Index Schema | Phase 0 |
|
||||||
|
| 5100.0001.0003 | Offline Bundle Manifest | Phase 0 |
|
||||||
|
| 5100.0001.0004 | Golden Corpus Expansion | Phase 0 |
|
||||||
|
| 5100.0002.0001 | Canonicalization Utilities | Phase 1 |
|
||||||
|
| 5100.0002.0002 | Replay Runner Service | Phase 1 |
|
||||||
|
| 5100.0002.0003 | Delta-Verdict Generator | Phase 1 |
|
||||||
|
| 5100.0003.0001 | SBOM Interop Round-Trip | Phase 2 |
|
||||||
|
| 5100.0003.0002 | No-Egress Enforcement | Phase 2 |
|
||||||
|
| 5100.0004.0001 | Unknowns Budget CI Gates | Phase 3 |
|
||||||
|
| 5100.0005.0001 | Router Chaos Suite | Phase 4 |
|
||||||
|
| 5100.0006.0001 | Audit Pack Export/Import | Phase 5 |
|
||||||
|
|
||||||
|
**Documentation Updated**:
|
||||||
|
- `docs/implplan/SPRINT_5100_SUMMARY.md` - Master epic summary
|
||||||
|
- `docs/19_TEST_SUITE_OVERVIEW.md` - Test suite documentation
|
||||||
|
- `tests/AGENTS.md` - AI agent guidance for tests directory
|
||||||
|
|
||||||
|
### Key Concepts Implemented
|
||||||
|
|
||||||
|
1. **Deterministic Replay**: Run Manifests capture all inputs for byte-identical verdict reproduction
|
||||||
|
2. **Canonical JSON**: RFC 8785 principles for stable serialization
|
||||||
|
3. **Evidence Index**: Linking verdicts to complete evidence chain
|
||||||
|
4. **Air-Gap Compliance**: Network-isolated testing with `--network none`
|
||||||
|
5. **SBOM Interoperability**: Round-trip testing with Syft, Grype, cosign
|
||||||
|
6. **Unknowns Budget Gates**: Environment-based budget enforcement
|
||||||
|
7. **Router Backpressure**: HTTP 429/503 with Retry-After validation
|
||||||
|
8. **Audit Packs**: Sealed export/import for compliance verification
|
||||||
|
|
||||||
|
### Release Blocking Gates
|
||||||
|
|
||||||
|
- Replay Verification: 0 byte diff
|
||||||
|
- Interop Suite: 95%+ findings parity
|
||||||
|
- Offline E2E: All pass with no network
|
||||||
|
- Unknowns Budget: Within configured limits
|
||||||
|
- Router Retry-After: 100% compliance
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Processed by: Claude Code*
|
||||||
|
*Date: 2025-12-21*
|
||||||
189
tests/AGENTS.md
Normal file
189
tests/AGENTS.md
Normal file
@@ -0,0 +1,189 @@
|
|||||||
|
# tests/AGENTS.md
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
This document provides guidance for AI agents and developers working in the `tests/` directory of the StellaOps codebase.
|
||||||
|
|
||||||
|
## Directory Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
tests/
|
||||||
|
├── acceptance/ # Acceptance test suites
|
||||||
|
├── AirGap/ # Air-gap specific tests
|
||||||
|
├── authority/ # Authority module tests
|
||||||
|
├── chaos/ # Chaos engineering tests
|
||||||
|
├── e2e/ # End-to-end test suites
|
||||||
|
├── EvidenceLocker/ # Evidence storage tests
|
||||||
|
├── fixtures/ # Shared test fixtures
|
||||||
|
│ ├── offline-bundle/ # Offline bundle for air-gap tests
|
||||||
|
│ ├── images/ # Container image tarballs
|
||||||
|
│ └── sboms/ # Sample SBOM documents
|
||||||
|
├── Graph/ # Graph module tests
|
||||||
|
├── integration/ # Integration test suites
|
||||||
|
├── interop/ # Interoperability tests
|
||||||
|
├── load/ # Load testing scripts
|
||||||
|
├── native/ # Native code tests
|
||||||
|
├── offline/ # Offline operation tests
|
||||||
|
├── plugins/ # Plugin tests
|
||||||
|
├── Policy/ # Policy module tests
|
||||||
|
├── Provenance/ # Provenance/attestation tests
|
||||||
|
├── reachability/ # Reachability analysis tests
|
||||||
|
├── Replay/ # Replay functionality tests
|
||||||
|
├── security/ # Security tests (OWASP)
|
||||||
|
├── shared/ # Shared test utilities
|
||||||
|
└── Vex/ # VEX processing tests
|
||||||
|
```
|
||||||
|
|
||||||
|
## Test Categories
|
||||||
|
|
||||||
|
### When writing tests, use appropriate category traits:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "Unit")] // Fast, isolated unit tests
|
||||||
|
[Trait("Category", "Integration")] // Tests requiring infrastructure
|
||||||
|
[Trait("Category", "E2E")] // Full end-to-end workflows
|
||||||
|
[Trait("Category", "AirGap")] // Must work without network
|
||||||
|
[Trait("Category", "Interop")] // Third-party tool compatibility
|
||||||
|
[Trait("Category", "Performance")] // Performance benchmarks
|
||||||
|
[Trait("Category", "Chaos")] // Failure injection tests
|
||||||
|
[Trait("Category", "Security")] // Security-focused tests
|
||||||
|
```
|
||||||
|
|
||||||
|
## Key Patterns
|
||||||
|
|
||||||
|
### 1. PostgreSQL Integration Tests
|
||||||
|
|
||||||
|
Use the shared fixture from `StellaOps.Infrastructure.Postgres.Testing`:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public class MyIntegrationTests : IClassFixture<MyPostgresFixture>
|
||||||
|
{
|
||||||
|
private readonly MyPostgresFixture _fixture;
|
||||||
|
|
||||||
|
public MyIntegrationTests(MyPostgresFixture fixture)
|
||||||
|
{
|
||||||
|
_fixture = fixture;
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task MyTest()
|
||||||
|
{
|
||||||
|
// _fixture.ConnectionString is available
|
||||||
|
// _fixture.TruncateAllTablesAsync() for cleanup
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Air-Gap Tests
|
||||||
|
|
||||||
|
Inherit from `NetworkIsolatedTestBase` for network-free tests:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
[Trait("Category", "AirGap")]
|
||||||
|
public class OfflineTests : NetworkIsolatedTestBase
|
||||||
|
{
|
||||||
|
[Fact]
|
||||||
|
public async Task Test_WorksOffline()
|
||||||
|
{
|
||||||
|
// Test implementation
|
||||||
|
AssertNoNetworkCalls(); // Fails if network accessed
|
||||||
|
}
|
||||||
|
|
||||||
|
protected string GetOfflineBundlePath() =>
|
||||||
|
Path.Combine(AppContext.BaseDirectory, "fixtures", "offline-bundle");
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Determinism Tests
|
||||||
|
|
||||||
|
Use `DeterminismVerifier` to ensure reproducibility:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
[Fact]
|
||||||
|
public void Output_IsDeterministic()
|
||||||
|
{
|
||||||
|
var verifier = new DeterminismVerifier();
|
||||||
|
var result = verifier.Verify(myObject, iterations: 10);
|
||||||
|
|
||||||
|
result.IsDeterministic.Should().BeTrue();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Golden Corpus Tests
|
||||||
|
|
||||||
|
Reference cases from `bench/golden-corpus/`:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
[Theory]
|
||||||
|
[MemberData(nameof(GetCorpusCases))]
|
||||||
|
public async Task Corpus_Case_Passes(string caseId)
|
||||||
|
{
|
||||||
|
var testCase = CorpusLoader.Load(caseId);
|
||||||
|
var result = await ProcessAsync(testCase.Input);
|
||||||
|
result.Should().BeEquivalentTo(testCase.Expected);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Rules for Test Development
|
||||||
|
|
||||||
|
### DO:
|
||||||
|
|
||||||
|
1. **Tag tests with appropriate categories** for filtering
|
||||||
|
2. **Use Testcontainers** for infrastructure dependencies
|
||||||
|
3. **Inherit from shared fixtures** to avoid duplication
|
||||||
|
4. **Assert no network calls** in air-gap tests
|
||||||
|
5. **Verify determinism** for any serialization output
|
||||||
|
6. **Use property-based tests** (FsCheck) for invariants
|
||||||
|
7. **Document test purpose** in method names
|
||||||
|
|
||||||
|
### DON'T:
|
||||||
|
|
||||||
|
1. **Don't skip tests** without documenting why
|
||||||
|
2. **Don't use Thread.Sleep** - use proper async waits
|
||||||
|
3. **Don't hardcode paths** - use `AppContext.BaseDirectory`
|
||||||
|
4. **Don't make network calls** in non-interop tests
|
||||||
|
5. **Don't depend on test execution order**
|
||||||
|
6. **Don't leave test data in shared databases**
|
||||||
|
|
||||||
|
## Test Infrastructure
|
||||||
|
|
||||||
|
### Required Services (CI)
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
env:
|
||||||
|
POSTGRES_PASSWORD: test
|
||||||
|
valkey:
|
||||||
|
image: valkey/valkey:7-alpine
|
||||||
|
```
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
| Variable | Purpose | Default |
|
||||||
|
|----------|---------|---------|
|
||||||
|
| `STELLAOPS_OFFLINE_MODE` | Enable offline mode | `false` |
|
||||||
|
| `STELLAOPS_OFFLINE_BUNDLE` | Path to offline bundle | - |
|
||||||
|
| `STELLAOPS_TEST_POSTGRES` | PostgreSQL connection | Testcontainers |
|
||||||
|
| `STELLAOPS_TEST_VALKEY` | Valkey connection | Testcontainers |
|
||||||
|
|
||||||
|
## Related Sprints
|
||||||
|
|
||||||
|
| Sprint | Topic |
|
||||||
|
|--------|-------|
|
||||||
|
| 5100.0001.0001 | Run Manifest Schema |
|
||||||
|
| 5100.0001.0002 | Evidence Index Schema |
|
||||||
|
| 5100.0001.0004 | Golden Corpus Expansion |
|
||||||
|
| 5100.0002.0001 | Canonicalization Utilities |
|
||||||
|
| 5100.0002.0002 | Replay Runner Service |
|
||||||
|
| 5100.0003.0001 | SBOM Interop Round-Trip |
|
||||||
|
| 5100.0003.0002 | No-Egress Enforcement |
|
||||||
|
| 5100.0005.0001 | Router Chaos Suite |
|
||||||
|
|
||||||
|
## Contact
|
||||||
|
|
||||||
|
For test infrastructure questions, see:
|
||||||
|
- `docs/19_TEST_SUITE_OVERVIEW.md`
|
||||||
|
- `docs/implplan/SPRINT_5100_SUMMARY.md`
|
||||||
|
- Sprint files in `docs/implplan/SPRINT_5100_*.md`
|
||||||
Reference in New Issue
Block a user